Science.gov

Sample records for air quantitative analysis

  1. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  2. Quantitative volatile metabolite profiling of common indoor fungi: relevancy for indoor air analysis.

    PubMed

    Schuchardt, Sven; Kruse, Hermann

    2009-08-01

    Microorganisms such as bacteria and molds produce an enormous variety of volatile metabolites. To determine whether typical microbial volatile metabolites can be used as indicator compounds for the detection of hidden mold in indoor environments, we examined 14 typical indoor fungal strains for their growth rates and their capability to produce volatile organic compounds (VOC) on standard clinical media and on agar medium made from building materials. Air samples from Headspace Chambers (HSC) were adsorbed daily on Tenax TA tubes and analyzed by thermal desorption gas chromatography and mass spectrometry. In parallel, metabolic activity was measured by determining oxygen demand, the microbial biomass was assessed by dry weighing. Profiling of the volatile metabolites showed that VOC production depended greatly on fungal strain, culture medium, biological activity, and time. The laboratory-derived maximum emission rates were extrapolated to approximate indoor air concentrations in a hypothetical mold-infested room. The extrapolated indoor air data suggest that most of the microbial-produced VOC concentrations were below the analytical detection limit for conventional indoor air analysis. Additionally, conducted indoor air analysis in mold homes confirmed these findings for the most part. The present findings raise doubts about the utility of indicator VOC for the detection of hidden mold growth in indoor environments.

  3. Application of ion chemistry and the SIFT technique to the quantitative analysis of trace gases in air and on breath

    NASA Astrophysics Data System (ADS)

    Smith, David; Španěl, Patrik

    Our major objective in this paper is to describe a new method we have developed for the analysis of trace gases at partial pressures down to the ppb level in atmospheric air, with special emphasis on the detection and quantification of trace gases on human breath. It involves the use of our selected ion flow tube (Sift) technique which we previously developed and used extensively for the study of gas phase ionic reactions occurring in ionized media such as the terrestrial atmosphere and interstellar gas clouds. Before discussing this analytical technique we describe the results of our very recent Sift and flowing afterglow (FA) studies of the reactions of the H3O+ and OH- ions, of their hydrates H3O+(H2O)1,2,3 and OH- (H2O)1,2, and of NO+ and O2+, with several hydrocarbons and oxygen-bearing organic molecules, studies that are very relevant to our trace gas analytical studies. Then follows a detailed discussion of the application of our Sift technique to trace gas analysis, after which we present some results obtained for the analyses of laboratory air, the breath of a healthy non-smoking person, the breath of a person who regularly smokes cigarettes, the complex vapours emitted by banana and onion, and the molecules present in a butane/air flame. We show how the quantitative analysis of breath can be achieved from only a single exhalation and in real time (the time response of the instrument is only about 20 ms). We also show how the time variation of breath gases over long time periods can be followed, using the decay of ethanol on the breath after the ingestion of distilled liquor as an example, yet simultaneously following several other trace gases including acetone and isoprene which are very easily detected on the breath of all individuals because of their relatively high partial pressures (typically 100 to 1000 ppb). The breath of a smoker is richer in complex molecules, some nitrogen containing organics apparently being very evident at the 5 to 50 ppb level

  4. Quantitative Analysis of Spectral Interference of Spontaneous Raman Scattering in High-Pressure Fuel-Rich H2-Air Combustion

    NASA Technical Reports Server (NTRS)

    Kojima, Jun; Nguyen, Quang-Viet

    2004-01-01

    We present a theoretical study of the spectral interferences in the spontaneous Raman scattering spectra of major combustion products in 30-atm fuel-rich hydrogen-air flames. An effective methodology is introduced to choose an appropriate line-shape model for simulating Raman spectra in high-pressure combustion environments. The Voigt profile with the additive approximation assumption was found to provide a reasonable model of the spectral line shape for the present analysis. The rotational/vibrational Raman spectra of H2, N2, and H2O were calculated using an anharmonic-oscillator model using the latest collisional broadening coefficients. The calculated spectra were validated with data obtained in a 10-atm fuel-rich H2-air flame and showed excellent agreement. Our quantitative spectral analysis for equivalence ratios ranging from 1.5 to 5.0 revealed substantial amounts of spectral cross-talk between the rotational H2 lines and the N2 O-/Q-branch; and between the vibrational H2O(0,3) line and the vibrational H2O spectrum. We also address the temperature dependence of the spectral cross-talk and extend our analysis to include a cross-talk compensation technique that removes the nterference arising from the H2 Raman spectra onto the N2, or H2O spectra.

  5. Quantitative Analysis of Major Phytochemicals in Orthodox tea (Camellia sinensis), Oxidized under Compressed Air Environment.

    PubMed

    Panda, Brajesh Kumar; Datta, Ashis Kumar

    2016-04-01

    This study describes major changes in phytochemical composition of orthodox tea (Camellia sinensis var. Assamica) oxidized under compressed air (CA). The experiments for oxidation were conducted under air pressure (101, 202, and 303 kPa) for 150 min. Relative change in the concentrations of caffeine, catechins, theaflavins (TF), and thearubigins (TR) were analyzed. Effect of CA pressure was found to be nonsignificant in regulating caffeine concentration during oxidation. But degradation in different catechins as well as formation of different TF was significantly affected by CA pressure. At high CA pressure, TF showed highest peak value. TR was found to have slower rate of formation during initial phase of oxidation than TF. Even though the rate of TR formation was significantly influenced by CA, a portion of catechins remained unoxidized at end of oxidation. Except caffeine, the percent change in rate of formation or degradation were more prominent at 202 kPa.

  6. An automated gas chromatography time-of-flight mass spectrometry instrument for the quantitative analysis of halocarbons in air

    NASA Astrophysics Data System (ADS)

    Obersteiner, F.; Bönisch, H.; Engel, A.

    2015-09-01

    We present the characterization and application of a new gas chromatography-time-of-flight mass spectrometry instrument (GC-TOFMS) for the quantitative analysis of halocarbons in air samples. The setup comprises three fundamental enhancements compared to our earlier work (Hoker et al., 2015): (1) full automation, (2) a mass resolving power R = m/Δ m of the TOFMS (Tofwerk AG, Switzerland) increased up to 4000 Th/Th and (3) a fully accessible data format of the mass spectrometric data. Automation in combination with the accessible data allowed an in-depth characterization of the instrument. Mass accuracy was found around 5 ppm after automatic recalibration of the mass axis in each measurement. A TOFMS configuration giving R = 3500 was chosen to provide an R-to-sensitivity ratio suitable for our purpose. Calculated detection limits were as low as a few femtograms as mass traces could be made highly specific for selected molecule fragments with the accurate mass information. The precision for substance quantification was 0.15 % at the best for an individual measurement and in general mainly determined by the signal-to-noise ratio of the chromatographic peak. The TOFMS was found to be linear within a concentration range from about 1 pg to 1 ng of analyte per Liter of air. At higher concentrations, non-linearities of a few percent were observed (precision level: 0.2 %) but could be attributed to a potential source within the detection system. A straight-forward correction for those non-linearities was applied in data processing, again by exploiting the accurate mass information. Based on the overall characterization results, the GC-TOFMS instrument was found to be very well-suited for the task of quantitative halocarbon trace gas observation and a big step forward compared to scanning, low resolution quadrupole MS and a TOFMS technique reported to be non-linear and restricted by a small dynamical range.

  7. An automated gas chromatography time-of-flight mass spectrometry instrument for the quantitative analysis of halocarbons in air

    NASA Astrophysics Data System (ADS)

    Obersteiner, F.; Bönisch, H.; Engel, A.

    2016-01-01

    We present the characterization and application of a new gas chromatography time-of-flight mass spectrometry instrument (GC-TOFMS) for the quantitative analysis of halocarbons in air samples. The setup comprises three fundamental enhancements compared to our earlier work (Hoker et al., 2015): (1) full automation, (2) a mass resolving power R = m/Δm of the TOFMS (Tofwerk AG, Switzerland) increased up to 4000 and (3) a fully accessible data format of the mass spectrometric data. Automation in combination with the accessible data allowed an in-depth characterization of the instrument. Mass accuracy was found to be approximately 5 ppm in mean after automatic recalibration of the mass axis in each measurement. A TOFMS configuration giving R = 3500 was chosen to provide an R-to-sensitivity ratio suitable for our purpose. Calculated detection limits are as low as a few femtograms by means of the accurate mass information. The precision for substance quantification was 0.15 % at the best for an individual measurement and in general mainly determined by the signal-to-noise ratio of the chromatographic peak. Detector non-linearity was found to be insignificant up to a mixing ratio of roughly 150 ppt at 0.5 L sampled volume. At higher concentrations, non-linearities of a few percent were observed (precision level: 0.2 %) but could be attributed to a potential source within the detection system. A straightforward correction for those non-linearities was applied in data processing, again by exploiting the accurate mass information. Based on the overall characterization results, the GC-TOFMS instrument was found to be very well suited for the task of quantitative halocarbon trace gas observation and a big step forward compared to scanning, quadrupole MS with low mass resolving power and a TOFMS technique reported to be non-linear and restricted by a small dynamical range.

  8. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  9. A quantitative method for optimized placement of continuous air monitors.

    PubMed

    Whicker, Jeffrey J; Rodgers, John C; Moxley, John S

    2003-11-01

    Alarming continuous air monitors (CAMs) are a critical component for worker protection in facilities that handle large amounts of hazardous materials. In nuclear facilities, continuous air monitors alarm when levels of airborne radioactive materials exceed alarm thresholds, thus prompting workers to exit the room to reduce inhalation exposures. To maintain a high level of worker protection, continuous air monitors are required to detect radioactive aerosol clouds quickly and with good sensitivity. This requires that there are sufficient numbers of continuous air monitors in a room and that they are well positioned. Yet there are no published methodologies to quantitatively determine the optimal number and placement of continuous air monitors in a room. The goal of this study was to develop and test an approach to quantitatively determine optimal number and placement of continuous air monitors in a room. The method we have developed uses tracer aerosol releases (to simulate accidental releases) and the measurement of the temporal and spatial aspects of the dispersion of the tracer aerosol through the room. The aerosol dispersion data is then analyzed to optimize continuous air monitor utilization based on simulated worker exposure. This method was tested in a room within a Department of Energy operated plutonium facility at the Savannah River Site in South Carolina, U.S. Results from this study show that the value of quantitative airflow and aerosol dispersion studies is significant and that worker protection can be significantly improved while balancing the costs associated with CAM programs.

  10. AIR Model Preflight Analysis

    NASA Technical Reports Server (NTRS)

    Tai, H.; Wilson, J. W.; Maiden, D. L.

    2003-01-01

    The atmospheric ionizing radiation (AIR) ER-2 preflight analysis, one of the first attempts to obtain a relatively complete measurement set of the high-altitude radiation level environment, is described in this paper. The primary thrust is to characterize the atmospheric radiation and to define dose levels at high-altitude flight. A secondary thrust is to develop and validate dosimetric techniques and monitoring devices for protecting aircrews. With a few chosen routes, we can measure the experimental results and validate the AIR model predictions. Eventually, as more measurements are made, we gain more understanding about the hazardous radiation environment and acquire more confidence in the prediction models.

  11. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  12. Automated quantitative analysis for pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  13. Direct quantitative analysis of phthalate esters as micro-contaminants in cleanroom air and wafer surfaces by auto-thermal desorption--gas chromatography--mass spectrometry.

    PubMed

    Kang, Yuhao; Den, Walter; Bai, Hsunling; Ko, Fu-Hsiang

    2005-04-01

    This study established an analytical method for the trace analyses of two phthalate esters, including diethyl phthalate (DEP) and di-n-butyl phthalate (DBP), known as the major constituents of cleanroom micro-contamination detrimental to the reliability of semiconductor devices. Using thermal desorption coupled with a GC-MS system, standard tubes were prepared by delivering liquid standards pre-vaporized by a quasi-vaporizer into Tenax GR tubes for calibration. This method was capable of achieving detection limits of 0.05 microg m(-3) for 0.1 m3 air samples and 0.03 ng cm(-2) for 150-mm wafer surface density. Actual samples collected from a semiconductor cleanroom showed that the concentration of DBP in a polypropylene wafer box (0.45 microg m(-3)) was nearly four times higher than that in the cleanroom environment (0.12 microg m(-3)). The surface contamination of DBP was 0.67 ng cm(-2) for a wafer stored in the wafer box for 24 h. Furthermore, among the three types of heat-resistant O-ring materials tested, Kalrez was found to be particularly suitable for high-temperature processes in semiconductor cleanrooms due to their low emissions of organic vapors. This analytical procedure should serve as an effective monitoring method for the organic micro-contamination in cleanroom environments.

  14. Novel Air Stimulation MR-Device for Intraoral Quantitative Sensory Cold Testing

    PubMed Central

    Brönnimann, Ben; Meier, Michael L.; Hou, Mei-Yin; Parkinson, Charles; Ettlin, Dominik A.

    2016-01-01

    The advent of neuroimaging in dental research provides exciting opportunities for relating excitation of trigeminal neurons to human somatosensory perceptions. Cold air sensitivity is one of the most frequent causes of dental discomfort or pain. Up to date, devices capable of delivering controlled cold air in an MR-environment are unavailable for quantitative sensory testing. This study therefore aimed at constructing and evaluating a novel MR-compatible, computer-controlled cold air stimulation apparatus (CASA) that produces graded air puffs. CASA consisted of a multi-injector air jet delivery system (AJS), a cold exchanger, a cooling agent, and a stimulus application construction. Its feasibility was tested by performing an fMRI stimulation experiment on a single subject experiencing dentine cold sensitivity. The novel device delivered repetitive, stable air stimuli ranging from room temperature (24.5°C ± 2°C) to −35°C, at flow rates between 5 and 17 liters per minute (l/min). These cold air puffs evoked perceptions similar to natural stimuli. Single-subject fMRI-analysis yielded brain activations typically associated with acute pain processing including thalamus, insular and cingulate cortices, somatosensory, cerebellar, and frontal brain regions. Thus, the novel CASA allowed for controlled, repetitive quantitative sensory testing by using air stimuli at graded temperatures (room temperature down to −35°C) while simultaneously recording brain responses. No MR-compatible stimulation device currently exists that is capable of providing non-contact natural-like stimuli at a wide temperature range to tissues in spatially restricted areas such as the mouth. The physical characteristics of this novel device thus holds promise for advancing the field of trigeminal and spinal somatosensory research, namely with respect to comparing therapeutic interventions for dentine hypersensitivity. PMID:27445771

  15. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization.

  16. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  17. Air Pollution. Part A: Analysis.

    ERIC Educational Resources Information Center

    Ledbetter, Joe O.

    Two facets of the engineering control of air pollution (the analysis of possible problems and the application of effective controls) are covered in this two-volume text. Part A covers Analysis, and Part B, Prevention and Control. (This review is concerned with Part A only.) This volume deals with the terminology, methodology, and symptomatology…

  18. Cancer detection by quantitative fluorescence image analysis.

    PubMed

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  19. Australia’s first national level quantitative environmental justice assessment of industrial air pollution

    NASA Astrophysics Data System (ADS)

    Chakraborty, Jayajit; Green, Donna

    2014-04-01

    This study presents the first national level quantitative environmental justice assessment of industrial air pollution in Australia. Specifically, our analysis links the spatial distribution of sites and emissions associated with industrial pollution sources derived from the National Pollution Inventory, to Indigenous status and social disadvantage characteristics of communities derived from Australian Bureau of Statistics indicators. Our results reveal a clear national pattern of environmental injustice based on the locations of industrial pollution sources, as well as volume, and toxicity of air pollution released at these locations. Communities with the highest number of polluting sites, emission volume, and toxicity-weighted air emissions indicate significantly greater proportions of Indigenous population and higher levels of socio-economic disadvantage. The quantities and toxicities of industrial air pollution are particularly higher in communities with the lowest levels of educational attainment and occupational status. These findings emphasize the need for more detailed analysis in specific regions and communities where socially disadvantaged groups are disproportionately impacted by industrial air pollution. Our empirical findings also underscore the growing necessity to incorporate environmental justice considerations in environmental planning and policy-making in Australia.

  20. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  1. Quantitative image analysis of synovial tissue.

    PubMed

    van der Hall, Pascal O; Kraan, Maarten C; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the acquisition, storage and evaluation of images with dedicated hardware and software. Major advantages of quantitative image analysis over traditional techniques include sophisticated calibration systems, interaction, speed, and control of inter- and intraobserver variation. This results in a well controlled environment, which is essential for quality control and reproducibility, and helps to optimize sensitivity and specificity. To achieve this, an optimal quantitative image analysis system combines solid software engineering with easy interactivity with the operator. Moreover, the system also needs to be as transparent as possible in generating the data because a "black box design" will deliver uncontrollable results. In addition to these more general aspects, specifically for the analysis of synovial tissue the necessity of interactivity is highlighted by the added value of identification and quantification of information as present in areas such as the intimal lining layer, blood vessels, and lymphocyte aggregates. Speed is another important aspect of digital cytometry. Currently, rapidly increasing numbers of samples, together with accumulation of a variety of markers and detection techniques has made the use of traditional analysis techniques such as manual quantification and semi-quantitative analysis unpractical. It can be anticipated that the development of even more powerful computer systems with sophisticated software will further facilitate reliable analysis at high speed.

  2. Background-oriented schlieren with natural background for quantitative visualization of open-air explosions

    NASA Astrophysics Data System (ADS)

    Mizukaki, T.; Wakabayashi, K.; Matsumura, T.; Nakayama, K.

    2014-01-01

    This study describes an attempt of quantitative visualization of open-air explosions via the background-oriented schlieren method (BOS). The shock wave propagation curve and overpressure distribution were extracted from the obtained images and compared with the results of the numerical analysis. The potential of extracting the density distribution behind the shock front is also demonstrated. Two open-air explosions were conducted; one with a -kg emulsion explosive and the other with a -kg composition C4 explosive. A high-speed digital video camera was used with a frame rate of and a pixel size of . A natural background, including trees and grass, was used for BOS measurements instead of the random dots used in a laboratory. The overpressure distribution given by the passing shock was estimated from the visualized images. The estimated overpressures agreed with the values recorded by pressure transducers in the test field. The background displacement caused by light diffraction inside the spherical shock waves was in good agreement, except at the shock front. The results shown here suggest that the BOS method for open-air experiments could provide increasingly better quantitative and conventional visualization results with increasing spatial resolution of high-speed cameras.

  3. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  4. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  5. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  6. Quantitative ADF STEM: acquisition, analysis and interpretation

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  7. Quantitative Proteomics Analysis of Leukemia Cells.

    PubMed

    Halbach, Sebastian; Dengjel, Jörn; Brummer, Tilman

    2016-01-01

    Chronic myeloid leukemia (CML) is driven by the oncogenic fusion kinase Bcr-Abl, which organizes its own signaling network with various proteins. These proteins, their interactions, and their role in relevant signaling pathways can be analyzed by quantitative mass spectrometry (MS) approaches in various models systems, e.g., in cell culture models. In this chapter, we describe in detail immunoprecipitations and quantitative proteomics analysis using stable isotope labeling by amino acids in cell culture (SILAC) of components of the Bcr-Abl signaling pathway in the human CML cell line K562. PMID:27581145

  8. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  9. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  10. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  11. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  12. Continuous Quantitative Measurements on a Linear Air Track

    ERIC Educational Resources Information Center

    Vogel, Eric

    1973-01-01

    Describes the construction and operational procedures of a spark-timing apparatus which is designed to record the back and forth motion of one or two carts on linear air tracks. Applications to measurements of velocity, acceleration, simple harmonic motion, and collision problems are illustrated. (CC)

  13. Air sampling and analysis in a rubber vulcanization area.

    PubMed

    Rappaport, S M; Fraser, D A

    1977-05-01

    Results of sampling and analysis of air in a rubber vulcanization area are described. Organic compounds were collected on activated charcoal, desorbed with carbon disulfide and analyzed by gas chromatography. Several previously identified substances were quantitated, including styrene, toluene, ethylbenzene, and several oligomers of 1,3-butadiene. Concentrations ranged from 0.007 to 1.1 ppm.

  14. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  15. Quantitative image analysis of celiac disease.

    PubMed

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  16. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  17. Quantitative mass spectrometry methods for pharmaceutical analysis.

    PubMed

    Loos, Glenn; Van Schepdael, Ann; Cabooter, Deirdre

    2016-10-28

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage.This article is part of the themed issue 'Quantitative mass spectrometry'.

  18. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  19. Quantitative Bias Analysis in Regulatory Settings.

    PubMed

    Lash, Timothy L; Fox, Matthew P; Cooney, Darryl; Lu, Yun; Forshee, Richard A

    2016-07-01

    Nonrandomized studies are essential in the postmarket activities of the US Food and Drug Administration, which, however, must often act on the basis of imperfect data. Systematic errors can lead to inaccurate inferences, so it is critical to develop analytic methods that quantify uncertainty and bias and ensure that these methods are implemented when needed. "Quantitative bias analysis" is an overarching term for methods that estimate quantitatively the direction, magnitude, and uncertainty associated with systematic errors influencing measures of associations. The Food and Drug Administration sponsored a collaborative project to develop tools to better quantify the uncertainties associated with postmarket surveillance studies used in regulatory decision making. We have described the rationale, progress, and future directions of this project. PMID:27196652

  20. Quantitative NIR Raman analysis in liquid mixtures.

    PubMed

    Sato-Berrú, R Ysacc; Medina-Valtierra, Jorge; Medina-Gutiérrez, Cirilo; Frausto-Reyes, Claudio

    2004-08-01

    The capability to obtain quantitative information of a simple way from Raman spectra is a subject of considerable interest. In this work, this is demonstrated for mixtures of ethanol with water and rhodamine-6G (R-6G) with methanol, which were analyzed directly in glass vessel. The Raman intensities and a simple mathematical model have been used and applied for the analysis of liquid samples. It is starting point to generate a general expression, from the experimental spectra, as the sum of the particular expression for each pure compound allow us to obtain an expression for the mixtures which can be used for determining concentrations, from the Raman spectrum, of the mixture.

  1. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described. PMID:24136541

  2. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  3. Multizone Age-of-Air Analysis

    SciTech Connect

    Sherman, Max H.

    2007-07-01

    Age of air is a technique for evaluating ventilation that has been actively used for over 20 years. Age of air quantifies the time it takes for outdoor air to reach a particular location or zone within then indoor environment. Age of air is often also used to quantify the ventilation effectiveness with respect to indoor air quality. In a purely single zone situation this use of age of air is straightforward, but application of age of air techniques in the general multizone environment has not been fully developed. This article looks at expanding those single-zone techniques to the more complicated environment of multizone buildings and in doing so develops further the general concept of age of air. The results of this analysis shows that the nominal age of air as often used cannot be directly used for determining ventilation effectiveness unless specific assumptions are made regarding source distributions.

  4. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  5. Empirical Bayes Analysis of Quantitative Proteomics Experiments

    PubMed Central

    Margolin, Adam A.; Ong, Shao-En; Schenone, Monica; Gould, Robert; Schreiber, Stuart L.; Carr, Steven A.; Golub, Todd R.

    2009-01-01

    Background Advances in mass spectrometry-based proteomics have enabled the incorporation of proteomic data into systems approaches to biology. However, development of analytical methods has lagged behind. Here we describe an empirical Bayes framework for quantitative proteomics data analysis. The method provides a statistical description of each experiment, including the number of proteins that differ in abundance between 2 samples, the experiment's statistical power to detect them, and the false-positive probability of each protein. Methodology/Principal Findings We analyzed 2 types of mass spectrometric experiments. First, we showed that the method identified the protein targets of small-molecules in affinity purification experiments with high precision. Second, we re-analyzed a mass spectrometric data set designed to identify proteins regulated by microRNAs. Our results were supported by sequence analysis of the 3′ UTR regions of predicted target genes, and we found that the previously reported conclusion that a large fraction of the proteome is regulated by microRNAs was not supported by our statistical analysis of the data. Conclusions/Significance Our results highlight the importance of rigorous statistical analysis of proteomic data, and the method described here provides a statistical framework to robustly and reliably interpret such data. PMID:19829701

  6. Large scale air monitoring: lichen vs. air particulate matter analysis.

    PubMed

    Rossbach, M; Jayasekera, R; Kniewald, G; Thang, N H

    1999-07-15

    Biological indicator organisms have been widely used for monitoring and banking purposes for many years. Although the complexity of the interactions between organisms and their environment is generally not easily comprehensible, environmental quality assessment using the bioindicator approach offers some convincing advantages compared to direct analysis of soil, water, or air. Measurement of air particulates is restricted to experienced laboratories with access to expensive sampling equipment. Additionally, the amount of material collected generally is just enough for one determination per sampling and no multidimensional characterization might be possible. Further, fluctuations in air masses have a pronounced effect on the results from air filter sampling. Combining the integrating property of bioindicators with the world wide availability and particular matrix characteristics of air particulate matter as a prerequisite for global monitoring of air pollution is discussed. A new approach for sampling urban dust using large volume filtering devices installed in air conditioners of large hotel buildings is assessed. A first experiment was initiated to collect air particulates (300-500 g each) from a number of hotels during a period of 3-4 months by successive vacuum cleaning of used inlet filters from high volume air conditioning installations reflecting average concentrations per 3 months in different large cities. This approach is expected to be upgraded and applied for global monitoring. Highly positive correlated elements were found in lichens such as K/S, Zn/P, the rare earth elements (REE) and a significant negative correlation between Hg and Cu was observed in these samples. The ratio of concentrations of elements in dust and Usnea spp. is highest for Cz, Zn and Fe (400-200) and lowest for elements such as Ca, Rb, and Sr (20-10).

  7. Low-cost monitoring of Campylobacter in poultry houses by air sampling and quantitative PCR.

    PubMed

    Søndergaard, M S R; Josefsen, M H; Löfström, C; Christensen, L S; Wieczorek, K; Osek, J; Hoorfar, J

    2014-02-01

    The present study describes the evaluation of a method for the quantification of Campylobacter by air sampling in poultry houses. Sampling was carried out in conventional chicken houses in Poland, in addition to a preliminary sampling in Denmark. Each measurement consisted of three air samples, two standard boot swab fecal samples, and one airborne particle count. Sampling was conducted over an 8-week period in three flocks, assessing the presence and levels of Campylobacter in boot swabs and air samples using quantitative real-time PCR. The detection limit for air sampling was approximately 100 Campylobacter cell equivalents (CCE)/m3. Airborne particle counts were used to analyze the size distribution of airborne particles (0.3 to 10 μm) in the chicken houses in relation to the level of airborne Campylobacter. No correlation was found. Using air sampling, Campylobacter was detected in the flocks right away, while boot swab samples were positive after 2 weeks. All samples collected were positive for Campylobacter from week 2 through the rest of the rearing period for both sampling techniques, although levels 1- to 2-log CCE higher were found with air sampling. At week 8, the levels were approximately 10(4) and 10(5) CCE per sample for boot swabs and air, respectively. In conclusion, using air samples combined with quantitative real-time PCR, Campylobacter contamination could be detected earlier than by boot swabs and was found to be a more convenient technique for monitoring and/or to obtain enumeration data useful for quantitative risk assessment of Campylobacter.

  8. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  9. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  10. Quantitative microstructure analysis of polymer-modified mortars.

    PubMed

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  11. Quantitative analysis of protein turnover in plants.

    PubMed

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  12. Two-dimensional quantitative measurements of methyl radicals in methane/air flame.

    PubMed

    Wu, Yue; Zhang, Zhili

    2015-01-10

    Two-dimensional (2D) quantitative measurements of methyl (CH3) radicals in a methane/air Hencken flame at atmospheric pressure are performed using coherent microwave Rayleigh scattering (Radar) from Resonance Enhanced Multi-Photon Ionization (REMPI) technique. The 2D scanning and subsequent quantification are employed for Radar REMPI. The 2D quantitative results are used to verify the numerical calculations. The line-integral effect was involved in the calculation due to the real experimental configuration. A 25% difference existed between the experimental results and numerical calculation, while the overall concentration distributions between experiment and modeling of single flamelet have fairly good agreement with each other. PMID:25967612

  13. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  14. Error Propagation Analysis for Quantitative Intracellular Metabolomics

    PubMed Central

    Tillack, Jana; Paczia, Nicole; Nöh, Katharina; Wiechert, Wolfgang; Noack, Stephan

    2012-01-01

    Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected. PMID:24957773

  15. Quantitative gold nanoparticle analysis methods: A review.

    PubMed

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  16. Quantitative Assessment of Detection Frequency for the INL Ambient Air Monitoring Network

    SciTech Connect

    A. Jeffrey Sondrup; Arthur S. Rood

    2014-11-01

    A quantitative assessment of the Idaho National Laboratory (INL) air monitoring network was performed using frequency of detection as the performance metric. The INL air monitoring network consists of 37 low-volume air samplers in 31 different locations. Twenty of the samplers are located on INL (onsite) and 17 are located off INL (offsite). Detection frequencies were calculated using both BEA and ESER laboratory minimum detectable activity (MDA) levels. The CALPUFF Lagrangian puff dispersion model, coupled with 1 year of meteorological data, was used to calculate time-integrated concentrations at sampler locations for a 1-hour release of unit activity (1 Ci) for every hour of the year. The unit-activity time-integrated concentration (TICu) values were calculated at all samplers for releases from eight INL facilities. The TICu values were then scaled and integrated for a given release quantity and release duration. All facilities modeled a ground-level release emanating either from the center of the facility or at a point where significant emissions are possible. In addition to ground-level releases, three existing stacks at the Advanced Test Reactor Complex, Idaho Nuclear Technology and Engineering Center, and Material and Fuels Complex were also modeled. Meteorological data from the 35 stations comprising the INL Mesonet network, data from the Idaho Falls Regional airport, upper air data from the Boise airport, and three-dimensional gridded data from the weather research forecasting model were used for modeling. Three representative radionuclides identified as key radionuclides in INL’s annual National Emission Standards for Hazardous Air Pollutants evaluations were considered for the frequency of detection analysis: Cs-137 (beta-gamma emitter), Pu-239 (alpha emitter), and Sr-90 (beta emitter). Source-specific release quantities were calculated for each radionuclide, such that the maximum inhalation dose at any publicly accessible sampler or the National

  17. Quantitative analysis of saccadic search strategy

    NASA Astrophysics Data System (ADS)

    Over, E. A. B.

    2007-06-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye movement parameters. Chapter 2 provides a method to quantify a general property of fixation locations. We proposed a quantitative measure based on Voronoi diagrams for the characterization of the uniformity of fixation density. This measure may be thought of as indicating the clustering of fixations. We showed that during a visual search task, a structured (natural) background leads to higher clustering of fixations compared to a homogeneous background. In addition, in natural stimuli, a search task leads to higher clustering of fixations than the instruction to freely view the stimuli. Chapter 3 provides a method to identify the overall field of saccade directions in the viewing area. We extended the Voronoi method of chapter 2 so that it became possible to create vector maps. These maps indicate the preferred saccade direction for each position in the viewing area. Several measures of these vector maps were used to quantify the influence of observer-dependent and stimulus-dependent factors on saccade direction in a search task with natural scenes. The results showed that the influence of stimulus-dependent factors appeared to be larger than the influence of observer-dependent factors. In chapter 4 we showed that the border of the search area played a role in the search strategy. In a search experiment in differently shaped areas we measured that search performance was poorer near the search area luminance edges. Fixation density, however, was higher in the edge region, and saccade direction was mainly along the edges of the search areas. In a target visibility experiment we established that the visibility of targets near a luminance edge is less than the visibility of

  18. [Development of rapid methods for quantitative analysis of proteolytic reactions].

    PubMed

    Beloivan, O A; Tsvetkova, M N; Bubriak, O A

    2002-01-01

    The approaches for development of express methods for quantitative control of proteolytic reactions are discussed. Recently, these reactions have taken on special significance for revealing many important problems of theoretical and practical medicine and biology as well as for technological, pharmacological and ecological monitoring. Traditional methods can be improved both by use of immobilized enzymes and substrates, and on the basis of combination of various classic biochemical and immunological approaches. The synthesis of substrates with specified properties allows new methods to be realized for the study of the proteinase activity and kinetic characteristics of the corresponding reactions both in vitro and in vivo. An application of biosensor technology is promising trend since it allows the analysis time and cost to be saved, the direct interaction between enzymes and their inhibitors and activators to be studied in a real time scale, the quantitative measurements to be performed both in liquids and in the air. Besides, biosensor technique is well compatible with computer data processing. PMID:12924013

  19. Visual Analysis of Air Traffic Data

    NASA Technical Reports Server (NTRS)

    Albrecht, George Hans; Pang, Alex

    2012-01-01

    In this paper, we present visual analysis tools to help study the impact of policy changes on air traffic congestion. The tools support visualization of time-varying air traffic density over an area of interest using different time granularity. We use this visual analysis platform to investigate how changing the aircraft separation volume can reduce congestion while maintaining key safety requirements. The same platform can also be used as a decision aid for processing requests for unmanned aerial vehicle operations.

  20. Quantitative assessment of radiation force effect at the dielectric air-liquid interface

    PubMed Central

    Capeloto, Otávio Augusto; Zanuto, Vitor Santaella; Malacarne, Luis Carlos; Baesso, Mauro Luciano; Lukasievicz, Gustavo Vinicius Bassi; Bialkowski, Stephen Edward; Astrath, Nelson Guilherme Castelli

    2016-01-01

    We induce nanometer-scale surface deformation by exploiting momentum conservation of the interaction between laser light and dielectric liquids. The effect of radiation force at the air-liquid interface is quantitatively assessed for fluids with different density, viscosity and surface tension. The imparted pressure on the liquids by continuous or pulsed laser light excitation is fully described by the Helmholtz electromagnetic force density. PMID:26856622

  1. Quantitative surface spectroscopic analysis of multicomponent polymers

    NASA Astrophysics Data System (ADS)

    Zhuang, Hengzhong

    Angle-dependent electron spectroscopy for chemical analysis (ESCA) has been successfully used to examine the surface compositional gradient of a multicomponent polymer. However, photoelectron intensities detected at each take-off angle of ESCA measurements are convoluted signals. The convoluted nature of the signal distorts depth profiles for samples having compositional gradients. To recover the true concentration profiles for the samples, a deconvolution program has been described in Chapter 2. The compositional profiles of two classes of important multicomponent polymers, i.e., poly(dimethysiloxane urethane) (PU-DMS) segmented copolymers and fluorinated poly(amide urethane) block copolymers, are achieved using this program. The effects of the polymer molecular structure and the processing variation on its surface compositional profile have been studied. Besides surface composition, it is desirable to know whether the distribution of segment or block lengths at the surface is different than in the bulk, because this aspect of surface structure may lead to properties different than that predicted simply by knowledge of the surface composition and the bulk structure. In Chapter 3, we pioneered the direct determination of the distribution of polydimethylsiloxane (PDMS) segment lengths at the surface of PU-DMS using time-of-flight secondary ion mass spectrometry (SUMS). Exciting preliminary results are provided: for the thick film of PU-DMS with nominal MW of PDMS = 1000, the distribution of the PDMS segment lengths at the surface is nearly identical to that in the bulk, whereas in the case of the thick films of PU-DMS with nominal MW of PDMS = 2400, only those PDMS segments with MW of ca. 1000 preferentially segregated at the surface. As a potential minimal fouling coating or biocompatible cardio-vascular materials, PU-DMS copolymers eventually come into contact with water once in use. Could such an environmental change (from air to aqueous) induce any undesirable

  2. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  3. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that question the…

  4. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Technical Reports Server (NTRS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  5. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Astrophysics Data System (ADS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-02-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  6. Using quantitative acid-base analysis in the ICU.

    PubMed

    Lloyd, P; Freebairn, R

    2006-03-01

    The quantitative acid-base 'Strong Ion' calculator is a practical application of quantitative acid-base chemistry, as developed by Peter Stewart and Peter Constable. It quantifies the three independent factors that control acidity, calculates the concentration and charge of unmeasured ions, produces a report based on these calculations and displays a Gamblegram depicting measured ionic species. Used together with the medical history, quantitative acid-base analysis has advantages over traditional approaches.

  7. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  8. Heavy metals in common foodstuff: Quantitative analysis

    SciTech Connect

    Tsoumbaris, P.; Tsoukali-Papadopoulou, H. )

    1994-07-01

    The presence of heavy metals in human body always draws scientific concern as these are considered responsible for affecting health, especially in these days where the release of toxic wastes in the environment has been increased. Some metals are essential for life, others have unknown biologic function, either favourable or toxic and some others have the potential to produce disease. Those causing toxicity are the ones which accumulate in the body through food chain, water and air. The purpose of this study is the determination of Pb, Cd, Ni, Mn, Zn in different foodstuff consumed by inhabitants of the city of Thessaloniki, northern Greece, according to their dietary habits.

  9. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  10. Quantitative data analysis of ESAR data

    NASA Astrophysics Data System (ADS)

    Phruksahiran, N.; Chandra, M.

    2013-07-01

    A synthetic aperture radar (SAR) data processing uses the backscattered electromagnetic wave to map radar reflectivity of the ground surface. The polarization property in radar remote sensing was used successfully in many applications, especially in target decomposition. This paper presents a case study to the experiments which are performed on ESAR L-Band full polarized data sets from German Aerospace Center (DLR) to demonstrate the potential of coherent target decomposition and the possibility of using the weather radar measurement parameter, such as the differential reflectivity and the linear depolarization ratio to obtain the quantitative information of the ground surface. The raw data of ESAR has been processed by the SAR simulator developed using MATLAB program code with Range-Doppler algorithm.

  11. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system. PMID:26360033

  12. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  13. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  14. Using fire tests for quantitative risk analysis

    SciTech Connect

    Ling, W.C.T.; Williamson, R.B.

    1980-03-01

    Fires can be considered a causal chain-of-events in which the growth and spread of fire may cause damage and injury if it is rapid enough to overcome the barriers placed in its way. Fire tests for fire resistance of the barriers can be used in a quantitative risk assessment. The fire growth and spread is modelled in a State Transition Model (STM). The fire barriers are presented as part of the Fire Protection Model (FPM) which is based on a portion of the NFPA Decision Tree. An Emergency Equivalent Network is introduced to couple the Fire Growth Model (FGM) and the FPM so that the spread of fire beyond the room-of-origin can be computed. An example is presented in which a specific building floor plan is analyzed to obtain the shortest expected time for fire to spread between two points. To obtain the probability and time for each link in the network, data from the results of fire tests were used. These results were found to be lacking and new standards giving better data are advocated.

  15. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    SciTech Connect

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  16. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    PubMed Central

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework. PMID:27141321

  17. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  18. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  19. A Quantitative Analysis of Countries' Research Strengths

    ERIC Educational Resources Information Center

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  20. Quantitative analysis of cascade impactor samples - revisited

    NASA Astrophysics Data System (ADS)

    Orlić , I.; Chiam, S. Y.; Sanchez, J. L.; Tang, S. M.

    1999-04-01

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  1. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  2. Quantitative analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    Kurths, J.; Voss, A.; Saparin, P.; Witt, A.; Kleiner, H. J.; Wessel, N.

    1995-03-01

    In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, noninvasive diagnostic tools like Holter monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyze the HRV. Especially, some complexity measures that are based on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.

  3. A quantitative assessment of the relationship between precipitation deficits and air temperature variations

    NASA Astrophysics Data System (ADS)

    He, B.; Wang, H. L.; Wang, Q. F.; Di, Z. H.

    2015-06-01

    Previous studies have reported precipitation deficits related to temperature extremes. However, how and to what extent precipitation deficits affect surface air temperatures is still poorly understood. In this study, the relationship between precipitation deficits and surface temperatures was examined in China from 1960 to 2012 based on monthly temperature and precipitation records from 565 stations. Significant negative correlations were identified in each season, with the strongest relationships in the summer, indicating that higher temperatures usually accompanied water-deficient conditions and lower temperatures usually accompanied wet conditions. The examination of the correlations based on 30 year moving windows suggested that the interaction between the two variables has declined over the past three decades. Further investigation indicated a higher impact of extreme dry conditions on temperature than that of extreme wet conditions. In addition, a new simple index (Dry Temperature Index, DTI) was developed and used to quantitatively describe the relationship between water deficits and air temperature variations. We tested and compared the DTI in the coldest month (January) and the hottest month (July) of the year, station by station. In both months, the number of stations with a DThighI ≥ 50% was greater than those with a DThighI < 50%, indicating that a greater proportion of higher temperatures occurred during dry conditions. Based on the results, we conclude that water deficits in China are usually correlated to high temperatures but not to low temperatures.

  4. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  5. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  6. Quantitative signal analysis in pulsed resonant photoacoustics

    NASA Astrophysics Data System (ADS)

    Schäfer, Stefan; Miklós, András; Hess, Peter

    1997-05-01

    The pulsed excitation of acoustic resonances was studied by means of a high- Q photoacoustic resonator with different types of microphone. The signal strength of the first radial mode was calculated by the basic theory as well as by a modeling program, which takes into account the acoustic impedances of the resonator, the acoustic filter system, and the influence of the microphone coupling on the photoacoustic cavity. When the calculated signal strength is used, the high- Q system can be calibrated for trace-gas analysis without a certified gas mixture. The theoretical results were compared with measurements and show good agreement for different microphone configurations. From the measured pressure signal (in pascals per joule), the absorption coefficient of ethylene was calculated; it agreed within 10 with literature values. In addition, a Helmholtz configuration with a highly sensitive 1-in. (2.54-cm) microphone was realized. Although the Q factor was reduced, the sensitivity could be increased by the Helmholtz resonator in the case of pulsed experiments. A maximum sensitivity of the coupled system of 341 mV Pa was achieved.

  7. Quantitative analysis of in vivo cell proliferation.

    PubMed

    Cameron, Heather A

    2006-11-01

    Injection and immunohistochemical detection of 5-bromo-2'-deoxyuridine (BrdU) has become the standard method for studying the birth and survival of neurons, glia, and other cell types in the nervous system. BrdU, a thymidine analog, becomes stably incorporated into DNA during the S-phase of mitosis. Because DNA containing BrdU can be specifically recognized by antibodies, this method allows dividing cells to be marked at any given time and then identified at time points from a few minutes to several years later. BrdU immunohistochemistry is suitable for cell counting to examine the regulation of cell proliferation and cell fate. It can be combined with labeling by other antibodies, allowing confocal analysis of cell phenotype or expression of other proteins. The potential for nonspecific labeling and toxicity are discussed. Although BrdU immunohistochemistry has almost completely replaced tritiated thymidine autoradiography for labeling dividing cells, this method and situations in which it is still useful are also described. PMID:18428635

  8. Quantitative comparisons of various air pollutant emission sources of ozone precursors in East Tennessee - a study evaluated from the emission inventory development

    SciTech Connect

    Bandyopadhyay, N.

    1996-12-31

    The United States Department of the Interior has raised concerns regarding air pollution impacts in the Great Smoky Mountains National Park (GSMNP). The formation of the Southern Appalachian Mountains Initiative (SAMI) is regional effort to understand the air quality impacts of emission sources upon the Appalachian Mountains. The Tennessee Division of Air Pollution Control (TDAPC) has recently committed additional resources for the analyses of proposals for increased emissions of air pollutants in East Tennessee. The TDAPC has planned to assess these effects by conducting an air quality modeling project. The United States Environmental Protection Agency`s (US EPA`s) Urban Airshed Model (UAM) has been used as the primary air quality model for this purpose. The purpose of this project will be to evaluate the expected impact of any major new or modified air pollution source located in Tennessee on ozone in the GSMNP. An accurate emission inventory is essential to any air quality modeling analysis. A modeling inventory has been developed by the TDAPC for the base year 1993. The modeling area includes 40 counties in East and Middle Tennessee and 42 counties in neighboring states. For the counties in Tennessee, a detailed inventory of the point sources was prepared. For the other states inside the modeling domain, the EPA`s Aerometric Information Retrieval System (AIRS)-AIRS Facility Subsystem (AFS) was used to obtain point source data, The accuracy of the AFS data for the other states was not addressed, A detailed quantitative analysis has been conducted with the emission inventory developed for Tennessee counties. The purpose of this study is to quantify the relative contributions of the emissions of Volatile Organic Compounds (VOCs) and Nitrogen Oxides (NO{sub x}) from different point, area, mobile and biogenic sources to ozone formation in the vicinity of the GSMNP.

  9. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  10. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  11. Air Force geographic information and analysis system

    SciTech Connect

    Henney, D.A.; Jansing, D.S.; Durfee, R.C.; Margle, S.M.; Till, L.E.

    1987-01-01

    A microcomputer-based geographic information and analysis system (GIAS) was developed to assist Air Force planners with environmental analysis, natural resources management, and facility and land-use planning. The system processes raster image data, topological data structures, and geometric or vector data similar to that produced by computer-aided design and drafting (CADD) systems, integrating the data where appropriate. Data types included Landsat imagery, scanned images of base maps, digitized point and chain features, topographic elevation data, USGS stream course data, highway networks, railroad networks, and land use/land cover information from USGS interpreted aerial photography. The system is also being developed to provide an integrated display and analysis capability with base maps and facility data bases prepared on CADD systems. 3 refs.

  12. Quantitative methods for the analysis of zoosporic fungi.

    PubMed

    Marano, Agostina V; Gleason, Frank H; Bärlocher, Felix; Pires-Zottarelli, Carmen L A; Lilje, Osu; Schmidt, Steve K; Rasconi, Serena; Kagami, Maiko; Barrera, Marcelo D; Sime-Ngando, Télesphore; Boussiba, Sammy; de Souza, José I; Edwards, Joan E

    2012-04-01

    Quantitative estimations of zoosporic fungi in the environment have historically received little attention, primarily due to methodological challenges and their complex life cycles. Conventional methods for quantitative analysis of zoosporic fungi to date have mainly relied on direct observation and baiting techniques, with subsequent fungal identification in the laboratory using morphological characteristics. Although these methods are still fundamentally useful, there has been an increasing preference for quantitative microscopic methods based on staining with fluorescent dyes, as well as the use of hybridization probes. More recently however PCR based methods for profiling and quantification (semi- and absolute) have proven to be rapid and accurate diagnostic tools for assessing zoosporic fungal assemblages in environmental samples. Further application of next generation sequencing technologies will however not only advance our quantitative understanding of zoosporic fungal ecology, but also their function through the analysis of their genomes and gene expression as resources and databases expand in the future. Nevertheless, it is still necessary to complement these molecular-based approaches with cultivation-based methods in order to gain a fuller quantitative understanding of the ecological and physiological roles of zoosporic fungi.

  13. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  14. Early Child Grammars: Qualitative and Quantitative Analysis of Morphosyntactic Production

    ERIC Educational Resources Information Center

    Legendre, Geraldine

    2006-01-01

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is…

  15. Quantitative transverse flow measurement using OCT speckle decorrelation analysis

    PubMed Central

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Mathews, Scott A.; Kang, Jin U.

    2014-01-01

    We propose an inter-Ascan speckle decorrelation based method that can quantitatively assess blood flow normal to the direction of the OCT imaging beam. To validate this method, we performed a systematic study using both phantom and in vivo animal models. Results show that our speckle analysis method can accurately extract transverse flow speed with high spatial and temporal resolution. PMID:23455305

  16. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  17. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  18. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  19. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  20. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  1. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  2. Quantitative assessment of bio-aerosols contamination in indoor air of University dormitory rooms

    PubMed Central

    Hayleeyesus, Samuel Fekadu; Ejeso, Amanuel; Derseh, Fikirte Aklilu

    2015-01-01

    Objectives The purpose of this study is to provide insight into how students are exposed to indoor bio-aerosols in the dormitory rooms and to figure out the major possible factors that govern the contamination levels. Methodology The Bio-aerosols concentration level of indoor air of thirty dormitory rooms of Jimma University was determined by taking 120 samples. Passive air sampling technique; the settle plate method using open Petri-dishes containing different culture media was employed to collect sample twice daily. Results The range of bio-aerosols contamination detected in the dormitory rooms was 511–9960 CFU/m3 for bacterial and 531–6568 CFU/m3 for fungi. Based on the criteria stated by WHO expert group, from the total 120 samples 95 of the samples were above the recommended level. The statistical analysis showed that, occupancy were significantly affected the concentrations of bacteria that were measured in all dormitory rooms at 6:00 am sampling time (p-value=0.000) and also the concentrations of bacteria that were measured in all dormitory rooms were significantly different to each other (p-value=0.013) as of their significance difference in occupancy (p-value=0.000). Moreover, there were a significant different on the contamination level of bacteria at 6:00 am and 7:00 pm sampling time (p=0.015), whereas there is no significant difference for fungi contamination level for two sampling times (p= 0.674). Conclusion There is excessive bio-aerosols contaminant in indoor air of dormitory rooms of Jimma University and human occupancy produces a marked concentration increase of bacterial contamination levels and most fungi species present into the rooms air of Jimma University dormitory were not human-borne. PMID:26609289

  3. An approach to market analysis for lighter than air transportation of freight

    NASA Technical Reports Server (NTRS)

    Roberts, P. O.; Marcus, H. S.; Pollock, J. H.

    1975-01-01

    An approach is presented to marketing analysis for lighter than air vehicles in a commercial freight market. After a discussion of key characteristics of supply and demand factors, a three-phase approach to marketing analysis is described. The existing transportation systems are quantitatively defined and possible roles for lighter than air vehicles within this framework are postulated. The marketing analysis views the situation from the perspective of both the shipper and the carrier. A demand for freight service is assumed and the resulting supply characteristics are determined. Then, these supply characteristics are used to establish the demand for competing modes. The process is then iterated to arrive at the market solution.

  4. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  5. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  6. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  7. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  8. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  9. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. PMID:27358910

  10. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored.

  11. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored. PMID:27444661

  12. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  13. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    NASA Astrophysics Data System (ADS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  14. Development of a method to detect and quantify Aspergillus fumigatus conidia by quantitative PCR for environmental air samples.

    PubMed

    McDevitt, James J; Lees, Peter S J; Merz, William G; Schwab, Kellogg J

    2004-10-01

    Exposure to Aspergillus fumigatus is linked with respiratory diseases such as asthma, invasive aspergillosis, hypersensitivity pneumonitis, and allergic bronchopulmonary aspergillosis. Molecular methods using quantitative PCR (qPCR) offer advantages over culture and optical methods for estimating human exposures to microbiological agents such as fungi. We describe an assay that uses lyticase to digest A. fumigatus conidia followed by TaqMan qPCR to quantify released DNA. This method will allow analysis of airborne A. fumigatus samples collected over extended time periods and provide a more representative assessment of chronic exposure. The method was optimized for environmental samples and incorporates: single tube sample preparation to reduce sample loss, maintain simplicity, and avoid contamination; hot start amplification to reduce non-specific primer/probe annealing; and uracil-N-glycosylase to prevent carryover contamination. An A. fumigatus internal standard was developed and used to detect PCR inhibitors potentially found in air samples. The assay detected fewer than 10 A. fumigatus conidia per qPCR reaction and quantified conidia over a 4-log10 range with high linearity (R2 >0.99) and low variability among replicate standards (CV=2.0%) in less than 4 h. The sensitivity and linearity of qPCR for conidia deposited on filters was equivalent to conidia calibration standards. A. fumigatus DNA from 8 isolates was consistently quantified using this method, while non-specific DNA from 14 common environmental fungi, including 6 other Aspergillus species, was not detected. This method provides a means of analyzing long term air samples collected on filters which may enable investigators to correlate airborne environmental A. fumigatus conidia concentrations with adverse health effects.

  15. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-01-01

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections. PMID:27548134

  16. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described. PMID:8905629

  17. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    PubMed

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  18. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  19. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  20. Quantitative analysis of surface electromyography: Biomarkers for convulsive seizures.

    PubMed

    Beniczky, Sándor; Conradsen, Isa; Pressler, Ronit; Wolf, Peter

    2016-08-01

    Muscle activity during seizures is in electroencephalographical (EEG) praxis often considered an irritating artefact. This article discusses ways by surface electromyography (EMG) to turn it into a valuable tool of epileptology. Muscles are in direct synaptic contact with motor neurons. Therefore, EMG signals provide direct information about the electric activity in the motor cortex. Qualitative analysis of EMG has traditionally been a part of the long-term video-EEG recordings. Recent development in quantitative analysis of EMG signals yielded valuable information on the pathomechanisms of convulsive seizures, demonstrating that it was different from maximal voluntary contraction, and different from convulsive psychogenic non-epileptic seizures. Furthermore, the tonic phase of the generalised tonic-clonic seizures (GTCS) proved to have different quantitative features than tonic seizures. The high temporal resolution of EMG allowed detailed characterisation of temporal dynamics of the GTCS, suggesting that the same inhibitory mechanisms that try to prevent the build-up of the seizure activity, contribute to ending the seizure. These findings have clinical implications: the quantitative EMG features provided the pathophysiologic substrate for developing neurophysiologic biomarkers that accurately identify GTCS. This proved to be efficient both for seizure detection and for objective, automated distinction between convulsive and non-convulsive epileptic seizures.

  1. Lichens biomonitoring as feasible methodology to assess air pollution in natural ecosystems: combined study of quantitative PAHs analyses and lichen biodiversity in the Pyrenees Mountains.

    PubMed

    Blasco, María; Domeño, Celia; Nerín, Cristina

    2008-06-01

    The air quality in the Aragón valley, in the central Pyrenees, has been assessed by evaluation of lichen biodiversity and mapped by elaboration of the Index of Air Purity (IAP) based on observations of the presence and abundance of eight kinds of lichen with different sensitivity to air pollution. The IAP values obtained have been compared with quantitative analytical measures of 16 PAHs in the lichen Evernia prunastri, because this species was associated with a wide range of traffic exposure and levels of urbanization. Analyses of PAHs were carried out by the DSASE method followed by an SPE clean-up step and GC-MS analysis. The concentration of total PAHs found in lichen samples from the Aragón valley ranged from 692 to 6420 ng g(-1) and the PAHs profile showed predominance of compounds with three aromatic rings. The influence of the road traffic in the area has been shown because values over the median concentration of PAHs (>1092 ng g(-1)), percentage of combustion PAHs (>50%), and equivalent toxicity (>169) were found in lichens collected at places exposed to the influence of traffic. The combination of both methods suggests IAP as a general method for evaluating the air pollution referenced to PAHs because it can be correlated with the content of combustion PAHs and poor lichen biodiversity can be partly explained by the air pollution caused by specific PAHs.

  2. Label-Free Technologies for Quantitative Multiparameter Biological Analysis

    PubMed Central

    Qavi, Abraham J.; Washburn, Adam L.; Byeon, Ji-Yeon; Bailey, Ryan C.

    2009-01-01

    In the post-genomic era, information is king and information-rich technologies are critically important drivers in both fundamental biology and medicine. It is now known that single-parameter measurements provide only limited detail and that quantitation of multiple biomolecular signatures can more fully illuminate complex biological function. Label-free technologies have recently attracted significant interest for sensitive and quantitative multiparameter analysis of biological systems. There are several different classes of label-free sensors that are currently being developed both in academia and in industry. In this critical review, we highlight, compare, and contrast some of the more promising approaches. We will describe the fundamental principles of these different methodologies and discuss advantages and disadvantages that might potentially help one in selecting the appropriate technology for a given bioanalytical application. PMID:19221722

  3. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  4. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  5. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  6. Air

    MedlinePlus

    ... do to protect yourself from dirty air . Indoor air pollution and outdoor air pollution Air can be polluted indoors and it can ... this chart to see what things cause indoor air pollution and what things cause outdoor air pollution! Indoor ...

  7. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  8. Fluorescent foci quantitation for high-throughput analysis

    PubMed Central

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  9. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  10. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. PMID:27354014

  11. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  12. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  13. Facegram - Objective quantitative analysis in facial reconstructive surgery.

    PubMed

    Gerós, Ana; Horta, Ricardo; Aguiar, Paulo

    2016-06-01

    Evaluation of effectiveness in reconstructive plastic surgery has become an increasingly important asset in comparing and choosing the most suitable medical procedure to handle facial disfigurement. Unfortunately, traditional methods to assess the results of surgical interventions are mostly qualitative and lack information about movement dynamics. Along with this, the few existing methodologies tailored to objectively quantify surgery results are not practical in the medical field due to constraints in terms of cost, complexity and poor suitability to clinical environment. These limitations enforce an urgent need for the creation of a new system to quantify facial movement and allow for an easy interpretation by medical experts. With this in mind, we present here a novel method capable of quantitatively and objectively assess complex facial movements, using a set of morphological, static and dynamic measurements. For this purpose, RGB-D cameras are used to acquire both color and depth images, and a modified block matching algorithm, combining depth and color information, was developed to track the position of anatomical landmarks of interest. The algorithms are integrated into a user-friendly graphical interface and the analysis outcomes are organized into an innovative medical tool, named facegram. This system was developed in close collaboration with plastic surgeons and the methods were validated using control subjects and patients with facial paralysis. The system was shown to provide useful and detailed quantitative information (static and dynamic) making it an appropriate solution for objective quantitative characterization of facial movement in a clinical environment. PMID:26994664

  14. Desiccant Enhanced Evaporative Air Conditioning: Parametric Analysis and Design; Preprint

    SciTech Connect

    Woods, J.; Kozubal, E.

    2012-10-01

    This paper presents a parametric analysis using a numerical model of a new concept in desiccant and evaporative air conditioning. The concept consists of two stages: a liquid desiccant dehumidifier and a dew-point evaporative cooler. Each stage consists of stacked air channel pairs separated by a plastic sheet. In the first stage, a liquid desiccant film removes moisture from the process (supply-side) air through a membrane. An evaporatively-cooled exhaust airstream on the other side of the plastic sheet cools the desiccant. The second-stage indirect evaporative cooler sensibly cools the dried process air. We analyze the tradeoff between device size and energy efficiency. This tradeoff depends strongly on process air channel thicknesses, the ratio of first-stage to second-stage area, and the second-stage exhaust air flow rate. A sensitivity analysis reiterates the importance of the process air boundary layers and suggests a need for increasing airside heat and mass transfer enhancements.

  15. Air Cargo Transportation Route Choice Analysis

    NASA Technical Reports Server (NTRS)

    Obashi, Hiroshi; Kim, Tae-Seung; Oum, Tae Hoon

    2003-01-01

    Using a unique feature of air cargo transshipment data in the Northeast Asian region, this paper identifies the critical factors that determine the transshipment route choice. Taking advantage of the variations in the transport characteristics in each origin-destination airports pair, the paper uses a discrete choice model to describe the transshipping route choice decision made by an agent (i.e., freight forwarder, consolidator, and large shipper). The analysis incorporates two major factors, monetary cost (such as line-haul cost and landing fee) and time cost (i.e., aircraft turnaround time, including loading and unloading time, custom clearance time, and expected scheduled delay), along with other controls. The estimation method considers the presence of unobserved attributes, and corrects for resulting endogeneity by use of appropriate instrumental variables. Estimation results find that transshipment volumes are more sensitive to time cost, and that the reduction in aircraft turnaround time by 1 hour would be worth the increase in airport charges by more than $1000. Simulation exercises measures the impacts of alternative policy scenarios for a Korean airport, which has recently declared their intention to be a future regional hub in the Northeast Asian region. The results suggest that reducing aircraft turnaround time at the airport be an effective strategy, rather than subsidizing to reduce airport charges.

  16. Quantitative Northern Blot Analysis of Mammalian rRNA Processing.

    PubMed

    Wang, Minshi; Pestov, Dimitri G

    2016-01-01

    Assembly of eukaryotic ribosomes is an elaborate biosynthetic process that begins in the nucleolus and requires hundreds of cellular factors. Analysis of rRNA processing has been instrumental for studying the mechanisms of ribosome biogenesis and effects of stress conditions on the molecular milieu of the nucleolus. Here, we describe the quantitative analysis of the steady-state levels of rRNA precursors, applicable to studies in mammalian cells and other organisms. We include protocols for gel electrophoresis and northern blotting of rRNA precursors using procedures optimized for the large size of these RNAs. We also describe the ratio analysis of multiple precursors, a technique that facilitates the accurate assessment of changes in the efficiency of individual pre-rRNA processing steps. PMID:27576717

  17. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  18. Quantitative analysis of motion control in long term microgravity.

    PubMed

    Baroni, G; Ferrigno, G; Anolli, A; Andreoni, G; Pedotti, A

    1998-01-01

    In the frame of the 179-days EUROMIR '95 space mission, two in-flight experiments have foreseen quantitative three-dimensional human movement analysis in microgravity. For this aim, a space qualified opto-electronic motion analyser based on passive markers has been installed onboard the Russian Space Station MIR and 8 in flight sessions have been performed. Techhology and method for the collection of kinematics data are described, evaluating the accuracy in three-dimensional marker localisation. Results confirm the suitability of opto-electronic technology for quantitative human motion analysis on orbital modules and raise a set of "lessons learned", leading to the improvement of motion analyser performance with a contemporary swiftness of the on-board operations. Among the experimental program of T4, results of three voluntary posture perturbation protocols are described. The analysis suggests that a short term reinterpretation of proprioceptive information and re-calibration of sensorimotor mechanisms seem to end within the first weeks of flight, while a continuous long term adaptation process allows the refinement of motor performance, in the frame of never abandoned terrestrial strategies.

  19. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  20. Flow quantitation by radio frequency analysis of contrast echocardiography.

    PubMed

    Rovai, D; Lombardi, M; Mazzarisi, A; Landini, L; Taddei, L; Distante, A; Benassi, A; L'Abbate, A

    1993-03-01

    Contrast echocardiography has the potential for measuring cardiac output and regional blood flow. However, accurate quantitation is limited both by the use of non-standard contrast agents and by the electronic signal distortion inherent to the echocardiographic instruments. Thus, the aim of this study is to quantify flow by combining a stable contrast agent and a modified echo equipment, able to sample the radio frequency (RF) signal from a region of interest (ROI) in the echo image. The contrast agent SHU-454 (0.8 ml) was bolus injected into an in vitro calf vein, at 23 flow rates (ranging from 376 to 3620 ml/min) but constant volume and pressure. The ROI was placed in the centre of the vein, the RF signal was processed in real time and transferred to a personal computer to generate time-intensity curves. In the absence of recirculation, contrast washout slope and mean transit time (MTT) of curves (1.11-8.52 seconds) yielded excellent correlations with flow: r = 0.93 and 0.95, respectively. To compare the accuracy of RF analysis with that of conventional image processing as to flow quantitation, conventional images were collected in the same flow model by two different scanners: a) the mechanical sector scanner used for RF analysis, and b) a conventional electronic sector scanner. These images were digitized off-line, mean videodensity inside an identical ROI was measured and time-intensity curves were built. MTT by RF was shorter than by videodensitometric analysis of the images generated by the same scanner (p < 0.001). In contrast, MTT by RF was longer than by the conventional scanner (p < 0.001). Significant differences in MTT were also found with changes in the gain setting controls of the conventional scanner. To study the stability of the contrast effect, 6 contrast injections (20 ml) were performed at a constant flow rate during recirculation: the spontaneous decay in RF signal intensity (t1/2 = 64 +/- 8 seconds) was too long to affect MTT significantly

  1. Quantitative proteomic analysis of drug-induced changes in mycobacteria.

    PubMed

    Hughes, Minerva A; Silva, Jeffrey C; Geromanos, Scott J; Townsend, Craig A

    2006-01-01

    A new approach for qualitative and quantitative proteomic analysis using capillary liquid chromatography and mass spectrometry to study the protein expression response in mycobacteria following isoniazid treatment is discussed. In keeping with known effects on the fatty acid synthase II pathway, proteins encoded by the kas operon (AcpM, KasA, KasB, Accd6) were significantly overexpressed, as were those involved in iron metabolism and cell division suggesting a complex interplay of metabolic events leading to cell death. PMID:16396495

  2. [Quantitative analysis for mast cells in obstructive sialadenitis].

    PubMed

    Diao, G X

    1993-03-01

    Quantitative analysis for mast cells in 27 cases of obstructive sialadenitis, 12 cases of approximate normal salivary gland tissues and 5 cases of lymphoepithelial lesion of salivary glands shows that the number of mast cells is slightly increased with the increase of gravity-grade of obstructive sialadenitis and this is closely related to fibrosis of salivary glands and infiltration grade of inflammation cells (dominated by lymphocyte cells), whereas not closely relating to the age change of patients. For the cases of benign lymphoepithelial lesion of salivary glands with malignant changes despite of malignant lymphoma or squamous cell carcinoma the numbers of mast cells are obviously decreased.

  3. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.

  4. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  5. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs.

  6. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs. PMID:26320797

  7. Quantitative analysis of echogenicity for patients with thyroid nodules

    PubMed Central

    Wu, Ming-Hsun; Chen, Chiung-Nien; Chen, Kuen-Yuan; Ho, Ming-Chih; Tai, Hao-Chih; Wang, Yu-Hsin; Chen, Argon; Chang, King-Jen

    2016-01-01

    Hypoechogenicity has been described qualitatively and is potentially subject to intra- and inter-observer variability. The aim of this study was to clarify whether quantitative echoic indexes (EIs) are useful for the detection of malignant thyroid nodules. Overall, 333 participants with 411 nodules were included in the final analysis. Quantification of echogenicity was performed using commercial software (AmCAD-UT; AmCad BioMed, Taiwan). The coordinates of three defined regions, the nodule, thyroid parenchyma, and strap muscle regions, were recorded in the database separately for subsequent analysis. And the results showed that ultrasound echogenicity (US-E), as assessed by clinicians, defined hypoechogenicity as an independent factor for malignancy. The EI, adjusted EI (EIN-T; EIN-M) and automatic EI(N-R)/R values between benign and malignant nodules were all significantly different, with lower values for malignant nodules. All of the EIs showed similar percentages of sensitivity and specificity and had better accuracies than US-E. In conclusion, the proposed quantitative EI seems more promising to constitute an important advancement than the conventional qualitative US-E in allowing for a more reliable distinction between benign and malignant thyroid nodules. PMID:27762299

  8. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  9. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  10. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  11. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  12. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGES

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  13. Quantitative analysis of intermolecular interactions in orthorhombic rubrene.

    PubMed

    Hathwar, Venkatesha R; Sist, Mattia; Jørgensen, Mads R V; Mamakhel, Aref H; Wang, Xiaoping; Hoffmann, Christina M; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-09-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H-H interactions. The electron density features of H-H bonding, and the interaction energy of molecular dimers connected by H-H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  14. Segmentation and quantitative analysis of individual cells in developmental tissues.

    PubMed

    Nandy, Kaustav; Kim, Jusub; McCullough, Dean P; McAuliffe, Matthew; Meaburn, Karen J; Yamaguchi, Terry P; Gudla, Prabhakar R; Lockett, Stephen J

    2014-01-01

    Image analysis is vital for extracting quantitative information from biological images and is used extensively, including investigations in developmental biology. The technique commences with the segmentation (delineation) of objects of interest from 2D images or 3D image stacks and is usually followed by the measurement and classification of the segmented objects. This chapter focuses on the segmentation task and here we explain the use of ImageJ, MIPAV (Medical Image Processing, Analysis, and Visualization), and VisSeg, three freely available software packages for this purpose. ImageJ and MIPAV are extremely versatile and can be used in diverse applications. VisSeg is a specialized tool for performing highly accurate and reliable 2D and 3D segmentation of objects such as cells and cell nuclei in images and stacks.

  15. Quantitatively understanding cellular uptake of gold nanoparticles via radioactivity analysis

    PubMed Central

    Shao, Xia; Schnau, Paul; Qian, Wei; Wang, Xueding

    2015-01-01

    The development of multifunctional gold nanoparticles (AuNPs) underwent an explosion in the last two decades. However, many questions regarding detailed surface chemistry and how they are affecting the behaviors of AuNPs in vivo and in vitro still need to be addressed before AuNPs can be widely adapted into clinical settings. In this work, radioactivity analysis was employed for quantitative evaluation of I-125 radiolabeled AuNPs uptakes by cancer cells. Facilitated with this new method, we have conducted initial bioevaluation of surfactant-free AuNPs produced by femtosecond laser ablation. Cellular uptake of AuNPs as a function of the RGD density on the AuNP surface, as well as a function of time, has been quantified. The radioactivity analysis may shed light on the dynamic interactions of AuNPs with cancer cells, and help achieve optimized designs of AuNPs for future clinical applications. PMID:26505012

  16. [Quantitative analysis of butachlor, oxadiazon and simetryn by gas chromatography].

    PubMed

    Liu, F; Mu, W; Wang, J

    1999-03-01

    The quantitative analysis of the ingredients in 26% B-O-S (butachlor, oxadiazon and simetryn) emulsion by gas chromatographic method was carried out with a 5% SE-30 on Chromosorb AW DMCS, 2 m x 3 mm i.d., glass column at column temperature of 210 degrees C and detector temperature of 230 degrees C. The internal standard is di-n-butyl sebacate. The retentions of simetryn, internal standard, butachlor and oxadiazon were 6.5, 8.3, 9.9 and 11.9 min respectively. This method has a recovery of 98.62%-100.77% and the coefficients of variation of this analysis of butachlor, oxadiazon and simetryn were 0.46%, 0.32% and 0.57% respectively. All coefficients of linear correlation were higher than 0.999.

  17. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  18. The effect of pedigree complexity on quantitative trait linkage analysis.

    PubMed

    Dyer, T D; Blangero, J; Williams, J T; Göring, H H; Mahaney, M C

    2001-01-01

    Due to the computational difficulties of performing linkage analysis on large complex pedigrees, most investigators resort to simplifying such pedigrees by some ad hoc strategy. In this paper, we suggest an analytical method to compare the power of various pedigree simplification schemes by using the asymptotic distribution of the likelihood-ratio statistic. We applied the method to the large Hutterine pedigree. Our results indicate that the breaking and reduction of inbreeding loops can greatly diminish the power to localize quantitative trait loci. We also present an efficient Monte Carlo method for estimating identity-by-descent allele sharing in large complex pedigrees. This method is used to facilitate a linkage analysis of serum IgE levels in the Hutterites without simplifying the pedigree.

  19. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  20. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  1. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  2. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  3. Quantitative characterizations of ultrashort echo (UTE) images for supporting air-bone separation in the head.

    PubMed

    Hsu, Shu-Hui; Cao, Yue; Lawrence, Theodore S; Tsien, Christina; Feng, Mary; Grodzki, David M; Balter, James M

    2015-04-01

    Accurate separation of air and bone is critical for creating synthetic CT from MRI to support Radiation Oncology workflow. This study compares two different ultrashort echo-time sequences in the separation of air from bone, and evaluates post-processing methods that correct intensity nonuniformity of images and account for intensity gradients at tissue boundaries to improve this discriminatory power. CT and MRI scans were acquired on 12 patients under an institution review board-approved prospective protocol. The two MRI sequences tested were ultra-short TE imaging using 3D radial acquisition (UTE), and using pointwise encoding time reduction with radial acquisition (PETRA). Gradient nonlinearity correction was applied to both MR image volumes after acquisition. MRI intensity nonuniformity was corrected by vendor-provided normalization methods, and then further corrected using the N4itk algorithm. To overcome the intensity-gradient at air-tissue boundaries, spatial dilations, from 0 to 4 mm, were applied to threshold-defined air regions from MR images. Receiver operating characteristic (ROC) analyses, by comparing predicted (defined by MR images) versus 'true' regions of air and bone (defined by CT images), were performed with and without residual bias field correction and local spatial expansion. The post-processing corrections increased the areas under the ROC curves (AUC) from 0.944 ± 0.012 to 0.976 ± 0.003 for UTE images, and from 0.850 ± 0.022 to 0.887 ± 0.012 for PETRA images, compared to without corrections. When expanding the threshold-defined air volumes, as expected, sensitivity of air identification decreased with an increase in specificity of bone discrimination, but in a non-linear fashion. A 1 mm air mask expansion yielded AUC increases of 1 and 4% for UTE and PETRA images, respectively. UTE images had significantly greater discriminatory power in separating air from bone than PETRA images. Post-processing strategies improved the

  4. Quantitative characterizations of ultrashort echo (UTE) images for supporting air-bone separation in the head

    NASA Astrophysics Data System (ADS)

    Hsu, Shu-Hui; Cao, Yue; Lawrence, Theodore S.; Tsien, Christina; Feng, Mary; Grodzki, David M.; Balter, James M.

    2015-04-01

    Accurate separation of air and bone is critical for creating synthetic CT from MRI to support Radiation Oncology workflow. This study compares two different ultrashort echo-time sequences in the separation of air from bone, and evaluates post-processing methods that correct intensity nonuniformity of images and account for intensity gradients at tissue boundaries to improve this discriminatory power. CT and MRI scans were acquired on 12 patients under an institution review board-approved prospective protocol. The two MRI sequences tested were ultra-short TE imaging using 3D radial acquisition (UTE), and using pointwise encoding time reduction with radial acquisition (PETRA). Gradient nonlinearity correction was applied to both MR image volumes after acquisition. MRI intensity nonuniformity was corrected by vendor-provided normalization methods, and then further corrected using the N4itk algorithm. To overcome the intensity-gradient at air-tissue boundaries, spatial dilations, from 0 to 4 mm, were applied to threshold-defined air regions from MR images. Receiver operating characteristic (ROC) analyses, by comparing predicted (defined by MR images) versus ‘true’ regions of air and bone (defined by CT images), were performed with and without residual bias field correction and local spatial expansion. The post-processing corrections increased the areas under the ROC curves (AUC) from 0.944 ± 0.012 to 0.976 ± 0.003 for UTE images, and from 0.850 ± 0.022 to 0.887 ± 0.012 for PETRA images, compared to without corrections. When expanding the threshold-defined air volumes, as expected, sensitivity of air identification decreased with an increase in specificity of bone discrimination, but in a non-linear fashion. A 1 mm air mask expansion yielded AUC increases of 1 and 4% for UTE and PETRA images, respectively. UTE images had significantly greater discriminatory power in separating air from bone than PETRA images. Post-processing strategies improved the

  5. Epistasis analysis for quantitative traits by functional regression model.

    PubMed

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  6. Functional Linear Models for Association Analysis of Quantitative Traits

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Mills, James L.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao

    2014-01-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  7. Fractal Spectrum Technique for Quantitative Analysis of Volcanic Particle Shapes

    NASA Astrophysics Data System (ADS)

    Maria, A. H.; Carey, S. N.

    2001-12-01

    The shapes of volcanic particles reflect numerous eruptive parameters (e.g. magma viscosity, volatile content, degree of interaction with water) and are useful for understanding fragmentation and transport processes associated with volcanic eruptions. However, quantitative analysis of volcanic particle shapes has proven difficult due to their morphological complexity and variability. Shape analysis based on fractal geometry has been successfully applied to a wide variety of particles and appears to be well suited for describing complex features. The technique developed and applied to volcanic particles in this study uses fractal data produced by dilation of the 2-D particle boundary to produce a full spectrum of fractal dimensions over a range of scales for each particle. Multiple fractal dimensions, which can be described as a fractal spectrum curve, are calculated by taking the first derivative of data points on a standard Richardson plot. Quantitative comparisons are carried out using multivariate statistical techniques such as cluster and principal components analysis. Compared with previous fractal methods that express shape in terms of only one or two fractal dimensions, use of multiple fractal dimensions results in more effective discrimination between samples. In addition, the technique eliminates the subjectivity associated with selecting linear segments on Richardson plots for fractal dimension calculation, and allows direct comparison of particles as long as instantaneous dimensions used as input to multivariate analyses are selected at the same scales for each particle. Applications to samples from well documented eruptions (e.g. Mt. St. Helens, Tambora, Surtsey) indicate that the fractal spectrum technique provides a useful means of characterizing volcanic particles and can be helpful for identifying the products of specific fragmentation processes (volatile exsolution, phreatomagmatic, quench granulation) and modes of volcanic deposition (tephra fall

  8. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study.

  9. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  10. Time-series analysis for determining vertical air permeability in unsaturated zones

    SciTech Connect

    Lu, N.

    1999-01-01

    The air pressure in the unsaturated subsurface changes dynamically as the barometric pressure varies with time. Depending on the material properties and boundary conditions, the intensity of the correlation between the atmospheric and subsurface pressures may be evidenced in two persistent patterns: (1) the amplitude attenuation; and (2) the phase lag for the principal modes, such as the diurnal, semidiurnal, and 8-h tides. The amplitude attenuation and the phase lag generally depend on properties that can be classified into two categories: (1) The barometric pressure parameters, such as the apparent pressure amplitudes and frequencies controlled by the atmospheric tides and others; and (2) the material properties of porous media, such as the air viscosity, air-filled porosity, and permeability. Based on the principle of superposition and a Fourier time-series analysis, an analytical solution for predicting the subsurface air pressure variation caused by the atmospheric pressure fluctuation is presented. The air permeability (or pneumatic diffusivity) can be quantitatively determined by using the calculated amplitude attenuations (or phase lags) and the appropriate analytical relations among the parameters of the atmosphere and the porous medium. An analysis using the field data shows that the Fourier time-series analysis may provide a potentially reliable and simple method for predicting the subsurface barometric pressure variation and for determining the air permeability of unsaturated zones.

  11. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  12. Quantitative analysis of the polarization characteristics of atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Gubarkova, Ekaterina V.; Kirillin, Michail Y.; Dudenkova, Varvara V.; Kiseleva, Elena B.; Moiseev, Alexander A.; Gelikonov, Grigory V.; Timofeeva, Lidia B.; Fiks, Ilya I.; Feldchtein, Felix I.; Gladkova, Natalia D.

    2016-04-01

    In this study we demonstrate the capability of cross-polarization optical coherence tomography (CP OCT) to assess collagen and elastin fibers condition in atherosclerotic plaques basing on ratio of the OCT signal levels in cross- and co- polarizations. We consider the depolarization factor (DF) and the effective birefringence (Δn) as quantitative characteristics of CP OCT images. We revealed that calculation of both DF and Δn in the region of interest (fibrous cap) yields a statistically significant difference between stable and unstable plaques (0.46+/-0.21 vs 0.09+/-0.04 for IDF; (4.7+/-1.0)•10-4 vs (2.5+/-0.7)•10-4 for Δn p<0.05). In parallel with CP OCT we used the nonlinear microscopy for analysis of thin cross-section of atherosclerotic plaque, revealing the different average isotropy index of collagen and elastin fibers for stable and unstable plaques (0.30 +/- 0.10 vs 0.70 +/- 0.08; p<0.001). The proposed approach for quantitative assessment of CP OCT images allows cross-scattering and birefringence characterization of stable and unstable atherosclerotic plaques.

  13. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  14. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  15. Application of Synchrotron-XRF to Quantitative Elemental Aerosol Analysis

    NASA Astrophysics Data System (ADS)

    Cliff, S. S.; Perry, K. D.; Jimenez-Cruz, M. P.; Cahill, T. A.

    2001-12-01

    Recent advances in synchrotron x-ray fluorescence (s-XRF) analysis of atmospheric particulate matter have improved elemental sensitivity, quantification and time-resolution. Analysis of both filter and impactor based aerosol samples have yielded quantitative data for elements Na-U, if present, in ambient aerosols. The increased sensitivity allows higher time resolution through either smaller spatial analysis of time-resolved impactor samples or shorter sample time-integration using filter-based samplers. Of particular interest is the application of s-XRF to aerodynamically sized rotating substrate impactor samples. These samplers, 8- and 3-stage DRUM's, have the ability to aerodynamically size-classify particles in either 8 or 3 categories, respectively. In addition, the rotating substrate allows time-resolved analysis of samples with little or no loss in elemental sensitivity. The s-XRF analyses are performed on Beamline 10.3.1 at the Advanced Light Source-Lawrence Berkeley Laboratory (ALS-LBL). Beamline 10.3.1, originally designed for materials analysis, has been supplemented with aerosol analysis capability from several substrate options. Typical analysis involves Teflon filters or Mylar impaction substrates. The newly formed Participating Research Team (PRT) for beamline 10.3.1 encompasses both global climate and material science research. The s-XRF capabilities of beamline 10.3.1 are now available for PRT researchers and independent investigators through a proposal process to the ALS. The technology, application to aerosol research and monitoring, and availability of the facility to the aerosol research community will be presented.

  16. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  17. [Simultaneous quantitative analysis of multielements in Al alloy samples by laser-induced breakdown spectroscopy].

    PubMed

    Sun, Lan-Xiang; Yu, Hai-Bin

    2009-12-01

    The multielement components of some aluminium alloy samples were quantified by using laser-induced breakdown spectroscopy (LIBS). The Nd : YAG pulsed laser was used to produce plasma in ambient air. The spectral range of 200-980 nm was simultaneously obtained through a multichannel grating spectrometer and CCD detectors. The authors studied the influences of time delays, energy of the laser, and depth profile of elements in samples on spectral intensity, and optimized the experimental parameters based on the influence analysis. With the optimal experimental parameters, the authors made the calibration curves by four certified aluminum alloy samples for eight elements, Si, Fe, Cu, Mn, Mg, Zn, Sn, and Ni, and quantified the composition of an aluminum sample. The obtained maximum relative standard deviation (RSD) was 5.89%, and relative errors were--20.99%-15%. Experimental results show that LIBS is an effective technique for quantitative analysis of aluminum alloy samples, though the improved accuracy of the quantitative analysis is necessary.

  18. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  19. Preparation of Buffers. An Experiment for Quantitative Analysis Laboratory

    NASA Astrophysics Data System (ADS)

    Buckley, P. T.

    2001-10-01

    In our experience, students who have a solid grounding in the theoretical aspects of buffers, buffer preparation, and buffering capacity are often at a loss when required to actually prepare a buffer in a research setting. However, there are very few published laboratory experiments pertaining to buffers. This laboratory experiment for the undergraduate quantitative analysis lab gives students hands-on experience in the preparation of buffers. By preparing a buffer to a randomly chosen pH value and comparing the theoretical pH to the actual pH, students apply their theoretical understanding of the Henderson-Hasselbalch equation, activity coefficients, and the effect of adding acid or base to a buffer. This experiment gives students experience in buffer preparation for research situations and helps them in advanced courses such as biochemistry where a fundamental knowledge of buffer systems is essential.

  20. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  1. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  2. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    PubMed

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  3. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  4. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  5. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  6. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  7. Mass spectrometry-based quantitative analysis and biomarker discovery.

    PubMed

    Suzuki, Naoto

    2011-01-01

      Mass spectrometry-based quantitative analysis and biomarker discovery using metabolomics approach represent one of the major platforms in clinical fields including for the prognosis or diagnosis, assessment of severity and response to therapy in a number of clinical disease states as well as therapeutic drug monitoring (TDM). This review first summarizes our mass spectrometry-based research strategy and some results on relationship between cysteinyl leukotriene (cysLT), thromboxane (TX), 12-hydroxyeicosatetraenoic acid (12-HETE) and other metabolites of arachidonic acid and diseases such as atopic dermatitis, rheumatoid arthritis and diabetes mellitus. For the purpose of evaluating the role of these metabolites of arachidonic acid in disease status, we have developed sensitive determination methods with simple solid-phase extraction and applied in clinical settings. In addition to these endogenous compounds, using mass spectrometry, we have developed actually applicable quantitative methods for TDM. Representative example was a method of TDM for sirolimus, one of the immunosuppressant agents for a recipient of organ transplant, which requires rigorous monitoring of blood level. As we recognized great potential in mass spectrometry during these researches, we have become interested in metabolomics as the non-targeted analysis of metabolites. Now, established strategy for the metabolomics investigation applies to samples from cells, animals and humans to separate groups based on altered patterns of metabolites in biological fluids and to identify metabolites as potential biomarkers discriminating groups. We would be honored if our research using mass spectrometry would contribute to provide useful information in the field of medical pharmacy. PMID:21881303

  8. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  9. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  10. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  11. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  12. Multiple Trait Analysis of Genetic Mapping for Quantitative Trait Loci

    PubMed Central

    Jiang, C.; Zeng, Z. B.

    1995-01-01

    We present in this paper models and statistical methods for performing multiple trait analysis on mapping quantitative trait loci (QTL) based on the composite interval mapping method. By taking into account the correlated structure of multiple traits, this joint analysis has several advantages, compared with separate analyses, for mapping QTL, including the expected improvement on the statistical power of the test for QTL and on the precision of parameter estimation. Also this joint analysis provides formal procedures to test a number of biologically interesting hypotheses concerning the nature of genetic correlations between different traits. Among the testing procedures considered are those for joint mapping, pleiotropy, QTL by environment interaction, and pleiotropy vs. close linkage. The test of pleiotropy (one pleiotropic QTL at a genome position) vs. close linkage (multiple nearby nonpleiotropic QTL) can have important implications for our understanding of the nature of genetic correlations between different traits in certain regions of a genome and also for practical applications in animal and plant breeding because one of the major goals in breeding is to break unfavorable linkage. Results of extensive simulation studies are presented to illustrate various properties of the analyses. PMID:7672582

  13. BMEWS Capture and Analysis of Reflected Energy Clear Air ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    BMEWS Capture and Analysis of Reflected Energy - Clear Air Force Station, Ballistic Missile Early Warning System Site II, One mile west of mile marker 293.5 on Parks Highway, 5 miles southwest of Anderson, Anderson, Denali Borough, AK

  14. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  15. Air Ingress Analysis: Part 1 - Theoretical Approach

    SciTech Connect

    Chang Ho Oh

    2011-01-01

    Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy (DOE), is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature gas-cooled reactors (VHTRs). Phenomena identification and ranking studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air-ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the VHTR through the break, possibly causing oxidation of the graphite core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of lower plenum graphite caused by graphite oxidation can lead to a loss of mechanical strength. Excessive oxidation of core graphite can also lead to a release of fission products into the confinement, which could be detrimental to reactor safety. Analytical models developed in this study will improve our understanding of this phenomenon. This paper presents two sets of analytical models for the qualitative assessment of the air ingress phenomena. The results from the analytical models are compared with results of the computational fluid dynamic models (CFD) in the subsequent paper. The analytical models agree well with those CFD results.

  16. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  17. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    PubMed

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  18. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  19. Quantitative analysis of fault slip evolution in analogue transpression models

    NASA Astrophysics Data System (ADS)

    Leever, Karen; Gabrielsen, Roy H.; Schmid, Dani; Braathen, Alvar

    2010-05-01

    A quantitative analysis of fault slip evolution in crustal scale brittle and brittle-ductile analogue models of doubly vergent transpressional wedges was performed by means of Particle Image Velocimetry (PIV). The kinematic analyses allow detailed comparison between model results and field kinematic data. This novel approach leads to better understanding of the evolution of transpressional orogens such as the Tertiary West Spitsbergen fold and thrust belt in particular and will advance the understanding of transpressional wedge mechanics in general. We ran a series of basal-driven models with convergence angles of 4, 7.5, 15 and 30 degrees. In these crustal scale models, brittle rheology was represented by quartz sand; in one model a viscous PDMS layer was included at shallow depth. Total sand pack thickness was 6cm, its extent 120x60cm. The PIV method was used to calculate a vector field from pairs of images that were recorded from the top of the experiments at a 2mm displacement increment. The slip azimuth on discrete faults was calculated and visualized by means of a directional derivative of this vector field. From this data set, several stages in the evolution of the models could be identified. The stages were defined by changes in the degree of displacement partitioning, i.e. slip along-strike and orthogonal to the plate boundary. A first stage of distributed strain (with no visible faults at the model surface) was followed by a shear lens stage with oblique displacement on pro- and retro-shear. The oblique displacement became locally partitioned during progressive displacement. During the final stage, strain was more fully partitioned between a newly formed central strike slip zone and reverse faults at the sides. Strain partitioning was best developed in the 15 degrees model, which shows near-reverse faults along both sides of the wedge in addition to strike slip displacement in the center. In further analysis we extracted average slip vectors for

  20. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling

    PubMed Central

    Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T. M.; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  1. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  2. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  3. [Quantitative Analysis of Mn in Soil Samples Using LIBS].

    PubMed

    Zhang, Bao-hua; Jiang, Yong-cheng; Zhang, Xian-yan; Cui, Zhi-feng

    2015-06-01

    The trace element of Manganese element in the agricultural farm (Anhui Huaiyuan Nongkang) soil was quantitatively analyzed by Laser-induced breakdown spectroscopy. The line of 403.1 nm was selected as the analysis line of Mn. The matrix element of Fe in soil was chosen as the internal calibration element and the analysis line was 407.2 nm. Ten soil samples were used to construct calibration curves with traditional method and internal standard method, and four soil samples were selected as test samples. The experimental results showed that the fitting correlation coefficient (r) is 0.954 when using the traditional method, the maximum relative error of the measurement samples is 5.72%, and the detection limit of Mn in soil is 93 mg x kg(-1). While using the internal standard method to construct the calibration curve, the fitting correlation coefficient (r) is 0.983, the relative error of measurement samples is reduced to 4.1%, and the detection limit of Mn in soil is 71 mg x kg(-1). The result indicates that LIBS technique can be used to detect trace element Mn in soil. In a certain extent, the internal standard method can improve the accuracy of measurement.

  4. Quantitative analysis of polyethylene blends by Fourier transform infrared spectroscopy.

    PubMed

    Cran, Marlene J; Bigger, Stephen W

    2003-08-01

    The quantitative analysis of binary polyethylene (PE) blends by Fourier transform infrared (FT-IR) spectroscopy has been achieved based on the ratio of two absorbance peaks in an FT-IR spectrum. The frequencies for the absorbance ratio are selected based on structural entities of the PE components in the blend. A linear relationship between the absorbance ratio and the blend composition was found to exist if one of the absorbance peaks is distinct to one of the components and the other peak is common to both components. It was also found that any peak resulting from short-chain branching in copolymers (such as linear low-density polyethylene (LLDPE) or metallocene-catalyzed LLDPE (mLLDPE)), is suitable for use as the peak that is designated as being distinct to that component. In order to optimize the linearity of the equation, however, the selection of the second common peak is the most important and depends on the blend system studied. Indeed, under certain circumstances peaks that are not spectrally distinct can be used successfully to apply the method. The method exhibits potential for the routine analysis of PE blends that have been calibrated prior to its application.

  5. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  6. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation.

  7. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  8. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  9. Quantitative assessments of indoor air pollution and the risk of childhood acute leukemia in Shanghai.

    PubMed

    Gao, Yu; Zhang, Yan; Kamijima, Michihiro; Sakai, Kiyoshi; Khalequzzaman, Md; Nakajima, Tamie; Shi, Rong; Wang, Xiaojin; Chen, Didi; Ji, Xiaofan; Han, Kaiyi; Tian, Ying

    2014-04-01

    We investigated the association between indoor air pollutants and childhood acute leukemia (AL). A total of 105 newly diagnosed cases and 105 1:1 gender-, age-, and hospital-matched controls were included. Measurements of indoor pollutants (including nitrogen dioxide (NO2) and 17 types of volatile organic compounds (VOCs)) were taken with diffusive samplers for 64 pairs of cases and controls. Higher concentrations of NO2 and almost half of VOCs were observed in the cases than in the controls and were associated with the increased risk of childhood AL. The use of synthetic materials for wall decoration and furniture in bedroom was related to the risk of childhood AL. Renovating the house in the last 5 years, changing furniture in the last 5 years, closing the doors and windows overnight in the winter and/or summer, paternal smoking history and outdoor pollutants affected VOC concentrations. Our results support the association between childhood AL and indoor air pollution.

  10. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  11. Qualitative and quantitative analysis of organophosphorus pesticide residues using temperature modulated SnO(2) gas sensor.

    PubMed

    Huang, Xingjiu; Liu, Jinhuai; Pi, Zongxin; Yu, Zengliang

    2004-10-01

    Qualitative and quantitative analysis of organophosphorus pesticide residues (acephate and trichlorphon) using temperature modulated SnO(2) gas sensor were studied. The testing method employed only a single SnO(2)-based gas sensor in a rectangular temperature mode to perform the qualitative analysis of pure pesticide vapor and a binary vapor mixture in the air. Experimental results showed that in the range 250-300 degrees C and at the modulating frequency of 20mHz the high selectivity of the sensor could be achieved. The quantitative analysis of the pure pesticide vapor and their mixture were performed by fast Fourier transformation (FFT). The higher harmonics of the FFT characterized the non-linear properties of the response at the sensor surface. The amplitudes of the higher harmonics exhibited characteristic variations that depend on the concentration and the kinetics of pesticide species on the sensor surface. PMID:18969637

  12. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created

  13. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  14. Air-to-air combat analysis - Review of differential-gaming approaches

    NASA Technical Reports Server (NTRS)

    Ardema, M. D.

    1981-01-01

    The problem of evaluating the combat performance of fighter/attack aircraft is discussed, and the mathematical nature of the problem is examined. The following approaches to air combat analysis are reviewed: (1) differential-turning differential game and (2) coplanar differential game. Selected numerical examples of these approaches are presented. The relative advantages and disadvantages of each are analyzed, and it is concluded that air combat analysis is an extremely difficult mathematical problem and that no one method of approach is best for all purposes. The paper concludes with a discussion of how the two approaches might be used in a complementary manner.

  15. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  16. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  17. Comparison of multivariate calibration methods for quantitative spectral analysis

    SciTech Connect

    Thomas, E.V.; Haaland, D.M. )

    1990-05-15

    The quantitative prediction abilities of four multivariate calibration methods for spectral analyses are compared by using extensive Monte Carlo simulations. The calibration methods compared include inverse least-squares (ILS), classical least-squares (CLS), partial least-squares (PLS), and principal component regression (PCR) methods. ILS is a frequency-limited method while the latter three are capable of full-spectrum calibration. The simulations were performed assuming Beer's law holds and that spectral measurement errors and concentration errors associated with the reference method are normally distributed. Eight different factors that could affect the relative performance of the calibration methods were varied in a two-level, eight-factor experimental design in order to evaluate their effect on the prediction abilities of the four methods. It is found that each of the three full-spectrum methods has its range of superior performance. The frequency-limited ILS method was never the best method, although in the presence of relatively large concentration errors it sometimes yields comparable analysis precision to the full-spectrum methods for the major spectral component. The importance of each factor in the absolute and relative performances of the four methods is compared.

  18. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  19. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  20. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    NASA Astrophysics Data System (ADS)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  1. Quantitative produced water analysis using mobile 1H NMR

    NASA Astrophysics Data System (ADS)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  2. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  3. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    PubMed Central

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  4. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-01

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene. PMID:21165476

  5. Active contour approach for accurate quantitative airway analysis

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Slabaugh, Greg G.; Novak, Carol L.; Naidich, David P.; Lerallut, Jean-Francois

    2008-03-01

    Chronic airway disease causes structural changes in the lungs including peribronchial thickening and airway dilatation. Multi-detector computed tomography (CT) yields detailed near-isotropic images of the lungs, and thus the potential to obtain quantitative measurements of lumen diameter and airway wall thickness. Such measurements would allow standardized assessment, and physicians to diagnose and locate airway abnormalities, adapt treatment, and monitor progress over time. However, due to the sheer number of airways per patient, systematic analysis is infeasible in routine clinical practice without automation. We have developed an automated and real-time method based on active contours to estimate both airway lumen and wall dimensions; the method does not require manual contour initialization but only a starting point on the targeted airway. While the lumen contour segmentation is purely region-based, the estimation of the outer diameter considers the inner wall segmentation as well as local intensity variation, in order anticipate the presence of nearby arteries and exclude them. These properties make the method more robust than the Full-Width Half Maximum (FWHM) approach. Results are demonstrated on a phantom dataset with known dimensions and on a human dataset where the automated measurements are compared against two human operators. The average error on the phantom measurements was 0.10mm and 0.14mm for inner and outer diameters, showing sub-voxel accuracy. Similarly, the mean variation from the average manual measurement was 0.14mm and 0.18mm for inner and outer diameters respectively.

  6. Quantitative analysis of flagellar proteins in Drosophila sperm tails.

    PubMed

    Mendes Maia, Teresa; Paul-Gilloteaux, Perrine; Basto, Renata

    2015-01-01

    The cilium has a well-defined structure, which can still accommodate some morphological and molecular composition diversity to suit the functional requirements of different cell types. The sperm flagellum of the fruit fly Drosophila melanogaster appears as a good model to study the genetic regulation of axoneme assembly and motility, due to the wealth of genetic tools publically available for this organism. In addition, the fruit fly's sperm flagellum displays quite a long axoneme (∼1.8mm), which may facilitate both histological and biochemical analyses. Here, we present a protocol for imaging and quantitatively analyze proteins, which associate with the fly differentiating, and mature sperm flagella. We will use as an example the quantification of tubulin polyglycylation in wild-type testes and in Bug22 mutant testes, which present defects in the deposition of this posttranslational modification. During sperm biogenesis, flagella appear tightly bundled, which makes it more challenging to get accurate measurements of protein levels from immunostained specimens. The method we present is based on the use of a novel semiautomated, macro installed in the image processing software ImageJ. It allows to measure fluorescence levels in closely associated sperm tails, through an exact distinction between positive and background signals, and provides background-corrected pixel intensity values that can directly be used for data analysis. PMID:25837396

  7. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems.

  8. Quantitative image analysis of cell colocalization in murine bone marrow.

    PubMed

    Mokhtari, Zeinab; Mech, Franziska; Zehentmeier, Sandra; Hauser, Anja E; Figge, Marc Thilo

    2015-06-01

    Long-term antibody production is a key property of humoral immunity and is accomplished by long-lived plasma cells. They mainly reside in the bone marrow, whose importance as an organ hosting immunological memory is becoming increasingly evident. Signals provided by stromal cells and eosinophils may play an important role for plasma cell maintenance, constituting a survival microenvironment. In this joint study of experiment and theory, we investigated the spatial colocalization of plasma cells, eosinophils and B cells by applying an image-based systems biology approach. To this end, we generated confocal fluorescence microscopy images of histological sections from murine bone marrow that were subsequently analyzed in an automated fashion. This quantitative analysis was combined with computer simulations of the experimental system for hypothesis testing. In particular, we tested the observed spatial colocalization of cells in the bone marrow against the hypothesis that cells are found within available areas at positions that were drawn from a uniform random number distribution. We find that B cells and plasma cells highly colocalize with stromal cells, to an extent larger than in the simulated random situation. While B cells are preferentially in contact with each other, i.e., form clusters among themselves, plasma cells seem to be solitary or organized in aggregates, i.e., loosely defined groups of cells that are not necessarily in direct contact. Our data suggest that the plasma cell bone marrow survival niche facilitates colocalization of plasma cells with stromal cells and eosinophils, respectively, promoting plasma cell longevity.

  9. Quantitative genetic analysis of flowering time in tomato.

    PubMed

    Jiménez-Gómez, José M; Alonso-Blanco, Carlos; Borja, Alicia; Anastasio, Germán; Angosto, Trinidad; Lozano, Rafael; Martínez-Zapater, José M

    2007-03-01

    Artificial selection of cultivated tomato (Solanum lycopersicum L.) has resulted in the generation of early-flowering, day-length-insensitive cultivars, despite its close relationship to other Solanum species that need more time and specific photoperiods to flower. To investigate the genetic mechanisms controlling flowering time in tomato and related species, we performed a quantitative trait locus (QTL) analysis for flowering time in an F2 mapping population derived from S. lycopersicum and its late-flowering wild relative S. chmielewskii. Flowering time was scored as the number of days from sowing to the opening of the first flower (days to flowering), and as the number of leaves under the first inflorescence (leaf number). QTL analyses detected 2 QTLs affecting days to flowering, which explained 55.3% of the total phenotypic variance, and 6 QTLs for leaf number, accounting for 66.7% of the corresponding phenotypic variance. Four of the leaf number QTLs had not previously been detected for this trait in tomato. Colocation of some QTLs with flowering-time genes included in the genetic map suggests PHYB2, FALSIFLORA, and a tomato FLC-like sequence as candidate genes that might have been targets of selection during the domestication of tomato.

  10. Early child grammars: qualitative and quantitative analysis of morphosyntactic production.

    PubMed

    Legendre, Géraldine

    2006-09-10

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is argued that acquisition of morphosyntax proceeds via overlapping grammars (rather than through abrupt changes), which OT formalizes in terms of partial rather than total constraint rankings. Initially, economy of structure constraints take priority over faithfulness constraints that demand faithful expression of a speaker's intent, resulting in child production of tense that is comparable in level to that of child-directed speech. Using the independent Predominant Length of Utterance measure of syntactic development proposed in Vainikka, Legendre, and Todorova (1999), production of agreement is shown first to lag behind tense then to compete with tense at an intermediate stage of development. As the child's development progresses, faithfulness constraints become more dominant, and the overall production of tense and agreement becomes adult-like.

  11. Limits of normality of quantitative thoracic CT analysis

    PubMed Central

    2013-01-01

    Introduction Although computed tomography (CT) is widely used to investigate different pathologies, quantitative data from normal populations are scarce. Reference values may be useful to estimate the anatomical or physiological changes induced by various diseases. Methods We analyzed 100 helical CT scans taken for clinical purposes and referred as nonpathological by the radiologist. Profiles were manually outlined on each CT scan slice and each voxel was classified according to its gas/tissue ratio. For regional analysis, the lungs were divided into 10 sterno-vertebral levels. Results We studied 53 males and 47 females (age 64 ± 13 years); males had a greater total lung volume, lung gas volume and lung tissue. Noninflated tissue averaged 7 ± 4% of the total lung weight, poorly inflated tissue averaged 18 ± 3%, normally inflated tissue averaged 65 ± 8% and overinflated tissue averaged 11 ± 7%. We found a significant correlation between lung weight and subject's height (P <0.0001, r2 = 0.49); the total lung capacity in a supine position was 4,066 ± 1,190 ml, ~1,800 ml less than the predicted total lung capacity in a sitting position. Superimposed pressure averaged 2.6 ± 0.5 cmH2O. Conclusion Subjects without lung disease present significant amounts of poorly inflated and overinflated tissue. Normal lung weight can be predicted from patient's height with reasonable confidence. PMID:23706034

  12. Quantitative measurement of odor detection thresholds using an air dilution olfactometer, and association with genetic variants in a sample of diverse ancestry

    PubMed Central

    Cook, Gillian R.; Krithika, S; Edwards, Melissa; Kavanagh, Paula

    2014-01-01

    Genetic association studies require a quantitative and reliable method for odor threshold assessment in order to examine the contribution of genetic variants to complex olfactory phenotypes. Our main goal was to assess the feasibility of a portable Scentroid air dilution olfactometer for use in such studies. Using the Scentroid SM110C and the SK5 n-butanol Sensitivity Kit (IDES Canada Inc.), n-butanol odor thresholds were determined for 182 individuals of diverse ancestry (mean age: 20.4 ± 2.5 years; n = 128 female; n = 54 male). Threshold scores from repeat participants were used to calculate a test–retest reliability coefficient, which was statistically significant (r = 0.754, p < 0.001, n = 29), indicating that the Scentroid provides reliable estimates of odor thresholds. In addition, we performed a preliminary genetic analysis evaluating the potential association of n-butanol odor thresholds to six single-nucleotide polymorphisms (SNPs) putatively involved in general olfactory sensitivity (GOS). The results of multiple linear regression analysis revealed no significant association between the SNPs tested and threshold scores. However, our sample size was relatively small, and our study was only powered to identify genetic markers with strong effects on olfactory sensitivity. Overall, we find that the Scentroid provides reliable quantitative measures of odor detection threshold and is well suited for genetic studies of olfactory sensitivity. PMID:25392755

  13. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  14. Quantitative Analysis of Human Cancer Cell Extravasation Using Intravital Imaging.

    PubMed

    Willetts, Lian; Bond, David; Stoletov, Konstantin; Lewis, John D

    2016-01-01

    Metastasis, or the spread of cancer cells from a primary tumor to distant sites, is the leading cause of cancer-associated death. Metastasis is a complex multi-step process comprised of invasion, intravasation, survival in circulation, extravasation, and formation of metastatic colonies. Currently, in vitro assays are limited in their ability to investigate these intricate processes and do not faithfully reflect metastasis as it occurs in vivo. Traditional in vivo models of metastasis are limited by their ability to visualize the seemingly sporadic behavior of where and when cancer cells spread (Reymond et al., Nat Rev Cancer 13:858-870, 2013). The avian embryo model of metastasis is a powerful platform to study many of the critical steps in the metastatic cascade including the migration, extravasation, and invasion of human cancer cells in vivo (Sung et al., Nat Commun 6:7164, 2015; Leong et al., Cell Rep 8, 1558-1570, 2014; Kain et al., Dev Dyn 243:216-28, 2014; Leong et al., Nat Protoc 5:1406-17, 2010; Zijlstra et al., Cancer Cell 13:221-234, 2008; Palmer et al., J Vis Exp 51:2815, 2011). The chicken chorioallantoic membrane (CAM) is a readily accessible and well-vascularized tissue that surrounds the developing embryo. When the chicken embryo is grown in a shell-less, ex ovo environment, the nearly transparent CAM provides an ideal environment for high-resolution fluorescent microcopy approaches. In this model, the embryonic chicken vasculature and labeled cancer cells can be visualized simultaneously to investigate specific steps in the metastatic cascade including extravasation. When combined with the proper image analysis tools, the ex ovo chicken embryo model offers a cost-effective and high-throughput platform for the quantitative analysis of tumor cell metastasis in a physiologically relevant in vivo setting. Here we discuss detailed procedures to quantify cancer cell extravasation in the shell-less chicken embryo model with advanced fluorescence

  15. Quantitative petrographic analysis of Cretaceous sandstones from southwest Montana

    SciTech Connect

    Dyman, T.S. Krystinik, K.B.; Takahashi, K.I.

    1986-05-01

    The Albian Blackleaf Formation and the Cenomanian lower Frontier Formation in southwest Montana lie within or east of the fold and thrust belt in the Cretaceous foreland basin complex. Petrography of these strata record a complex interaction between source-area tectonism, basin subsidence, and sedimentation patterns associated with a cyclic sequence of transgressions and regressions. Because the petrographic data set was large (127 thin sections) and difficult to interpret subjectively, statistical techniques were used to establish sample and variable relationships. Theta-mode cluster and correspondence analysis were used to determine the contributing effect (total variance) of key framework grains. Monocrystalline quartz, plagioclase, potassium feldspar, and sandstone-, limestone-, and volcanic-lithic grain content contribute most to the variation in the framework-grain population. Theta-mode cluster and correspondence analysis were used to identify six petrofacies. Lower Blackleaf petrofacies (I-III) contain abundant monocrystalline quartz (55-90%) and sedimentary lithic grains (10-50%), which are distributed throughout the study area. Petrofacies I-III are differentiated by variable monocrystalline quartz and sedimentary lithic grain content. Upper Blackleaf and lower Frontier petrofacies (IV-VI) exhibit highly variable, sedimentary and volcanic lithic ratios, and contain less monocrystalline quartz (20-50%) than lower Blackleaf petrofacies. Information from quantitative analyses combined with available paleocurrent data indicates that Blackleaf and lower Frontier detritus was derived from variable source areas through time. Lower Blackleaf detritus was derived from Precambrian through Paleozoic sedimentary terranes to the west, north, and east; whereas, upper Blackleaf and lower Frontier detritus was derived from both sedimentary and volcanic terranes to the south.

  16. Using Qualitative and Quantitative Methods to Choose a Habitat Quality Metric for Air Pollution Policy Evaluation.

    PubMed

    Rowe, Edwin C; Ford, Adriana E S; Smart, Simon M; Henrys, Peter A; Ashmore, Mike R

    2016-01-01

    Atmospheric nitrogen (N) deposition has had detrimental effects on species composition in a range of sensitive habitats, although N deposition can also increase agricultural productivity and carbon storage, and favours a few species considered of importance for conservation. Conservation targets are multiple, and increasingly incorporate services derived from nature as well as concepts of intrinsic value. Priorities vary. How then should changes in a set of species caused by drivers such as N deposition be assessed? We used a novel combination of qualitative semi-structured interviews and quantitative ranking to elucidate the views of conservation professionals specialising in grasslands, heathlands and mires. Although conservation management goals are varied, terrestrial habitat quality is mainly assessed by these specialists on the basis of plant species, since these are readily observed. The presence and abundance of plant species that are scarce, or have important functional roles, emerged as important criteria for judging overall habitat quality. However, species defined as 'positive indicator-species' (not particularly scarce, but distinctive for the habitat) were considered particularly important. Scarce species are by definition not always found, and the presence of functionally important species is not a sufficient indicator of site quality. Habitat quality as assessed by the key informants was rank-correlated with the number of positive indicator-species present at a site for seven of the nine habitat classes assessed. Other metrics such as species-richness or a metric of scarcity were inconsistently or not correlated with the specialists' assessments. We recommend that metrics of habitat quality used to assess N pollution impacts are based on the occurrence of, or habitat-suitability for, distinctive species. Metrics of this type are likely to be widely applicable for assessing habitat change in response to different drivers. The novel combined

  17. Using Qualitative and Quantitative Methods to Choose a Habitat Quality Metric for Air Pollution Policy Evaluation

    PubMed Central

    Ford, Adriana E. S.; Smart, Simon M.; Henrys, Peter A.; Ashmore, Mike R.

    2016-01-01

    Atmospheric nitrogen (N) deposition has had detrimental effects on species composition in a range of sensitive habitats, although N deposition can also increase agricultural productivity and carbon storage, and favours a few species considered of importance for conservation. Conservation targets are multiple, and increasingly incorporate services derived from nature as well as concepts of intrinsic value. Priorities vary. How then should changes in a set of species caused by drivers such as N deposition be assessed? We used a novel combination of qualitative semi-structured interviews and quantitative ranking to elucidate the views of conservation professionals specialising in grasslands, heathlands and mires. Although conservation management goals are varied, terrestrial habitat quality is mainly assessed by these specialists on the basis of plant species, since these are readily observed. The presence and abundance of plant species that are scarce, or have important functional roles, emerged as important criteria for judging overall habitat quality. However, species defined as ‘positive indicator-species’ (not particularly scarce, but distinctive for the habitat) were considered particularly important. Scarce species are by definition not always found, and the presence of functionally important species is not a sufficient indicator of site quality. Habitat quality as assessed by the key informants was rank-correlated with the number of positive indicator-species present at a site for seven of the nine habitat classes assessed. Other metrics such as species-richness or a metric of scarcity were inconsistently or not correlated with the specialists’ assessments. We recommend that metrics of habitat quality used to assess N pollution impacts are based on the occurrence of, or habitat-suitability for, distinctive species. Metrics of this type are likely to be widely applicable for assessing habitat change in response to different drivers. The novel combined

  18. Quantitative analysis of LISA pathfinder test-mass noise

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-12-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3×10-14ms-2/Hz at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise detection, the

  19. Column precipitation chromatography: an approach to quantitative analysis of eigencolloids.

    PubMed

    Breynaert, E; Maes, A

    2005-08-01

    A new column precipitation chromatography (CPC) technique, capable of quantitatively measuring technetium eigencolloids in aqueous solutions, is presented. The CPC technique is based on the destabilization and precipitation of eigencolloids by polycations in a confined matrix. Tc(IV) colloids can be quantitatively determined from their precipitation onto the CPC column (separation step) and their subsequent elution upon oxidation to pertechnetate by peroxide (elution step). A clean-bed particle removal model was used to explain the experimental results. PMID:16053321

  20. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  1. Descriptive Quantitative Analysis of Rearfoot Alignment Radiographic Parameters.

    PubMed

    Meyr, Andrew J; Wagoner, Matthew R

    2015-01-01

    Although the radiographic parameters of the transverse talocalcaneal angle (tTCA), calcaneocuboid angle (CCA), talar head uncovering (THU), calcaneal inclination angle (CIA), talar declination angle (TDA), lateral talar-first metatarsal angle (lTFA), and lateral talocalcaneal angle (lTCA) form the basis of the preoperative evaluation and procedure selection for pes planovalgus deformity, the so-called normal values of these measurements are not well-established. The objectives of the present study were to retrospectively evaluate the descriptive statistics of these radiographic parameters (tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA) in a large population, and, second, to determine an objective basis for defining "normal" versus "abnormal" measurements. As a secondary outcome, the relationship of these variables to the body mass index was assessed. Anteroposterior and lateral foot radiographs from 250 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated. The results revealed a mean measurement of 24.12°, 13.20°, 74.32%, 16.41°, 26.64°, 8.37°, and 43.41° for the tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA, respectively. These were generally in line with the reported historical normal values. Descriptive statistical analysis demonstrated that the tTCA, THU, and TDA met the standards to be considered normally distributed but that the CCA, CIA, lTFA, and lTCA demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, only the CIA (R = -0.2428) and lTCA (R = -0.2449) demonstrated substantial correlation with the body mass index. No differentiations in deformity progression were observed when the radiographic parameters were plotted against each other to lead to a quantitative basis for defining "normal" versus "abnormal" measurements. PMID:26002682

  2. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    Cruikshank, D. P.; Dalle Ore, C. M.; Pendleton, Y. J.; Clark, R. N.

    2012-12-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iapetus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-bys of these satellites, are the C-H stretching modes of aromatic hydrocarbons at ~3.28 μm (~3050 cm-1), and the are four blended bands of aliphatic -CH2- and -CH3 in the range ~3.36-3.52 μm (~2980-2840 cm-1). In these data, the aromatic band, probably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signature among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph ~24; for Hyperion the value is ~12, while Iapetus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 ~2.2 in the spectrum of low-albedo material on Iapetus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  3. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    PubMed

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  4. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  5. Quantitative analysis of mycoflora on commercial domestic fruits in Japan.

    PubMed

    Watanabe, Maiko; Tsutsumi, Fumiyuki; Konuma, Rumi; Lee, Ken-Ichi; Kawarada, Kensuke; Sugita-Konishi, Yoshiko; Kumagai, Susumu; Takatori, Kosuke; Konuma, Hirotaka; Hara-Kudo, Yukiko

    2011-09-01

    A comprehensive and quantitative analysis of the mycoflora on the surface of commercial fruit was performed. Nine kinds of fruits grown in Japan were tested. Overall fungal counts on the fruits ranged from 3.1 to 6.5 log CFU/g. The mean percentages of the total yeast counts were higher than those of molds in samples of apples, Japanese pears, and strawberries, ranging from 58.5 to 67.0%, and were lower than those of molds in samples of the other six fruits, ranging from 9.8 to 48.3%. Cladosporium was the most frequent fungus and was found in samples of all nine types of fruits, followed by Penicillium found in eight types of fruits. The fungi with the highest total counts in samples of the various fruits were Acremonium in cantaloupe melons (47.6% of the total fungal count), Aspergillus in grapes (32.2%), Aureobasidium in apples (21.3%), blueberries (63.6%), and peaches (33.6%), Cladosporium in strawberries (38.4%), Cryptococcus in Japanese pears (37.6%), Penicillium in mandarins (22.3%), and Sporobolomyces in lemons (26.9%). These results demonstrated that the mycoflora on the surfaces of these fruits mainly consists of common pre- and postharvest inhabitants of the plants or in the environment; fungi that produce mycotoxins or cause market diseases were not prominent in the mycoflora of healthy fruits. These findings suggest fruits should be handled carefully with consideration given to fungal contaminants, including nonpathogenic fungi, to control the quality of fruits and processed fruit products. PMID:21902918

  6. Quantitative analysis of harmonic convergence in mosquito auditory interactions

    PubMed Central

    Aldersley, Andrew; Champneys, Alan; Robert, Daniel

    2016-01-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the ‘harmonic convergence’ phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male–female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male–male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  7. The next generation of low-cost personal air quality sensors for quantitative exposure monitoring

    NASA Astrophysics Data System (ADS)

    Piedrahita, R.; Xiang, Y.; Masson, N.; Ortega, J.; Collier, A.; Jiang, Y.; Li, K.; Dick, R.; Lv, Q.; Hannigan, M.; Shang, L.

    2014-03-01

    Advances in embedded systems and low-cost gas sensors are enabling a new wave of low cost air quality monitoring tools. Our team has been engaged in the development of low-cost wearable air quality monitors (M-Pods) using the Arduino platform. The M-Pods use commercially available metal oxide semiconductor (MOx) sensors to measure CO, O3, NO2, and total VOCs, and NDIR sensors to measure CO2. MOx sensors are low in cost and show high sensitivity near ambient levels; however they display non-linear output signals and have cross sensitivity effects. Thus, a quantification system was developed to convert the MOx sensor signals into concentrations. Two deployments were conducted at a regulatory monitoring station in Denver, Colorado. M-Pod concentrations were determined using laboratory calibration techniques and co-location calibrations, in which we place the M-Pods near regulatory monitors to then derive calibration function coefficients using the regulatory monitors as the standard. The form of the calibration function was derived based on laboratory experiments. We discuss various techniques used to estimate measurement uncertainties. A separate user study was also conducted to assess personal exposure and M-Pod reliability. In this study, 10 M-Pods were calibrated via co-location multiple times over 4 weeks and sensor drift was analyzed with the result being a calibration function that included drift. We found that co-location calibrations perform better than laboratory calibrations. Lab calibrations suffer from bias and difficulty in covering the necessary parameter space. During co-location calibrations, median standard errors ranged between 4.0-6.1 ppb for O3, 6.4-8.4 ppb for NO2, 0.28-0.44 ppm for CO, and 16.8 ppm for CO2. Median signal to noise (S/N) ratios for the M-Pod sensors were higher for M-Pods than the regulatory instruments: for NO2, 3.6 compared to 23.4; for O3, 1.4 compared to 1.6; for CO, 1.1 compared to 10.0; and for CO2, 42.2 compared to 300

  8. Air pollution and venous thrombosis: a meta-analysis

    PubMed Central

    Tang, Liang; Wang, Qing-Yun; Cheng, Zhi-Peng; Hu, Bei; Liu, Jing-Di; Hu, Yu

    2016-01-01

    Exposure to air pollution has been linked to cardiovascular and respiratory disorders. However, the effect of air pollution on venous thrombotic disorders is uncertain. We performed a meta-analysis to assess the association between air pollution and venous thrombosis. PubMed, Embase, EBM Reviews, Healthstar, Global Health, Nursing Database, and Web of Science were searched for citations on air pollutants (carbon monoxide, sulfur dioxide, nitrogen dioxide, ozone, and particulate matters) and venous thrombosis. Using a random-effects model, overall risk estimates were derived for each increment of 10 μg/m3 of pollutant concentration. Of the 485 in-depth reviewed studies, 8 citations, involving approximately 700,000 events, fulfilled the inclusion criteria. All the main air pollutants analyzed were not associated with an increased risk of venous thrombosis (OR = 1.005, 95% CI = 0.998–1.012 for PM2.5; OR = 0.995, 95% CI = 0.984–1.007 for PM10; OR = 1.006, 95% CI = 0.994–1.019 for NO2). Based on exposure period and thrombosis location, additional subgroup analyses provided results comparable with those of the overall analyses. There was no evidence of publication bias. Therefore, this meta analysis does not suggest the possible role of air pollution as risk factor for venous thrombosis in general population. PMID:27600652

  9. Air pollution and venous thrombosis: a meta-analysis

    NASA Astrophysics Data System (ADS)

    Tang, Liang; Wang, Qing-Yun; Cheng, Zhi-Peng; Hu, Bei; Liu, Jing-Di; Hu, Yu

    2016-09-01

    Exposure to air pollution has been linked to cardiovascular and respiratory disorders. However, the effect of air pollution on venous thrombotic disorders is uncertain. We performed a meta-analysis to assess the association between air pollution and venous thrombosis. PubMed, Embase, EBM Reviews, Healthstar, Global Health, Nursing Database, and Web of Science were searched for citations on air pollutants (carbon monoxide, sulfur dioxide, nitrogen dioxide, ozone, and particulate matters) and venous thrombosis. Using a random-effects model, overall risk estimates were derived for each increment of 10 μg/m3 of pollutant concentration. Of the 485 in-depth reviewed studies, 8 citations, involving approximately 700,000 events, fulfilled the inclusion criteria. All the main air pollutants analyzed were not associated with an increased risk of venous thrombosis (OR = 1.005, 95% CI = 0.998–1.012 for PM2.5; OR = 0.995, 95% CI = 0.984–1.007 for PM10; OR = 1.006, 95% CI = 0.994–1.019 for NO2). Based on exposure period and thrombosis location, additional subgroup analyses provided results comparable with those of the overall analyses. There was no evidence of publication bias. Therefore, this meta analysis does not suggest the possible role of air pollution as risk factor for venous thrombosis in general population.

  10. Air pollution and venous thrombosis: a meta-analysis.

    PubMed

    Tang, Liang; Wang, Qing-Yun; Cheng, Zhi-Peng; Hu, Bei; Liu, Jing-Di; Hu, Yu

    2016-01-01

    Exposure to air pollution has been linked to cardiovascular and respiratory disorders. However, the effect of air pollution on venous thrombotic disorders is uncertain. We performed a meta-analysis to assess the association between air pollution and venous thrombosis. PubMed, Embase, EBM Reviews, Healthstar, Global Health, Nursing Database, and Web of Science were searched for citations on air pollutants (carbon monoxide, sulfur dioxide, nitrogen dioxide, ozone, and particulate matters) and venous thrombosis. Using a random-effects model, overall risk estimates were derived for each increment of 10 μg/m(3) of pollutant concentration. Of the 485 in-depth reviewed studies, 8 citations, involving approximately 700,000 events, fulfilled the inclusion criteria. All the main air pollutants analyzed were not associated with an increased risk of venous thrombosis (OR = 1.005, 95% CI = 0.998-1.012 for PM2.5; OR = 0.995, 95% CI = 0.984-1.007 for PM10; OR = 1.006, 95% CI = 0.994-1.019 for NO2). Based on exposure period and thrombosis location, additional subgroup analyses provided results comparable with those of the overall analyses. There was no evidence of publication bias. Therefore, this meta analysis does not suggest the possible role of air pollution as risk factor for venous thrombosis in general population. PMID:27600652

  11. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    PubMed

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  12. Modal analysis of PATHFINDER unmanned air vehicle

    SciTech Connect

    Woehrle, T.G.; Costerus, B.W.; Lee, C.L.

    1994-10-19

    An experimental modal analysis was performed on PATHFINDER, a 450-lb, 100-ft wing span, flying-wing-design aircraft powered by solar/electric motors. The aircraft was softly suspended and then excited using random input from a long-stroke shaker. Modal data was taken from 92 measurement locations on the aircraft using newly designed, lightweight, tri-axial accelerometers. A conventional PC-based data acquisition system provided data handling. Modal parameters were calculated, and animated mode shapes were produced using SMS STARStruct{trademark} Modal Analysis System software. The modal parameters will be used for validation of finite element models, optimum placement of onboard accelerometers during flight testing, and vibration isolation design of sensor platforms.

  13. Tracer-based laser-induced fluorescence measurement technique for quantitative fuel/air-ratio measurements in a hydrogen internal combustion engine.

    PubMed

    Blotevogel, Thomas; Hartmann, Matthias; Rottengruber, Hermann; Leipertz, Alfred

    2008-12-10

    A measurement technique for the quantitative investigation of mixture formation processes in hydrogen internal combustion engines (ICEs) has been developed using tracer-based laser-induced fluorescence (TLIF). This technique can be employed to fired and motored engine operation. The quantitative TLIF fuel/air-ratio results have been verified by means of linear Raman scattering measurements. Exemplary results of the simultaneous investigation of mixture formation and combustion obtained at an optical accessible hydrogen ICE are shown. PMID:19079454

  14. Tracer-based laser-induced fluorescence measurement technique for quantitative fuel/air-ratio measurements in a hydrogen internal combustion engine.

    PubMed

    Blotevogel, Thomas; Hartmann, Matthias; Rottengruber, Hermann; Leipertz, Alfred

    2008-12-10

    A measurement technique for the quantitative investigation of mixture formation processes in hydrogen internal combustion engines (ICEs) has been developed using tracer-based laser-induced fluorescence (TLIF). This technique can be employed to fired and motored engine operation. The quantitative TLIF fuel/air-ratio results have been verified by means of linear Raman scattering measurements. Exemplary results of the simultaneous investigation of mixture formation and combustion obtained at an optical accessible hydrogen ICE are shown.

  15. The next generation of low-cost personal air quality sensors for quantitative exposure monitoring

    NASA Astrophysics Data System (ADS)

    Piedrahita, R.; Xiang, Y.; Masson, N.; Ortega, J.; Collier, A.; Jiang, Y.; Li, K.; Dick, R. P.; Lv, Q.; Hannigan, M.; Shang, L.

    2014-10-01

    Advances in embedded systems and low-cost gas sensors are enabling a new wave of low-cost air quality monitoring tools. Our team has been engaged in the development of low-cost, wearable, air quality monitors (M-Pods) using the Arduino platform. These M-Pods house two types of sensors - commercially available metal oxide semiconductor (MOx) sensors used to measure CO, O3, NO2, and total VOCs, and NDIR sensors used to measure CO2. The MOx sensors are low in cost and show high sensitivity near ambient levels; however they display non-linear output signals and have cross-sensitivity effects. Thus, a quantification system was developed to convert the MOx sensor signals into concentrations. We conducted two types of validation studies - first, deployments at a regulatory monitoring station in Denver, Colorado, and second, a user study. In the two deployments (at the regulatory monitoring station), M-Pod concentrations were determined using collocation calibrations and laboratory calibration techniques. M-Pods were placed near regulatory monitors to derive calibration function coefficients using the regulatory monitors as the standard. The form of the calibration function was derived based on laboratory experiments. We discuss various techniques used to estimate measurement uncertainties. The deployments revealed that collocation calibrations provide more accurate concentration estimates than laboratory calibrations. During collocation calibrations, median standard errors ranged between 4.0-6.1 ppb for O3, 6.4-8.4 ppb for NO2, 0.28-0.44 ppm for CO, and 16.8 ppm for CO2. Median signal to noise (S / N) ratios for the M-Pod sensors were higher than the regulatory instruments: for NO2, 3.6 compared to 23.4; for O3, 1.4 compared to 1.6; for CO, 1.1 compared to 10.0; and for CO2, 42.2 compared to 300-500. By contrast, lab calibrations added bias and made it difficult to cover the necessary range of environmental conditions to obtain a good calibration. A separate user study

  16. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  17. Quantitative analysis of localized surface plasmons based on molecular probing.

    PubMed

    Deeb, Claire; Bachelot, Renaud; Plain, Jérôme; Baudrion, Anne-Laure; Jradi, Safi; Bouhelier, Alexandre; Soppera, Olivier; Jain, Prashant K; Huang, Libai; Ecoffet, Carole; Balan, Lavinia; Royer, Pascal

    2010-08-24

    We report on the quantitative characterization of the plasmonic optical near-field of a single silver nanoparticle. Our approach relies on nanoscale molecular molding of the confined electromagnetic field by photoactivated molecules. We were able to directly image the dipolar profile of the near-field distribution with a resolution better than 10 nm and to quantify the near-field depth and its enhancement factor. A single nanoparticle spectral signature was also assessed. This quantitative characterization constitutes a prerequisite for developing nanophotonic applications.

  18. Quantitative Analysis of Autophagy using Advanced 3D Fluorescence Microscopy

    PubMed Central

    Changou, Chun A.; Wolfson, Deanna L.; Ahluwalia, Balpreet Singh; Bold, Richard J.; Kung, Hsing-Jien; Chuang, Frank Y.S.

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine1. This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)1,10. Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)1,2,3. Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation4,5. Although the essential components of this pathway are well-characterized6,7,8,9, many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy11,12. Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early stages of

  19. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  20. A diagnostic programme for quantitative analysis of proteinuria.

    PubMed

    Hofmann, W; Guder, W G

    1989-09-01

    A spectrum of quantitative methods was adapted to the Kone Specific Analyser for the purpose of recognizing, quantifying and differentiating various forms of proteinuria. Total protein, IgG, albumin and alpha 1-microglobulin (measured by turbidimetry), N-acetyl-beta-D-glucosaminidase activity and creatinine (measured photometrically), were measured in undiluted urine; in addition alpha 1-microglobulin was measured in serum. Within and between run precision, accuracy and linearity of the turbidimetric methods were in good agreement with nephelometric procedures. All turbidimetric methods exhibited a correlation coefficient r greater than 0.98 when compared with the radial immunodiffusion procedure as reference method. Total protein measured turbidimetrically with the Kone Specific Analyser was in good agreement with the manual biuret procedure. The low detection limits and linearities allowed quantification of urine analytes from the lower range of normals up to ten times the upper limit of normals. The measured analytes exhibited stability in urine at pH 4-8 over at least seven days at 4-6 degrees C and -20 degrees C. Only IgG showed a significant loss (up to 30 percent), when measured after storage at -20 degrees C. Quantities per mol creatinine showed significantly lower intra-individual and inter-individual variability than quantities per liter. In 31 normal persons, the intraindividual variation was lowest for N-acetyl-beta-D-glucosaminidase activity (13%) and highest for total protein (33%), when measured in the second morning urine on 5 consecutive days. When related to creatinine, results obtained in the second morning urine showed no significant differences from those in 24 h urine, except for alpha 1-microglobulin which gave lower values in 24 h urines. The upper normal limits, calculated as the 95% ranges, were determined from 154 urines of 31 individuals. Nearly all analytes showed an asymmetric distribution. Because of a wide tailing of the upper limit

  1. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  2. Spacelab J air filter debris analysis

    NASA Technical Reports Server (NTRS)

    Obenhuber, Donald C.

    1993-01-01

    Filter debris from the Spacelab module SLJ of STS-49 was analyzed for microbial contamination. Debris for cabin and avionics filters was collected by Kennedy Space Center personnel on 1 Oct. 1992, approximately 5 days postflight. The concentration of microorganisms found was similar to previous Spacelab missions averaging 7.4E+4 CFU/mL for avionics filter debris and 4.5E+6 CFU/mL for the cabin filter debris. A similar diversity of bacterial types was found in the two filters. Of the 13 different bacterial types identified from the cabin and avionics samples, 6 were common to both filters. The overall analysis of these samples as compared to those of previous missions shows no significant differences.

  3. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  4. Teaching Quantitative Research Methods: A Quasi-Experimental Analysis.

    ERIC Educational Resources Information Center

    Bridges, George S.; Gillmore, Gerald M.; Pershing, Jana L.; Bates, Kristin A.

    1998-01-01

    Describes an experiment designed to introduce aspects of quantitative reasoning to a large, substantively-focused class in the social sciences. Reveals that participating students' abilities to interpret and manipulate empirical data increased significantly, independent of baseline SAT verbal and mathematics scores. Discusses implications for…

  5. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  6. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  7. Cognitive Task Analysis of Prioritization in Air Traffic Control.

    ERIC Educational Resources Information Center

    Redding, Richard E.; And Others

    A cognitive task analysis was performed to analyze the key cognitive components of the en route air traffic controllers' jobs. The goals were to ascertain expert mental models and decision-making strategies and to identify important differences in controller knowledge, skills, and mental models as a function of expertise. Four groups of…

  8. An Analysis of the Air Conditioning, Refrigerating and Heating Occupation.

    ERIC Educational Resources Information Center

    Frass, Melvin R.; Krause, Marvin

    The general purpose of the occupational analysis is to provide workable, basic information dealing with the many and varied duties performed in the air conditioning, refrigerating, and heating occupation. The document opens with a brief introduction followed by a job description. The bulk of the document is presented in table form. Six duties are…

  9. ANSI/ASHRAE/IESNA Standard 90.1-2010 Preliminary Determination Quantitative Analysis

    SciTech Connect

    Halverson, Mark A.; Liu, Bing; Rosenberg, Michael I.

    2010-11-01

    The United States (U.S.) Department of Energy (DOE) conducted a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2010 (ASHRAE Standard 90.1-2010, Standard 90.1-2010, or 2010 edition) would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2007(ASHRAE Standard 90.1-2007, Standard 90.1-2007, or 2007 edition). The preliminary analysis considered each of the 109 addenda to ASHRAE Standard 90.1-2007 that were included in ASHRAE Standard 90.1-2010. All 109 addenda processed by ASHRAE in the creation of Standard 90.1-2010 from Standard 90.1-2007 were reviewed by DOE, and their combined impact on a suite of 16 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s preliminary determination. However, out of the 109 addenda, 34 were preliminarily determined to have measureable and quantifiable impact.

  10. ANSI/ASHRAE/IESNA Standard 90.1-2007 Final Determination Quantitative Analysis

    SciTech Connect

    Halverson, Mark A.; Liu, Bing; Richman, Eric E.; Winiarski, David W.

    2011-05-01

    The United States (U.S.) Department of Energy (DOE) conducted a final quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2007 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2004. The final analysis considered each of the 44 addenda to ANSI/ASHRAE/IESNA Standard 90.1-2004 that were included in ANSI/ASHRAE/IESNA Standard 90.1-2007. All 44 addenda processed by ASHRAE in the creation of Standard 90.1-2007 from Standard 90.1-2004 were reviewed by DOE, and their combined impact on a suite of 15 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 44 addenda, 9 were preliminarily determined to have measureable and quantifiable impact.

  11. Tobacco Smoke, Indoor Air Pollution and Tuberculosis: A Systematic Review and Meta-Analysis

    PubMed Central

    Lin, Hsien-Ho; Ezzati, Majid; Murray, Megan

    2007-01-01

    Background Tobacco smoking, passive smoking, and indoor air pollution from biomass fuels have been implicated as risk factors for tuberculosis (TB) infection, disease, and death. Tobacco smoking and indoor air pollution are persistent or growing exposures in regions where TB poses a major health risk. We undertook a systematic review and meta-analysis to quantitatively assess the association between these exposures and the risk of infection, disease, and death from TB. Methods and Findings We conducted a systematic review and meta-analysis of observational studies reporting effect estimates and 95% confidence intervals on how tobacco smoking, passive smoke exposure, and indoor air pollution are associated with TB. We identified 33 papers on tobacco smoking and TB, five papers on passive smoking and TB, and five on indoor air pollution and TB. We found substantial evidence that tobacco smoking is positively associated with TB, regardless of the specific TB outcomes. Compared with people who do not smoke, smokers have an increased risk of having a positive tuberculin skin test, of having active TB, and of dying from TB. Although we also found evidence that passive smoking and indoor air pollution increased the risk of TB disease, these associations are less strongly supported by the available evidence. Conclusions There is consistent evidence that tobacco smoking is associated with an increased risk of TB. The finding that passive smoking and biomass fuel combustion also increase TB risk should be substantiated with larger studies in future. TB control programs might benefit from a focus on interventions aimed at reducing tobacco and indoor air pollution exposures, especially among those at high risk for exposure to TB. PMID:17227135

  12. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  13. Investigation of storage-phosphor autoradiography for the rapid quantitative screening of air filters for emergency response purposes

    NASA Astrophysics Data System (ADS)

    Gallardo, Athena Marie

    Past nuclear accidents, such as Chernobyl, resulted in a large release of radionuclides into the atmosphere. Radiological assessment of the vicinity of the site of the incident is vital to assess the exposure levels and dose received by the population and workers. Therefore, it is critical to thoroughly understand the situation and risks associated with a particular event in a timely manner in order to properly manage the event. Current atmospheric radiological assessments of alpha emitting radioisotopes include acquiring large quantities of air samples, chemical separation of radionuclides, sample mounting, counting through alpha spectrometry, and analysis of the data. The existing methodology is effective, but time consuming and labor intensive. Autoradiography, and the properties of phosphor imaging films, may be used as an additional technique to facilitate and expedite the alpha analysis process in these types of situations. Although autoradiography is not as sensitive to alpha radiation as alpha spectrometry, autoradiography may benefit alpha analysis by providing information about the activity as well as the spatial distribution of radioactivity in the sample under investigation. The objective for this research was to develop an efficient method for quantification and visualization of air filter samples taken in the aftermath of a nuclear emergency through autoradiography using 241Am and 239Pu tracers. Samples containing varying activities of either 241Am or 239Pu tracers were produced through microprecipitation and assayed by alpha spectroscopy. The samples were subsequently imaged and an activity calibration curve was produced by comparing the digital light units recorded from the image to the known activity of the source. The usefulness of different phosphor screens was examined by exposing each type of film to the same standard nuclide for varying quantities of time. Unknown activity samples created through microprecipiation containing activities of

  14. Quantitative and qualitative HPLC analysis of thermogenic weight loss products.

    PubMed

    Schaneberg, B T; Khan, I A

    2004-11-01

    An HPLC qualitative and quantitative method of seven analytes (caffeine, ephedrine, forskolin, icariin, pseudoephedrine, synephrine, and yohimbine) in thermogenic weight loss preparations available on the market is described in this paper. After 45 min the seven analytes were separated and detected in the acetonitrile: water (80:20) extract. The method uses a Waters XTerra RP18 (5 microm particle size) column as the stationary phase, a gradient mobile phase of water (5.0 mM SDS) and acetonitrile, and a UV detection of 210 nm. The correlation coefficients for the calibration curves and the recovery rates ranged from 0.994 to 0.999 and from 97.45% to 101.05%, respectively. The qualitative and quantitative results are discussed. PMID:15587578

  15. Quantitative sectioning and noise analysis for structured illumination microscopy

    PubMed Central

    Hagen, Nathan; Gao, Liang; Tkaczyk, Tomasz S.

    2011-01-01

    Structured illumination (SI) has long been regarded as a nonquantitative technique for obtaining sectioned microscopic images. Its lack of quantitative results has restricted the use of SI sectioning to qualitative imaging experiments, and has also limited researchers’ ability to compare SI against competing sectioning methods such as confocal microscopy. We show how to modify the standard SI sectioning algorithm to make the technique quantitative, and provide formulas for calculating the noise in the sectioned images. The results indicate that, for an illumination source providing the same spatially-integrated photon flux at the object plane, and for the same effective slice thicknesses, SI sectioning can provide higher SNR images than confocal microscopy for an equivalent setup when the modulation contrast exceeds about 0.09. PMID:22274364

  16. Quantitative architectural analysis: a new approach to cortical mapping.

    PubMed

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-11-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, objective mapping procedures based on quantitative cytoarchitecture have been generated. As a result, new maps for various species including man were established. In our contribution, principles of quantitative cytoarchitecture and algorithm-based cortical mapping are described for a cytoarchitectural parcellation of the human auditory cortex. Defining cortical borders based on quantified changes in cortical lamination is the decisive step towards a novel, highly improved probabilistic brain atlas.

  17. Quantitative analysis of the human T cell palmitome

    PubMed Central

    Morrison, Eliot; Kuropka, Benno; Kliche, Stefanie; Brügger, Britta; Krause, Eberhard; Freund, Christian

    2015-01-01

    Palmitoylation is a reversible post-translational modification used to inducibly compartmentalize proteins in cellular membranes, affecting the function of receptors and intracellular signaling proteins. The identification of protein “palmitomes” in several cell lines raises the question to what extent this modification is conserved in primary cells. Here we use primary T cells with acyl-biotin exchange and quantitative mass spectrometry to identify a pool of proteins previously unreported as palmitoylated in vivo. PMID:26111759

  18. Comprehensive objective maps of macromolecular conformations by quantitative SAXS analysis

    PubMed Central

    Hura, Greg L.; Budworth, Helen; Dyer, Kevin N.; Rambo, Robert P.; Hammel, Michal

    2013-01-01

    Comprehensive perspectives of macromolecular conformations are required to connect structure to biology. Here we present a small angle X-ray scattering (SAXS) Structural Similarity Map (SSM) and Volatility of Ratio (VR) metric providing comprehensive, quantitative and objective (superposition-independent) perspectives on solution state conformations. We validate VR and SSM utility on human MutSβ, a key ABC ATPase and chemotherapeutic target, by revealing MutSβ DNA sculpting and identifying multiple conformational states for biological activity. PMID:23624664

  19. Fluorescent microscopy approaches of quantitative soil microbial analysis

    NASA Astrophysics Data System (ADS)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  20. Air Pollution and Quality of Sperm: A Meta-Analysis

    PubMed Central

    Fathi Najafi, Tahereh; Latifnejad Roudsari, Robab; Namvar, Farideh; Ghavami Ghanbarabadi, Vahid; Hadizadeh Talasaz, Zahra; Esmaeli, Mahin

    2015-01-01

    Context: Air pollution is common in all countries and affects reproductive functions in men and women. It particularly impacts sperm parameters in men. This meta-analysis aimed to examine the impact of air pollution on the quality of sperm. Evidence Acquisition: The scientific databases of Medline, PubMed, Scopus, Google scholar, Cochrane Library, and Elsevier were searched to identify relevant articles published between 1978 to 2013. In the first step, 76 articles were selected. These studies were ecological correlation, cohort, retrospective, cross-sectional, and case control ones that were found through electronic and hand search of references about air pollution and male infertility. The outcome measurement was the change in sperm parameters. A total of 11 articles were ultimately included in a meta-analysis to examine the impact of air pollution on sperm parameters. The authors applied meta-analysis sheets from Cochrane library, then data extraction, including mean and standard deviation of sperm parameters were calculated and finally their confidence interval (CI) were compared to CI of standard parameters. Results: The CI for pooled means were as follows: 2.68 ± 0.32 for ejaculation volume (mL), 62.1 ± 15.88 for sperm concentration (million per milliliter), 39.4 ± 5.52 for sperm motility (%), 23.91 ± 13.43 for sperm morphology (%) and 49.53 ± 11.08 for sperm count. Conclusions: The results of this meta-analysis showed that air pollution reduces sperm motility, but has no impact on the other sperm parameters of spermogram. PMID:26023349

  1. Air-segmented amplitude-modulated multiplexed flow analysis.

    PubMed

    Inui, Koji; Uemura, Takeshi; Ogusu, Takeshi; Takeuchi, Masaki; Tanaka, Hideji

    2011-01-01

    Air-segmentation is applied to amplitude-modulated multiplexed flow analysis, which we proposed recently. Sample solutions, the flow rates of which are varied periodically, are merged with reagent and/or diluent solution. The merged stream is segmented by air-bubbles and, downstream, its absorbance is measured after deaeration. The analytes in the samples are quantified from the amplitudes of the respective wave components in the absorbance. The proposed method is applied to the determinations of a food dye, phosphate ions and nitrite ions. The air-segmentation is effective for limiting amplitude damping through the axial dispersion, resulting in an improvement in sensitivity. This effect is more pronounced at shorter control periods and longer flow path lengths.

  2. A growing role for gender analysis in air pollution epidemiology.

    PubMed

    Clougherty, Jane E

    2011-04-01

    Epidemiologic studies of air pollution effects on respiratory health report significant modification by sex, although results are not uniform. Importantly, it remains unclear whether modifications are attributable to socially derived gendered exposures, to sex-linked physiological differences, or to some interplay thereof. Gender analysis, which aims to disaggregate social from biological differences between males and females, may help to elucidate these possible sources of effect modification. Studies of children suggest stronger effects among boys in early life and among girls in later childhood. The qualitative review describes possible sources of difference in air pollution response between women and men, which may vary by life stage, coexposures, hormonal status, or other factors. The sources of observed effect modifications remain unclear, although gender analytic approaches may help to disentangle gender and sex differences in pollution response. A framework for incorporating gender analysis into environmental epidemiology is offered, along with several potentially useful methods from gender analysis. PMID:21584463

  3. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  4. Particle concentration measurement of virus samples using electrospray differential mobility analysis and quantitative amino acid analysis.

    PubMed

    Cole, Kenneth D; Pease, Leonard F; Tsai, De-Hao; Singh, Tania; Lute, Scott; Brorson, Kurt A; Wang, Lili

    2009-07-24

    Virus reference materials are needed to develop and calibrate detection devices and instruments. We used electrospray differential mobility analysis (ES-DMA) and quantitative amino acid analysis (AAA) to determine the particle concentration of three small model viruses (bacteriophages MS2, PP7, and phiX174). The biological activity, purity, and aggregation of the virus samples were measured using plaque assays, denaturing gel electrophoresis, and size-exclusion chromatography. ES-DMA was developed to count the virus particles using gold nanoparticles as internal standards. ES-DMA additionally provides quantitative measurement of the size and extent of aggregation in the virus samples. Quantitative AAA was also used to determine the mass of the viral proteins in the pure virus samples. The samples were hydrolyzed and the masses of the well-recovered amino acids were used to calculate the equivalent concentration of viral particles in the samples. The concentration of the virus samples determined by ES-DMA was in good agreement with the concentration predicted by AAA for these purified samples. The advantages and limitations of ES-DMA and AAA to characterize virus reference materials are discussed.

  5. Limitations of quantitative oculoplethysmography and of directional Doppler ultrasonography in cerebrovascular diagnosis: assessment of an air-filled OPG system.

    PubMed

    Ginsberg, M D; Greenwood, S A; Goldberg, H I

    1981-01-01

    500 consecutive patients were evaluated for extracranial disease of the internal carotid arteries by an automated, air-filled, digital oculoplethysmographic system (OPG) of the Kartchner type (Zira) and by supraorbital (SO) and supratrochlear (ST) directional Doppler ultrasonography. Cerebral arteriograms were performed in 58 patients (110 vessels), and OPG timing criteria for detecting hemodynamically significant carotid artery stenosis (60% or greater diameter reduction) were ascertained. Optimal criteria were a delay of one ocular pulse, relative to the other, of greater than 12 msec; and a delay of an ocular pulse, relative to the earlier ear (external carotid) pulse, of greater than 36 msec. These criteria correctly identified 73% of vessels with 0 to 59% stenosis and 76% of vessels with 60 to 100% stenosis. However, in 26% of the vessels, OPG was either inconclusive or inaccurate. Correct diagnosis of bilateral hemodynamically significant carotid artery stenoses was made by OPG in 6 of 9 affected patients. SO Doppler was normal in 70% of vessels with 0-59% stenosis, and abnormal in 75% of vessels with 60-100% stenosis. Corresponding percentages for ST Doppler were 95% and 44%. Abnormal Doppler responses to compression of contralateral facial branches were predictive of intracranial cross-collateralization in only 25% of patients. These results suggest that both quantitative OPG in its present form and directional Doppler studies have serious limitations as non-invasive diagnostic methods.

  6. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research

  7. A Wavelet Analysis Approach for Categorizing Air Traffic Behavior

    NASA Technical Reports Server (NTRS)

    Drew, Michael; Sheth, Kapil

    2015-01-01

    In this paper two frequency domain techniques are applied to air traffic analysis. The Continuous Wavelet Transform (CWT), like the Fourier Transform, is shown to identify changes in historical traffic patterns caused by Traffic Management Initiatives (TMIs) and weather with the added benefit of detecting when in time those changes take place. Next, with the expectation that it could detect anomalies in the network and indicate the extent to which they affect traffic flows, the Spectral Graph Wavelet Transform (SGWT) is applied to a center based graph model of air traffic. When applied to simulations based on historical flight plans, it identified the traffic flows between centers that have the greatest impact on either neighboring flows, or flows between centers many centers away. Like the CWT, however, it can be difficult to interpret SGWT results and relate them to simulations where major TMIs are implemented, and more research may be warranted in this area. These frequency analysis techniques can detect off-nominal air traffic behavior, but due to the nature of air traffic time series data, so far they prove difficult to apply in a way that provides significant insight or specific identification of traffic patterns.

  8. Quantitative two-process analysis of avoidance conditioning in goldfish.

    PubMed

    Zhuikov, A Y; Couvillon, P A; Bitterman, M E

    1994-01-01

    The shuttlebox performance of goldfish was studied under standardized conditions in a variety of problems--with or without an avoidance contingency, a conditioned stimulus (CS)-termination contingency, and an escape contingency. The effects of CS-only, unconditioned stimulus (US)-only, and explicitly unpaired training were also examined. All the data could be simulated quantitatively with a version of O. H. Mowrer's (1947) 2-process theory expressed in 2 learning equations (1 classical, the other instrumental) and a performance equation. The good fit suggests that the theory is worth developing further with new experiments designed to challenge it.

  9. Quantitative analysis of laminin 5 gene expression in human keratinocytes.

    PubMed

    Akutsu, Nobuko; Amano, Satoshi; Nishiyama, Toshio

    2005-05-01

    To examine the expression of laminin 5 genes (LAMA3, LAMB3, and LAMC2) encoding the three polypeptide chains alpha3, beta3, and gamma2, respectively, in human keratinocytes, we developed novel quantitative polymerase chain reaction (PCR) methods utilizing Thermus aquaticus DNA polymerase, specific primers, and fluorescein-labeled probes with the ABI PRISM 7700 sequence detector system. Gene expression levels of LAMA3, LAMB3, and LAMC2 and glyceraldehyde-3-phosphate dehydrogenase were quantitated reproducibly and sensitively in the range from 1 x 10(2) to 1 x 10(8) gene copies. Basal gene expression level of LAMB3 was about one-tenth of that of LAMA3 or LAMC2 in human keratinocytes, although there was no clear difference among immunoprecipitated protein levels of alpha3, beta3, and gamma2 synthesized in radio-labeled keratinocytes. Human serum augmented gene expressions of LAMA3, LAMB3, and LAMC2 in human keratinocytes to almost the same extent, and this was associated with an increase of the laminin 5 protein content measured by a specific sandwich enzyme-linked immunosorbent assay. These results demonstrate that the absolute mRNA levels generated from the laminin 5 genes do not determine the translated protein levels of the laminin 5 chains in keratinocytes, and indicate that the expression of the laminin 5 genes may be controlled by common regulation mechanisms. PMID:15854126

  10. Quantitative phenotypic analysis of multistress response in Zygosaccharomyces rouxii complex.

    PubMed

    Solieri, Lisa; Dakal, Tikam C; Bicciato, Silvio

    2014-06-01

    Zygosaccharomyces rouxii complex comprises three yeasts clusters sourced from sugar- and salt-rich environments: haploid Zygosaccharomyces rouxii, diploid Zygosaccharomyces sapae and allodiploid/aneuploid strains of uncertain taxonomic affiliations. These yeasts have been characterized with respect to gene copy number variation, karyotype variability and change in ploidy, but functional diversity in stress responses has not been explored yet. Here, we quantitatively analysed the stress response variation in seven strains of the Z. rouxii complex by modelling growth variables via model and model-free fitting methods. Based on the spline fit as most reliable modelling method, we resolved different interstrain responses to 15 environmental perturbations. Compared with Z. rouxii CBS 732(T) and Z. sapae strains ABT301(T) and ABT601, allodiploid strain ATCC 42981 and aneuploid strains CBS 4837 and CBS 4838 displayed higher multistress resistance and better performance in glycerol respiration even in the presence of copper. μ-based logarithmic phenotypic index highlighted that ABT601 is a slow-growing strain insensitive to stress, whereas ABT301(T) grows fast on rich medium and is sensitive to suboptimal conditions. Overall, the differences in stress response could imply different adaptation mechanisms to sugar- and salt-rich niches. The obtained phenotypic profiling contributes to provide quantitative insights for elucidating the adaptive mechanisms to stress in halo- and osmo-tolerant Zygosaccharomyces yeasts. PMID:24533625

  11. Qualitative and quantitative analysis of volatile constituents from latrines.

    PubMed

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 μg/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

  12. Quantitative analysis of radiation-induced changes in sperm morphology

    SciTech Connect

    Young, I.T.; Gledhill, B.L.; Lake, S.; Wyrobek, A.J.

    1982-09-01

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure.

  13. Quantitative analysis of pulmonary ventilation scans with N-13 nitrogen gas and positron computed tomography

    SciTech Connect

    Senda, M.; Murata, K.; Itoh, H.; Yonekura, Y.; Saji, H.; Torizuka, K.

    1985-05-01

    The authors developed a quantitative method for the analysis of pulmonary ventilation studies using N-13 labeled nitrogen gas and positron computed tomography (PCT). The subject inhales N-13 nitrogen gas diluted with oxygen gas in a closed circuit. When the count rate comes up to the equilibrium in 2 or 4 minutes, the equilibrium phase scan (EQ) is performed for 3 min. Then the radioactive gas is washed out by the room air, during which the washout phase scan (WO) is performed for 5 min. Because nitrogen gas is almost insoluble in blood or tissue, the activity of the alveolus can be described with single compartment model if the dead space is ignored. The authors integrated the equation during the scanning period of EQ or WO, expressed the pixel count in each scan with V and T, and solved the equations simultaneously to obtain V and T. In clinical studies, poorly ventilated regions, which had decreased counts in EQ images, showed normal value in V images. Fibrotic regions showed normal T and decreased V. The authors method yields not only the distribution of alveolar volume which they cannot evaluate in EQ images, but also more accurate regional T values than Stewart-Hamilton method. Thus it is useful for the evaluation of regional pulmonary ventilatory function.

  14. QUANTITATIVE CT ANALYSIS, AIRFLOW OBSTRUCTION AND LUNG CANCER IN THE PITTSBURGH LUNG SCREENING STUDY

    PubMed Central

    Wilson, David O; Leader, Joseph K; Fuhrman, Carl R; Reilly, John J; Sciurba, Frank C.; Weissfeld, Joel L

    2011-01-01

    Background To study the relationship between emphysema, airflow obstruction and lung cancer in a high risk population we performed quantitative analysis of screening computed tomography (CT) scans. Methods Subjects completed questionnaires, spirometry and low-dose helical chest CT. Analyses compared cases and controls according to automated quantitative analysis of lung parenchyma and airways measures. Results Our case-control study of 117 matched pairs of lung cancer cases and controls did not reveal any airway or lung parenchymal findings on quantitative analysis of screening CT scans that were associated with increased lung cancer risk. Airway measures including wall area %, lumen perimeter, lumen area and average wall HU, and parenchymal measures including lung fraction < −910 Hounsfield Units (HU), were not statistically different between cases and controls. Conclusions The relationship between visual assessment of emphysema and increased lung cancer risk could not be verified by quantitative analysis of low-dose screening CT scans in a high risk tobacco exposed population. PMID:21610523

  15. Analysis of Artifacts Suggests DGGE Should Not Be Used For Quantitative Diversity Analysis

    PubMed Central

    Neilson, Julia W.; Jordan, Fiona L.; Maier, Raina M.

    2014-01-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. PMID:23313091

  16. Bridging the gaps for global sustainable development: a quantitative analysis.

    PubMed

    Udo, Victor E; Jansson, Peter Mark

    2009-09-01

    Global human progress occurs in a complex web of interactions between society, technology and the environment as driven by governance and infrastructure management capacity among nations. In our globalizing world, this complex web of interactions over the last 200 years has resulted in the chronic widening of economic and political gaps between the haves and the have-nots with consequential global cultural and ecosystem challenges. At the bottom of these challenges is the issue of resource limitations on our finite planet with increasing population. The problem is further compounded by pleasure-driven and poverty-driven ecological depletion and pollution by the haves and the have-nots respectively. These challenges are explored in this paper as global sustainable development (SD) quantitatively; in order to assess the gaps that need to be bridged. Although there has been significant rhetoric on SD with very many qualitative definitions offered, very few quantitative definitions of SD exist. The few that do exist tend to measure SD in terms of social, energy, economic and environmental dimensions. In our research, we used several human survival, development, and progress variables to create an aggregate SD parameter that describes the capacity of nations in three dimensions: social sustainability, environmental sustainability and technological sustainability. Using our proposed quantitative definition of SD and data from relatively reputable secondary sources, 132 nations were ranked and compared. Our comparisons indicate a global hierarchy of needs among nations similar to Maslow's at the individual level. As in Maslow's hierarchy of needs, nations that are struggling to survive are less concerned with environmental sustainability than advanced and stable nations. Nations such as the United States, Canada, Finland, Norway and others have higher SD capacity, and thus, are higher on their hierarchy of needs than nations such as Nigeria, Vietnam, Mexico and other

  17. Modeling of X-Ray Fluorescence for Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    Zarkadas, Charalambos

    2010-03-01

    Quantitative XRF algorithms involve mathematical procedures intended to solve a set of equations expressing the total fluorescence intensity of selected X-ray element lines emitted after sample irradiation by a photon source. These equations [1] have been derived under the assumptions of a parallel exciting beam and that of a perfectly flat and uniform sample and have been extended up to date to describe composite cases such as multilayered samples and samples exhibiting particle size effects. In state of the art algorithms the equations include most of the physical processes which can contribute to the measured fluorescence signal and make use of evaluated databases for the Fundamental Parameters included in the calculations. The accuracy of the results obtained depends on a great extent on the completeness of the model used to describe X-ray fluorescence intensities and on the compliance of the actual experimental conditions to the basic assumptions under which the mathematical formulas were derived.

  18. Quantitative analysis of CT scans of ceramic candle filters

    SciTech Connect

    Ferer, M.V.; Smith, D.H.

    1996-12-31

    Candle filters are being developed to remove coal ash and other fine particles (<15{mu}m) from hot (ca. 1000 K) gas streams. In the present work, a color scanner was used to digitize hard-copy CT X-ray images of cylindrical SiC filters, and linear regressions converted the scanned (color) data to a filter density for each pixel. These data, with the aid of the density of SiC, gave a filter porosity for each pixel. Radial averages, density-density correlation functions, and other statistical analyses were performed on the density data. The CT images also detected the presence and depth of cracks that developed during usage of the filters. The quantitative data promise to be a very useful addition to the color images.

  19. Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC

    PubMed Central

    Peng, Xinsheng; Hu, Min; Ling, Yahao; Tian, Yuan; Zhou, Yanxing; Zhou, Yanfang

    2014-01-01

    A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8)-triethylamine (50 : 50 : 0.1%) with a flow rate of 1 mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220 nm. The linearity of matrine is in the range of 1.6 to 200.0 μg/mL. The regression equation is y = 10706x − 2959 (R2 = 1.0). The average recovery is 101.7%; RSD = 2.22%  (n = 9). This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle. PMID:24834359

  20. Cross-bridge model of muscle contraction. Quantitative analysis.

    PubMed Central

    Eisenberg, E; Hill, T L; Chen, Y

    1980-01-01

    We recently presented, in a qualitative manner, a cross-bridge model of muscle contraction which was based on a biochemical kinetic cycle for the actomyosin ATPase activity. This cross-bridge model consisted of two cross-bridge states detached from actin and two cross-bridge states attached to actin. In the present paper, we attempt to fit this model quantitatively to both biochemical and physiological data. We find that the resulting complete cross-bridge model is able to account reasonably well for both the isometric transient data observed when a muscle is subjected to a sudden change in length and for the relationship between the velocity of muscle contraction in vivo and the actomyosin ATPase activity in vitro. This model also illustrates the interrelationship between biochemical and physiological data necessary for the development of a complete cross-bridge model of muscle contraction. PMID:6455168

  1. Quantitative error analysis for computer assisted navigation: a feasibility study

    PubMed Central

    Güler, Ö.; Perwög, M.; Kral, F.; Schwarm, F.; Bárdosi, Z. R.; Göbel, G.; Freysinger, W.

    2013-01-01

    Purpose The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features – the User Localization Error (ULE) - is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system. Methods Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials) and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root mean square, RMS; probe tip calibration was 0.18 mm RMS. Variances of tracking along the principal directions were measured as 0.18 mm2, 0.32 mm2, and 0.42 mm2. ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively. Results The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE. Conclusions Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials

  2. [Quantitative Measurement of Equivalence Ratios of Methane/Air Mixture by Laser-Induced Breakdown Spectroscopy: the Effects of Detector Gated Mode and Laser Wavelength].

    PubMed

    Zuo, Peng; Li, Bo; Yan, Bei-bei; Li, Zhong-shan; Yao, Ming-fa

    2015-11-01

    Laser-induced breakdown spectroscopy (LIBS) has been increasingly used in combustion diagnostics as a novel spectral analysis method in recent years. The quantitative local equivalence ratio of methane/air mixture is determined by LIBS using different emission intensity ratios of H/O and H/N. The comparison between calibration curves of H₆₅₆/O₇₇₇ and H₆₅₆/N₇₄₆ is performed in gated mode, which shows that H₆₅₆/O₇₇₇ can achieve better prediction accuracy and higher sensitivity. More spectral intensity ratios (H₆₅₆/O₇₇₇, H₆₅₆/N₅₀₀⁺, H₆₅₆/N₅₆₇ and H₆₅₆/N₇₄₆) can be used to make calibration measurements in ungated mode and H₆₅₆/O₇₇₇ is also tested best among them. The comparison between gated and ungated detection modes shows that gated mode offers better accuracy and precision. In addition, the effects of different laser wavelengths (1064, 532 and 355 nm) on LIBS spectra and calibration curves are investigated with laser focal point size and laser fluence kept constant. The results show that with longer laser wavelength, the peak intensity and SNR of H, O and N lines increase, as well as the slope of calibration curve of H₆₅₆/O₇₇₇. Among these three wavelengths, 1064 nm laser is best suited to measure the equivalence ratio of CH₄/air mixture by LIBS. The experimental results are explained in terms of plasma electron density and temperature, which have a significant impact on the emission intensity and the partition function of hydrogen and oxygen, respectively.

  3. Theoretical and numerical analysis of the corneal air puff test

    NASA Astrophysics Data System (ADS)

    Simonini, Irene; Angelillo, Maurizio; Pandolfi, Anna

    2016-08-01

    Ocular analyzers are used in the current clinical practice to estimate, by means of a rapid air jet, the intraocular pressure and other eye's parameters. In this study, we model the biomechanical response of the human cornea to the dynamic test with two approaches. In the first approach, the corneal system undergoing the air puff test is regarded as a harmonic oscillator. In the second approach, we use patient-specific geometries and the finite element method to simulate the dynamic test on surgically treated corneas. In spite of the different levels of approximation, the qualitative response of the two models is very similar, and the most meaningful results of both models are not significantly affected by the inclusion of viscosity of the corneal material in the dynamic analysis. Finite element calculations reproduce the observed snap-through of the corneal shell, including two applanate configurations, and compare well with in vivo images provided by ocular analyzers, suggesting that the mechanical response of the cornea to the air puff test is actually driven only by the elasticity of the stromal tissue. These observations agree with the dynamic characteristics of the test, since the frequency of the air puff impulse is several orders of magnitude larger than the reciprocal of any reasonable relaxation time for the material, downplaying the role of viscosity during the fast snap-through phase.

  4. Aviation System Analysis Capability Air Carrier Investment Model-Cargo

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse; Santmire, Tara

    1999-01-01

    The purpose of the Aviation System Analysis Capability (ASAC) Air Cargo Investment Model-Cargo (ACIMC), is to examine the economic effects of technology investment on the air cargo market, particularly the market for new cargo aircraft. To do so, we have built an econometrically based model designed to operate like the ACIM. Two main drivers account for virtually all of the demand: the growth rate of the Gross Domestic Product (GDP) and changes in the fare yield (which is a proxy of the price charged or fare). These differences arise from a combination of the nature of air cargo demand and the peculiarities of the air cargo market. The net effect of these two factors are that sales of new cargo aircraft are much less sensitive to either increases in GDP or changes in the costs of labor, capital, fuel, materials, and energy associated with the production of new cargo aircraft than the sales of new passenger aircraft. This in conjunction with the relatively small size of the cargo aircraft market means technology improvements to the cargo aircraft will do relatively very little to spur increased sales of new cargo aircraft.

  5. An analysis of secondary pollutants in Buenos Aires City.

    PubMed

    Reich, Silvia; Magallanes, Jorge; Dawidowski, Laura; Gómez, Darío; Groselj, Neva; Zupan, Jure

    2006-08-01

    Air pollutant concentrations from a monitoring campaign in Buenos Aires City, Argentina, are used to investigate the relationships between ambient levels of ozone (O3), nitric oxide (NO) and nitrogen dioxide (NO2) as a function of NO(x) (= NO + NO2). This campaign undertaken by the electricity sector was aimed at elucidating the apportionment of thermal power plants to air quality deterioration. Concentrations of carbon monoxide (CO) and sulphur dioxide (SO2) were also registered. Photo stationary state (PSS) of the NO, NO2, O3 and peroxy radicals species has been analysed. The 'oxidant' level concept has been introduced, OX (= O3 + NO2), which varies with the level of NO(x). It is shown that this level is made up of NO(x)-independent and NO(x)-dependent contributions. The former is a regional contribution that equates the background O3 level, whereas the latter is a local contribution that correlates with the level of primary pollution. Furthermore, the anticorrelation between NO2 and O3 levels, which is a characteristic of the atmospheric photo stationary cycle has been verified. The analysis of the concentration of the primary pollutants CO and NO strongly suggests that the vehicle traffic is the principal source of them. Levels of continuous measurements of SO2 for Buenos Aires City are reported in this work as a complement of previously published results.

  6. Why social network analysis is important to Air Force applications

    NASA Astrophysics Data System (ADS)

    Havig, Paul R.; McIntire, John P.; Geiselman, Eric; Mohd-Zaid, Fairul

    2012-06-01

    Social network analysis is a powerful tool used to help analysts discover relationships amongst groups of people as well as individuals. It is the mathematics behind such social networks as Facebook and MySpace. These networks alone cause a huge amount of data to be generated and the issue is only compounded once one adds in other electronic media such as e-mails and twitter. In this paper we outline the basics of social network analysis and how it may be used in current and future Air Force applications.

  7. [Research progress of quantitative analysis for respiratory sinus arrhythmia].

    PubMed

    Sun, Congcong; Zhang, Zhengbo; Wang, Buqing; Liu, Hongyun; Ang, Qing; Wang, Weidong

    2011-12-01

    Respiratory sinus arrhythmia (RSA) is known as fluctuations of heart rate associated with breathing. It has been increasingly used as a noninvasive index of cardiac vagal tone in psychophysiological research recently. Its analysis is often influenced or distorted by respiratory parameters, posture and action, etc. This paper reviews five methods of quantification, including the root mean square of successive differences (RMSSD), peak valley RSA (pvRSA), cosinor fitting, spectral analysis, and joint timing-frequency analysis (JTFA). Paced breathing, analysis of covariance, residua method and msRSA per liter tidal volume are adjustment strategies of measurement and analysis of RSA in this article as well. At last, some prospects of solutions of the problems of RSA research are given.

  8. Rethinking Meta-Analysis: Applications for Air Pollution Data and Beyond

    PubMed Central

    Goodman, Julie E; Petito Boyce, Catherine; Sax, Sonja N; Beyer, Leslie A; Prueitt, Robyn L

    2015-01-01

    Meta-analyses offer a rigorous and transparent systematic framework for synthesizing data that can be used for a wide range of research areas, study designs, and data types. Both the outcome of meta-analyses and the meta-analysis process itself can yield useful insights for answering scientific questions and making policy decisions. Development of the National Ambient Air Quality Standards illustrates many potential applications of meta-analysis. These applications demonstrate the strengths and limitations of meta-analysis, issues that arise in various data realms, how meta-analysis design choices can influence interpretation of results, and how meta-analysis can be used to address bias and heterogeneity. Reviewing available data from a meta-analysis perspective can provide a useful framework and impetus for identifying and refining strategies for future research. Moreover, increased pervasiveness of a meta-analysis mindset—focusing on how the pieces of the research puzzle fit together—would benefit scientific research and data syntheses regardless of whether or not a quantitative meta-analysis is undertaken. While an individual meta-analysis can only synthesize studies addressing the same research question, the results of separate meta-analyses can be combined to address a question encompassing multiple data types. This observation applies to any scientific or policy area where information from a variety of disciplines must be considered to address a broader research question. PMID:25969128

  9. Quantitative analysis of numerical solvers for oscillatory biomolecular system models

    PubMed Central

    Quo, Chang F; Wang, May D

    2008-01-01

    Background This article provides guidelines for selecting optimal numerical solvers for biomolecular system models. Because various parameters of the same system could have drastically different ranges from 10-15 to 1010, the ODEs can be stiff and ill-conditioned, resulting in non-unique, non-existing, or non-reproducible modeling solutions. Previous studies have not examined in depth how to best select numerical solvers for biomolecular system models, which makes it difficult to experimentally validate the modeling results. To address this problem, we have chosen one of the well-known stiff initial value problems with limit cycle behavior as a test-bed system model. Solving this model, we have illustrated that different answers may result from different numerical solvers. We use MATLAB numerical solvers because they are optimized and widely used by the modeling community. We have also conducted a systematic study of numerical solver performances by using qualitative and quantitative measures such as convergence, accuracy, and computational cost (i.e. in terms of function evaluation, partial derivative, LU decomposition, and "take-off" points). The results show that the modeling solutions can be drastically different using different numerical solvers. Thus, it is important to intelligently select numerical solvers when solving biomolecular system models. Results The classic Belousov-Zhabotinskii (BZ) reaction is described by the Oregonator model and is used as a case study. We report two guidelines in selecting optimal numerical solver(s) for stiff, complex oscillatory systems: (i) for problems with unknown parameters, ode45 is the optimal choice regardless of the relative error tolerance; (ii) for known stiff problems, both ode113 and ode15s are good choices under strict relative tolerance conditions. Conclusions For any given biomolecular model, by building a library of numerical solvers with quantitative performance assessment metric, we show that it is possible

  10. Calibration and Data Analysis of the MC-130 Air Balance

    NASA Technical Reports Server (NTRS)

    Booth, Dennis; Ulbrich, N.

    2012-01-01

    Design, calibration, calibration analysis, and intended use of the MC-130 air balance are discussed. The MC-130 balance is an 8.0 inch diameter force balance that has two separate internal air flow systems and one external bellows system. The manual calibration of the balance consisted of a total of 1854 data points with both unpressurized and pressurized air flowing through the balance. A subset of 1160 data points was chosen for the calibration data analysis. The regression analysis of the subset was performed using two fundamentally different analysis approaches. First, the data analysis was performed using a recently developed extension of the Iterative Method. This approach fits gage outputs as a function of both applied balance loads and bellows pressures while still allowing the application of the iteration scheme that is used with the Iterative Method. Then, for comparison, the axial force was also analyzed using the Non-Iterative Method. This alternate approach directly fits loads as a function of measured gage outputs and bellows pressures and does not require a load iteration. The regression models used by both the extended Iterative and Non-Iterative Method were constructed such that they met a set of widely accepted statistical quality requirements. These requirements lead to reliable regression models and prevent overfitting of data because they ensure that no hidden near-linear dependencies between regression model terms exist and that only statistically significant terms are included. Finally, a comparison of the axial force residuals was performed. Overall, axial force estimates obtained from both methods show excellent agreement as the differences of the standard deviation of the axial force residuals are on the order of 0.001 % of the axial force capacity.

  11. Longitudinal Metagenomic Analysis of Hospital Air Identifies Clinically Relevant Microbes

    PubMed Central

    King, Paula; Pham, Long K.; Waltz, Shannon; Sphar, Dan; Yamamoto, Robert T.; Conrad, Douglas; Taplitz, Randy; Torriani, Francesca

    2016-01-01

    We describe the sampling of sixty-three uncultured hospital air samples collected over a six-month period and analysis using shotgun metagenomic sequencing. Our primary goals were to determine the longitudinal metagenomic variability of this environment, identify and characterize genomes of potential pathogens and determine whether they are atypical to the hospital airborne metagenome. Air samples were collected from eight locations which included patient wards, the main lobby and outside. The resulting DNA libraries produced 972 million sequences representing 51 gigabases. Hierarchical clustering of samples by the most abundant 50 microbial orders generated three major nodes which primarily clustered by type of location. Because the indoor locations were longitudinally consistent, episodic relative increases in microbial genomic signatures related to the opportunistic pathogens Aspergillus, Penicillium and Stenotrophomonas were identified as outliers at specific locations. Further analysis of microbial reads specific for Stenotrophomonas maltophilia indicated homology to a sequenced multi-drug resistant clinical strain and we observed broad sequence coverage of resistance genes. We demonstrate that a shotgun metagenomic sequencing approach can be used to characterize the resistance determinants of pathogen genomes that are uncharacteristic for an otherwise consistent hospital air microbial metagenomic profile. PMID:27482891

  12. Quantitative analysis of agricultural land use change in China

    NASA Astrophysics Data System (ADS)

    Chou, Jieming; Dong, Wenjie; Wang, Shuyu; Fu, Yuqing

    This article reviews the potential impacts of climate change on land use change in China. Crop sown area is used as index to quantitatively analyze the temporal-spatial changes and the utilization of the agricultural land. A new concept is defined as potential multiple cropping index to reflect the potential sowing ability. The impacting mechanism, land use status and its surplus capacity are investigated as well. The main conclusions are as following; During 1949-2010, the agricultural land was the greatest in amount in the middle of China, followed by that in the country's eastern and western regions. The most rapid increase and decrease of agricultural land were observed in Xinjiang and North China respectively, Northwest China and South China is also changed rapid. The variation trend before 1980 differed significantly from that after 1980. Agricultural land was affected by both natural and social factors, such as regional climate and environmental changes, population growth, economic development, and implementation of policies. In this paper, the effects of temperature and urbanization on the coverage of agriculture land are evaluated, and the results show that the urbanization can greatly affects the amount of agriculture land in South China, Northeast China, Xinjiang and Southwest China. From 1980 to 2009, the extent of agricultural land use had increased as the surplus capacity had decreased. Still, large remaining potential space is available, but the future utilization of agricultural land should be carried out with scientific planning and management for the sustainable development.

  13. Quantitative analysis of chromosome condensation in fission yeast.

    PubMed

    Petrova, Boryana; Dehler, Sascha; Kruitwagen, Tom; Hériché, Jean-Karim; Miura, Kota; Haering, Christian H

    2013-03-01

    Chromosomes undergo extensive conformational rearrangements in preparation for their segregation during cell divisions. Insights into the molecular mechanisms behind this still poorly understood condensation process require the development of new approaches to quantitatively assess chromosome formation in vivo. In this study, we present a live-cell microscopy-based chromosome condensation assay in the fission yeast Schizosaccharomyces pombe. By automatically tracking the three-dimensional distance changes between fluorescently marked chromosome loci at high temporal and spatial resolution, we analyze chromosome condensation during mitosis and meiosis and deduct defined parameters to describe condensation dynamics. We demonstrate that this method can determine the contributions of condensin, topoisomerase II, and Aurora kinase to mitotic chromosome condensation. We furthermore show that the assay can identify proteins required for mitotic chromosome formation de novo by isolating mutants in condensin, DNA polymerase ε, and F-box DNA helicase I that are specifically defective in pro-/metaphase condensation. Thus, the chromosome condensation assay provides a direct and sensitive system for the discovery and characterization of components of the chromosome condensation machinery in a genetically tractable eukaryote.

  14. Quantitative analysis of TALE-DNA interactions suggests polarity effects.

    PubMed

    Meckler, Joshua F; Bhakta, Mital S; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J; Baldwin, Enoch P

    2013-04-01

    Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ≈ NN > NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 10(3)-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches 'standard' RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5'-T. Another surprising observation was that base mismatches at the 5' end of the target site had more disruptive effects on affinity than those at the 3' end, particularly in designed TALEs. These results provide evidence that TALE-DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones.

  15. Quantitative analysis of TALE–DNA interactions suggests polarity effects

    PubMed Central

    Meckler, Joshua F.; Bhakta, Mital S.; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H.; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H.; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J.; Baldwin, Enoch P.

    2013-01-01

    Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ∼ NN ≫ NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 103-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches ‘standard’ RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5′-T. Another surprising observation was that base mismatches at the 5′ end of the target site had more disruptive effects on affinity than those at the 3′ end, particularly in designed TALEs. These results provide evidence that TALE–DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones. PMID:23408851

  16. Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis

    PubMed Central

    Stukes, Sabriya A.; Cohen, Hillel W.

    2014-01-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  17. On the in vivo action of erythropoietin: a quantitative analysis.

    PubMed

    Papayannopoulou, T; Finch, C A

    1972-05-01

    The composite response of the erythron to exogenous erythropoietin has been studied in normal, splenectomized, and polycythemic mice. After stimulation the normal animal doubled its marrow nucleated red cells by the 3rd day with little further change by the 5th. Nucleated red cells within the spleen began to increase sharply on the 2nd day and, by the 5th, exceeded those in the marrow. The total nucleated erythroid response represented a fourfold increase. Reticulocytes lagged behind the expansion of the nucleated red cell mass, but by the 5th day the original ratio was re-established. Hemoglobin synthesis was increased, but the ratio of hemoglobin synthesized in nucleated red cells and reticulocytes was basically unchanged. Early displacement of marrow reticulocytes into circulation and the production of a larger red cell also occurred. No evidence of a change in the number of erythroid mitoses was found; only a slight decrease in the average cell cycle time was demonstrated. Thus, whereas erythropoietin stimulation induced several changes in erythropoiesis, the increased number of cells entering into the maturing pool appeared to be of greatest quantitative significance.Splenectomy reduced the proliferative response of the erythron over 5 days stimulation to three-fourths that found in the normal animal. This difference, also reflected in a proportionately lower reticulocyte response and increment in circulating red cell mass, suggests that erythropoiesis within the mouse marrow is spatially or otherwise restricted and that the spleen provided a supplemental area of erythroid expansion.

  18. On the in vivo action of erythropoietin: a quantitative analysis

    PubMed Central

    Papayannopoulou, Thalia; Finch, Clement A.

    1972-01-01

    The composite response of the erythron to exogenous erythropoietin has been studied in normal, splenectomized, and polycythemic mice. After stimulation the normal animal doubled its marrow nucleated red cells by the 3rd day with little further change by the 5th. Nucleated red cells within the spleen began to increase sharply on the 2nd day and, by the 5th, exceeded those in the marrow. The total nucleated erythroid response represented a fourfold increase. Reticulocytes lagged behind the expansion of the nucleated red cell mass, but by the 5th day the original ratio was re-established. Hemoglobin synthesis was increased, but the ratio of hemoglobin synthesized in nucleated red cells and reticulocytes was basically unchanged. Early displacement of marrow reticulocytes into circulation and the production of a larger red cell also occurred. No evidence of a change in the number of erythroid mitoses was found; only a slight decrease in the average cell cycle time was demonstrated. Thus, whereas erythropoietin stimulation induced several changes in erythropoiesis, the increased number of cells entering into the maturing pool appeared to be of greatest quantitative significance. Splenectomy reduced the proliferative response of the erythron over 5 days stimulation to three-fourths that found in the normal animal. This difference, also reflected in a proportionately lower reticulocyte response and increment in circulating red cell mass, suggests that erythropoiesis within the mouse marrow is spatially or otherwise restricted and that the spleen provided a supplemental area of erythroid expansion. PMID:5020431

  19. Quantitative Analysis of Synaptic Release at the Photoreceptor Synapse

    PubMed Central

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B.

    2010-01-01

    Abstract Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca2+ and exhibits an unusually shallow dependence on presynaptic Ca2+. To provide a quantitative description of the photoreceptor Ca2+ sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca2+-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca2+: exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca2+ binding sites on the rod Ca2+ sensor rather than the typical four or five. For most models, the on-rates for Ca2+ binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca2+ unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca2+ sensor, slow Ca2+ unbinding may support the fusion of vesicles located at a distance from Ca2+ channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  20. Quantitative Analysis of CME Deflections in the Corona

    NASA Astrophysics Data System (ADS)

    Gui, Bin; Shen, Chenglong; Wang, Yuming; Ye, Pinzhong; Liu, Jiajia; Wang, Shui; Zhao, Xuepu

    2011-07-01

    In this paper, ten CME events viewed by the STEREO twin spacecraft are analyzed to study the deflections of CMEs during their propagation in the corona. Based on the three-dimensional information of the CMEs derived by the graduated cylindrical shell (GCS) model (Thernisien, Howard, and Vourlidas in Astrophys. J. 652, 1305, 2006), it is found that the propagation directions of eight CMEs had changed. By applying the theoretical method proposed by Shen et al. ( Solar Phys. 269, 389, 2011) to all the CMEs, we found that the deflections are consistent, in strength and direction, with the gradient of the magnetic energy density. There is a positive correlation between the deflection rate and the strength of the magnetic energy density gradient and a weak anti-correlation between the deflection rate and the CME speed. Our results suggest that the deflections of CMEs are mainly controlled by the background magnetic field and can be quantitatively described by the magnetic energy density gradient (MEDG) model.

  1. Quantitative proteomic analysis of amphotericin B resistance in Leishmania infantum

    PubMed Central

    Brotherton, Marie-Christine; Bourassa, Sylvie; Légaré, Danielle; Poirier, Guy G.; Droit, Arnaud; Ouellette, Marc

    2014-01-01

    Amphotericin B (AmB) in its liposomal form is now considered as either first- or second-line treatment against Leishmania infections in different part of the world. Few cases of AmB resistance have been reported and resistance mechanisms toward AmB are still poorly understood. This paper reports a large-scale comparative proteomic study in the context of AmB resistance. Quantitative proteomics using stable isotope labeling of amino acids in cell culture (SILAC) was used to better characterize cytoplasmic and membrane-enriched (ME) proteomes of the in vitro generated Leishmania infantum AmB resistant mutant AmB1000.1. In total, 97 individual proteins were found as differentially expressed between the mutant and its parental sensitive strain (WT). More than half of these proteins were either metabolic enzymes or involved in transcription or translation processes. Key energetic pathways such as glycolysis and TCA cycle were up-regulated in the mutant. Interestingly, many proteins involved in reactive oxygen species (ROS) scavenging and heat-shock proteins were also up-regulated in the resistant mutant. This work provides a basis for further investigations to understand the roles of proteins differentially expressed in relation with AmB resistance. PMID:25057462

  2. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions.

  3. Quantitative Proteome Analysis of Leishmania donovani under Spermidine Starvation

    PubMed Central

    Singh, Shalini; Dubey, Vikash Kumar

    2016-01-01

    We have earlier reported antileishmanial activity of hypericin by spermidine starvation. In the current report, we have used label free proteome quantitation approach to identify differentially modulated proteins after hypericin treatment. A total of 141 proteins were found to be differentially regulated with ANOVA P value less than 0.05 in hypericin treated Leishmania promastigotes. Differentially modulated proteins have been broadly classified under nine major categories. Increase in ribosomal protein S7 protein suggests the repression of translation. Inhibition of proteins related to ubiquitin proteasome system, RNA binding protein and translation initiation factor also suggests altered translation. We have also observed increased expression of Hsp 90, Hsp 83–1 and stress inducible protein 1. Significant decreased level of cyclophilin was observed. These stress related protein could be cellular response of the parasite towards hypericin induced cellular stress. Also, defective metabolism, biosynthesis and replication of nucleic acids, flagellar movement and signalling of the parasite were observed as indicated by altered expression of proteins involved in these pathways. The data was analyzed rigorously to get further insight into hypericin induced parasitic death. PMID:27123864

  4. Analysis of quantitative trait loci for behavioral laterality in mice.

    PubMed Central

    Roubertoux, Pierre L; Le Roy, Isabelle; Tordjman, Sylvie; Cherfou, Améziane; Migliore-Samour, Danièle

    2003-01-01

    Laterality is believed to have genetic components, as has been deduced from family studies in humans and responses to artificial selection in mice, but these genetic components are unknown and the underlying physiological mechanisms are still a subject of dispute. We measured direction of laterality (preferential use of left or right paws) and degree of laterality (absolute difference between the use of left and right paws) in C57BL/6ByJ (B) and NZB/BlNJ (N) mice and in their F(1) and F(2) intercrosses. Measurements were taken of both forepaws and hind paws. Quantitative trait loci (QTL) did not emerge for direction but did for degree of laterality. One QTL for forepaw (LOD score = 5.6) and the second QTL for hind paw (LOD score = 7.2) were both located on chromosome 4 and their peaks were within the same confidence interval. A QTL for plasma luteinizing hormone concentration was also found in the confidence interval of these two QTL. These results suggest that the physiological mechanisms underlying degree of laterality react to gonadal steroids. PMID:12663540

  5. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  6. Quantitative analysis of task selection for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  7. Quantitative genetic analysis of salicylic acid perception in Arabidopsis.

    PubMed

    Dobón, Albor; Canet, Juan Vicente; Perales, Lorena; Tornero, Pablo

    2011-10-01

    Salicylic acid (SA) is a phytohormone required for a full resistance against some pathogens in Arabidopsis, and NPR1 (Non-Expressor of Pathogenesis Related Genes 1) is the only gene with a strong effect on resistance induced by SA which has been described. There can be additional components of SA perception that escape the traditional approach of mutagenesis. An alternative to that approach is searching in the natural variation of Arabidopsis. Different methods of analyzing the variation between ecotypes have been tried and it has been found that measuring the growth of a virulent isolate of Pseudomonas syringae after the exogenous application of SA is the most effective one. Two ecotypes, Edi-0 and Stw-0, have been crossed, and their F2 has been studied. There are two significant quantitative trait loci (QTLs) in this population, and there is one QTL in each one of the existing mapping populations Col-4 × Laer-0 and Laer-0 × No-0. They have different characteristics: while one QTL is only detectable at low concentrations of SA, the other acts after the point of crosstalk with methyl jasmonate signalling. Three of the QTLs have candidates described in SA perception as NPR1, its interactors, and a calmodulin binding protein.

  8. Quantitative analysis of pheromone-binding protein specificity

    PubMed Central

    Katti, S.; Lokhande, N.; González, D.; Cassill, A.; Renthal, R.

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using β-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between β-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the β-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ~100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ~200 nM for the silk moth pheromone bombykol and ~90 nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the pheromone receptor model proposed by Laughlin et al. (Cell 133: 1255–65, 2008) are discussed. PMID:23121132

  9. Quantitative analysis of virus and plasmid trafficking in cells

    NASA Astrophysics Data System (ADS)

    Lagache, Thibault; Dauty, Emmanuel; Holcman, David

    2009-01-01

    Intracellular transport of DNA carriers is a fundamental step of gene delivery. By combining both theoretical and numerical approaches we study here single and several viruses and DNA particles trafficking in the cell cytoplasm to a small nuclear pore. We present a physical model to account for certain aspects of cellular organization, starting with the observation that a viral trajectory consists of epochs of pure diffusion and epochs of active transport along microtubules. We define a general degradation rate to describe the limitations of the delivery of plasmid or viral particles to a nuclear pore imposed by various types of direct and indirect hydrolysis activity inside the cytoplasm. By replacing the switching dynamics by a single steady state stochastic description, we obtain estimates for the probability and the mean time for the first one of many particles to go from the cell membrane to a small nuclear pore. Computational simulations confirm that our model can be used to analyze and interpret viral trajectories and estimate quantitatively the success of nuclear delivery.

  10. Space-to-Ground Communication for Columbus: A Quantitative Analysis

    PubMed Central

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  11. Depression in Parkinson's disease: a quantitative and qualitative analysis.

    PubMed Central

    Gotham, A M; Brown, R G; Marsden, C D

    1986-01-01

    Depression is a common feature of Parkinson's disease, a fact of both clinical and theoretical significance. Assessment of depression in Parkinson's disease is complicated by overlapping symptomatology in the two conditions, making global assessments based on observer or self-ratings of doubtful validity. The present study aimed to provide both a quantitative and qualitative description of the nature of the depressive changes found in Parkinson's disease as compared with normal elderly subjects and arthritis patients. As with previous studies, the patients with Parkinson's disease scored significantly higher than normal controls on various self-ratings of depression and anxiety but, in this study, did not differ from those with arthritis. Qualitatively, both the Parkinson's disease and the arthritis groups had depression characterised by pessimism and hopelessness, decreased motivation and drive, and increased concern with health. In contrast, the negative affective feelings of guilt, self-blame and worthlessness were absent in both patient groups. This pattern of depression was significantly associated with severity of illness and functional disability. However, these factors account for only a modest proportion of the variability in test scores. Probable unexplored factors are individual differences in coping style and availability of support. PMID:3701347

  12. Quantitative proteomic analysis of the Salmonella-lettuce interaction.

    PubMed

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-11-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens.

  13. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  14. An Introduction to Error Analysis for Quantitative Chemistry

    ERIC Educational Resources Information Center

    Neman, R. L.

    1972-01-01

    Describes two formulas for calculating errors due to instrument limitations which are usually found in gravimetric volumetric analysis and indicates their possible applications to other fields of science. (CC)

  15. Modeling air pollution in the Tracking and Analysis Framework (TAF)

    SciTech Connect

    Shannon, J.D.

    1998-12-31

    The Tracking and Analysis Framework (TAF) is a set of interactive computer models for integrated assessment of the Acid Rain Provisions (Title IV) of the 1990 Clean Air Act Amendments. TAF is designed to execute in minutes on a personal computer, thereby making it feasible for a researcher or policy analyst to examine quickly the effects of alternate modeling assumptions or policy scenarios. Because the development of TAF involves researchers in many different disciplines, TAF has been given a modular structure. In most cases, the modules contain reduced-form models that are based on more complete models exercised off-line. The structure of TAF as of December 1996 is shown. Both the Atmospheric Pathways Module produce estimates for regional air pollution variables.

  16. Theoretical analysis of the coherence-brightened laser in air

    NASA Astrophysics Data System (ADS)

    Yuan, Luqi; Hokr, Brett H.; Traverso, Andrew J.; Voronine, Dmitri V.; Rostovtsev, Yuri; Sokolov, Alexei V.; Scully, Marlan O.

    2013-02-01

    We present a detailed theoretical study of a recent experiment [A. J. Traverso , Proc. Natl. Acad. Sci. USAPNASA60027-842410.1073/pnas.1211481109 109, 15185 (2012)] in which a laserlike source is created in air by pumping with a nanosecond pulse. The source generates radiation in the forward and backward directions. The temporal behavior of the emitted pulses is investigated for different pump shapes and durations. Our analysis indicates that the spiky emission is due to quantum coherence via cooperation between atoms of an ensemble, which leads to strong-oscillatory superfluorescence. We show that these cooperative nonadiabatic coherence effects cannot be described by rate equations and instead a full set of the Maxwell-Bloch equations must be used. We consider a range of parameters and study transitions between various regimes. Understanding these coherence-brightened processes in air should lead to improvements in environmental, atmospheric remote sensing and other applications.

  17. Managing the analysis of air quality impacts under NEPA

    SciTech Connect

    Weber, Y.B.; Leslie, A.C.D.

    1995-12-31

    The National Environmental Policy Act of 1969 (NEPA) mandates the analysis and evaluation of potential impacts of major Federal actions having the potential to affect the environment. The Clean Air Act Amendments of 1990 identify an array of new air quality issues appropriate for analysis in compliance with NEPA. An example is emissions of the 189 hazardous air pollutants identified in Title III. The utility industry estimates that more than 2.4 billion pounds of toxic pollutants were emitted to the atmosphere in 1988, with the potential for resultant adverse health impacts such as cancer, reproductive effects, birth defects, and respiratory illness. The US Department of Energy (DOE) provides Federal funds for projects that utilize coal as the primary fuel, including the approximately 45 projects funded over the past ten years under the Clean Coal Technology Demonstration Program. Provision of Federal funds brings these projects under NEPA review. While electric steam generating units greater than 25 MW are currently excluded from regulatory review for the 189 air toxics listed in Title III, they are not, due to their potential impacts, excluded from NEPA review when Federally funded, in whole or in part. The authors will discuss their experiences drawn from NEPA evaluations of coal-fired power projects, the differences between regulatory requirements and NEPA requirements, source categories, major and area sources, conformity, maximum achievable control technology, mandatory licensing, radionuclides, visibility, toxics found to be emitted from coal combustion, public involvement, citizen suits, the bounty system, and how NEPA review can result in beneficial changes to proposed projects through mitigation measures to avoid or minimize potentially adverse environmental impacts.

  18. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  19. Software applications toward quantitative metabolic flux analysis and modeling.

    PubMed

    Dandekar, Thomas; Fieselmann, Astrid; Majeed, Saman; Ahmed, Zeeshan

    2014-01-01

    Metabolites and their pathways are central for adaptation and survival. Metabolic modeling elucidates in silico all the possible flux pathways (flux balance analysis, FBA) and predicts the actual fluxes under a given situation, further refinement of these models is possible by including experimental isotopologue data. In this review, we initially introduce the key theoretical concepts and different analysis steps in the modeling process before comparing flux calculation and metabolite analysis programs such as C13, BioOpt, COBRA toolbox, Metatool, efmtool, FiatFlux, ReMatch, VANTED, iMAT and YANA. Their respective strengths and limitations are discussed and compared to alternative software. While data analysis of metabolites, calculation of metabolic fluxes, pathways and their condition-specific changes are all possible, we highlight the considerations that need to be taken into account before deciding on a specific software. Current challenges in the field include the computation of large-scale networks (in elementary mode analysis), regulatory interactions and detailed kinetics, and these are discussed in the light of powerful new approaches.

  20. Quantitative analysis of nailfold capillary morphology in patients with fibromyalgia

    PubMed Central

    Choi, Dug-Hyun

    2015-01-01

    Background/Aims Nailfold capillaroscopy (NFC) has been used to examine morphological and functional microcirculation changes in connective tissue diseases. It has been demonstrated that NFC patterns reflect abnormal microvascular dynamics, which may play a role in fibromyalgia (FM) syndrome. The aim of this study was to determine NFC patterns in FM, and their association with clinical features of FM. Methods A total of 67 patients with FM, and 30 age- and sex-matched healthy controls, were included. Nailfold capillary patterns were quantitatively analyzed using computerized NFC. The parameters of interest were as follows: number of capillaries within the central 3 mm, deletion score, apical limb width, capillary width, and capillary dimension. Capillary dimension was determined by calculating the number of capillaries using the Adobe Photoshop version 7.0. Results FM patients had a lower number of capillaries and higher deletion scores on NFC compared to healthy controls (17.3 ± 1.7 vs. 21.8 ± 2.9, p < 0.05; 2.2 ± 0.9 vs. 0.7 ± 0.6, p < 0.05, respectively). Both apical limb width (µm) and capillary width (µm) were significantly decreased in FM patients (1.1 ± 0.2 vs. 3.7 ± 0.6; 5.4 ± 0.5 vs. 7.5 ± 1.4, respectively), indicating that FM patients have abnormally decreased digital capillary diameter and density. Interestingly, there was no difference in capillary dimension between the two groups, suggesting that the length or tortuosity of capillaries in FM patients is increased to compensate for diminished microcirculation. Conclusions FM patients had altered capillary density and diameter in the digits. Diminished microcirculation on NFC may alter capillary density and increase tortuosity. PMID:26161020

  1. Quantitative analysis of wrist electrodermal activity during sleep

    PubMed Central

    Sano, Akane; Picard, Rosalind W.; Stickgold, Robert

    2015-01-01

    We present the first quantitative characterization of electrodermal activity (EDA) patterns on the wrists of healthy adults during sleep using dry electrodes. We compare the new results on the wrist to prior findings on palmar or finger EDA by characterizing data measured from 80 nights of sleep consisting of 9 nights of wrist and palm EDA from 9 healthy adults sleeping at home, 56 nights of wrist and palm EDA from one healthy adult sleeping at home, and 15 nights of wrist EDA from 15 healthy adults in a sleep laboratory, with the latter compared to concurrent polysomnography. While high frequency patterns of EDA called “storms” were identified by eye in the 1960’s, we systematically compare thresholds for automatically detecting EDA peaks and establish criteria for EDA storms. We found that more than 80% of EDA peaks occurred in non-REM sleep, specifically during slow-wave sleep (SWS) and non-REM stage 2 sleep (NREM2). Also, EDA amplitude is higher in SWS than in other sleep stages. Longer EDA storms were more likely in the first two quarters of sleep and during SWS and NREM2. We also found from the home studies (65 nights) that EDA levels were higher and the skin conductance peaks were larger and more frequent when measured on the wrist than when measured on the palm. These EDA high frequency peaks and high amplitude were sometimes associated with higher skin temperature, but more work is needed looking at neurological and other EDA elicitors in order to elucidate their complete behavior. PMID:25286449

  2. Balancing Yin and Yang: Teaching and Learning Qualitative Data Analysis Within an Undergraduate Quantitative Data Analysis Course.

    ERIC Educational Resources Information Center

    Clark, Roger; Lang, Angela

    2002-01-01

    Describes an undergraduate sociology course that taught qualitative and quantitative data analysis. Focuses on two students and how they dealt with and overcame anxiety issues, subsequently achieving higher levels of learning and new learning strategies. (KDR)

  3. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    PubMed

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI.

  4. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution…

  5. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  6. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    2001-01-01

    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  7. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  8. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/.

  9. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis

    PubMed Central

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-01-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  10. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  11. Relative Humidity and its Effect on Sampling and Analysis of Agricultural Odorants in Air

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Source and ambient air sampling techniques used in agricultural air quality studies are seldom validated for the variability in the air matrix (temperature, dust levels, and relative humidity). In particular, relative humidity (RH) affects both field sampling and analysis of air samples. The objec...

  12. Biomolecular computation with molecular beacons for quantitative analysis of target nucleic acids.

    PubMed

    Lim, Hee-Woong; Lee, Seung Hwan; Yang, Kyung-Ae; Yoo, Suk-In; Park, Tai Hyun; Zhang, Byoung-Tak

    2013-01-01

    Molecular beacons are efficient and useful tools for quantitative detection of specific target nucleic acids. Thanks to their simple protocol, molecular beacons have great potential as substrates for biomolecular computing. Here we present a molecular beacon-based biomolecular computing method for quantitative detection and analysis of target nucleic acids. Whereas the conventional quantitative assays using fluorescent dyes have been designed for single target detection or multiplexed detection, the proposed method enables us not only to detect multiple targets but also to compute their quantitative information by weighted-sum of the targets. The detection and computation are performed on a molecular level simultaneously, and the outputs are detected as fluorescence signals. Experimental results show the feasibility and effectiveness of our weighted detection and linear combination method using molecular beacons. Our method can serve as a primitive operation of molecular pattern analysis, and we demonstrate successful binary classifications of molecular patterns made of synthetic oligonucleotide DNA molecules.

  13. Space-Time Analysis of the Air Quality Model Evaluation International Initiative (AQMEII) Phase 1 Air Quality Simulations

    EPA Science Inventory

    This study presents an evaluation of summertime daily maximum ozone concentrations over North America (NA) and Europe (EU) using the database generated during Phase 1 of the Air Quality Model Evaluation International Initiative (AQMEII). The analysis focuses on identifying tempor...

  14. Quantitative analysis of cell-free DNA in ovarian cancer

    PubMed Central

    SHAO, XUEFENG; He, YAN; JI, MIN; CHEN, XIAOFANG; QI, JING; SHI, WEI; HAO, TIANBO; JU, SHAOQING

    2015-01-01

    The aim of the present study was to investigate the association between cell-free DNA (cf-DNA) levels and clinicopathological characteristics of patients with ovarian cancer using a branched DNA (bDNA) technique, and to determine the value of quantitative cf-DNA detection in assisting with the diagnosis of ovarian cancer. Serum specimens were collected from 36 patients with ovarian cancer on days 1, 3 and 7 following surgery, and additional serum samples were also collected from 22 benign ovarian tumor cases, and 19 healthy, non-cancerous ovaries. bDNA techniques were used to detect serum cf-DNA concentrations. All data were analyzed using SPSS version 18.0. The cf-DNA levels were significantly increased in the ovarian cancer group compared with those of the benign ovarian tumor group and healthy ovarian group (P<0.01). Furthermore, cf-DNA levels were significantly increased in stage III and IV ovarian cancer compared with those of stages I and II (P<0.01). In addition, cf-DNA levels were significantly increased on the first day post-surgery (P<0.01), and subsequently demonstrated a gradual decrease. In the ovarian cancer group, the area under the receiver operating characteristic curve of cf-DNA and the sensitivity were 0.917 and 88.9%, respectively, which was higher than those of cancer antigen 125 (0.724, 75%) and human epididymis protein 4 (0.743, 80.6%). There was a correlation between the levels of serum cf-DNA and the occurrence and development of ovarian cancer in the patients evaluated. bDNA techniques possessed higher sensitivity and specificity than other methods for the detection of serum cf-DNA in patients exhibiting ovarian cancer, and bDNA techniques are more useful for detecting cf-DNA than other factors. Thus, the present study demonstrated the potential value for the use of bDNA as an adjuvant diagnostic method for ovarian cancer. PMID:26788153

  15. Quantitative analysis of night skyglow amplification under cloudy conditions

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio

    2014-10-01

    The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU

  16. On the quantitative analysis and evaluation of magnetic hysteresis data

    NASA Astrophysics Data System (ADS)

    Jackson, Mike; Solheid, Peter

    2010-04-01

    Magnetic hysteresis data are centrally important in pure and applied rock magnetism, but to date, no objective quantitative methods have been developed for assessment of data quality and of the uncertainty in parameters calculated from imperfect data. We propose several initial steps toward such assessment, using loop symmetry as an important key. With a few notable exceptions (e.g., related to field cooling and exchange bias), magnetic hysteresis loops possess a high degree of inversion symmetry (M(H) = -M(-H)). This property enables us to treat the upper and lower half-loops as replicate measurements for quantification of random noise, drift, and offsets. This, in turn, makes it possible to evaluate the statistical significance of nonlinearity, either in the high-field region (due to nonsaturation of the ferromagnetic moment) or over the complete range of applied fields (due to nonnegligible contribution of ferromagnetic phases to the total magnetic signal). It also allows us to quantify the significance of fitting errors for model loops constructed from analytical basis functions. When a statistically significant high-field nonlinearity is found, magnetic parameters must be calculated by approach-to-saturation fitting, e.g., by a model of the form M(H) = Ms + χHFH + αHβ. This nonlinear high-field inverse modeling problem is strongly ill conditioned, resulting in large and strongly covariant uncertainties in the fitted parameters, which we characterize through bootstrap analyses. For a variety of materials, including ferrihydrite and mid-ocean ridge basalts, measured in applied fields up to about 1.5 T, we find that the calculated value of the exponent β is extremely sensitive to small differences in the data or in the method of processing and that the overall uncertainty exceeds the range of physically reasonable values. The "unknowability" of β is accompanied by relatively large uncertainties in the other parameters, which can be characterized, if not

  17. Probabilistic reliability analysis, quantitative safety goals, and nuclear licensing in the United Kingdom.

    PubMed

    Cannell, W

    1987-09-01

    Although unpublicized, the use of quantitative safety goals and probabilistic reliability analysis for licensing nuclear reactors has become a reality in the United Kingdom. This conclusion results from an examination of the process leading to the licensing of the Sizewell B PWR in England. The licensing process for this reactor has substantial implications for nuclear safety standards in Britain, and is examined in the context of the growing trend towards quantitative safety goals in the United States. PMID:3685540

  18. Concentration analysis: A quantitative assessment of student states

    NASA Astrophysics Data System (ADS)

    Bao, Lei; Redish, Edward F.

    2001-07-01

    Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution of wrong answers given by the class. In this paper we introduce a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. This information can be used to study if the students have common incorrect models or if the question is effective in detecting student models. When combined with information obtained from qualitative research, the method allows us to identify cleanly what FCI results are telling us about student knowledge.

  19. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    PubMed

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI. PMID:27576710

  20. Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells

    NASA Astrophysics Data System (ADS)

    Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.

    2014-05-01

    Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

  1. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  2. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation. PMID:27501287

  3. Quantitative Analysis of PMLA Nanoconjugate Components after Backbone Cleavage

    PubMed Central

    Ding, Hui; Patil, Rameshwar; Portilla-Arias, Jose; Black, Keith L.; Ljubimova, Julia Y.; Holler, Eggehard

    2015-01-01

    Multifunctional polymer nanoconjugates containing multiple components show great promise in cancer therapy, but in most cases complete analysis of each component is difficult. Polymalic acid (PMLA) based nanoconjugates have demonstrated successful brain and breast cancer treatment. They consist of multiple components including targeting antibodies, Morpholino antisense oligonucleotides (AONs), and endosome escape moieties. The component analysis of PMLA nanoconjugates is extremely difficult using conventional spectrometry and HPLC method. Taking advantage of the nature of polyester of PMLA, which can be cleaved by ammonium hydroxide, we describe a method to analyze the content of antibody and AON within nanoconjugates simultaneously using SEC-HPLC by selectively cleaving the PMLA backbone. The selected cleavage conditions only degrade PMLA without affecting the integrity and biological activity of the antibody. Although the amount of antibody could also be determined using the bicinchoninic acid (BCA) method, our selective cleavage method gives more reliable results and is more powerful. Our approach provides a new direction for the component analysis of polymer nanoconjugates and nanoparticles. PMID:25894227

  4. Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size

    SciTech Connect

    Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao

    2011-12-01

    In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

  5. The Quantitative Analysis of an Analgesic Tablet: An NMR Experiment for the Instrumental Analysis Course

    NASA Astrophysics Data System (ADS)

    Schmedake, Thomas A.; Welch, Lawrence E.

    1996-11-01

    A quantitative analysis experiment is outlined that uses 13C NMR. Initial work utilizes a known compound (acenapthene) to assess the type of NMR experiment necessary to achieve a proportional response from all of the carbons in the compound. Both gated decoupling and inverse gated decoupling routines with a variety of delay times are inspected, in addition to investigation of paramagnetic additives in conjunction with inverse gated decoupling. Once the experiments with the known compound have illuminated the merits of the differing strategies for obtaining a proportional carbon response, a quantitative assessment of an unknown analgesic tablet is undertaken. The amounts of the two major components of the tablet, acetaminophen and aspirin, are determined following addition of an internal standard to the mixture. The carbon resonances emanating from each compound can be identified using spectra of the pure analgesic components and internal standard. Knowing the concentration of the internal standard and assuming a proportional response to all carbons in the sample allows calculation of the amount of both analytes in the analgesic tablets. Data from an initial laboratory trial is presented that illustrates the accuracy of the procedure.

  6. Quantitative analysis of nitrocellulose and pulp in gunpowder by using thermogravimetric analysis/FTIR

    NASA Astrophysics Data System (ADS)

    Johnson, David J.; Compton, David A.

    1989-12-01

    Thermogravimetric Analysis (TGA) has routinely been used to quantitatively determine the presence of a specific component within a material by direct measurement from the weight loss profile. This technique works well when it is known that the detected weight loss was caused only by that component. If more than one material evolves during a single weight loss it is impossible to quantify the contribution of each individual component by using stand-alone TGA. However by coupling an FT-IR to the TGA one may assign evolved gases to a detected weight loss and potentially isolate each iny dividual material. Although a number of gases may evolve during one weight loss, the judicious selection of "Specific Gas Profiles" may allow the experimentalist to isolate each gas. The SGP is a measure of IR absorbance within specific frequency regions as a function of time. Through the use of standards, integration of theseprofiles allows the operator to quantitate the various components in an unknownp. Data from this research will show that nitrocellulose andpulp content in gun powder samples may be measured using the TGA/FT-IR technique.

  7. Preparation, certification and interlaboratory analysis of workplace air filters spiked with high-fired beryllium oxide.

    PubMed

    Oatts, Thomas J; Hicks, Cheryl E; Adams, Amy R; Brisson, Michael J; Youmans-McDonald, Linda D; Hoover, Mark D; Ashley, Kevin

    2012-02-01

    Occupational sampling and analysis for multiple elements is generally approached using various approved methods from authoritative government sources such as the National Institute for Occupational Safety and Health (NIOSH), the Occupational Safety and Health Administration (OSHA) and the Environmental Protection Agency (EPA), as well as consensus standards bodies such as ASTM International. The constituents of a sample can exist as unidentified compounds requiring sample preparation to be chosen appropriately, as in the case of beryllium in the form of beryllium oxide (BeO). An interlaboratory study was performed to collect analytical data from volunteer laboratories to examine the effectiveness of methods currently in use for preparation and analysis of samples containing calcined BeO powder. NIST SRM(®) 1877 high-fired BeO powder (1100 to 1200 °C calcining temperature; count median primary particle diameter 0.12 μm) was used to spike air filter media as a representative form of beryllium particulate matter present in workplace sampling that is known to be resistant to dissolution. The BeO powder standard reference material was gravimetrically prepared in a suspension and deposited onto 37 mm mixed cellulose ester air filters at five different levels between 0.5 μg and 25 μg of Be (as BeO). Sample sets consisting of five BeO-spiked filters (in duplicate) and two blank filters, for a total of twelve unique air filter samples per set, were submitted as blind samples to each of 27 participating laboratories. Participants were instructed to follow their current process for sample preparation and utilize their normal analytical methods for processing samples containing substances of this nature. Laboratories using more than one sample preparation and analysis method were provided with more than one sample set. Results from 34 data sets ultimately received from the 27 volunteer laboratories were subjected to applicable statistical analyses. The observed

  8. Preparation, certification and interlaboratory analysis of workplace air filters spiked with high-fired beryllium oxide.

    PubMed

    Oatts, Thomas J; Hicks, Cheryl E; Adams, Amy R; Brisson, Michael J; Youmans-McDonald, Linda D; Hoover, Mark D; Ashley, Kevin

    2012-02-01

    Occupational sampling and analysis for multiple elements is generally approached using various approved methods from authoritative government sources such as the National Institute for Occupational Safety and Health (NIOSH), the Occupational Safety and Health Administration (OSHA) and the Environmental Protection Agency (EPA), as well as consensus standards bodies such as ASTM International. The constituents of a sample can exist as unidentified compounds requiring sample preparation to be chosen appropriately, as in the case of beryllium in the form of beryllium oxide (BeO). An interlaboratory study was performed to collect analytical data from volunteer laboratories to examine the effectiveness of methods currently in use for preparation and analysis of samples containing calcined BeO powder. NIST SRM(®) 1877 high-fired BeO powder (1100 to 1200 °C calcining temperature; count median primary particle diameter 0.12 μm) was used to spike air filter media as a representative form of beryllium particulate matter present in workplace sampling that is known to be resistant to dissolution. The BeO powder standard reference material was gravimetrically prepared in a suspension and deposited onto 37 mm mixed cellulose ester air filters at five different levels between 0.5 μg and 25 μg of Be (as BeO). Sample sets consisting of five BeO-spiked filters (in duplicate) and two blank filters, for a total of twelve unique air filter samples per set, were submitted as blind samples to each of 27 participating laboratories. Participants were instructed to follow their current process for sample preparation and utilize their normal analytical methods for processing samples containing substances of this nature. Laboratories using more than one sample preparation and analysis method were provided with more than one sample set. Results from 34 data sets ultimately received from the 27 volunteer laboratories were subjected to applicable statistical analyses. The observed

  9. A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals

    ERIC Educational Resources Information Center

    Takusi, Gabriel Samuto

    2010-01-01

    This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

  10. Integrating Data Analysis (IDA): Working with Sociology Departments to Address the Quantitative Literacy Gap

    ERIC Educational Resources Information Center

    Howery, Carla B.; Rodriguez, Havidan

    2006-01-01

    The NSF-funded Integrating Data Analysis (IDA) Project undertaken by the American Sociological Association (ASA) and the Social Science Data Analysis Network sought to close the quantitative literacy gap for sociology majors. Working with twelve departments, the project built on lessons learned from ASA's Minority Opportunities through School…

  11. A fully automated method for quantitative cerebral hemodynamic analysis using DSC-MRI.

    PubMed

    Bjørnerud, Atle; Emblem, Kyrre E

    2010-05-01

    Dynamic susceptibility contrast (DSC)-based perfusion analysis from MR images has become an established method for analysis of cerebral blood volume (CBV) in glioma patients. To date, little emphasis has, however, been placed on quantitative perfusion analysis of these patients, mainly due to the associated increased technical complexity and lack of sufficient stability in a clinical setting. The aim of our study was to develop a fully automated analysis framework for quantitative DSC-based perfusion analysis. The method presented here generates quantitative hemodynamic maps without user interaction, combined with automatic segmentation of normal-appearing cerebral tissue. Validation of 101 patients with confirmed glioma after surgery gave mean values for CBF, CBV, and MTT, extracted automatically from normal-appearing whole-brain white and gray matter, in good agreement with literature values. The measured age- and gender-related variations in the same parameters were also in agreement with those in the literature. Several established analysis methods were compared and the resulting perfusion metrics depended significantly on method and parameter choice. In conclusion, we present an accurate, fast, and automatic quantitative perfusion analysis method where all analysis steps are based on raw DSC data only. PMID:20087370

  12. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  13. Digitally Enhanced Thin-Layer Chromatography: An Inexpensive, New Technique for Qualitative and Quantitative Analysis

    ERIC Educational Resources Information Center

    Hess, Amber Victoria Irish

    2007-01-01

    A study conducted shows that if digital photography is combined with regular thin-layer chromatography (TLC), it could perform highly improved qualitative analysis as well as make accurate quantitative analysis possible for a much lower cost than commercial equipment. The findings suggest that digitally enhanced TLC (DE-TLC) is low-cost and easy…

  14. A Quantitative Content Analysis of Mercer University MEd, EdS, and Doctoral Theses

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Gaiek, Lura S.; White, Torian A.; Slappey, Lisa A.; Chastain, Andrea; Harris, Rose Prejean

    2010-01-01

    Quantitative content analysis of a body of research not only helps budding researchers understand the culture, language, and expectations of scholarship, it helps identify deficiencies and inform policy and practice. Because of these benefits, an analysis of a census of 980 Mercer University MEd, EdS, and doctoral theses was conducted. Each thesis…

  15. Revisiting the quantitative features of surface-assisted laser desorption/ionization mass spectrometric analysis.

    PubMed

    Wu, Ching-Yi; Lee, Kai-Chieh; Kuo, Yen-Ling; Chen, Yu-Chie

    2016-10-28

    Surface-assisted laser desorption/ionization (SALDI) coupled with mass spectrometry (MS) is frequently used to analyse small organics owing to its clean background. Inorganic materials can be used as energy absorbers and the transfer medium to facilitate the desorption/ionization of analytes; thus, they are used as SALDI-assisting materials. Many studies have demonstrated the usefulness of SALDI-MS in quantitative analysis of small organics. However, some characteristics occurring in SALDI-MS require certain attention to ensure the reliability of the quantitative analysis results. The appearance of a coffee-ring effect in SALDI sample preparation is the primary factor that can affect quantitative SALDI-MS analysis results. However, to the best of our knowledge, there are no reports relating to quantitative SALDI-MS analysis that discuss or consider this effect. In this study, the coffee-ring effect is discussed using nanoparticles and nanostructured substrates as SALDI-assisting materials to show how this effect influences SALDI-MS analysis results. Potential solutions for overcoming the existing problems are also suggested.This article is part of the themed issue 'Quantitative mass spectrometry'.

  16. Revisiting the quantitative features of surface-assisted laser desorption/ionization mass spectrometric analysis.

    PubMed

    Wu, Ching-Yi; Lee, Kai-Chieh; Kuo, Yen-Ling; Chen, Yu-Chie

    2016-10-28

    Surface-assisted laser desorption/ionization (SALDI) coupled with mass spectrometry (MS) is frequently used to analyse small organics owing to its clean background. Inorganic materials can be used as energy absorbers and the transfer medium to facilitate the desorption/ionization of analytes; thus, they are used as SALDI-assisting materials. Many studies have demonstrated the usefulness of SALDI-MS in quantitative analysis of small organics. However, some characteristics occurring in SALDI-MS require certain attention to ensure the reliability of the quantitative analysis results. The appearance of a coffee-ring effect in SALDI sample preparation is the primary factor that can affect quantitative SALDI-MS analysis results. However, to the best of our knowledge, there are no reports relating to quantitative SALDI-MS analysis that discuss or consider this effect. In this study, the coffee-ring effect is discussed using nanoparticles and nanostructured substrates as SALDI-assisting materials to show how this effect influences SALDI-MS analysis results. Potential solutions for overcoming the existing problems are also suggested.This article is part of the themed issue 'Quantitative mass spectrometry'. PMID:27644973

  17. Some remarks on the quantitative analysis of behavior.

    PubMed

    Marr, M J

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences-physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  18. Some remarks on the quantitative analysis of behavior.

    PubMed

    Marr, M J

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences-physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory.

  19. Some remarks on the quantitative analysis of behavior

    PubMed Central

    Marr, M. Jackson

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences—physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  20. Quantitative proteomic analysis of cold-responsive proteins in rice.

    PubMed

    Neilson, Karlie A; Mariani, Michael; Haynes, Paul A

    2011-05-01

    Rice is susceptible to cold stress and with a future of climatic instability we will be unable to produce enough rice to satisfy increasing demand. A thorough understanding of the molecular responses to thermal stress is imperative for engineering cultivars, which have greater resistance to low temperature stress. In this study we investigated the proteomic response of rice seedlings to 48, 72 and 96 h of cold stress at 12-14°C. The use of both label-free and iTRAQ approaches in the analysis of global protein expression enabled us to assess the complementarity of the two techniques for use in plant proteomics. The approaches yielded a similar biological response to cold stress despite a disparity in proteins identified. The label-free approach identified 236 cold-responsive proteins compared to 85 in iTRAQ results, with only 24 proteins in common. Functional analysis revealed differential expression of proteins involved in transport, photosynthesis, generation of precursor metabolites and energy; and, more specifically, histones and vitamin B biosynthetic proteins were observed to be affected by cold stress. PMID:21433000

  1. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  2. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  3. [Quantitative analysis of seven phenolic acids in eight Yinqiao Jiedu serial preparations by quantitative analysis of multi-components with single-marker].

    PubMed

    Wang, Jun-jun; Zhang, Li; Guo, Qing; Kou, Jun-ping; Yu, Bo-yang; Gu, Dan-hua

    2015-04-01

    The study aims to develop a unified method to determine seven phenolic acids (neochlorogenic acid, chlorogenic acid, 4-caffeoylquinic acid, caffeic acid, isochlorogenic acid B, isochlorogenic acid A and isochlorogenic acid C) contained in honeysuckle flower that is the monarch drug of all the eight Yinqiao Jiedu serial preparations using quantitative analysis of multi-components by single-marker (QAMS). Firstly, chlorogenic acid was used as a reference to get the average relative correction factors (RCFs) of the other phenolic acids in ratios to the reference; columns and instruments from different companies were used to validate the durability of the achieved RCFs in different levels of standard solutions; and honeysuckle flower extract was used as the reference substance to fix the positions of chromatographic peaks. Secondly, the contents of seven phenolic acids in eight different Yinqiao Jiedu serial preparations samples were calculated based on the RCFs durability. Finally, the quantitative results were compared between QAMS and the external standard (ES) method. The results have showed that the durability of the achieved RCFs is good (RSD during 0.80% - 2.56%), and there are no differences between the quantitative results of QAMS and ES (the relative average deviation < 0.93%). So it can be successfully used to the quantitative control of honeysuckle flower principally prescribed in Yinqiao Jiedu serial preparations. PMID:26223132

  4. Multivariate processing strategies for enhancing qualitative and quantitative analysis based on infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Wan, Boyong

    2007-12-01

    Airborne passive Fourier transform infrared spectrometry is gaining increased attention in environmental applications because of its great flexibility. Usually, pattern recognition techniques are used for automatic analysis of large amount of collected data. However, challenging problems are the constantly changing background and high calibration cost. As aircraft is flying, background is always changing. Also, considering the great variety of backgrounds and high expense of data collection from aircraft, cost of collecting representative training data is formidable. Instead of using airborne data, data generated from simulation strategies can be used for training purposes. Training data collected under controlled conditions on the ground or synthesized from real backgrounds can be both options. With both strategies, classifiers may be developed with much lower cost. For both strategies, signal processing techniques need to be used to extract analyte features. In this dissertation, signal processing methods are applied either in interferogram or spectral domain for features extraction. Then, pattern recognition methods are applied to develop binary classifiers for automated detection of air-collected methanol and ethanol vapors. The results demonstrate, with optimized signal processing methods and training set composition, classifiers trained from ground-collected or synthetic data can give good classification on real air-collected data. Near-infrared (NIR) spectrometry is emerging as a promising tool for noninvasive blood glucose detection. In combination with multivariate calibration techniques, NIR spectroscopy can give quick quantitative determinations of many species with minimal sample preparation. However, one main problem with NIR calibrations is degradation of calibration model over time. The varying background information will worsen the prediction precision and complicate the multivariate models. To mitigate the needs for frequent recalibration and

  5. Quantitative Analysis of Photoactivated Localization Microscopy (PALM) Datasets Using Pair-correlation Analysis

    PubMed Central

    Sengupta, Prabuddha; Lippincott-Schwartz, Jennifer

    2013-01-01

    Pointillistic approach based super-resolution techniques, such as photoactivated localization microscopy (PALM), involve multiple cycles of sequential activation, imaging and precise localization of single fluorescent molecules. A super-resolution image, having nanoscopic structural information, is then constructed by compiling all the image sequences. Because the final image resolution is determined by the localization precision of detected single molecules and their density, accurate image reconstruction requires imaging of biological structures labeled with fluorescent molecules at high density. In such image datasets, stochastic variations in photon emission and intervening dark states lead to uncertainties in identification of single molecules. This, in turn, prevents the proper utilization of the wealth of information on molecular distribution and quantity. A recent strategy for overcoming this problem is pair-correlation analysis applied to PALM. Using rigorous statistical algorithms to estimate the number of detected proteins, this approach allows the spatial organization of molecules to be quantitatively described. PMID:22447653

  6. Quantifying fungal viability in air and water samples using quantitative PCR after treatment with propidium monoazide (PMA).

    PubMed

    Vesper, Stephen; McKinstry, Craig; Hartmann, Chris; Neace, Michelle; Yoder, Stephanie; Vesper, Alex

    2008-02-01

    A method is described to discriminate between live and dead cells of the infectious fungi Aspergillus fumigatus, Aspergillus flavus, Aspergillus terreus, Mucor racemosus, Rhizopus stolonifer and Paecilomyces variotii. To test the method, conidial suspensions were heat inactivated at 85 degrees C or held at 5 degrees C (controls) for 1 h. Polycarbonate filters (25 mm diameter, 0.8 microm pore size) were placed on "welled" slides (14 mm diameter) and the filters treated with either PBS or PMA. Propidium monoazide (PMA), which enters dead cells but not live cells, was incubated with cell suspensions, exposed to blue wavelength light-emitting diodes (LED) to inactivate remaining PMA and secure intercalation of PMA with DNA of dead cells. Treated cells were extracted and the live and dead cells evaluated with quantitative PCR (QPCR). After heat treatment and DNA modification with PMA, all fungal species tested showed an approximate 100- to 1000-fold difference in cell viability estimated by QPCR analysis which was consistent with estimates of viability based on culturing.

  7. Quantifying fungal viability in air and water samples using quantitative PCR after treatment with propidium monoazide (PMA)

    SciTech Connect

    Vesper, Stephen; McKinstry, Craig A.; Hartmann, Chris; Neace, Michelle; Yoder, Stephanie; Vesper, Alex

    2007-11-28

    A method is described to discriminate between live and dead cells of the infectious fungi Aspergillus fumigatus, Aspergillus flavus, Aspergillus terreus, Mucor racemosus, Rhizopus stolonifer and Paecilomyces variotii. To test the method, conidial suspensions were heat inactivated at 85 °C or held at 5 °C (controls) for 1 h. Polycarbonate filters (25 mm diameter, 0.8 μm pore size) were placed on "welled" slides (14 mm diameter) and the filters treated with either PBS or PMA. Propidium monoazide (PMA), which enters dead cells but not live cells, was incubated with cell suspensions, exposed to blue wavelength light-emitting diodes (LED) to inactivate remaining PMA and secure intercalation of PMAwith DNA of dead cells. Treated cells were extracted and the live and dead cells evaluated with quantitative PCR (QPCR). After heat treatment and DNA modification with PMA, all fungal species tested showed an approximate 100- to 1000-fold difference in cell viability estimated by QPCR analysis which was consistent with estimates of viability based on culturing.

  8. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila.

    PubMed

    Itskov, Pavel M; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H; Ribeiro, Carlos

    2014-08-04

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake.

  9. Quantitative real-time single particle analysis of virions.

    PubMed

    Heider, Susanne; Metzner, Christoph

    2014-08-01

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed-or adapted from other fields, such as nanotechnology-to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification.

  10. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila

    PubMed Central

    Itskov, Pavel M.; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H.; Ribeiro, Carlos

    2014-01-01

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake. PMID:25087594

  11. Quantitative Analysis of Spectral Impacts on Silicon Photodiode Radiometers: Preprint

    SciTech Connect

    Myers, D. R.

    2011-04-01

    Inexpensive broadband pyranometers with silicon photodiode detectors have a non-uniform spectral response over the spectral range of 300-1100 nm. The response region includes only about 70% to 75% of the total energy in the terrestrial solar spectral distribution from 300 nm to 4000 nm. The solar spectrum constantly changes with solar position and atmospheric conditions. Relative spectral distributions of diffuse hemispherical irradiance sky radiation and total global hemispherical irradiance are drastically different. This analysis convolves a typical photodiode response with SMARTS 2.9.5 spectral model spectra for different sites and atmospheric conditions. Differences in solar component spectra lead to differences on the order of 2% in global hemispherical and 5% or more in diffuse hemispherical irradiances from silicon radiometers. The result is that errors of more than 7% can occur in the computation of direct normal irradiance from global hemispherical irradiance and diffuse hemispherical irradiance using these radiometers.

  12. Quantitative analysis by mid-infrared spectrometry in food and agro-industrial fields

    NASA Astrophysics Data System (ADS)

    Dupuy, Nathalie; Huvenne, J. P.; Sombret, B.; Legrand, P.

    1993-03-01

    Thanks to what has been achieved by the Fourier transform, infrared spectroscopy can now become a state of the art device in the quality control laboratories if we consider its precision and the gain in time it ensures compared to traditional analysis methods such as HPLC chromatography. Moreover, the increasing number of new mathematical regression methods such as Partial Least Square ( PLS) regression allows the multicomponent quantitative analysis in mixtures. Nevertheless, the efficiency of infrared spectrometry as a quantitative analysis method often depends on the choice of an adequate presentation for the sample. In this document, we shall demonstrate several techniques such as diffuse reflectance and Attenuated Total Reflectance (ATR) which can be according to the various physical states of the mixtures. The quantitative analysis of real samples from the food industry enables us to estimate its precision. For instance, the analysis of the three main components (glucose, fructose and maltose) in the glucose syrups can be done (using ATR) with a precision in the region of 3% whereas the time required to obtain an analysis report is about 5 minutes. Finally multicomponent quantitative analysis is quite feasable by mid-IR spectroscopy.

  13. Interlake production established using quantitative hydrocarbon well-log analysis

    SciTech Connect

    Lancaster, J.; Atkinson, A.

    1988-07-01

    Production was established in a new pay zone of the basal Interlake Formation adjacent to production in Midway field in Williams County, North Dakota. Hydrocarbon saturation, which was computed using hydrocarbon well-log (mud-log) data, and computed permeability encouraged the operator to run casing and test this zone. By use of drilling rig parameters, drilling mud properties, hydrocarbon-show data from the mud log, drilled rock and porosity descriptions, and wireline log porosity, this new technique computes oil saturation (percent of porosity) and permeability to the invading filtrate, using the Darcy equation. The Leonardo Fee well was drilled to test the Devonian Duperow, the Silurian upper Interlake, and the Ordovician Red River. The upper two objectives were penetrated downdip from Midway production and there were no hydrocarbon shows. It was determined that the Red River was tight, based on sample examination by well site personnel. The basal Interlake, however, liberated hydrocarbon shows that were analyzed by this new technology. The results of this evaluation accurately predicted this well would be a commercial success when placed in production. Where geophysical log analysis might be questionable, this new evaluation technique may provide answers to anticipated oil saturation and producibility. The encouraging results of hydrocarbon saturation and permeability, produced by this technique, may be largely responsible for the well being in production today.

  14. Quantitative analysis of American woodcock nest and brood habitat

    USGS Publications Warehouse

    Bourgeois, A.; Keppie, Daniel M.; Owen, Ray B.

    1977-01-01

    Sixteen nest and 19 brood sites of American woodcock (Philohela minoI) were examined in northern lower Michigan between 15 April and 15 June 1974 to determine habitat structure associated with these sites. Woodcock hens utilized young, second-growth forest stands which were similar in species composition for both nesting and brood rearing. A multi-varIate discriminant function analysis revealed a significant (P< 0.05) difference, however, in habitat structure. Nest habitat was characterized by lower tree density (2176 trees/ha) and basal area (8.6 m2/ha), by being close to forest openings (7 m) and by being situated on dry, relatively well drained sites. In contrast, woodcock broods were located in sites that had nearly twice the tree density (3934 trees/hal and basal area (16.5 m2/ha), was located over twice as far from forest openings (18 m) and generally occurred on damp sites, near (8 m) standing water. Importance of the habitat features to the species and possible management implications are discussed.

  15. Mechanistic insights from a quantitative analysis of pollen tube guidance

    PubMed Central

    2010-01-01

    Background Plant biologists have long speculated about the mechanisms that guide pollen tubes to ovules. Although there is now evidence that ovules emit a diffusible attractant, little is known about how this attractant mediates interactions between the pollen tube and the ovules. Results We employ a semi-in vitro assay, in which ovules dissected from Arabidopsis thaliana are arranged around a cut style on artificial medium, to elucidate how ovules release the attractant and how pollen tubes respond to it. Analysis of microscopy images of the semi-in vitro system shows that pollen tubes are more attracted to ovules that are incubated on the medium for longer times before pollen tubes emerge from the cut style. The responses of tubes are consistent with their sensing a gradient of an attractant at 100-150 μm, farther than previously reported. Our microscopy images also show that pollen tubes slow their growth near the micropyles of functional ovules with a spatial range that depends on ovule incubation time. Conclusions We propose a stochastic model that captures these dynamics. In the model, a pollen tube senses a difference in the fraction of receptors bound to an attractant and changes its direction of growth in response; the attractant is continuously released from ovules and spreads isotropically on the medium. The model suggests that the observed slowing greatly enhances the ability of pollen tubes to successfully target ovules. The relation of the results to guidance in vivo is discussed. PMID:20170550

  16. Quantitative analysis of phenol oxidase activity in insect hemolymph.

    PubMed

    Sorrentino, Richard Paul; Small, Chiyedza N; Govind, Shubha

    2002-04-01

    We describe a simple, inexpensive, and robust protocol for the quantification of phenol oxidase activity in insect hemolymph. Discrete volumes of hemolymph from Drosophila melanogaster larvae are applied to pieces of filter paper soaked in an L-3, 4-dihydroxyphenylalanine (L-DOPA) solution. Phenol oxidase present in the samples catalyzes melanin synthesis from the L-DOPA precursor, resulting in the appearance of a roughly circular melanized spot on the filter paper. The filter paper is then scanned and analyzed with image-processing software. Each pixel in an image is assigned a grayscale value. The mean of the grayscale values for a circular region of pixels at the center of the image of each spot is used to compute a melanization index (MI) value, the computation is based on a comparison to an external standard (India ink). Numerical MI values for control and experimental larvae can then be pooled and subjected to statistical analysis. This protocol was used to evaluate phenol oxidase activity in larvae of different backgrounds: wild-type, lozenge, hopscotch(Tumorous-lethal) (which induces the formation of large melanotic tumors), and body-color mutations ebony and yellow. Our results demonstrate that this assay is sensitive enough for use in genetic screens with D. melanogaster and could conceivably be used for evaluation of MI from hemolymph of other insects.

  17. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  18. Quantitative analysis of a transportable matter-wave gravimeter

    NASA Astrophysics Data System (ADS)

    Desruelle, B.; Le Moigne, N.; Bonvalot, S.; Menoret, V.; Vermeulen, P.; Merlet, S.

    2015-12-01

    This paper summarizes the latest results obtained with our second generation Absolute Quantum Gravimeter (AQG). This instrument relies on the utilization of advanced matter-wave interferometry techniques, which allow us to precisely characterize the vertical acceleration experienced by a cloud of cold atoms over a free-fall of 10 cm. A significant research effort was conducted over the last months to optimize the instrument sensitivity as well as the rejection of ground vibrations, and we will present the technological solutions that were selected to meet our objectives. We will then present a detailed review of the characterizations performed with this instrument. This data shows a very satisfactory sensitivity of the AQG (2 μGal standard deviation after 1000 s of data integration) and a very robust behavior against ground vibrations. We will also present a detailed analysis of the long term behavior of the instrument. These results clearly demonstrate the high potential of matter-wave gravimeter for high performance absolute gravity measurements. Eventually, we will discuss the research activities we are conducting to develop a field version of this instrument.

  19. Quantitative analysis of bloggers' collective behavior powered by emotions

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Paltoglou, Georgios; Tadić, Bosiljka

    2011-02-01

    Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.

  20. Direct Quantitative Analysis of Arsenic in Coal Fly Ash

    PubMed Central

    Hartuti, Sri; Kambara, Shinji; Takeyama, Akihiro; Kumabe, Kazuhiro; Moritomi, Hiroshi

    2012-01-01

    A rapid, simple method based on graphite furnace atomic absorption spectrometry is described for the direct determination of arsenic in coal fly ash. Solid samples were directly introduced into the atomizer without preliminary treatment. The direct analysis method was not always free of spectral matrix interference, but the stabilization of arsenic by adding palladium nitrate (chemical modifier) and the optimization of the parameters in the furnace program (temperature, rate of temperature increase, hold time, and argon gas flow) gave good results for the total arsenic determination. The optimal furnace program was determined by analyzing different concentrations of a reference material (NIST1633b), which showed the best linearity for calibration. The optimized parameters for the furnace programs for the ashing and atomization steps were as follows: temperatures of 500–1200 and 2150°C, heating rates of 100 and 500°C s−1, hold times of 90 and 7 s, and medium then maximum and medium argon gas flows, respectively. The calibration plots were linear with a correlation coefficient of 0.9699. This method was validated using arsenic-containing raw coal samples in accordance with the requirements of the mass balance calculation; the distribution rate of As in the fly ashes ranged from 101 to 119%. PMID:23251836

  1. Quantitative real-time single particle analysis of virions

    SciTech Connect

    Heider, Susanne; Metzner, Christoph

    2014-08-15

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed—or adapted from other fields, such as nanotechnology—to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification. - Highlights: • We introduce four methods for virus particle-based quantification of viruses. • They allow for quantification of a wide range of samples in under an hour time. • The additional measurement of size and zeta potential is possible for some.

  2. Quantitative Analysis of the Microstructure of Auxetic Foams

    SciTech Connect

    Gaspar, N.; Smith, C.W.; Miller, E.A.; Seidler, G.T.; Evans, K.E.

    2008-07-28

    The auxetic foams first produced by Lakes have been modelled in a variety of ways, each model trying to reproduce some observed feature of the microscale of the foams. Such features include bent or broken ribs or inverted angles between ribs. These models can reproduce the Poisson's ratio or Poisson's function of auxetic foam if the model parameters are carefully chosen. However these model parameters may not actually reflect the internal structure of the foams. A big problem is that measurement of parameters such as lengths and angles is not straightforward within a 3-d sample. In this work a sample of auxetic foam has been imaged by 3-d X-ray computed tomography. The resulting image is translated to a form that emphasises the geometrical structure of connected ribs. This connected rib data are suitably analysed to describe both the microstructural construction of auxetic foams and the statistical spread of structure, that is, the heterogeneity of an auxetic foam. From the analysis of the microstructure, observations are made about the requirements for microstructural models and comparisons made to previous existing models. From the statistical data, measures of heterogeneity are made that will help with future modelling that includes the heterogeneous aspect of auxetic foams.

  3. Segmentation and learning in the quantitative analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Ruggiero, Christy; Ross, Amy; Porter, Reid

    2015-02-01

    In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.

  4. The Quantitative Analysis of the Rotational Spectrum of Ncncs

    NASA Astrophysics Data System (ADS)

    Winnewisser, Manfred; Winnewisser, Brenda P.; Medvedev, Ivan R.; De Lucia, Frank C.; Ross, Stephen C.; Koput, Jacek

    2009-06-01

    The analysis of the rotational data which were the basis of our two previous publications about NCNCS as an example of quantum monodromy has been completed, and the data extended to include the 6th excited state of the quasilinear bending mode. This talk will present the results of fitting the data with the GSRB Hamiltonian, which provides structural and potential parameters. Ab initio calculations contributed some parameters that could not be determined from the data. The predicted variation of the expectation value of ρ, which is the complement of the CNC angle, and of the electric dipole transition moment, upon rovibrational excitation indicate the mapping of monodromy in the potential function into these properties of the molecule. B. P. Winnewisser, M. Winnewisser, I. R. Medvedev, M. Behnke, F. C. De Lucia, S. C. Ross and J. Koput Phys. Rev. Lett. 95 (243002), 2005. M. Winnewisser, B. P. Winnewisser, I. R. Medvedev, F. C. De Lucia, S. C. Ross and L. M. Bates J. Mol. Struct. 798 (1-26), 2006.

  5. Segmentation of vascular structures and hematopoietic cells in 3D microscopy images and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Mu, Jian; Yang, Lin; Kamocka, Malgorzata M.; Zollman, Amy L.; Carlesso, Nadia; Chen, Danny Z.

    2015-03-01

    In this paper, we present image processing methods for quantitative study of how the bone marrow microenvironment changes (characterized by altered vascular structure and hematopoietic cell distribution) caused by diseases or various factors. We develop algorithms that automatically segment vascular structures and hematopoietic cells in 3-D microscopy images, perform quantitative analysis of the properties of the segmented vascular structures and cells, and examine how such properties change. In processing images, we apply local thresholding to segment vessels, and add post-processing steps to deal with imaging artifacts. We propose an improved watershed algorithm that relies on both intensity and shape information and can separate multiple overlapping cells better than common watershed methods. We then quantitatively compute various features of the vascular structures and hematopoietic cells, such as the branches and sizes of vessels and the distribution of cells. In analyzing vascular properties, we provide algorithms for pruning fake vessel segments and branches based on vessel skeletons. Our algorithms can segment vascular structures and hematopoietic cells with good quality. We use our methods to quantitatively examine the changes in the bone marrow microenvironment caused by the deletion of Notch pathway. Our quantitative analysis reveals property changes in samples with deleted Notch pathway. Our tool is useful for biologists to quantitatively measure changes in the bone marrow microenvironment, for developing possible therapeutic strategies to help the bone marrow microenvironment recovery.

  6. Hemato-critical issues in quantitative analysis of dried blood spots: challenges and solutions.

    PubMed

    De Kesel, Pieter Mm; Sadones, Nele; Capiau, Sara; Lambert, Willy E; Stove, Christophe P

    2013-08-01

    Dried blood spot (DBS) sampling for quantitative determination of drugs in blood has entered the bioanalytical arena at a fast pace during the last decade, primarily owing to progress in analytical instrumentation. Despite the many advantages associated with this new sampling strategy, several issues remain, of which the hematocrit issue is undoubtedly the most widely discussed challenge, since strongly deviating hematocrit values may significantly impact DBS-based quantitation. In this review, an overview is given of the different aspects of the 'hematocrit problem' in quantitative DBS analysis. The different strategies that try to cope with this problem are discussed, along with their potential and limitations. Implementation of some of these strategies in practice may help to overcome this important hurdle in DBS assays, further allowing DBS to become an established part of routine quantitative bioanalysis.

  7. Analysis of air quality management with emphasis on transportation sources

    NASA Technical Reports Server (NTRS)

    English, T. D.; Divita, E.; Lees, L.

    1980-01-01

    The current environment and practices of air quality management were examined for three regions: Denver, Phoenix, and the South Coast Air Basin of California. These regions were chosen because the majority of their air pollution emissions are related to mobile sources. The impact of auto exhaust on the air quality management process is characterized and assessed. An examination of the uncertainties in air pollutant measurements, emission inventories, meteorological parameters, atmospheric chemistry, and air quality simulation models is performed. The implications of these uncertainties to current air quality management practices is discussed. A set of corrective actions are recommended to reduce these uncertainties.

  8. Quantitative flux analysis reveals folate-dependent NADPH production

    NASA Astrophysics Data System (ADS)

    Fan, Jing; Ye, Jiangbin; Kamphorst, Jurre J.; Shlomi, Tomer; Thompson, Craig B.; Rabinowitz, Joshua D.

    2014-06-01

    ATP is the dominant energy source in animals for mechanical and electrical work (for example, muscle contraction or neuronal firing). For chemical work, there is an equally important role for NADPH, which powers redox defence and reductive biosynthesis. The most direct route to produce NADPH from glucose is the oxidative pentose phosphate pathway, with malic enzyme sometimes also important. Although the relative contribution of glycolysis and oxidative phosphorylation to ATP production has been extensively analysed, similar analysis of NADPH metabolism has been lacking. Here we demonstrate the ability to directly track, by liquid chromatography-mass spectrometry, the passage of deuterium from labelled substrates into NADPH, and combine this approach with carbon labelling and mathematical modelling to measure NADPH fluxes. In proliferating cells, the largest contributor to cytosolic NADPH is the oxidative pentose phosphate pathway. Surprisingly, a nearly comparable contribution comes from serine-driven one-carbon metabolism, in which oxidation of methylene tetrahydrofolate to 10-formyl-tetrahydrofolate is coupled to reduction of NADP+ to NADPH. Moreover, tracing of mitochondrial one-carbon metabolism revealed complete oxidation of 10-formyl-tetrahydrofolate to make NADPH. As folate metabolism has not previously been considered an NADPH producer, confirmation of its functional significance was undertaken through knockdown of methylenetetrahydrofolate dehydrogenase (MTHFD) genes. Depletion of either the cytosolic or mitochondrial MTHFD isozyme resulted in decreased cellular NADPH/NADP+ and reduced/oxidized glutathione ratios (GSH/GSSG) and increased cell sensitivity to oxidative stress. Thus, although the importance of folate metabolism for proliferating cells has been long recognized and attributed to its function of producing one-carbon units for nucleic acid synthesis, another crucial function of this pathway is generating reducing power.

  9. Quantitative ultrasound texture analysis for clinical decision making support

    NASA Astrophysics Data System (ADS)

    Wu, Jie Ying; Beland, Michael; Konrad, Joseph; Tuomi, Adam; Glidden, David; Grand, David; Merck, Derek

    2015-03-01

    We propose a general ultrasound (US) texture-analysis and machine-learning framework for detecting the presence of disease that is suitable for clinical application across clinicians, disease types, devices, and operators. Its stages are image selection, image filtering, ROI selection, feature parameterization, and classification. Each stage is modular and can be replaced with alternate methods. Thus, this framework is adaptable to a wide range of tasks. Our two preliminary clinical targets are hepatic steatosis and adenomyosis diagnosis. For steatosis, we collected US images from 288 patients and their pathology-determined values of steatosis (%) from biopsies. Two radiologists independently reviewed all images and identified the region of interest (ROI) most representative of the hepatic echotexture for each patient. To parameterize the images into comparable quantities, we filter the US images at multiple scales for various texture responses. For each response, we collect a histogram of pixel features within the ROI, and parameterize it as a Gaussian function using its mean, standard deviation, kurtosis, and skew to create a 36-feature vector. Our algorithm uses a support vector machine (SVM) for classification. Using a threshold of 10%, we achieved 72.81% overall accuracy, 76.18% sensitivity, and 65.96% specificity in identifying steatosis with leave-ten-out cross-validation (p<0.0001). Extending this framework to adenomyosis, we identified 38 patients with MR-confirmed findings of adenomyosis and previous US studies and 50 controls. A single rater picked the best US-image and ROI for each case. Using the same processing pipeline, we obtained 76.14% accuracy, 86.00% sensitivity, and 63.16% specificity with leave-one-out cross-validation (p<0.0001).

  10. Quantitative analysis of cardiovascular modulation in respiratory neural activity.

    PubMed

    Dick, Thomas E; Morris, Kendall F

    2004-05-01

    We propose the 'delta(2)-statistic' for assessing the magnitude and statistical significance of arterial pulse-modulated activity of single neurones and present the results of applying this tool to medullary respiratory-modulated units. This analytical tool is a modification of the eta(2)-statistic and, consequently, based on the analysis of variance. The eta(2)-statistic reflects the consistency of respiratory-modulated activity on a cycle-by-cycle basis. However, directly applying this test to activity during the cardiac cycle proved ineffective because subjects-by-treatments matrices did not contain enough 'information'. We increased information by dividing the cardiac cycle into fewer bins, excluding cycles without activity and summing activity over multiple cycles. The analysed neuronal activity was an existing data set examining the neural control of respiration and cough. Neurones were recorded in the nuclei of the solitary tracts, and in the rostral and caudal ventral respiratory groups of decerebrate, neuromuscularly blocked, ventilated cats (n= 19). Two hundred of 246 spike trains were respiratory modulated; of these 53% were inspiratory (I), 36.5% expiratory (E), 6% IE phase spanning and 4.5% EI phase spanning and responsive to airway stimulation. Nearly half (n= 96/200) of the respiratory-modulated units were significantly pulse modulated and 13 were highly modulated with delta(2) values exceeding 0.3. In 10 of these highly modulated units, eta(2) values were greater than 0.3 and all 13 had, at least, a portion of their activity during expiration. We conclude that cardiorespiratory interaction is reciprocal; in addition to respiratory-modulated activity in a subset of neuronal activity patterns controlling the cardiovascular system, pulse-modulated activity exists in a subset of neuronal activity patterns controlling the respiratory system. Thus, cardio-ventilatory coupling apparent in respiratory motor output is evident and, perhaps, derived from the

  11. [Study of infrared spectroscopy quantitative analysis method for methane gas based on data mining].

    PubMed

    Zhang, Ai-Ju

    2013-10-01

    Monitoring of methane gas is one of the important factors affecting the coal mine safety. The online real-time monitoring of the methane gas is used for the mine safety protection. To improve the accuracy of model analysis, in the present paper, the author uses the technology of infrared spectroscopy to study the gas infrared quantitative analysis algorithm. By data mining technology application in multi-component infrared spectroscopy quantitative analysis algorithm, it was found that cluster analysis partial least squares algorithm is obviously superior to simply using partial least squares algorithm in terms of accuracy. In addition, to reduce the influence of the error on the accuracy of model individual calibration samples, the clustering analysis was used for the data preprocessing, and such denoising method was found to improve the analysis accuracy.

  12. A Growing Role for Gender Analysis in Air Pollution Epidemiology

    PubMed Central

    Clougherty, Jane E.

    2010-01-01

    Objective Epidemiologic studies of air pollution effects on respiratory health report significant modification by sex, although results are not uniform. Importantly, it remains unclear whether modifications are attributable to socially derived gendered exposures, to sex-linked physiological differences, or to some interplay thereof. Gender analysis, which aims to disaggregate social from biological differences between males and females, may help to elucidate these possible sources of effect modification. Data sources and data extraction A PubMed literature search was performed in July 2009, using the terms “respiratory” and any of “sex” or “gender” or “men and women” or “boys and girls” and either “PM2.5” (particulate matter ≥ 2.5 μm in aerodynamic diameter) or “NO2” (nitrogen dioxide). I reviewed the identified studies, and others cited therein, to summarize current evidence of effect modification, with attention to authors’ interpretation of observed differences. Owing to broad differences in exposure mixes, outcomes, and analytic techniques, with few studies examining any given combination thereof, meta-analysis was not deemed appropriate at this time. Data synthesis More studies of adults report stronger effects among women, particularly for older persons or where using residential exposure assessment. Studies of children suggest stronger effects among boys in early life and among girls in later childhood. Conclusions The qualitative review describes possible sources of difference in air pollution response between women and men, which may vary by life stage, coexposures, hormonal status, or other factors. The sources of observed effect modifications remain unclear, although gender analytic approaches may help to disentangle gender and sex differences in pollution response. A framework for incorporating gender analysis into environmental epidemiology is offered, along with several potentially useful methods from gender analysis

  13. Quantitative Expression Analysis in Brassica napus by Northern Blot Analysis and Reverse Transcription-Quantitative PCR in a Complex Experimental Setting

    PubMed Central

    Rumlow, Annekathrin; Keunen, Els; Klein, Jan; Pallmann, Philip; Riemenschneider, Anja; Cuypers, Ann

    2016-01-01

    Analysis of gene expression is one of the major ways to better understand plant reactions to changes in environmental conditions. The comparison of many different factors influencing plant growth challenges the gene expression analysis for specific gene-targeted experiments, especially with regard to the choice of suitable reference genes. The aim of this study is to compare expression results obtained by Northern blot, semi-quantitative PCR and RT-qPCR, and to identify a reliable set of reference genes for oilseed rape (Brassica napus L.) suitable for comparing gene expression under complex experimental conditions. We investigated the influence of several factors such as sulfur deficiency, different time points during the day, varying light conditions, and their interaction on gene expression in oilseed rape plants. The expression of selected reference genes was indeed influenced under these conditions in different ways. Therefore, a recently developed algorithm, called GrayNorm, was applied to validate a set of reference genes for normalizing results obtained by Northern blot analysis. After careful comparison of the three methods mentioned above, Northern blot analysis seems to be a reliable and cost-effective alternative for gene expression analysis under a complex growth regime. For using this method in a quantitative way a number of references was validated revealing that for our experiment a set of three references provides an appropriate normalization. Semi-quantitative PCR was prone to many handling errors and difficult to control while RT-qPCR was very sensitive to expression fluctuations of the reference genes. PMID:27685087

  14. An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies

    NASA Technical Reports Server (NTRS)

    Kostiuk, Peter F.; Adams, Milton B.; Allinger, Deborah F.; Rosch, Gene; Kuchar, James

    1998-01-01

    The continuing growth of air traffic will place demands on NASA's Air Traffic Management (ATM) system that cannot be accommodated without the creation of significant delays and economic impacts. To deal with this situation, work has begun to develop new approaches to providing a safe and economical air transportation infrastructure. Many of these emerging air transport technologies will represent radically new approaches to ATM, both for ground and air operations.

  15. Analysis of liver connexin expression using reverse transcription quantitative real-time polymerase chain reaction

    PubMed Central

    Maes, Michaël; Willebrords, Joost; Crespo Yanguas, Sara; Cogliati, Bruno; Vinken, Mathieu

    2016-01-01

    Summary Although connexin production is mainly regulated at the protein level, altered connexin gene expression has been identified as the underlying mechanism of several pathologies. When studying the latter, appropriate methods to quantify connexin mRNA levels are required. The present chapter describes a well-established reverse transcription quantitative real-time polymerase chain reaction procedure optimized for analysis of hepatic connexins. The method includes RNA extraction and subsequent quantification, generation of complementary DNA, quantitative real-time polymerase chain reaction and data analysis. PMID:27207283

  16. Analysis of Liver Connexin Expression Using Reverse Transcription Quantitative Real-Time Polymerase Chain Reaction.

    PubMed

    Maes, Michaël; Willebrords, Joost; Crespo Yanguas, Sara; Cogliati, Bruno; Vinken, Mathieu

    2016-01-01

    Although connexin production is mainly regulated at the protein level, altered connexin gene expression has been identified as the underlying mechanism of several pathologies. When studying the latter, appropriate methods to quantify connexin RNA levels are required. The present chapter describes a well-established reverse transcription quantitative real-time polymerase chain reaction procedure optimized for analysis of hepatic connexins. The method includes RNA extraction and subsequent quantification, generation of complementary DNA, quantitative real-time polymerase chain reaction, and data analysis. PMID:27207283

  17. Analysis of mixed cell cultures with quantitative digital holographic phase microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Wibbeling, Jana; Ketelhut, Steffi

    2014-05-01

    In order to study, for example, the influence of pharmaceuticals or pathogens on different cell types under identical measurement conditions and to analyze interactions between different cellular specimens a minimally-invasive quantitative observation of mixed cell cultures is of particular interest. Quantitative phase microscopy (QPM) provides high resolution detection of optical path length changes that is suitable for stain-free minimally-invasive live cell analysis. Due to low light intensities for object illumination, QPM minimizes the interaction with the sample and is in particular suitable for long term time-lapse investigations, e.g., for the detection of cell morphology alterations due to drugs and toxins. Furthermore, QPM has been demonstrated to be a versatile tool for the quantification of cellular growth, the extraction morphological parameters and cell motility. We studied the feasibility of QPM for the analysis of mixed cell cultures. It was explored if quantitative phase images provide sufficient information to distinguish between different cell types and to extract cell specific parameters. For the experiments quantitative phase imaging with digital holographic microscopy (DHM) was utilized. Mixed cell cultures with different types of human pancreatic tumor cells were observed with quantitative DHM phase contrast up to 35 h. The obtained series of quantitative phase images were evaluated by adapted algorithms for image segmentation. From the segmented images the cellular dry mass and the mean cell thickness were calculated and used in the further analysis as parameters to quantify the reliability the measurement principle. The obtained results demonstrate that it is possible to characterize the growth of cell types with different morphologies in a mixed cell culture separately by consideration of specimen size and cell thickness in the evaluation of quantitative DHM phase images.

  18. Quantitative Computed Tomography Protocols Affect Material Mapping and Quantitative Computed Tomography-Based Finite-Element Analysis Predicted Stiffness.

    PubMed

    Giambini, Hugo; Dragomir-Daescu, Dan; Nassr, Ahmad; Yaszemski, Michael J; Zhao, Chunfeng

    2016-09-01

    Quantitative computed tomography-based finite-element analysis (QCT/FEA) has become increasingly popular in an attempt to understand and possibly reduce vertebral fracture risk. It is known that scanning acquisition settings affect Hounsfield units (HU) of the CT voxels. Material properties assignments in QCT/FEA, relating HU to Young's modulus, are performed by applying empirical equations. The purpose of this study was to evaluate the effect of QCT scanning protocols on predicted stiffness values from finite-element models. One fresh frozen cadaveric torso and a QCT calibration phantom were scanned six times varying voltage and current and reconstructed to obtain a total of 12 sets of images. Five vertebrae from the torso were experimentally tested to obtain stiffness values. QCT/FEA models of the five vertebrae were developed for the 12 image data resulting in a total of 60 models. Predicted stiffness was compared to the experimental values. The highest percent difference in stiffness was approximately 480% (80 kVp, 110 mAs, U70), while the lowest outcome was ∼1% (80 kVp, 110 mAs, U30). There was a clear distinction between reconstruction kernels in predicted outcomes, whereas voltage did not present a clear influence on results. The potential of QCT/FEA as an improvement to conventional fracture risk prediction tools is well established. However, it is important to establish research protocols that can lead to results that can be translated to the clinical setting. PMID:27428281

  19. Quantitative x-ray phase-contrast imaging of air-assisted water sprays with high Weber numbers

    NASA Astrophysics Data System (ADS)

    Wang, Y. J.; Im, Kyoung-Su; Fezzaa, K.; Lee, W. K.; Wang, Jin; Micheli, P.; Laub, C.

    2006-10-01

    X-ray in-line phase-contrast imaging along with a single-image phase retrieval reconstruction was used to visualize the near-nozzle breakup of optically dense water jets atomized by a high-speed, annular air flow. The influence of the atomizing air on water mass distribution was investigated to reveal the complex air/liquid interactions at various breakup stages. Unlike low-Weber-number jets, the breakup of high-Weber-number jets can occur in the liquid core, which causes sudden decreases in liquid volume fraction.

  20. Quantitative underwater 3D motion analysis using submerged video cameras: accuracy analysis and trajectory reconstruction.

    PubMed

    Silvatti, Amanda P; Cerveri, Pietro; Telles, Thiago; Dias, Fábio A S; Baroni, Guido; Barros, Ricardo M L

    2013-01-01

    In this study we aim at investigating the applicability of underwater 3D motion capture based on submerged video cameras in terms of 3D accuracy analysis and trajectory reconstruction. Static points with classical direct linear transform (DLT) solution, a moving wand with bundle adjustment and a moving 2D plate with Zhang's method were considered for camera calibration. As an example of the final application, we reconstructed the hand motion trajectories in different swimming styles and qualitatively compared this with Maglischo's model. Four highly trained male swimmers performed butterfly, breaststroke and freestyle tasks. The middle fingertip trajectories of both hands in the underwater phase were considered. The accuracy (mean absolute error) of the two calibration approaches (wand: 0.96 mm - 2D plate: 0.73 mm) was comparable to out of water results and highly superior to the classical DLT results (9.74 mm). Among all the swimmers, the hands' trajectories of the expert swimmer in the style were almost symmetric and in good agreement with Maglischo's model. The kinematic results highlight symmetry or asymmetry between the two hand sides, intra- and inter-subject variability in terms of the motion patterns and agreement or disagreement with the model. The two outcomes, calibration results and trajectory reconstruction, both move towards the quantitative 3D underwater motion analysis.

  1. The correlation of contrast-enhanced ultrasound and MRI perfusion quantitative analysis in rabbit VX2 liver cancer.

    PubMed

    Xiang, Zhiming; Liang, Qianwen; Liang, Changhong; Zhong, Guimian

    2014-12-01

    Our objective is to explore the value of liver cancer contrast-enhanced ultrasound (CEUS) and MRI perfusion quantitative analysis in liver cancer and the correlation between these two analysis methods. Rabbit VX2 liver cancer model was established in this study. CEUS was applied. Sono Vue was applied in rabbits by ear vein to dynamically observe and record the blood perfusion and changes in the process of VX2 liver cancer and surrounding tissue. MRI perfusion quantitative analysis was used to analyze the mean enhancement time and change law of maximal slope increasing, which were further compared with the pathological examination results. Quantitative indicators of liver cancer CEUS and MRI perfusion quantitative analysis were compared, and the correlation between them was analyzed by correlation analysis. Rabbit VX2 liver cancer model was successfully established. CEUS showed that time-intensity curve of rabbit VX2 liver cancer showed "fast in, fast out" model while MRI perfusion quantitative analysis showed that quantitative parameter MTE of tumor tissue increased and MSI decreased: the difference was statistically significant (P < 0.01). The diagnostic results of CEUS and MRI perfusion quantitative analysis were not significantly different (P > 0.05). However, the quantitative parameter of them were significantly positively correlated (P < 0.05). CEUS and MRI perfusion quantitative analysis can both dynamically monitor the liver cancer lesion and surrounding liver parenchyma, and the quantitative parameters of them are correlated. The combined application of both is of importance in early diagnosis of liver cancer.

  2. Analysis of 129I in Groundwater Samples: Direct and Quantitative Results below the Drinking Water Standard

    SciTech Connect

    Brown, Christopher F.; Geiszler, Keith N.; Lindberg, Michael J.

    2007-03-03

    Due to its long half-life (15.7 million years) and relatively unencumbered migration in subsurface environments, 129I has been recognized as a contaminant of concern at numerous federal, private, and international facilities. In order to understand the long-term risk associated with 129I at these locations, quantitative analysis of groundwater samples must be performed. However, the ability to quantitatively assess the 129I content in groundwater samples requires specialized extraction and sophisticated analytical techniques, which are complicated and not always available to the general scientific community. This paper highlights an analytical method capable of directly quantifying 129I in groundwater samples at concentrations below the MCL without the need for sample pre-concentration. Samples were analyzed on a Perkin Elmer ELAN DRC II ICP-MS after minimal dilution using O2 as the reaction gas. Analysis of continuing calibration verification standards indicated that the DRC mode could be used for quantitative analysis of 129I in samples below the drinking water standard (0.0057 ng/ml or 1 pCi/L). The low analytical detection limit of 129I analysis in the DRC mode coupled with minimal sample dilution (1.02x) resulted in a final sample limit of quantification of 0.0051 ng/ml. Subsequent analysis of three groundwater samples containing 129I resulted in fully quantitative results in the DRC mode, and spike recovery analyses performed on all three samples confirmed that the groundwater matrix did not adversely impact the analysis of 129I in the DRC mode. This analytical approach has been proven to be a cost-effective, high-throughput technique for the direct, quantitative analysis of 129I in groundwater samples at concentrations below the current MCL.

  3. A critical analysis of air shower structure functions and size spectrum measurements with the NBU air shower array

    NASA Technical Reports Server (NTRS)

    Chaudhuri, N.; Basak, D. K.

    1985-01-01

    A total of 11,000 showers in the size range 10 to the 4 to 10 to the 6 particles so far detected by the NBU air shower array has been analyzed using five different structure functions. A comparison of structure functions in terms: (1) of shower size; and (2) electron density at various core distances has been discussed to indicate the present status of structure functions in air shower analysis.

  4. High-throughput automated image analysis of neuroinflammation and neurodegeneration enables quantitative assessment of virus neurovirulence

    PubMed Central

    Maximova, Olga A.; Murphy, Brian R.; Pletnev, Alexander G.

    2010-01-01

    Historically, the safety of live attenuated vaccine candidates against neurotropic viruses was assessed by semi-quantitative analysis of virus-induced histopathology in the central nervous system of monkeys. We have developed a high-throughput automated image analysis (AIA) for the quantitative assessment of virus-induced neuroinflammation and neurodegeneration. Evaluation of the results generated by AIA showed that quantitative estimates of lymphocytic infiltration, microglial activation, and neurodegeneration strongly and significantly correlated with results of traditional histopathological scoring. In addition, we show that AIA is a targeted, objective, accurate, and time-efficient approach that provides reliable differentiation of virus neurovirulence. As such, it may become a useful tool in establishing consistent analytical standards across research and development laboratories and regulatory agencies, and may improve the safety evaluation of live virus vaccines. The implementation of this high-throughput AIA will markedly advance many fields of research including virology, neuroinflammation, neuroscience, and vaccinology. PMID:20688036

  5. Possibility of quantitative estimation of blood cell forms by the spatial-frequency spectrum analysis

    NASA Astrophysics Data System (ADS)

    Spiridonov, Igor N.; Safonova, Larisa P.; Samorodov, Andrey V.

    2000-05-01

    At present in hematology there are no quantitative estimates of such important for the cell classification parameters: cell form and nuclear form. Due to the absence of the correlation between morphological parameters and parameters measured by hemoanalyzers, both flow cytometers and computer recognition systems, do not provide the completeness of the clinical blood analysis. Analysis of the spatial-frequency spectra of blood samples (smears and liquid probes) permit the estimate the forms quantitatively. On the results of theoretical and experimental researches carried out an algorithm of the form quantitative estimation by means of SFS parameters has been created. The criteria of the quality of these estimates have been proposed. A test bench based on the coherent optical and digital processors. The received results could be applied for the automated classification of ether normal or pathological blood cells in the standard blood smears.

  6. A novel rapid quantitative analysis of drug migration on tablets using laser induced breakdown spectroscopy.

    PubMed

    Yokoyama, Makoto; Tourigny, Martine; Moroshima, Kenji; Suzuki, Junsuke; Sakai, Miyako; Iwamoto, Kiyoshi; Takeuchi, Hirofumi

    2010-11-01

    There have been few reports wherein drug migration from the interior to the surface of a tablet has been analyzed quantitatively until now. In this paper, we propose a novel, rapid, quantitative analysis of drug migration in tablets using laser induced breakdown spectroscopy (LIBS). To evaluate drug migration, model tablets containing nicardipine hydrochloride as active pharmaceutical ingredient (API) were prepared by a conventional wet granulation method. Since the color of this API is pale yellow and all excipients are white, we can observe the degree of drug migration by visual inspection in these model tablets. In order to prepare tablets with different degrees of drug migration, the temperature of the drying process after tableting was varied between 50 to 80 °C. Using these manifold tablets, visual inspection, Fourier transform (FT)-IR mapping and LIBS analysis were carried out to evaluate the drug migration in the tablets. While drug migration could be observed using all methods, only LIBS analysis could provide quantitative analysis wherein the average LIBS intensity was correlated with the degree of drug migration obtained from the drying temperature. Moreover, in this work, we compared the sample preparation, data analysis process and measurement time for visual inspection, FT-IR mapping and LIBS analysis. The results of the comparison between these methods demonstrated that LIBS analysis is the simplest and the fastest method for migration monitoring. From the results obtained, we conclude that LIBS analysis is one of most useful process analytical technology (PAT) tools to solve the universal migration problem.

  7. ANSI/ASHRAE/IES Standard 90.1-2010 Final Determination Quantitative Analysis

    SciTech Connect

    Halverson, Mark A.; Rosenberg, Michael I.; Liu, Bing

    2011-10-31

    The U.S. Department of Energy (DOE) conducted a final quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2010 (ASHRAE Standard 90.1-2010, Standard 90.1-2010, or 2010 edition) would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2007(ASHRAE Standard 90.1-2007, Standard 90.1-2007, or 2007 edition). The final analysis considered each of the 109 addenda to ASHRAE Standard 90.1-2007 that were included in ASHRAE Standard 90.1-2010. All 109 addenda processed by ASHRAE in the creation of Standard 90.1-2010 from Standard 90.1-2007 were reviewed by DOE, and their combined impact on a suite of 16 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE's final determination. However, out of the 109 addenda, 34 were preliminarily determined to have a measureable and quantifiable impact. A suite of 240 computer energy simulations for building prototypes complying with ASHRAE 90.1-2007 was developed. These prototypes were then modified in accordance with these 34 addenda to create a second suite of corresponding building simulations reflecting the same buildings compliant with Standard 90.1-2010. The building simulations were conducted using the DOE EnergyPlus building simulation software. The resulting energy use from the complete suite of 480 simulation runs was then converted to energy use intensity (EUI, or energy use per unit floor area) metrics (Site EUI, Primary EUI, and energy cost intensity [ECI]) results for each simulation. For each edition of the standard, these EUIs were then aggregated to a national basis for each prototype using weighting factors based on

  8. Thermal analysis of Perforated Metal Air Transportable Package (PMATP) prototype.

    SciTech Connect

    Oneto, Robert; Levine, Howard; Mould, John; Pierce, Jim Dwight

    2003-08-01

    Sandia National Laboratories (SNL) has designed a crash-resistant container, the Perforated Metal Air Transportable Package (PMATP), capable of surviving a worst-case plane crash, including both impact and subsequent fire, for the air transport of plutonium. This report presents thermal analyses of the full-scale PMATP in its undamaged (pre-test) condition and in bounding post-accident states. The goal of these thermal simulations was to evaluate the performance of the package in a worst-case post-crash fire. The full-scale package is approximately 1.6 m long by 0.8 m diameter. The thermal analyses were performed with the FLEX finite element code. This analysis clearly predicts that the PMATP provides acceptable thermal response characteristics, both for the post-accident fire of a one-hour duration and the after-fire heat-soak condition. All predicted temperatures for the primary containment vessel are well within design limits for safety.

  9. Modeling and Analysis of Aluminum/Air Fuel Cell

    NASA Astrophysics Data System (ADS)

    Leon, Armando J.

    The technical and scientific challenges to provide reliable sources energy for US and global economy are enormous tasks, and especially so when combined with strategic and recent economic concerns of the last five years. It is clear that as part of the mix of energy sources necessary to deal with these challenges, fuel cells technology will play critical or even a central role. The US Department of Energy, as well as a number of the national laboratories and academic institutions have been aware of the importance such technology for some time. Recently, car manufacturers, transportation experts, and even utilities are paying attention to this vital source of energy for the future. In this thesis, a review of the main fuel cell technologies is presented with the focus on the modeling, and control of one particular and promising fuel cell technology, aluminum air fuel cells. The basic principles of this fuel cell technology are presented. A major part of the study consists of a description of the electrochemistry of the process, modeling, and simulations of aluminum air FC using Matlab Simulink(TM). The controller design of the proposed model is also presented. In sequel, a power management unit is designed and analyzed as an alternative source of power. Thus, the system commutes between the fuel cell output and the alternative power source in order to fulfill a changing power load demand. Finally, a cost analysis and assessment of this technology for portable devices, conclusions and future recommendations are presented.

  10. Analysis of Air Showers at the Trigger Threshold of KASCADE

    NASA Astrophysics Data System (ADS)

    Scholz, J.; Antoni, T.; Apel, W. D.; Bekk, K.; Bercuci, A.; Blümer, H.; Bozdog, H.; Brancus, I. M.; Büttner, C.; Chilingarian, A.; Daumiller, K.; Doll, P.; Engel, R.; Engler, J.; Feßler, F.; Gils, H. J.; Glasstetter, R.; Haungs, A.; Heck, D.; Hörandel, J. R.; Iwan, A.; Kampert, K-H.; Klages, H. O.; Maier, G.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Müller, M.; Obenland, R.; Oehschläger, J.; Ostapchenko, S.; Petcu, M.; Rebel, H.; Risse, M.; Roth, M.; Schatz, G.; Schieler, H.; Thouw, T.; Ulrich, H.; van Buren, J.; Vardanyan, A.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2003-07-01

    The KASCADE experiment measures extensive air showers. It is 100% efficient for showers which are induced by primary particles with energies above 1015 eV to pursue its main goal, the examination of the knee in the flux spectrum at ≈ 5 · 1015 eV. A specially adapted method to calculate two observables (Nch , the number of charged particles and Nµ , the number of muons) by means of a maximum likelihood estimate will be presented. The estimate combines different detector systems and works already at energies around the trigger threshold of KASCADE at ≈ 1014 eV. These observables are used to reconstruct a preliminary energy flux spectrum which is compared with direct measurements and previous measurements of KASCADE at energies above 1015 eV. The reconstruction of energy spectrum and elemental composition around the trigger threshold of KASCADE is important for two reasons. First the estimated spectrum at higher energies has to be congruent with the results of direct measurements. Second it is a cross-check of the interaction models underlying the analysis of extended air showers.

  11. Heart-rate monitoring by air pressure and causal analysis

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Naoki; Nakajima, Hiroshi; Hata, Yutaka

    2011-06-01

    Among lots of vital signals, heart-rate (HR) is an important index for diagnose human's health condition. For instance, HR provides an early stage of cardiac disease, autonomic nerve behavior, and so forth. However, currently, HR is measured only in medical checkups and clinical diagnosis during the rested state by using electrocardiograph (ECG). Thus, some serious cardiac events in daily life could be lost. Therefore, a continuous HR monitoring during 24 hours is desired. Considering the use in daily life, the monitoring should be noninvasive and low intrusive. Thus, in this paper, an HR monitoring in sleep by using air pressure sensors is proposed. The HR monitoring is realized by employing the causal analysis among air pressure and HR. The causality is described by employing fuzzy logic. According to the experiment on 7 males at age 22-25 (23 on average), the correlation coefficient against ECG is 0.73-0.97 (0.85 on average). In addition, the cause-effect structure for HR monitoring is arranged by employing causal decomposition, and the arranged causality is applied to HR monitoring in a setting posture. According to the additional experiment on 6 males, the correlation coefficient is 0.66-0.86 (0.76 on average). Therefore, the proposed method is suggested to have enough accuracy and robustness for some daily use cases.

  12. A Comprehensive Analysis of AIRS Near Surface Air Temperature and Water Vapor Over Land and Tropical Ocean

    NASA Astrophysics Data System (ADS)

    Dang, H. V. T.; Lambrigtsen, B.; Manning, E. M.; Fetzer, E. J.; Wong, S.; Teixeira, J.

    2015-12-01

    Version 6 (V6) of the Atmospheric Infrared Sounder's (AIRS) combined infrared and microwave (IR+MW) retrieval of near surface air temperature (NSAT) and water vapor (NSWV) is validated over the United States with the densely populated MESONET data. MESONET data is a collection of surface/near surface meteorological data from many federal and state agencies. The ones used for this analysis are measured from instruments maintained by the National Weather Service (NWS), the Federal Aviation Administration (FAA), and the Interagency Remote Automatic Weather Stations (RAWS), resulting in a little more than four thousand locations throughout the US. Over the Tropical oceans, NSAT and NSWV are compared to a network of moored buoys from the Tropical Atmosphere Ocean/Triangle Trans-Ocean Buoy Network (TAO/TRITON), and the Pilot Research Moored Array in the Tropical Atlantic (PIRATA). With the analysis of AIRS surface and near surface products over ocean, we glean information on how retrieval of NSAT and NSWV over land can be improved and why it needs some adjustments. We also compare AIRS initial guess of near surface products that are trained on fifty days of ECMWF along with AIRS calibrated radiances, to ECMWF analysis data. The comparison is done to show the differing characteristics of AIRS initial guesses from ECMWF.

  13. Targeted multidimensional gas chromatography for the quantitative analysis of suspected allergens in fragrance products.

    PubMed

    Dunn, Michael S; Vulic, Natalie; Shellie, Robert A; Whitehead, Simon; Morrison, Paul; Marriott, Philip J

    2006-10-13

    provided a quick and effective means to qualitatively determine the presence of six SAs in a commercially available air freshener, however all were not adequately resolved from matrix components. In contrast, quantitation was straightforward using the targeted MDGC method.

  14. Air-sampled Filter Analysis for Endotoxins and DNA Content.

    PubMed

    Lang-Yona, Naama; Mazar, Yinon; Pardo, Michal; Rudich, Yinon

    2016-01-01

    Outdoor aerosol research commonly uses particulate matter sampled on filters. This procedure enables various characterizations of the collected particles to be performed in parallel. The purpose of the method presented here is to obtain a highly accurate and reliable analysis of the endotoxin and DNA content of bio-aerosols extracted from filters. The extraction of high molecular weight organic molecules, such as lipopolysaccharides, from sampled filters involves shaking the sample in a pyrogen-free water-based medium. The subsequent analysis is based on an enzymatic reaction that can be detected using a turbidimetric measurement. As a result of the high organic content on the sampled filters, the extraction of DNA from the samples is performed using a commercial DNA extraction kit that was originally designed for soils and modified to improve the DNA yield. The detection and quantification of specific microbial species using quantitative polymerase chain reaction (q-PCR) analysis are described and compared with other available methods. PMID:27023725

  15. anNET: a tool for network-embedded thermodynamic analysis of quantitative metabolome data

    PubMed Central

    Zamboni, Nicola; Kümmel, Anne; Heinemann, Matthias

    2008-01-01

    Background Compared to other omics techniques, quantitative metabolomics is still at its infancy. Complex sample preparation and analytical procedures render exact quantification extremely difficult. Furthermore, not only the actual measurement but also the subsequent interpretation of quantitative metabolome data to obtain mechanistic insights is still lacking behind the current expectations. Recently, the method of network-embedded thermodynamic (NET) analysis was introduced to address some of these open issues. Building upon principles of thermodynamics, this method allows for a quality check of measured metabolite concentrations and enables to spot metabolic reactions where active regulation potentially controls metabolic flux. So far, however, widespread application of NET analysis in metabolomics labs was hindered by the absence of suitable software. Results We have developed in Matlab a generalized software called 'anNET' that affords a user-friendly implementation of the NET analysis algorithm. anNET supports the analysis of any metabolic network for which a stoichiometric model can be compiled. The model size can span from a single reaction to a complete genome-wide network reconstruction including compartments. anNET can (i) test quantitative data sets for thermodynamic consistency, (ii) predict metabolite concentrations beyond the actually measured data, (iii) identify putative sites of active regulation in the metabolic reaction network, and (iv) help in localizing errors in data sets that were found to be thermodynamically infeasible. We demonstrate the application of anNET with three published Escherichia coli metabolome data sets. Conclusion Our user-friendly and generalized implementation of the NET analysis method in the software anNET allows users to rapidly integrate quantitative metabolome data obtained from virtually any organism. We envision that use of anNET in labs working on quantitative metabolomics will provide the systems biology and

  16. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  17. Quantitative Intersectionality: A Critical Race Analysis of the Chicana/o Educational Pipeline

    ERIC Educational Resources Information Center

    Covarrubias, Alejandro

    2011-01-01

    Utilizing the critical race framework of intersectionality, this research reexamines the Chicana/o educational pipeline through a quantitative intersectional analysis. This approach disaggregates data along the intersection of race, class, gender, and citizenship status to provide a detailed portrait of the educational trajectory of Mexican-origin…

  18. Qualitative and quantitative analysis of mixtures of compounds containing both hydrogen and deuterium

    NASA Technical Reports Server (NTRS)

    Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.

    1969-01-01

    Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.

  19. A Computer Program for Calculation of Calibration Curves for Quantitative X-Ray Diffraction Analysis.

    ERIC Educational Resources Information Center

    Blanchard, Frank N.

    1980-01-01

    Describes a FORTRAN IV program written to supplement a laboratory exercise dealing with quantitative x-ray diffraction analysis of mixtures of polycrystalline phases in an introductory course in x-ray diffraction. Gives an example of the use of the program and compares calculated and observed calibration data. (Author/GS)

  20. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  1. A Quantitative Analysis of Cognitive Strategy Usage in the Marking of Two GCSE Examinations

    ERIC Educational Resources Information Center

    Suto, W. M. Irenka; Greatorex, Jackie

    2008-01-01

    Diverse strategies for marking GCSE examinations have been identified, ranging from simple automatic judgements to complex cognitive operations requiring considerable expertise. However, little is known about patterns of strategy usage or how such information could be utilised by examiners. We conducted a quantitative analysis of previous verbal…

  2. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    ERIC Educational Resources Information Center

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  3. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    SciTech Connect

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  4. Gas chromatograph-mass spectrometer (GC/MS) system for quantitative analysis of reactive chemical compounds

    DOEpatents

    Grindstaff, Quirinus G.

    1992-01-01

    Described is a new gas chromatograph-mass spectrometer (GC/MS) system and method for quantitative analysis of reactive chemical compounds. All components of such a GC/MS system external to the oven of the gas chromatograph are programmably temperature controlled to operate at a volatilization temperature specific to the compound(s) sought to be separated and measured.

  5. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A Complementary Perspective

    ERIC Educational Resources Information Center

    Kahveci, Ajda

    2010-01-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses.…

  6. A Quantitative Categorical Analysis of Metadata Elements in Image-Applicable Metadata Schemas.

    ERIC Educational Resources Information Center

    Greenberg, Jane

    2001-01-01

    Reports on a quantitative categorical analysis of metadata elements in the Dublin Core, VRA (Visual Resource Association) Core, REACH (Record Export for Art and Cultural Heritage), and EAD (Encoded Archival Description) metadata schemas, all of which can be used for organizing and describing images. Introduces a new schema comparison methodology…

  7. Forty Years of the "Journal of Librarianship and Information Science": A Quantitative Analysis, Part I

    ERIC Educational Resources Information Center

    Furner, Jonathan

    2009-01-01

    This paper reports on the first part of a two-part quantitative analysis of volume 1-40 (1969-2008) of the "Journal of Librarianship and Information Science" (formerly the "Journal of Librarianship"). It provides an overview of the current state of LIS research journal publishing in the UK; a review of the publication and printing history of…

  8. Whose American Government? A Quantitative Analysis of Gender and Authorship in American Politics Texts

    ERIC Educational Resources Information Center

    Cassese, Erin C.; Bos, Angela L.; Schneider, Monica C.

    2014-01-01

    American government textbooks signal to students the kinds of topics that are important and, by omission, the kinds of topics that are not important to the discipline of political science. This article examines portrayals of women in introductory American politics textbooks through a quantitative content analysis of 22 widely used texts. We find…

  9. QUANTITATIVE PCR ANALYSIS OF MOLDS IN THE DUST FROM HOMES OF ASTHMATIC CHILDREN IN NORTH CAROLINA

    EPA Science Inventory

    The vacuum bag (VB) dust was analyzed by mold specific quantitative PCR. These results were compared to the analysis survey calculated for each of the homes. The mean and standard deviation (SD) of the ERMI values in the homes of the NC asthmatic children was 16.4 (6.77), compa...

  10. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  11. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  12. A Colorimetric Analysis Experiment Not Requiring a Spectrophotometer: Quantitative Determination of Albumin in Powdered Egg White

    ERIC Educational Resources Information Center

    Charlton, Amanda K.; Sevcik, Richard S.; Tucker, Dorie A.; Schultz, Linda D.

    2007-01-01

    A general science experiment for high school chemistry students might serve as an excellent review of the concepts of solution preparation, solubility, pH, and qualitative and quantitative analysis of a common food product. The students could learn to use safe laboratory techniques, collect and analyze data using proper scientific methodology and…

  13. Quantitative Analysis of Organic Compounds: A Simple and Rapid Method for Use in Schools

    ERIC Educational Resources Information Center

    Schmidt, Hans-Jurgen

    1973-01-01

    Describes the procedure for making a quantitative analysis of organic compounds suitable for secondary school chemistry classes. Using the Schoniger procedure, the organic compound, such as PVC, is decomposed in a conical flask with oxygen. The products are absorbed in a suitable liquid and analyzed by titration. (JR)

  14. BACT analysis under the Clean Air Act's PCD program

    SciTech Connect

    Simms, P.; Walke, J.

    2006-11-15

    Before a company may build a new major industrial source of air pollution, or make modifications to an existing major source in the USA it must apply for and receive a Clean Air Act (CAA) Prevention of Significant Deterioration (PSD) permit. State environmental agencies typically issue such permits, either under state law or by exercising delegated authority to implement the federal PSD program. To fully comply with the CAA, the emissions limits identified as BACT must incorporate consideration of more than just add-on emissions control technology, they must also reflect appropriate considerations of fuel quality (e.g. low-sulfur coal) and process changes (e.g. advanced combustion techniques) as a means of controlling emissions, and must consider the other environmental and public welfare benefits of the identified emissions control options. Several states including New Mexico and Illinois have already determined that innovated technologies, such as Integrated Gasification Combined Cycle (IGCC), must be considered in connection with the BACT analysis for new coal-fired power plants. Even the notion that BACT is categorically limited in scope to the general type of facility proposed is contrary to EPA precedent. For example, the Environmental Appeals Board (EAB) has explained that permitting authorities retain the discretion under the definition of BACT to require dramatically different facility designs (e.g. a natural gas plant instead of a coal-fired power plant). The best advice for any permit applicant is to include in the BACT analysis a careful and honest examination of better performing alternative processes and/or innovative combustion techniques and to aggressively pursue such options wherever feasible. 17 refs.

  15. Quantitative analysis of ciliary beating in primary ciliary dyskinesia: a pilot study

    PubMed Central

    2012-01-01

    Background Primary ciliary dyskinesia (PCD) is a rare congenital respiratory disorder characterized by abnormal ciliary motility leading to chronic airway infections. Qualitative evaluation of ciliary beat pattern based on digital high-speed videomicroscopy analysis has been proposed in the diagnosis process of PCD. Although this evaluation is easy in typical cases, it becomes difficult when ciliary beating is partially maintained. We postulated that a quantitative analysis of beat pattern would improve PCD diagnosis. We compared quantitative parameters with the qualitative evaluation of ciliary beat pattern in patients in whom the diagnosis of PCD was confirmed or excluded. Methods Nasal nitric oxide measurement, nasal brushings and biopsies were performed prospectively in 34 patients with suspected PCD. In combination with qualitative analysis, 12 quantitative parameters of ciliary beat pattern were determined on high-speed videomicroscopy recordings of beating ciliated edges. The combination of ciliary ultrastructural abnormalities on transmission electron microscopy analysis with low nasal nitric oxide levels was the “gold standard” used to establish the diagnosis of PCD. Results This “gold standard” excluded PCD in 15 patients (non-PCD patients), confirmed PCD in 10 patients (PCD patients) and was inconclusive in 9 patients. Among the 12 parameters, the distance traveled by the cilium tip weighted by the percentage of beating ciliated edges presented 96% sensitivity and 95% specificity. Qualitative evaluation and quantitative analysis were concordant in non-PCD patients. In 9/10 PCD patients, quantitative analysis was concordant with the “gold standard”, while the qualitative evaluation was discordant with the “gold standard” in 3/10 cases. Among the patients with an inconclusive “gold standard”, the use of quantitative parameters supported PCD diagnosis in 4/9 patients (confirmed by the identification of disease-causing mutations in one

  16. Qualitative and quantitative proteomic analysis of formalin-fixed paraffin-embedded (FFPE) tissue.

    PubMed

    Azimzadeh, Omid; Atkinson, Michael J; Tapio, Soile

    2015-01-01

    Formalin-fixed, paraffin-embedded (FFPE) tissue has recently gained interest as an alternative to fresh/frozen tissue for retrospective protein biomarker discovery. However, during the formalin fixation proteins undergo degradation and cross-linking, making conventional protein analysis technologies challenging. Cross-linking is even more challenging when quantitative proteome analysis of FFPE tissue is planned. The use of conventional protein labeling technologies on FFPE tissue has turned out to be problematic as the lysine residue labeling targets are frequently blocked by the formalin treatment. We have established a qualitative and quantitative proteomics analysis technique for FFPE tissues that combines label-free proteomic analysis with optimized protein extraction and separation conditions.

  17. DEVELOPMENT AND ANALYSIS OF AIR QUALITY MODELING SIMULATIONS FOR HAZARDOUS AIR POLLUTANTS

    EPA Science Inventory

    The concentrations of five hazardous air pollutants were simulated using the Community Multi Scale Air Quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results a...

  18. Control of asthma triggers in indoor air with air cleaners: a modeling analysis

    PubMed Central

    Myatt, Theodore A; Minegishi, Taeko; Allen, Joseph G; MacIntosh, David L

    2008-01-01

    Background Reducing exposure to environmental agents indoors shown to increase asthma symptoms or lead to asthma exacerbations is an important component of a strategy to manage asthma for individuals. Numerous investigations have demonstrated that portable air cleaning devices can reduce concentrations of asthma triggers in indoor air; however, their benefits for breathing problems have not always been reproducible. The potential exposure benefits of whole house high efficiency in-duct air cleaners for sensitive subpopulations have yet to be evaluated. Methods We used an indoor air quality modeling system (CONTAM) developed by NIST to examine peak and time-integrated concentrations of common asthma triggers present in indoor air over a year as a function of natural ventilation, portable air cleaners, and forced air ventilation equipped with conventional and high efficiency filtration systems. Emission rates for asthma triggers were based on experimental studies published in the scientific literature. Results Forced air systems with high efficiency filtration were found to provide the best control of asthma triggers: 30–55% lower cat allergen levels, 90–99% lower risk of respiratory infection through the inhalation route of exposure, 90–98% lower environmental tobacco smoke (ETS) levels, and 50–75% lower fungal spore levels than the other ventilation/filtration systems considered. These results indicate that the use of high efficiency in-duct air cleaners provide an effective means of controlling allergen levels not only in a single room, like a portable air cleaner, but the whole house. Conclusion These findings are useful for evaluating potential benefits of high efficiency in-duct filtration systems for controlling exposure to asthma triggers indoors and for the design of trials of environmental interventions intended to evaluate their utility in practice. PMID:18684328

  19. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  20. Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Jiang, Dejun; Zhao, Shusen; Shen, Jingling

    2008-03-01

    A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.

  1. The other half of the story: effect size analysis in quantitative research.

    PubMed

    Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane

    2013-01-01

    Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.

  2. Tannin structural elucidation and quantitative ³¹P NMR analysis. 2. Hydrolyzable tannins and proanthocyanidins.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-01

    An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products. PMID:23998855

  3. Immunochemical approach to indoor aeroallergen quantitation with a new volumetric air sampler: studies with mite, roach, cat, mouse, and guinea pig antigens

    SciTech Connect

    Swanson, M.C.; Agarwal, M.K.; Reed, C.E.

    1985-11-01

    We describe a new high-volume air sampler for determining antigen concentrations in homes and illustrate its use for quantitating airborne house dust mite, cat, cockroach, mouse, and guinea pig antigens. The concentration of house dust-mite antigen was similar from houses in Rochester, Minn. and tenement apartments in Harlem, N. Y., but cockroach and mouse urinary proteins were present only in Harlem. The amount of cat or guinea pig antigen varied as expected with the number of pets in the home. In calm air the airborne concentration of mite and cat antigen was similar throughout the house but increased greatly in a bedroom when bedding was changed. In calm air most of the cat and mite antigens were associated with respirable particles less than 5 microns mean aerodynamic mass diameter, but in air sampled after the bedding was changed, more cat antigen was found in particles greater than 5 microns. The apparatus and technique described can provide objective data concerning the magnitude and the relative distribution and duration of suspended particles of defined sizes, which contain allergen activity.

  4. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows. PMID:26138893

  5. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows.

  6. Application of BP Neural Network Based on Genetic Algorithm in Quantitative Analysis of Mixed GAS

    NASA Astrophysics Data System (ADS)

    Chen, Hongyan; Liu, Wenzhen; Qu, Jian; Zhang, Bing; Li, Zhibin

    Aiming at the problem of mixed gas detection in neural network and analysis on the principle of gas detection. Combining BP algorithm of genetic algorithm with hybrid gas sensors, a kind of quantitative analysis system of mixed gas is designed. The local minimum of network learning is the main reason which affects the precision of gas analysis. On the basis of the network study to improve the learning algorithms, the analyses and tests for CO, CO2 and HC compounds were tested. The results showed that the above measures effectively improve and enhance the accuracy of the neural network for gas analysis.

  7. Development of a combined air sampling and quantitative real-time PCR method for detection of Legionella spp.

    PubMed

    Sirigul, Chomrach; Wongwit, Waranya; Phanprasit, Wantanee; Paveenkittiporn, Wantana; Blacksell, Stuart D; Ramasoota, Pongrama

    2006-05-01

    The objective of this study was to develop and optimize the combined methods of air sampling and real time polymerase chain reaction (real-time PCR) for quantifying aerosol Legionella spp. Primers and TaqMan hydrolysis probe based on 5S rRNA gene specific for Legionella spp were used to amplify a specific DNA product of 84 bp. The impinger air sampler plus T-100 sampling pump was used to collect aerosol Legionella and as low as 10 fg of Legionella DNA per reaction could detected. Preliminary studies demonstrated that the developed method could detect aerosol Legionella spp 1.5-185 organisms /500 l of air within 5 hours, in contrast to culture method, that required a minimum of 7-10 days. PMID:17120970

  8. Quantitative Analysis of Pork and Chicken Products by Droplet Digital PCR

    PubMed Central

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises. PMID:25243184

  9. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  10. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples.

  11. Quantitative LC-MS/MS Glycomic Analysis of Biological Samples Using AminoxyTMT.

    PubMed

    Zhou, Shiyue; Hu, Yunli; Veillon, Lucas; Snovida, Sergei I; Rogers, John C; Saba, Julian; Mechref, Yehia

    2016-08-01

    Protein glycosylation plays an important role in various biological processes, such as modification of protein function, regulation of protein-protein interactions, and control of turnover rates of proteins. Moreover, glycans have been considered as potential biomarkers for many mammalian diseases and development of aberrant glycosylation profiles is an important indicator of the pathology of a disease or cancer. Hence, quantitation is an important aspect of a comprehensive glycomics study. Although numerous MS-based quantitation strategies have been developed in the past several decades, some issues affecting sensitivity and accuracy of quantitation still exist, and the development of more effective quantitation strategies is still required. Aminoxy tandem mass tag (aminoxyTMT) reagents are recently commercialized isobaric tags which enable relative quantitation of up to six different glycan samples simultaneously. In this study, liquid chromatography and mass spectrometry conditions have been optimized to achieve reliable LC-MS/MS quantitative glycomic analysis using aminoxyTMT reagents. Samples were resuspended in 0.2 M sodium chloride solution to promote the formation of sodium adduct precursor ions, which leads to higher MS/MS reporter ion yields. This method was first evaluated with glycans from model glycoproteins and pooled human blood serum samples. The observed variation of reporter ion ratios was generally less than 10% relative to the theoretical ratio. Even for the highly complex minor N-glycans, the variation was still below 15%. This strategy was further applied to the glycomic profiling of N-glycans released from blood serum samples of patients with different esophageal diseases. Our results demonstrate the benefits of utilizing aminoxyTMT reagents for reliable quantitation of biological glycomic samples. PMID:27377957

  12. [A multivariate nonlinear model for quantitative analysis in laser-induced breakdown spectroscopy].

    PubMed

    Chen, Xing-Long; Fu, Hong-Bo; Wang, Jing-Ge; Ni, Zhi-Bo; He, Wen-Gan; Xu, Jun; Rao Rui-zhong; Dong, Rui-Zhong

    2014-11-01

    Most quantitative models used in laser-induced breakdown spectroscopy (LIBS) are based on the hypothesis that laser-induced plasma approaches the state of local thermal equilibrium (LTE). However, the local equilibrium is possible only at a specific time segment during the evolution. As the populations of each energy level does not follow Boltzmann distribution in non-LTE condition, those quantitative models using single spectral line would be inaccurate. A multivariate nonlinear model, in which the LTE is not required, was proposed in this article to reduce the signal fluctuation and improve the accuracy of quantitative analysis. This multivariate nonlinear model was compared with the internal calibration model which is based on the LTE condition. The content of Mn in steel samples was determined by using the two models, respectively. A minor error and a minor relative standard deviation (RSD) were observed in multivariate nonlinear model. This result demonstrates that multivariate nonlinear model can improve measurement accuracy and repeatability.

  13. Stable isotope labeling of mammals (SILAM) for in vivo quantitative proteomic analysis.

    PubMed

    Rauniyar, Navin; McClatchy, Daniel B; Yates, John R

    2013-06-15

    Metabolic labeling of rodent proteins with ¹⁵N, a heavy stable isotope of nitrogen, provides an efficient way for relative quantitation of differentially expressed proteins. Here we describe a protocol for metabolic labeling of rats with an ¹⁵N-enriched spirulina diet. As a case study, we also demonstrate the application of ¹⁵N-enriched tissue as a common internal standard in quantitative analysis of differentially expressed proteins in neurodevelopment in rats at two different time points, postnatal day 1 and 45. We briefly discuss the bioinformatics tools, ProLucid and Census, which can easily be used in a sequential manner to identify and quantitate relative protein levels on a proteomic scale. PMID:23523555

  14. Ambient Air Pollution and Preeclampsia: A Spatiotemporal Analysis

    PubMed Central

    Figueras, Francesc; Basagaña, Xavier; Beelen, Rob; Martinez, David; Cirach, Marta; Schembari, Anna; Hoek, Gerard; Brunekreef, Bert; Nieuwenhuijsen, Mark J

    2013-01-01

    topic. Citation: Dadvand P, Figueras F, Basagaña X, Beelen R, Martinez D, Cirach M, Schembari A, Hoek G, Brunekreef B, Nieuwenhuijsen MJ. 2013. Ambient air pollution and preeclampsia: a spatiotemporal analysis. Environ Health Perspect 121:1365–1371; http://dx.doi.org/10.1289/ehp.1206430 PMID:24021707

  15. Sampling and analysis of terpenes in air. An interlaboratory comparison

    NASA Astrophysics Data System (ADS)

    Larsen, Bo; Bomboi-Mingarro, Teresa; Brancaleoni, Enzo; Calogirou, Aggelos; Cecinato, Angelo; Coeur, Cecile; Chatzinestis, Ioannis; Duane, Matthew; Frattoni, Massimiliano; Fugit, Jean-Luc; Hansen, Ute; Jacob, Veronique; Mimikos, Nikolaos; Hoffmann, Thorsten; Owen, Susan; Perez-Pastor, Rosa; Reichmann, Andreas; Seufert, Gunther; Staudt, Michael; Steinbrecher, Rainer

    An interlaboratory comparison on the sampling and analysis of terpenes in air was held within the framework of the BEMA (Biogenic Emissions in the Mediterranean Area) project in May 1995. Samples were drawn and analysed by 10 European laboratories from a dynamic artificial air generator in which five terpenes were present at low ng ℓ -1 levels and ozone varied between 8 and 125 ppbv. Significant improvements over previous inter-comparison exercises in the quality of results were observed. At the ozone mixing ratio of 8 ppbv a good agreement among laboratories was obtained for all test compounds with mean values close to the target concentration. At higher mixing ratios, ozone reduced terpene recoveries and decreased the precision of the measurements due to ozonolysis during sampling. For β-pinene this effect was negligible but for the more reactive compounds significant losses were observed in some laboratories ( cis-β-ocimene = trans-β-ocimene > linalool > d-limonene). The detrimental effect of ozone was significantly lower for the laboratories which removed ozone prior to sampling by scrubbers. Parallel sampling was carried out with a standardised sampler and each individual laboratory's own device. A good agreement between the two sets of results was obtained, clearly showing that the majority of laboratories used efficient sampling systems. Two different standard solutions were analysed by each laboratory. Only in a few cases did interference in the GC separation cause problems for the quantification of the terpenes (nonanal/linalool). However, making up of standards for the calibration of the analytical equipment (GC-MS or GC-FID) was pointed out as a source of error in some laboratories.

  16. Identification and quantitative analysis of chemical compounds based on multiscale linear fitting of terahertz spectra

    NASA Astrophysics Data System (ADS)

    Qiao, Lingbo; Wang, Yingxin; Zhao, Ziran; Chen, Zhiqiang

    2014-07-01

    Terahertz (THz) time-domain spectroscopy is considered as an attractive tool for the analysis of chemical composition. The traditional methods for identification and quantitative analysis of chemical compounds by THz spectroscopy are all based on full-spectrum data. However, intrinsic features of the THz spectrum only lie in absorption peaks due to existence of disturbances, such as unexpected components, scattering effects, and barrier materials. We propose a strategy that utilizes Lorentzian parameters of THz absorption peaks, extracted by a multiscale linear fitting method, for both identification of pure chemicals and quantitative analysis of mixtures. The multiscale linear fitting method can automatically remove background content and accurately determine Lorentzian parameters of the absorption peaks. The high recognition rate for 16 pure chemical compounds and the accurate predicted concentrations for theophylline-lactose mixtures demonstrate the practicability of our approach.

  17. Aerosol analysis for the regional air pollution study. Final report

    SciTech Connect

    Jaklevic, J.M.; Gatti, R.C.; Goulding, F.S.; Loo, B.W.; Thompson, A.C.

    1980-05-01

    The design and operation of an aerosol sampling and analysis program implemented during the 1975 to 1977 St. Louis Regional Air Pollution Study is described. A network of ten samplers were operated at selected sites in the St. Louis area and the total mass and elemental composition of the collected particulates were determined. Sampling periods of 2 to 24 hours were employed. The samplers were capable of collecting aerosol particles in two distinct size ranges corresponding to fine (< 2.4 ..mu..m diameter) and coarse (> 2.4 ..mu..m diameter) particles. This unique feature allowed the separation of the particulate samples into two distinct fractions with differing chemical origins and health effects. The analysis methods were also newly developed for use in the St. Louis RAPS study. Total particulate mass was measured by a beta-particle attenuation method in which a precision of +- 5 ..mu..m/cm/sup 2/ could be obtained in a one minute measurement time. Elemental compositions of the samples were determined using an energy dispersive x-ray fluorescence method in which detectable limits of 5 ng/cm/sup 2/ or less were routinely achieved for elements ranging in atomic number from Al to Pb. The advantages of these analytical methods over more conventional techniques arise from the ability to automate the measurements. During the course of the two year study, a total of more than 35,000 individual samples were processed and a total of 28 concentrations measured for each sample.

  18. Spatial analysis of the tuberculosis treatment dropout, Buenos Aires, Argentina.

    PubMed

    Herrero, María Belén; Arrossi, Silvina; Ramos, Silvina; Braga, Jose Ueleres

    2015-01-01

    OBJECTIVE Identify spatial distribution patterns of the proportion of nonadherence to tuberculosis treatment and its associated factors. METHODS We conducted an ecological study based on secondary and primary data from municipalities of the metropolitan area of Buenos Aires, Argentina. An exploratory analysis of the characteristics of the area and the distributions of the cases included in the sample (proportion of nonadherence) was also carried out along with a multifactor analysis by linear regression. The variables related to the characteristics of the population, residences and families were analyzed. RESULTS Areas with higher proportion of the population without social security benefits (p = 0.007) and of households with unsatisfied basic needs had a higher risk of nonadherence (p = 0.032). In addition, the proportion of nonadherence was higher in areas with the highest proportion of households with no public transportation within 300 meters (p = 0.070). CONCLUSIONS We found a risk area for the nonadherence to treatment characterized by a population living in poverty, with precarious jobs and difficult access to public transportation. PMID:26270011

  19. Spatial analysis of the tuberculosis treatment dropout, Buenos Aires, Argentina

    PubMed Central

    Herrero, María Belén; Arrossi, Silvina; Ramos, Silvina; Braga, Jose Ueleres

    2015-01-01

    OBJECTIVE Identify spatial distribution patterns of the proportion of nonadherence to tuberculosis treatment and its associated factors. METHODS We conducted an ecological study based on secondary and primary data from municipalities of the metropolitan area of Buenos Aires, Argentina. An exploratory analysis of the characteristics of the area and the distributions of the cases included in the sample (proportion of nonadherence) was also carried out along with a multifactor analysis by linear regression. The variables related to the characteristics of the population, residences and families were analyzed. RESULTS Areas with higher proportion of the population without social security benefits (p = 0.007) and of households with unsatisfied basic needs had a higher risk of nonadherence (p = 0.032). In addition, the proportion of nonadherence was higher in areas with the highest proportion of households with no public transportation within 300 meters (p = 0.070). CONCLUSIONS We found a risk area for the nonadherence to treatment characterized by a population living in poverty, with precarious jobs and difficult access to public transportation. PMID:26270011

  20. Spatial analysis of the tuberculosis treatment dropout, Buenos Aires, Argentina.

    PubMed

    Herrero, María Belén; Arrossi, Silvina; Ramos, Silvina; Braga, Jose Ueleres

    2015-01-01

    OBJECTIVE Identify spatial distribution patterns of the proportion of nonadherence to tuberculosis treatment and its associated factors. METHODS We conducted an ecological study based on secondary and primary data from municipalities of the metropolitan area of Buenos Aires, Argentina. An exploratory analysis of the characteristics of the area and the distributions of the cases included in the sample (proportion of nonadherence) was also carried out along with a multifactor analysis by linear regression. The variables related to the characteristics of the population, residences and families were analyzed. RESULTS Areas with higher proportion of the population without social security benefits (p = 0.007) and of households with unsatisfied basic needs had a higher risk of nonadherence (p = 0.032). In addition, the proportion of nonadherence was higher in areas with the highest proportion of households with no public transportation within 300 meters (p = 0.070). CONCLUSIONS We found a risk area for the nonadherence to treatment characterized by a population living in poverty, with precarious jobs and difficult access to public transportation.

  1. Quantitative end qualitative analysis of the electrical activity of rectus abdominis muscle portions.

    PubMed

    Negrão Filho, R de Faria; Bérzin, F; Souza, G da Cunha

    2003-01-01

    The purpose of this study was to investigate the electrical behavior pattern of the Rectus abdominis muscle by qualitative and quantitative analysis of the electromyographic signal obtained from its superior, medium and inferior portions during dynamic and static activities. Ten voluntaries (aged X = 17.8 years, SD = 1.6) athletic males were studied without history of muscle skeletal disfunction. For the quantitative analysis the RMS (Root Mean Square) values obtained in the electromyographic signal during the isometric exercises were normalized and expressed in maximum voluntary isometric contraction percentages. For the qualitative analysis of the dynamic activity the electromyographic signal was processed by full-wave rectification, linear envelope and normalization (amplitude and time), so that the resulting curve of the processed signal was submitted to descriptive graphic analysis. The results of the quantitative study show that there is not a statistically significant difference among the portions of the muscle. Qualitative analysis demonstrated two aspects: the presence of a common activation electric pattern in the portions of Rectus abdominis muscle and the absence of significant difference in the inclination angles in the electrical activity curve during the isotonic exercises. PMID:12964259

  2. Identification of ambient air sampling and analysis methods for the 189 Title III air toxics

    SciTech Connect

    Mukund, R.; Kelly, T.J.; Gordon, S.M.; Hays, M.J.

    1994-12-31

    The state of development of ambient air measurement methods for the 189 Hazardous Air Pollution (HAPs) in Title 3 of the Clean Air Act Amendments was surveyed. Measurement methods for the HAPs were identified by reviews of established methods, and by literature searches for pertinent research techniques. Methods were segregated by their degree of development into Applicable, Likely, and Potential methods. This survey identified a total of 183 methods, applicable at varying degrees to ambient air measurements of one or more HAPs. As a basis for classifying the HAPs and evaluating the applicability of measurement methods, a survey of a variety of chemical and physical properties of the HAPs was also conducted. The results of both the methods and properties surveys were tabulated for each of the 189 HAP. The current state of development of ambient measurement methods for the 189 HAPs was then assessed from the results of the survey, and recommendations for method development initiatives were developed.

  3. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    PubMed

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time <60 s) and method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical

  4. Quantitative planar laser-induced fluorescence imaging of multi-component fuel/air mixing in a firing gasoline-direct-injection engine: Effects of residual exhaust gas on quantitative PLIF

    SciTech Connect

    Williams, Ben; Ewart, Paul; Wang, Xiaowei; Stone, Richard; Ma, Hongrui; Walmsley, Harold; Cracknell, Roger; Stevens, Robert; Richardson, David; Fu, Huiyu; Wallace, Stan

    2010-10-15

    A study of in-cylinder fuel-air mixing distributions in a firing gasoline-direct-injection engine is reported using planar laser-induced fluorescence (PLIF) imaging. A multi-component fuel synthesised from three pairs of components chosen to simulate light, medium and heavy fractions was seeded with one of three tracers, each chosen to co-evaporate with and thus follow one of the fractions, in order to account for differential volatility of such components in typical gasoline fuels. In order to make quantitative measurements of fuel-air ratio from PLIF images, initial calibration was by recording PLIF images of homogeneous fuel-air mixtures under similar conditions of in-cylinder temperature and pressure using a re-circulation loop and a motored engine. This calibration method was found to be affected by two significant factors. Firstly, calibration was affected by variation of signal collection efficiency arising from build-up of absorbing deposits on the windows during firing cycles, which are not present under motored conditions. Secondly, the effects of residual exhaust gas present in the firing engine were not accounted for using a calibration loop with a motored engine. In order to account for these factors a novel method of PLIF calibration is presented whereby 'bookend' calibration measurements for each tracer separately are performed under firing conditions, utilising injection into a large upstream heated plenum to promote the formation of homogeneous in-cylinder mixtures. These calibration datasets contain sufficient information to not only characterise the quantum efficiency of each tracer during a typical engine cycle, but also monitor imaging efficiency, and, importantly, account for the impact of exhaust gas residuals (EGR). By use of this method EGR is identified as a significant factor in quantitative PLIF for fuel mixing diagnostics in firing engines. The effects of cyclic variation in fuel concentration on burn rate are analysed for different

  5. HPTLC Hyphenated with FTIR: Principles, Instrumentation and Qualitative Analysis and Quantitation

    NASA Astrophysics Data System (ADS)

    Cimpoiu, Claudia

    In recent years, much effort has been devoted to the coupling of high-performance thin-layer chromatography (HPTLC) with spectrometric methods because of the robustness and simplicity of HPTLC and the need for detection techniques that provide identification and determination of sample constituents. IR is one of the spectroscopic methods that have been coupled with HPTLC. IR spectroscopy has a high potential for the elucidation of molecular structures, and the characteristic absorption bands can be used for compound-specific detection. HPTLC-FTIR coupled method has been widely used in the modern laboratories for the qualitative and quantitative analysis. The potential of this method is demonstrated by its application in different fields of analysis such as drug analysis, forensic analysis, food analysis, environmental analysis, biological analysis, etc. The hyphenated HPTLC-FTIR technique will be developed in the future with the aim of taking full advantage of this method.

  6. Quantitative analysis of biological tissues using Fourier transform-second-harmonic generation imaging

    NASA Astrophysics Data System (ADS)

    Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.

    2010-02-01

    We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.

  7. Sources of Technical Variability in Quantitative LC-MS Proteomics: Human Brain Tissue Sample Analysis.

    SciTech Connect

    Piehowski, Paul D.; Petyuk, Vladislav A.; Orton, Daniel J.; Xie, Fang; Moore, Ronald J.; Ramirez Restrepo, Manuel; Engel, Anzhelika; Lieberman, Andrew P.; Albin, Roger L.; Camp, David G.; Smith, Richard D.; Myers, Amanda J.

    2013-05-03

    To design a robust quantitative proteomics study, an understanding of both the inherent heterogeneity of the biological samples being studied as well as the technical variability of the proteomics methods and platform is needed. Additionally, accurately identifying the technical steps associated with the largest variability would provide valuable information for the improvement and design of future processing pipelines. We present an experimental strategy that allows for a detailed examination of the variability of the quantitative LC-MS proteomics measurements. By replicating analyses at different stages of processing, various technical components can be estimated and their individual contribution to technical variability can be dissected. This design can be easily adapted to other quantitative proteomics pipelines. Herein, we applied this methodology to our label-free workflow for the processing of human brain tissue. For this application, the pipeline was divided into four critical components: Tissue dissection and homogenization (extraction), protein denaturation followed by trypsin digestion and SPE clean-up (digestion), short-term run-to-run instrumental response fluctuation (instrumental variance), and long-term drift of the quantitative response of the LC-MS/MS platform over the 2 week period of continuous analysis (instrumental stability). From this analysis, we found the following contributions to variability: extraction (72%) >> instrumental variance (16%) > instrumental stability (8.4%) > digestion (3.1%). Furthermore, the stability of the platform and its’ suitability for discovery proteomics studies is demonstrated.

  8. Quantitative approaches to utilizing mutational analysis and disulfide crosslinking for modeling a transmembrane domain.

    PubMed Central

    Lee, G. F.; Hazelbauer, G. L.

    1995-01-01

    The transmembrane domain of chemoreceptor Trg from Escherichia coli contains four transmembrane segments in its native homodimer, two from each subunit. We had previously used mutational analysis and sulfhydryl cross-linking between introduced cysteines to obtain data relevant to the three-dimensional organization of this domain. In the current study we used Fourier analysis to assess these data quantitatively for periodicity along the sequences of the segments. The analyses provided a strong indication of alpha-helical periodicity in the first transmembrane segment and a substantial indication of that periodicity for the second segment. On this basis, we considered both segments as idealized alpha-helices and proceeded to model the transmembrane domain as a unit of four helices. For this modeling, we calculated helical crosslinking moments, parameters analogous to helical hydrophobic moments, as a quantitative way of condensing and utilizing a large body of crosslinking data. Crosslinking moments were used to define the relative separation and orientation of helical pairs, thus creating a quantitatively derived model for the transmembrane domain of Trg. Utilization of Fourier transforms to provide a quantitative indication of periodicity in data from analyses of transmembrane segments, in combination with helical crosslinking moments to position helical pairs should be useful in modeling other transmembrane domains. PMID:7549874

  9. Models and methods for quantitative analysis of surface-enhanced Raman spectra.

    PubMed

    Li, Shuo; Nyagilo, James O; Dave, Digant P; Gao, Jean

    2014-03-01

    The quantitative analysis of surface-enhanced Raman spectra using scattering nanoparticles has shown the potential and promising applications in in vivo molecular imaging. The diverse approaches have been used for quantitative analysis of Raman pectra information, which can be categorized as direct classical least squares models, full spectrum multivariate calibration models, selected multivariate calibration models, and latent variable regression (LVR) models. However, the working principle of these methods in the Raman spectra application remains poorly understood and a clear picture of the overall performance of each model is missing. Based on the characteristics of the Raman spectra, in this paper, we first provide the theoretical foundation of the aforementioned commonly used models and show why the LVR models are more suitable for quantitative analysis of the Raman spectra. Then, we demonstrate the fundamental connections and differences between different LVR methods, such as principal component regression, reduced-rank regression, partial least square regression (PLSR), canonical correlation regression, and robust canonical analysis, by comparing their objective functions and constraints.We further prove that PLSR is literally a blend of multivariate calibration and feature extraction model that relates concentrations of nanotags to spectrum intensity. These features (a.k.a. latent variables) satisfy two purposes: the best representation of the predictor matrix and correlation with the response matrix. These illustrations give a new understanding of the traditional PLSR and explain why PLSR exceeds other methods in quantitative analysis of the Raman spectra problem. In the end, all the methods are tested on the Raman spectra datasets with different evaluation criteria to evaluate their performance.

  10. Quantitative Analysis of the Nanopore Translocation Dynamics of Simple Structured Polynucleotides

    PubMed Central

    Schink, Severin; Renner, Stephan; Alim, Karen; Arnaut, Vera; Simmel, Friedrich C.; Gerland, Ulrich

    2012-01-01

    Nanopore translocation experiments are increasingly applied to probe the secondary structures of RNA and DNA molecules. Here, we report two vital steps toward establishing nanopore translocation as a tool for the systematic and quantitative analysis of polynucleotide folding: 1), Using α-hemolysin pores and a diverse set of different DNA hairpins, we demonstrate that backward nanopore force spectroscopy is particularly well suited for quantitative analysis. In contrast to forward translocation from the vestibule side of the pore, backward translocation times do not appear to be significantly affected by pore-DNA interactions. 2), We develop and verify experimentally a versatile mesoscopic theoretical framework for the quantitative analysis of translocation experiments with structured polynucleotides. The underlying model is based on sequence-dependent free energy landscapes constructed using the known thermodynamic parameters for polynucleotide basepairing. This approach limits the adjustable parameters to a small set of sequence-independent parameters. After parameter calibration, the theoretical model predicts the translocation dynamics of new sequences. These predictions can be leveraged to generate a baseline expectation even for more complicated structures where the assumptions underlying the one-dimensional free energy landscape may no longer be satisfied. Taken together, backward translocation through α-hemolysin pores combined with mesoscopic theoretical modeling is a promising approach for label-free single-molecule analysis of DNA and RNA folding. PMID:22225801

  11. Stand-off Raman spectroscopy: a powerful technique for qualitative and quantitative analysis of inorganic and organic compounds including explosives.

    PubMed

    Zachhuber, Bernhard; Ramer, Georg; Hobro, Alison; Chrysostom, Engelene T H; Lendl, Bernhard

    2011-06-01

    A pulsed stand-off Raman system has been built and optimised for the qualitative and quantitative analysis of inorganic and organic samples including explosives. The system consists of a frequency doubled Q-switched Nd:YAG laser (532 nm, 10 Hz, 4.4 ns pulse length), aligned coaxially with a 6″ Schmidt-Cassegrain telescope for the collection of Raman scattered light. The telescope was coupled via a fibre optic bundle to an Acton standard series SP-2750 spectrograph with a PI-MAX 1024RB intensified CCD camera equipped with a 500-ps gating option for detection. Gating proved to be essential for achieving high signal-to-noise ratios in the recorded stand-off Raman spectra. In some cases, gating also allowed suppression of disturbing fluorescence signals. For the first time, quantitative analysis of stand-off Raman spectra was performed using both univariate and multivariate methods of data analysis. To correct for possible variation in instrumental parameters, the nitrogen band of ambient air was used as an internal standard. For the univariate method, stand-off Raman spectra obtained at a distance of 9 m on sodium chloride pellets containing varying amounts of ammonium nitrate (0-100%) were used. For the multivariate quantification of ternary xylene mixtures (0-100%), stand-off spectra at a distance of 5 m were used. The univariate calibration of ammonium nitrate yielded R (2) values of 0.992, and the multivariate quantitative analysis yielded root mean square errors of prediction of 2.26%, 1.97% and 1.07% for o-, m- and p-xylene, respectively. Stand-off Raman spectra obtained at a distance of 10 m yielded a detection limit of 174 μg for NaClO(3). Furthermore, to assess the applicability of stand-off Raman spectroscopy for explosives detection in "real-world" scenarios, their detection on different background materials (nylon, polyethylene and part of a car body) and in the presence of interferents (motor oil, fuel oil and soap) at a distance of 20 m was also

  12. Quantitative wake analysis of a freely swimming fish using 3D synthetic aperture PIV

    NASA Astrophysics Data System (ADS)

    Mendelson, Leah; Techet, Alexandra H.

    2015-07-01

    Synthetic aperture PIV (SAPIV) is used to quantitatively analyze the wake behind a giant danio ( Danio aequipinnatus) swimming freely in a seeded quiescent tank. The experiment is designed with minimal constraints on animal behavior to ensure that natural swimming occurs. The fish exhibits forward swimming and turning behaviors at speeds between 0.9 and 1.5 body lengths/second. Results show clearly isolated and linked vortex rings in the wake structure, as well as the thrust jet coming off of a visual hull reconstruction of the fish body. As a benchmark for quantitative analysis of volumetric PIV data, the vortex circulation and impulse are computed using methods consistent with those applied to planar PIV data. Volumetric momentum analysis frameworks are discussed for linked and asymmetric vortex structures, laying a foundation for further volumetric studies of swimming hydrodynamics with SAPIV. Additionally, a novel weighted refocusing method is presented as an improvement to SAPIV reconstruction.

  13. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  14. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  15. Quantitative enantiomeric analysis of chlorcyclizine, hydroxyzine, and meclizine by capillary electrophoresis.

    PubMed

    Ho, Yu- Hsiang; Wu, Hsin- Lung; Wu, Shou- Mei; Chen, Su- Hwei; Kou, Hwang- Shang

    2003-07-01

    A simple capillary zone electrophoresis method was developed for the quantitative enantiomeric analysis of piperazine antihistamines with teratogenic suspicion in animals. Enantioseparation of chlorcyclizine, hydroxyzine, and meclizine was performed in glycine buffer (0.6 mol L(-1); pH 3.00) with sulfated beta-cyclodextrin (5 mg mL(-1)) as a chiral selector; and the separated drugs were monitored by ultra-violet detector. The lower quantitation of the individual enantiomer is attainable at 10 micro mol L(-1), using an achiral piperazine drug (cyclizine) as internal standard. The method is simple and rapid with a short run time (<5 min) for the analysis of chlorcyclizine, hydroxyzine or meclizine enantiomers. PMID:12830360

  16. [Influence of sample surface roughness on mathematical model of NIR quantitative analysis of wood density].

    PubMed

    Huang, An-Min; Fei, Ben-Hua; Jiang, Ze-Hui; Hse, Chung-Yun

    2007-09-01

    Near infrared spectroscopy is widely used as a quantitative method, and the main multivariate techniques consist of regression methods used to build prediction models, however, the accuracy of analysis results will be affected by many factors. In the present paper, the influence of different sample roughness on the mathematical model of NIR quantitative analysis of wood density was studied. The result of experiments showed that if the roughness of predicted samples was consistent with that of calibrated samples, the result was good, otherwise the error would be much higher. The roughness-mixed model was more flexible and adaptable to different sample roughness. The prediction ability of the roughness-mixed model was much better than that of the single-roughness model.

  17. Quantitative analysis of saltwater-freshwater relationships in groundwater systems-A historical perspective

    USGS Publications Warehouse

    Reilly, T.E.; Goodman, A.S.

    1985-01-01

    Although much progress has been made toward the mathematical description of saltwater-freshwater relationships in groundwater systems since the late 19th century, the advective and dispersive mechanisms involved are still incompletely understood. This article documents the major historical advances in this subject and summarizes the major direction of current studies. From the time of Badon Ghyben and Herzberg, it has been recognized that density is important in mathematically describing saltwater-freshwater systems. Other mechanisms, such as hydrodynamic dispersion, were identified later and are still not fully understood. Quantitative analysis of a saltwater-freshwater system attempts to mathematically describe the physical system and the important mechanisms using reasonable simplifications and assumptions. This paper, in developing the history of quantitative analysis discusses many of these simplifications and assumptions and their effect on describing and understanding the phenomenon. ?? 1985.

  18. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    PubMed

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design.

  19. Quantitative analysis of terahertz spectra for illicit drugs using adaptive-range micro-genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin

    2011-08-01

    In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.

  20. Molecular Orientation Analysis of Alkyl Methylene Groups from Quantitative Coherent Anti-Stokes Raman Scattering Spectroscopy.

    PubMed

    Zhang, Chi; Wang, Jie; Jasensky, Joshua; Chen, Zhan

    2015-04-16

    Quantitative data analysis in coherent anti-Stokes Raman scattering (CARS) spectroscopy is important for extracting molecular structural information. We developed a method to derive molecular tilt angle with respect to the surface normal based on quantitative CARS spectral analysis. We showed that the tilt angle of methylene alkyl chains on a surface can be directly obtained from the CH2 symmetric/asymmetric peak ratio in a CARS spectrum. The lipid alkyl chain tilt angle from a lipid monolayer was measured to be ∼0° and was verified by sum frequency generation spectroscopy, which probes the orientations of the lipid methyl end groups. The tilt angle of a silane monolayer alkyl chain was derived to be ∼35°, which agrees with the theoretical prediction. This method is submonolayer sensitive and can also be used to interpret polarization-dependent signals in CARS microscopy. It can be applied to elucidate detailed molecular structure from CARS spectroscopic and microscopic measurements.