Science.gov

Sample records for quantitative analysis aqua

  1. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  2. AquaLite, a bioluminescent label for immunoassay and nucleic acid detection: quantitative analyses at the attomol level

    NASA Astrophysics Data System (ADS)

    Smith, David F.; Stults, Nancy L.

    1996-04-01

    AquaLiteR is a direct, bioluminescent label capable of detecting attomol levels of analyte in clinical immunoassays and assays for the quantitative measurement of nucleic acids. Bioluminescent immunoassays (BIAs) require no radioisotopes and avoid complex fluorescent measurements and many of the variables of indirect enzyme immunoassays (EIAs). AquaLite, a recombinant form of the photoprotein aequorin from a bioluminescent jellyfish, is coupled directly to antibodies to prepare bioluminescent conjugates for assay development. When the AquaLite-antibody complex is exposed to a solution containing calcium ions, a flash of blue light ((lambda) max equals 469 nm) is generated. The light signal is measured in commercially available luminometers that simultaneously inject a calcium solution and detect subattomol photoprotein levies in either test tubes or microtiter plates. Immunometric or 'sandwich' type assays are available for the quantitative measurement of human endocrine hormones and nucleic acids. The AquaLite TSH assay can detect 1 attomol of thyroid stimulating hormone (TSH) in 0.2 mL of human serum and is a useful clinical tool for diagnosing hyperthyroid patients. AquaLite-based nucleic acid detection permits quantifying attomol levels of specific nucleic acid markers and represents possible solution to the difficult problem of quantifying the targets of nucleic acid amplification methods.

  3. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  4. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  5. Merging MODIS Terra and Aqua Level 3 Aerosol Optical Thickness for Giovanni Online Data Analysis and Visualization

    NASA Astrophysics Data System (ADS)

    Zubko, V.; Leptoukh, G.; Gopalan, A.

    2007-12-01

    With a vast amount of satellite-obtained environmental data held, the Goddard Earth Sciences Data and Information Services Center (GES DISC) researches ways to combine multi-sensor data to increase their usefulness, and to integrate it in the GES DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni). Here, we studied the performance of various methods for merging-interpolating the Moderate Resolution Imaging Spectroradiometer (MODIS) Terra and Aqua Level 3 Aerosol Optical Thickness (AOT). To quickly validate the accuracy of the merger, we introduced two confidence functions, which characterize the percentage of the merged AOT pixels as a function of the relative deviation of the merged AOT from original Terra and Aqua AOTs in respect to the original AOT standard deviations or AOT means. Experiment with three different methods for pure merging (no interpolation): simple arithmetic averaging (SIM), maximum likelihood estimate (MLE), and weighting by pixel counts (WPC) demonstrated the relative proximity of the resulting AOTs produced by the three methods with the MLE (SIM) being slightly preferable when validating with respect to AOT standard deviations (AOT means). Another experiment with eight different methods of combined merger-interpolation applied to a variety of scenes with different gap patterns showed that that the absolutely best method is when the merging of Terra and Aqua AOTs is done first followed by Optimal Interpolation to fill in the gaps. The sensitivity of the results to the gap patterns and radius of influence was assessed.

  6. Quantitative Techniques in Volumetric Analysis

    NASA Astrophysics Data System (ADS)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the analyst. This video material carefully documents several options in the process of quantitatively weighing and transferring a solid, quantitatively transferring a liquid aliquot with a pipet, and the process of quantitative volumetric titration. There are many local variants in each of these procedures. For example, some prefer to transfer solid with a weighing spoon, some with a finger held bottle, and some with a paper-strap held bottle. Students should follow the local preference, but should be aware of other acceptable options. Whatever the technique option chosen, the procedure must be done reproducibly, if analysis quality is to be optimized. Acknowledgments Quantitative Techniques in Volumetric Analysis was created with support from Project SERAPHIM and the NSF Directorate for Education and Human Resources, grant MDR-9154099.

  7. Quantitative analysis of 'calanchi

    NASA Astrophysics Data System (ADS)

    Agnesi, Valerio; Cappadonia, Chiara; Conoscenti, Christian; Costanzo, Dario; Rotigliano, Edoardo

    2010-05-01

    Three years (2006 - 2009) of monitoring data from two calanchi sites located in the western Sicilian Appennines are analyzed and discussed: the data comes from two networks of erosion pins and a rainfall gauge station. The aim of the present research is to quantitatively analyze the effects of erosion by water and to investigate their relationships with rainfall trends and specific properties of the two calanchi fronts. Each of the sites was equipped with a grid of randomly distributed erosion pins, made of 41 nodes for the "Catalfimo" site, and 13 nodes for the "Ottosalme" site (in light of the general homogeneity of its geomorphologic conditions); the erosion pins consist in 2 cm graded iron stakes, 100 cm long, with a section having a diameter of 1.6 cm. Repeated readings at the erosion pins allowed to estimate point topographic height variations; a total number of 21 surveys have been made remotely by acquiring high resolution photographs from a fixed view point. Since the two calanchi sites are very close each other (some hundred meters), a single rainfall gauge station was installed, assuming a strict climatic homogeneity of the investigated area. Rainfall data have been processed to derive the rain erosivity index signal, detecting a total number of 27 erosive events. Despite the close distance between the two sites, because of a different geologic setting, the calanchi fronts are characterized by the outcropping of different levels of the same formation (Terravecchia fm., Middle-Late Miocene); as a consequence, both mineralogical, textural and geotechnical (index) properties, as well as the topographic and geomorphologic characteristics, change. Therefore, in order to define the "framework" in which the two erosion pin grids have been installed, 40 samples of rock have been analyzed, and a geomorphologic detailed survey has been carried out; in particular, plasticity index, liquid limit, carbonate, pH, granulometric fractions and their mineralogic properties, electrical conductivity and sodium adsorption ratio (SAR), have been characterized. The analysis of the data allows to define relationships between the response of each erosion pin and the erosive rainfall events, the micro-hydrologic of its position and lithotechnical properties of the outcropping rocks. The estimations of the mean annual erosion rate and of the erosivity index, as well as results of the terrain analysis, largely agree with available data from literature observed in similar sites affected by calanchi development. Moreover, the gained results well reflect the differences of the morphologic features and their distribution on the two calanchi fronts; of particular interest is the spatial distribution and variability of piping landforms that markedly influence the development of gullies, specially on "Catalfimo" site, where a high frequency of pipes of different typologies can be detected.

  8. Analysis of Prototype Collection 5 Products of Leaf Area Index from Terra and Aqua MODIS Sensors

    NASA Astrophysics Data System (ADS)

    Shabanov, N. V.; Yang, W.; Dong, H.; Knyazikhin, Y.; Myneni, R.

    2005-12-01

    A prototype product suite, containing the Terra 8-day, Aqua 8-day, Terra and Aqua combined 8- and 4-day products, was generated as part of testing for the next version of MODIS LAI products - the Collection 5. These products were analyzed for consistency between Terra and Aqua retrievals. The potential for combining retrievals from the two sensors to derive improved products by reducing the impact of environmental conditions and temporal compositing period was also explored. The results suggest no significant discrepancies between large area averages of Terra and Aqua 8-day surface reflectances and LAI products. The differences over smaller regions, however, can be large due to the random nature of residual atmospheric effects. Best quality radiative transfer based retrievals can be expected in 90-95% of the pixels with mostly herbaceous cover and about 50-75% of the pixels with woody vegetation during the growing season. Rate of the best quality retrievals during the growing season is mostly restricted by aerosol contamination of the MODIS data. The combined 8-day product helps to minimize this effect and increases the amount of the best quality retrievals by 10-20% over woody vegetation. The combined 8-day product did not result in more main algorithm retrievals during the winter period because the extent of snow contamination of Terra and Aqua observations is similar. Likewise, the number of cloudy pixels in single-sensor and combined products is also similar. The combined 4-day product provides advantages for fine time step phenology monitoring, especially during transition periods of spring and fall. Implementation of the fine time step does not reduce accuracy of retrievals: the LAI magnitudes, seasonal profiles and the amount of best quality retrieval were found to be comparable between the combined 4-day and the single-sensor 8-day products. Finally, it was found that both Terra and Aqua surface reflectances over the northern high latitudes needle leaf forests demonstrate inverse seasonal trends compared to radiative transfer simulation of the MODIS LAI algorithm. This leads to anomalous LAI seasonality in the retrievals and needs to be investigated further referencing corresponding field measurements through seasonal cycle. Overall, the obtained results indicate that further improvement of the MODIS LAI products is mainly restricted not by the amount, but the precision of the MODIS observations.

  9. Improved centroid moment tensor analyses in the NIED AQUA (Accurate and QUick Analysis system for source parameters)

    NASA Astrophysics Data System (ADS)

    Kimura, H.; Asano, Y.; Matsumoto, T.

    2012-12-01

    The rapid determination of hypocentral parameters and their transmission to the public are valuable components of disaster mitigation. We have operated an automatic system for this purpose—termed the Accurate and QUick Analysis system for source parameters (AQUA)—since 2005 (Matsumura et al., 2006). In this system, the initial hypocenter, the moment tensor (MT), and the centroid moment tensor (CMT) solutions are automatically determined and posted on the NIED Hi-net Web site (www.hinet.bosai.go.jp). This paper describes improvements made to the AQUA to overcome limitations that became apparent after the 2011 Tohoku Earthquake (05:46:17, March 11, 2011 in UTC). The improvements included the processing of NIED F-net velocity-type strong motion records, because NIED F-net broadband seismographs are saturated for great earthquakes such as the 2011 Tohoku Earthquake. These velocity-type strong motion seismographs provide unsaturated records not only for the 2011 Tohoku Earthquake, but also for recording stations located close to the epicenters of M>7 earthquakes. We used 0.005-0.020 Hz records for M>7.5 earthquakes, in contrast to the 0.01-0.05 Hz records employed in the original system. The initial hypocenters determined based on arrival times picked by using seismograms recorded by NIED Hi-net stations can have large errors in terms of magnitude and hypocenter location, especially for great earthquakes or earthquakes located far from the onland Hi-net network. The size of the 2011 Tohoku Earthquake was initially underestimated in the AQUA to be around M5 at the initial stage of rupture. Numerous aftershocks occurred at the outer rise east of the Japan trench, where a great earthquake is anticipated to occur. Hence, we modified the system to repeat the MT analyses assuming a larger size, for all earthquakes for which the magnitude was initially underestimated. We also broadened the search range of centroid depth for earthquakes located far from the onland Hi-net network. After implementing the above improvements, the CMT solution for the 2011 Tohoku Earthquake was successfully determined with a moment magnitude (Mw) of 8.6 (9.04 × 10^21 Nm). The focal mechanisms and centroid depths of the 2011 Tohoku Earthquake and M>7 aftershocks, as obtained using the improved system, are in agreement with those from the GlobalCMT. The sizes of these earthquakes are also consistent with those of GlobalCMT, with differences of less than Mw 0.1 except for the mainshock (Mw9.1, 5.31 × 10^22 Nm, GlobalCMT). This discrepancy may indicate that the bandwidth used in the analysis is insufficient for an earthquake of this size. To address this shortcoming, we used 0.0025-0.0100 Hz records and obtained a magnitude of Mw8.9 (3.35 × 10^22 Nm). This result is consistent with the GlobalCMT and other results (e.g., Mw 9.0, 3.43 × 10^22 Nm reported by Ozawa et al., 2011; Mw9.0, 4.42 × 10^22 Nm reported by Suzuki et al., 2011). Using the improved system, the CMT analysis for the 2011 Tohoku Earthquake is estimated to be completed within 12 minutes of the origin time.

  10. Issues in Data Fusion for Use in an Interactive Online Analysis System using MODIS Terra and Aqua Daily Aerosol Data

    NASA Astrophysics Data System (ADS)

    Gopalan, A.; Zubko, V.; Leptoukh, G. G.

    2008-12-01

    Data Fusion defined here as a consisting of merging and interpolation is a method of combining spatio- temporally near-coincident satellite observations to provide complete global or regional maps of geophysical variables for comparison with transport models and ground station observations. We investigate various methods, challenges and limitations of data fusion, with and without interpolation, as a first step towards merging datasets archived in the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) and made public through the Goddard Interactive Online Visualization and Analysis Infrastructure (Giovanni) data portals. As a prototype for the data fusion algorithm, this study uses daily global observations of Aerosol Optical Thickness (AOT), as measured by the MODerate resolution Imaging Spectroradiometer (MODIS) onboard the Terra and Aqua satellites. The goal is to develop a very fast online method for data fusion for implementation into Giovanni. We demonstrate three different methods for fusion (without interpolation): Simple Arithmetic Averaging (SIM), Maximum Likelihood Estimate (MLE) and Weighting by Pixel Counts (WPC). All three methods are roughly comparable, with the MLE (SIM) being slightly preferable when validating with respect to the AOT standard deviations (AOT means). To evaluate the fused product, we introduce two confidence functions, which characterize the percentage of the fused AOT pixels as a function of the relative deviation of the fused AOT from the initial Terra and Aqua AOTs. Gaps in the daily global maps of AOT's arise from regions in sun glint, clouds, gaps between orbit tracks at low latitudes, and other sources of missing data. Data fusion with spatial interpolation produces spatially contiguous fields (global and regional maps) for dust event tracking and comparison with and input to 3-D global and regional models. Eight combinations of merger-interpolation are applied to scenes with regular and irregular data gap patterns. The Cumulative SemiVariogram (CSV) was found to be sensitive to the spatial distribution and fraction of gap areas and, thus, useful for assessing the sensitivity and radius of influence of the merged data to gap patterns. Our results show that the merging-interpolation procedure can produce complete spatial fields with acceptable errors. In this work we also look at some of the challenges involved in data fusion as described above which include the treatment of biases in the individual measurements with respect to a validation standard and assumptions made about the spatial and temporal distribution of the parameter.

  11. A Quantitative Fitness Analysis Workflow

    PubMed Central

    Lydall, D.A.

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel1,2,3,4. QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods5,6. However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases3. For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously1. Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and imaging. Any of these automated steps can be replaced by an equivalent, manual procedure, with an associated reduction in throughput, and we also present a lower throughput manual protocol. The same QFA software tools can be applied to images captured in either workflow. We have extensive experience applying QFA to cultures of the budding yeast S. cerevisiae but we expect that QFA will prove equally useful for examining cultures of the fission yeast S. pombe and bacterial cultures. PMID:22907268

  12. Aqua CERES First Light

    NASA Technical Reports Server (NTRS)

    2002-01-01

    NASA's latest Earth Observing System satellite-Aqua-is dedicated to advancing our understanding of Earth's water cycle. Launched on May 4, 2002, Aqua has successfully completed its checkout period and is fully operational. Using multiple instruments, Aqua data and images are crucial toward improving our knowledge of global climate change. The Clouds and the Earth's Radiant Energy System (CERES) instrument is one of six on board the Aqua satellite. CERES detects the amount of outgoing heat and reflected sunlight leaving the planet. A detailed understanding of how clouds affect the energy balance is essential for better climate change predictions. These Aqua images show CERES measurements over the United States from June 22, 2002. Clear ocean regions, shown in dark blue on the left image, reflect the least amount of sunlight back to space. Clear land areas, shown in lighter blue, reflect more solar energy. Clouds and snow-covered surfaces, shown in white and green, reflect the greatest amounts of sunlight back to space. Clear warm regions, shown in yellow over much of the western United States on the right image, emit the most heat. High, cold clouds, shown in blue and white, significantly reduce the amount of heat lost to space. Aqua is part of NASA's Earth Science Enterprise, a long-term research effort dedicated to understanding and protecting our home planet. Through the study of Earth, NASA will help to provide sound science to policy and economic decision makers so as to better life here, while developing the technologies needed to explore the universe and search for life beyond our home planet. Click to read details on the launch and deployment of Aqua; or read the Aqua fact sheet for more information about the mission. Image courtesy CERES Science Team, NASA Langley Research Center

  13. Multiyear analysis of Terra/Aqua MODIS aerosol optical depth and ground observations over tropical urban region of Hyderabad, India

    NASA Astrophysics Data System (ADS)

    Kharol, Shailesh Kumar; Badarinath, K. V. S.; Sharma, Anu Rani; Kaskaoutis, D. G.; Kambezidis, H. D.

    2011-03-01

    Remote sensing of global aerosols has constituted a great scientific interest in a variety of applications related to global warming and climate change. The present study uses Level 2 (10 × 10 km) and Level 3 (1° × 1°) Terra/Aqua MODIS (C005) derived aerosol optical depths at 550 nm (AOD 550) and compares them with ground-based (MICROTOPS-II, MT) sun photometer measured AOD 550 in the period 2002-2008 over Hyderabad, India. The correlation coefficient ( R2) between Level 3 Terra/Aqua MODIS and MT AOD 550 in all seasons ranges from 0.30 to 0.46. Even lower correlations revealed when the Level 2 MODIS data are used ( R2 = 0.16-0.30). The Level 3 MODIS AOD 550 underestimates significantly the MT AOD 550, while the Level 2 AOD 550 values are much larger than those of Level 3. The comparison of the Terra/Aqua MODIS AOD 550 at regional scale, and especially over urban/industrial areas with significant aerosol diurnal variation, constitutes a real challenge and may reveal the ability of the two sensors to capture the temporal variation of the aerosol loading within a time interval of ˜3 h. The results show relatively good correlation ( R2 ˜ 0.6-0.7) regarding the Level 3 dataset; however, the Level 2 data showed large scatter and very poor correlations. On the other hand, the mean seasonal AOD 550 values are similar, while Terra AOD 550 is higher than that obtained from Aqua. Both satellite and ground-based measurements show remarkable increasing trends in AOD over Hyderabad, which are attributed to the extension of the urbanized area, the growing of population, motor vehicles and local emissions.

  14. Echocardiographic visual processing and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Du, Ji; Qi, Long; Yu, Daoyin

    1996-12-01

    This paper presents a new system of echocardiographic visual processing and quantitative analysis, which can be used in computer aided diagnosis. Detection of the left ventricular boundaries in cardiac images is essential for the quantitative analysis. Our new approach incorporates a priori knowledge of heart geometry (a set of constraints), its brightness distribution, and Sobel operator detection in polar coordinate into the boundaries detection algorithm. Then we employ the outline chain-code tracking technique for echocardiogram computation and quantitative analysis. The procedure is demonstrated by using echocardiograms of the human heart.

  15. Aqua Education and Public Outreach

    NASA Astrophysics Data System (ADS)

    Graham, S. M.; Parkinson, C. L.; Chambers, L. H.; Ray, S. E.

    2011-12-01

    NASA's Aqua satellite was launched on May 4, 2002, with six instruments designed to collect data about the Earth's atmosphere, biosphere, hydrosphere, and cryosphere. Since the late 1990s, the Aqua mission has involved considerable education and public outreach (EPO) activities, including printed products, formal education, an engineering competition, webcasts, and high-profile multimedia efforts. The printed products include Aqua and instrument brochures, an Aqua lithograph, Aqua trading cards, NASA Fact Sheets on Aqua, the water cycle, and weather forecasting, and an Aqua science writers' guide. On-going formal education efforts include the Students' Cloud Observations On-Line (S'COOL) Project, the MY NASA DATA Project, the Earth System Science Education Alliance, and, in partnership with university professors, undergraduate student research modules. Each of these projects incorporates Aqua data into its inquiry-based framework. Additionally, high school and undergraduate students have participated in summer internship programs. An earlier formal education activity was the Aqua Engineering Competition, which was a high school program sponsored by the NASA Goddard Space Flight Center, Morgan State University, and the Baltimore Museum of Industry. The competition began with the posting of a Round 1 Aqua-related engineering problem in December 2002 and concluded in April 2003 with a final round of competition among the five finalist teams. The Aqua EPO efforts have also included a wide range of multimedia products. Prior to launch, the Aqua team worked closely with the Special Projects Initiative (SPI) Office to produce a series of live webcasts on Aqua science and the Cool Science website aqua.nasa.gov/coolscience, which displays short video clips of Aqua scientists and engineers explaining the many aspects of the Aqua mission. These video clips, the Aqua website, and numerous presentations have benefited from dynamic visualizations showing the Aqua launch, instrument deployments, instrument sensing, and the Aqua orbit. More recently, in 2008 the Aqua team worked with the ViewSpace production team from the Space Telescope Science Institute to create an 18-minute ViewSpace feature showcasing the science and applications of the Aqua mission. Then in 2010 and 2011, Aqua and other NASA Earth-observing missions partnered with National CineMedia on the "Know Your Earth" (KYE) project. During January and July 2010 and 2011, KYE ran 2-minute segments highlighting questions that promoted global climate literacy on lobby LCD screens in movie theaters throughout the U.S. Among the ongoing Aqua EPO efforts is the incorporation of Aqua data sets onto the Dynamic Planet, a large digital video globe that projects a wide variety of spherical data sets. Aqua also has a highly successful collaboration with EarthSky communications on the production of an Aqua/EarthSky radio show and podcast series. To date, eleven productions have been completed and distributed via the EarthSky network. In addition, a series of eight video podcasts (i.e., vodcasts) are under production by NASA Goddard TV in conjunction with Aqua personnel, highlighting various aspects of the Aqua mission.

  16. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  17. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  18. Analysis of Raman Lidar and radiosonde measurements from the AWEX-G field campaign and its relation to Aqua validation

    NASA Technical Reports Server (NTRS)

    Whiteman, D. N.; Russo, F.; Demoz, B.; Miloshevich, L. M.; Veselovskii, I.; Hannon, S.; Wang, Z.; Vomel, H.; Schmidlin, F.; Lesht, B.

    2005-01-01

    Early work within the Aqua validation activity revealed there to be large differences in water vapor measurement accuracy among the various technologies in use for providing validation data. The validation measurements were made at globally distributed sites making it difficult to isolate the sources of the apparent measurement differences among the various sensors, which included both Raman lidar and radiosonde. Because of this, the AIRS Water Vapor Experiment-Ground (AWEX-G) was held in October - November, 2003 with the goal of bringing validation technologies to a common site for intercomparison and resolution of the measurement discrepancies. Using the University of Colorado Cryogenic Frostpoint Hygrometer (CFH) as the water vapor reference, the AWEX-G field campaign resulted in new correction techniques for both Raman lidar, Vaisala RS80-H and RS90/92 measurements that significantly improve the absolute accuracy of those measurement systems particularly in the upper troposphere. Mean comparisons of radiosondes and lidar are performed demonstrating agreement between corrected sensors and the CFH to generally within 5% thereby providing data of sufficient accuracy for Aqua validation purposes. Examples of the use of the correction techniques in radiance and retrieval comparisons are provided and discussed.

  19. Analysis of Raman Lidar and Radiosonde Measurements from the AWEX-G Field Campaign and Its Relation to Aqua Validation

    NASA Technical Reports Server (NTRS)

    Whiteman, D. N.; Russo, F.; Demoz, B.; Miloshevich, L. M.; Veselovskii, I.; Hannon, S.; Wang, Z.; Vomel, H.; Schmidlin, F.; Lesht, B.; Moore, P. J.; Beebe, A. S.; Gambacorta, A.; Barnet, C.

    2006-01-01

    Early work within the Aqua validation activity revealed there to be large differences in water vapor measurement accuracy among the various technologies in use for providing validation data. The validation measurements were made at globally distributed sites making it difficult to isolate the sources of the apparent measurement differences among the various sensors, which included both Raman lidar and radiosonde. Because of this, the AIRS Water Vapor Experiment-Ground (AWEX-G) was held in October-November 2003 with the goal of bringing validation technologies to a common site for intercomparison and resolving the measurement discrepancies. Using the University of Colorado Cryogenic Frostpoint Hygrometer (CFH) as the water vapor reference, the AWEX-G field campaign permitted correction techniques to be validated for Raman lidar, Vaisala RS80-H and RS90/92 that significantly improve the absolute accuracy of water vapor measurements from these systems particularly in the upper troposphere. Mean comparisons of radiosondes and lidar are performed demonstrating agreement between corrected sensors and the CFH to generally within 5% thereby providing data of sufficient accuracy for Aqua validation purposes. Examples of the use of the correction techniques in radiance and retrieval comparisons are provided and discussed.

  20. Qualitative and Quantitative Analysis: Interpretation of Electropherograms

    NASA Astrophysics Data System (ADS)

    Szumski, Michał; Buszewski, Bogusław

    In this chapter the basic information on qualitative and quantitative analysis in CE is provided. Migration time and spectral data are described as the most important parameters used for identification of compounds. The parameters that negatively influence qualitative analysis are briefly mentioned. In the quantitative analysis section the external standard and internal standard calibration methods are described. Variables influencing peak height and peak area in capillary electrophoresis are briefly summarized. Also, a discussion on electrodisperssion and its influence on a observed peak shape is provided.

  1. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  2. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  3. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  4. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  5. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  6. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  7. High-energy PIXE: quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Opitz-Coutureau, J.; Campbell, J. L.; Maxwell, J. A.; Hopman, T.

    2004-06-01

    In recent years, high-energy PIXE was applied successfully for qualitative analysis on art and archaeological objects, e.g. coins, bronzes, sculptures, brooches. However, in the absence of software for quantitative analysis the full benefit inherent in the PIXE technique was not obtained. For example, a bronze could easily be distinguished from a brass, but the concentrations could not be rigorously compared within a set of bronzes. In this paper, the first quantitative analysis by high-energy PIXE is presented. The Guelph PIXE Software Package GUPIX has been extended to proton energies up to 100 MeV, so that high-energy PIXE spectra can be evaluated and concentrations derived. Measurements on metal and alloy standards at two different proton energies have been performed and the obtained compositions were compared to the certified values. The results will be presented and deviations discussed.

  8. Quantitative analysis of colony morphology in yeast

    PubMed Central

    Ruusuvuori, Pekka; Lin, Jake; Scott, Adrian C.; Tan, Zhihao; Sorsa, Saija; Kallio, Aleksi; Nykter, Matti; Yli-Harja, Olli; Shmulevich, Ilya; Dudley, Aimée M.

    2014-01-01

    Microorganisms often form multicellular structures such as biofilms and structured colonies that can influence the organism’s virulence, drug resistance, and adherence to medical devices. Phenotypic classification of these structures has traditionally relied on qualitative scoring systems that limit detailed phenotypic comparisons between strains. Automated imaging and quantitative analysis have the potential to improve the speed and accuracy of experiments designed to study the genetic and molecular networks underlying different morphological traits. For this reason, we have developed a platform that uses automated image analysis and pattern recognition to quantify phenotypic signatures of yeast colonies. Our strategy enables quantitative analysis of individual colonies, measured at a single time point or over a series of time-lapse images, as well as the classification of distinct colony shapes based on image-derived features. Phenotypic changes in colony morphology can be expressed as changes in feature space trajectories over time, thereby enabling the visualization and quantitative analysis of morphological development. To facilitate data exploration, results are plotted dynamically through an interactive Yeast Image Analysis web application (YIMAA; http://yimaa.cs.tut.fi) that integrates the raw and processed images across all time points, allowing exploration of the image-based features and principal components associated with morphological development. PMID:24447135

  9. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  10. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  11. Aqua Satellite Mission Educational Outreach

    NASA Astrophysics Data System (ADS)

    Parkinson, C. L.; Graham, S. M.

    2003-12-01

    An important component of the Aqua mission, launched into space on May 4, 2002 with a suite of six instruments from the U.S., Japan, and Brazil, is the effort to educate the public about the mission and the science topics that it addresses. This educational outreach includes printed products, web casts, other web-based materials, animations, presentations, and a student contest. The printed products include brochures for the mission as a whole and for the instruments, NASA Fact Sheets on the mission, the water cycle, and weather forecasting, an Aqua Science Writers' Guide, an Aqua lithograph, posters, and trading cards. Animations include animations of the launch, the orbit, instrument deployments, instrument sensing, and several of the data products. Each of these materials is available on the Aqua web site at http://aqua.nasa.gov, as are archived versions of the eight Aqua web casts. The web casts were done live on the internet and focused on the spacecraft, the science, the launch, and the validation efforts. All web casts had key Aqua personnel as live guests and had a web-based chat session allowing viewers to ask questions. Other web-based materials include a "Cool Science" section of the aqua.nasa.gov website, with videos of Aqua scientists and engineers speaking about Aqua and the science and engineering behind it, arranged in a framework organized for the convenience of teachers dealing with core curriculum requirements. The web casts and "Cool Science" site were produced by the Special Project Initiatives Office at NASA's Goddard Space Flight Center. Outreach presentations about Aqua have been given at schools, universities, and public forums at many locations around the world, especially in the U.S. A competition was held for high school students during the 2002-03 school year, culminating in April 2003, with five finalist teams competing for the top slots, followed by an awards ceremony. The competition had all the student teams analyzing an anomalous situation encountered by Aqua shortly after launch and the five finalist teams determining how best to handle a hypothetical degradation of the solid state recorder.

  12. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  13. Quantitative analysis of retinal changes in hypertension

    NASA Astrophysics Data System (ADS)

    Giansanti, Roberto; Boemi, Massimo; Fumelli, Paolo; Passerini, Giorgio; Zingaretti, Primo

    1995-05-01

    Arterial hypertension is a high prevalence disease in Western countries and it is associated with increased risk for cardiovascular accidents. Retinal vessel changes are common findings in patients suffering from long-standing hypertensive disease. Morphological evaluations of the fundus oculi represent a fundamental tool for the clinical approach to the patient with hypertension. A qualitative analysis of the retinal lesions is usually performed and this implies severe limitations both in the classification of the different degrees of the pathology and in the follow-up of the disease. A diagnostic system based on a quantitative analysis of the retinal changes could overcome these problems. Our computerized approach was intended for this scope. The paper concentrates on the results and the implications of a computerized approach to the automatic extraction of numerical indexes describing morphological details of the fundus oculi. A previously developed image processing and recognition system, documented elsewhere and briefly described here, was successfully tested in pre-clinical experiments and applied in the evaluation of normal as well as of pathological fundus. The software system was developed to extract indexes such as caliber and path of vessels, local tortuosity of arteries and arterioles, positions and angles of crossings between two vessels. The reliability of the results, justified by their low variability, makes feasible the standardization of quantitative parameters to be used both in the diagnosis and in the prognosis of hypertension, and also allows prospective studies based upon them.

  14. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  15. Automated quantitative image analysis of nanoparticle assembly

    NASA Astrophysics Data System (ADS)

    Murthy, Chaitanya R.; Gao, Bo; Tao, Andrea R.; Arya, Gaurav

    2015-05-01

    The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly.The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly. Electronic supplementary information (ESI) available: Derivation and implementation of unbiased feature measurement, calculation of empirical distribution of single particle areas, calculation of self-similarity dimensions by regression on cluster data, and validation of image analysis algorithms. See DOI: 10.1039/c5nr00809c

  16. Uncertainty analysis in quantitative risk assessment

    SciTech Connect

    Quin, S.; Widera, G.E.O.

    1996-02-01

    Inservice inspection is of great significance to a number of industries, especially such ones as petrochemical and nuclear power. Of the quantitative approaches applied to inservice inspection, failure modes, effects, criticality analysis (FMECA) methodology is recommended. FMECA can provide a straightforward illustration of how risk can be used to prioritize components for inspection (ASME, 1991). But, at present, it has two limitations. One is that it cannot be used in the situation where components have multiple failure modes. The other is that it cannot be used in the situation where the uncertainties in the data of components have nonuniform distributions. In engineering practice, these two situations exist in many cases. In this paper, two methods based on fuzzy set theory are presented to treat these problems. The methods proposed here can be considered as a supplement to FMECA, thus extending its range of applicability.

  17. Near Real Time Quantitative Gas Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Herget, William F.; Tromp, Marianne L.; Anderson, Charles R.

    1985-12-01

    A Fourier transform infrared (FT-IR) - based system has been developed and is undergoing evaluation for near real time multicomponent quantitative analysis of undiluted gaseous automotive exhaust emissions. The total system includes: (1) a gas conditioning system (GCS) for tracer gas injection, gas mixing, and temperature stabilization; and (2) an exhaust gas analyzer (EGA) consisting of a sample cell, an FT-IR system, and a computerized data processing system. Tests have shown that the system can monitor about 20 individual species (concentrations down to the 1-20 ppm range) with a time resolution of one second. Tests have been conducted on a chassis dynamometer system utilizing different autos, different fuels, and different driving cycles. Results were compared with those obtained using a standard constant volume sampling (CVS) system.

  18. Method for quantitative analysis of flocculation performance.

    PubMed

    Tse, Ian C; Swetland, Karen; Weber-Shirk, Monroe L; Lion, Leonard W

    2011-05-01

    The sedimentation rate and the post-sedimentation residual turbidity of flocculated suspensions are properties central to the design and operation of unit processes following flocculation in a water treatment plant. A method for comparing flocculation performance based on these two properties is described. The flocculation residual turbidity analyzer (FReTA) records the turbidity of flocculent suspensions undergoing quiescent settling. The fixed distance across which flocs must travel to clear the measurement volume allows sedimentation velocity distributions of the flocculent suspension to be calculated from the raw turbidity data. By fitting the transformed turbidity data with a modified gamma distribution, the mean and variance of sedimentation velocity can be obtained along with the residual turbidity after a period of settling. This new analysis method can be used to quantitatively compare how differences in flocculator operating conditions affect the sedimentation velocity distribution of flocs as well as the post-sedimentation residual turbidity. PMID:21497877

  19. Quantitative analysis of phagocytosis and phagosome maturation.

    PubMed

    Sattler, Natascha; Monroy, Roger; Soldati, Thierry

    2013-01-01

    Phagocytosis and phagosome maturation lead to killing and digestion of bacteria by protozoans and innate immune phagocytes. Phagocytosis of particles expressing or coupled to various fluorescent reporters and sensors can be used to monitor quantitatively various parameters of this central biological process. In this chapter we detail different labeling techniques of bacteria and latex beads used to measure adhesion and uptake by FACS analysis. We also describe methods to use fluorescent reporter dyes (FITC or DQgreen) coupled to silica beads to measure the kinetics of acidification and proteolysis. Measurements can be performed either at the single-cell level, using live microscopy, or for a whole cell population, with a fluorescence microplate reader. PMID:23494319

  20. Aqua 10 Years After Launch

    NASA Technical Reports Server (NTRS)

    Parkinson, Claire L.

    2013-01-01

    A little over ten years ago, in the early morning hours of May 4, 2002, crowds of spectators stood anxiously watching as the Delta II rocket carrying NASA's Aqua spacecraft lifted off from its launch pad at Vandenberg Air Force Base in California at 2:55 a.m. The rocket quickly went through a low-lying cloud cover, after which the main portion of the rocket fell to the waters below and the rockets second stage proceeded to carry Aqua south across the Pacific, onward over Antarctica, and north to Africa, where the spacecraft separated from the rocket 59.5 minutes after launch. Then, 12.5 minutes later, the solar array unfurled over Europe, and Aqua was on its way in the first of what by now have become over 50,000 successful orbits of the Earth.

  1. Quantitative Analysis of Tremors in Welders

    PubMed Central

    Sanchez-Ramos, Juan; Reimer, Dacy; Zesiewicz, Theresa; Sullivan, Kelly; Nausieda, Paul A.

    2011-01-01

    Background: Workers chronically exposed to manganese in welding fumes may develop an extra-pyramidal syndrome with postural and action tremors. Objectives: To determine the utility of tremor analysis in distinguishing tremors among workers exposed to welding fumes, patients with Idiopathic Parkinson’s Disease (IPD) and Essential Tremor (ET). Methods: Retrospective study of recorded tremor in subjects from academic Movement Disorders Clinics and Welders. Quantitative tremor analysis was performed and associated with clinical status. Results: Postural tremor intensity was increased in Welders and ET and was associated with visibly greater amplitude of tremor with arms extended. Mean center frequencies (Cf) of welders and patients with ET were significantly higher than the mean Cf of PD subjects. Although both the welders and the ET group exhibited a higher Cf with arms extended, welders could be distinguished from the ET subjects by a significantly lower Cf of the rest tremor than that measured in ET subjects. Conclusions: In the context of an appropriate exposure history and neurological examination, tremor analysis may be useful in the diagnosis of manganese-related extra-pyramidal manifestations. PMID:21655131

  2. Quantitative proteomic analysis of single pancreatic islets

    PubMed Central

    Waanders, Leonie F.; Chwalek, Karolina; Monetti, Mara; Kumar, Chanchal; Lammert, Eckhard; Mann, Matthias

    2009-01-01

    Technological developments make mass spectrometry (MS)-based proteomics a central pillar of biochemical research. MS has been very successful in cell culture systems, where sample amounts are not limiting. To extend its capabilities to extremely small, physiologically distinct cell types isolated from tissue, we developed a high sensitivity chromatographic system that measures nanogram protein mixtures for 8 h with very high resolution. This technology is based on splitting gradient effluents into a capture capillary and provides an inherent technical replicate. In a single analysis, this allowed us to characterize kidney glomeruli isolated by laser capture microdissection to a depth of more than 2,400 proteins. From pooled pancreatic islets of Langerhans, another type of “miniorgan,” we obtained an in-depth proteome of 6,873 proteins, many of them involved in diabetes. We quantitatively compared the proteome of single islets, containing 2,000–4,000 cells, treated with high or low glucose levels, and covered most of the characteristic functions of beta cells. Our ultrasensitive analysis recapitulated known hyperglycemic changes but we also find components up-regulated such as the mitochondrial stress regulator Park7. Direct proteomic analysis of functionally distinct cellular structures opens up perspectives in physiology and pathology. PMID:19846766

  3. Quantitative Analysis of Triple Mutant Genetic Interactions

    PubMed Central

    Braberg, Hannes; Alexander, Richard; Shales, Michael; Xu, Jiewei; Franks-Skiba, Kathleen E.; Wu, Qiuqin; Haber, James E.; Krogan, Nevan J.

    2014-01-01

    The quantitative analysis of genetic interactions between pairs of gene mutations has proven effective for characterizing cellular functions but can miss important interactions for functionally redundant genes. To address this limitation, we have developed an approach termed Triple Mutant Analysis (TMA). The procedure relies on a query strain that contains two deletions in a pair of redundant or otherwise related genes, that is crossed against a panel of candidate deletion strains to isolate triple mutants and measure their growth. A central feature of TMA is to interrogate mutants that are synthetically sick when two other genes are deleted but interact minimally with either single deletion. This approach has been valuable for discovering genes that restore critical functions when the principle actors are deleted. TMA has also uncovered double mutant combinations that produce severe defects because a third protein becomes deregulated and acts in a deleterious fashion, and it has revealed functional differences between proteins presumed to act together. The protocol is optimized for Singer ROTOR pinning robots, takes 3 weeks to complete, and measures interactions for up to 30 double mutants against a library of 1536 single mutants. PMID:25010907

  4. Quantitative analysis of infrared contrast enhancement algorithms

    NASA Astrophysics Data System (ADS)

    Weith-Glushko, Seth; Salvaggio, Carl

    2007-04-01

    Dynamic range reduction and contrast enhancement are two image-processing methods that are required when developing thermal camera systems. The two methods must be performed in such a way that the high dynamic range imagery output from current sensors are compressed in a pleasing way for display on lower dynamic range monitors. This research examines a quantitative analysis of infrared contrast enhancement algorithms found in literature and developed by the author. Four algorithms were studied, three of which were found in literature and one developed by the author: tail-less plateau equalization (TPE), adaptive plateau equalization (APE), the method according to Aare Mällo (MEAM), and infrared multi-scale retinex (IMSR). TPE and APE are histogram-based methods, requiring the calculation of the probability density of digital counts within an image. MEAM and IMSR are frequency-domain methods, methods that operate on input imagery that has been split into components containing differing spatial frequency content. After a rate of growth analysis and psychophysical trial were performed, MEAM was found to be the best algorithm.

  5. QUANTITATIVE TRAIT LOCUS ANALYSIS AND METABOLIC PATHWAYS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The development of molecular markers for crop plants has enabled research on the genetic basis of quantitative traits. However, despite more than a decade of these studies, called quantitative trait locus (QTL) analyses, the molecular basis for variation in most agronomic traits is still largely unk...

  6. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  7. Quantitative methods for ecological network analysis.

    PubMed

    Ulanowicz, Robert E

    2004-12-01

    The analysis of networks of ecological trophic transfers is a useful complement to simulation modeling in the quest for understanding whole-ecosystem dynamics. Trophic networks can be studied in quantitative and systematic fashion at several levels. Indirect relationships between any two individual taxa in an ecosystem, which often differ in either nature or magnitude from their direct influences, can be assayed using techniques from linear algebra. The same mathematics can also be employed to ascertain where along the trophic continuum any individual taxon is operating, or to map the web of connections into a virtual linear chain that summarizes trophodynamic performance by the system. Backtracking algorithms with pruning have been written which identify pathways for the recycle of materials and energy within the system. The pattern of such cycling often reveals modes of control or types of functions exhibited by various groups of taxa. The performance of the system as a whole at processing material and energy can be quantified using information theory. In particular, the complexity of process interactions can be parsed into separate terms that distinguish organized, efficient performance from the capacity for further development and recovery from disturbance. Finally, the sensitivities of the information-theoretic system indices appear to identify the dynamical bottlenecks in ecosystem functioning. PMID:15556474

  8. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Technical Reports Server (NTRS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  9. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that question the…

  10. In aqua vivo EPID dosimetry

    SciTech Connect

    Wendling, Markus; McDermott, Leah N.; Mans, Anton; Olaciregui-Ruiz, Igor; Pecharroman-Gallego, Raul; Sonke, Jan-Jakob; Stroom, Joep; Herk, Marcel J.; Mijnheer, Ben van

    2012-01-15

    Purpose: At the Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital in vivo dosimetry using an electronic portal imaging device (EPID) has been implemented for almost all high-energy photon treatments of cancer with curative intent. Lung cancer treatments were initially excluded, because the original back-projection dose-reconstruction algorithm uses water-based scatter-correction kernels and therefore does not account for tissue inhomogeneities accurately. The aim of this study was to test a new method, in aqua vivo EPID dosimetry, for fast dose verification of lung cancer irradiations during actual patient treatment. Methods: The key feature of our method is the dose reconstruction in the patient from EPID images, obtained during the actual treatment, whereby the images have been converted to a situation as if the patient consisted entirely of water; hence, the method is termed in aqua vivo. This is done by multiplying the measured in vivo EPID image with the ratio of two digitally reconstructed transmission images for the unit-density and inhomogeneous tissue situation. For dose verification, a comparison is made with the calculated dose distribution with the inhomogeneity correction switched off. IMRT treatment verification is performed for each beam in 2D using a 2D {gamma} evaluation, while for the verification of volumetric-modulated arc therapy (VMAT) treatments in 3D a 3D {gamma} evaluation is applied using the same parameters (3%, 3 mm). The method was tested using two inhomogeneous phantoms simulating a tumor in lung and measuring its sensitivity for patient positioning errors. Subsequently five IMRT and five VMAT clinical lung cancer treatments were investigated, using both the conventional back-projection algorithm and the in aqua vivo method. The verification results of the in aqua vivo method were statistically analyzed for 751 lung cancer patients treated with IMRT and 50 lung cancer patients treated with VMAT. Results: The improvements by applying the in aqua vivo approach are considerable. The percentage of {gamma} values {<=}1 increased on average from 66.2% to 93.1% and from 43.6% to 97.5% for the IMRT and VMAT cases, respectively. The corresponding mean {gamma} value decreased from 0.99 to 0.43 for the IMRT cases and from 1.71 to 0.40 for the VMAT cases, which is similar to the accepted clinical values for the verification of IMRT treatments of prostate, rectum, and head-and-neck cancers. The deviation between the reconstructed and planned dose at the isocenter diminished on average from 5.3% to 0.5% for the VMAT patients and was almost the same, within 1%, for the IMRT cases. The in aqua vivo verification results for IMRT and VMAT treatments of a large group of patients had a mean {gamma} of approximately 0.5, a percentage of {gamma} values {<=}1 larger than 89%, and a difference of the isocenter dose value less than 1%. Conclusions: With the in aqua vivo approach for the verification of lung cancer treatments (IMRT and VMAT), we can achieve results with the same accuracy as obtained during in vivo EPID dosimetry of sites without large inhomogeneities.

  11. Use quantitative analysis to manage fire risk

    SciTech Connect

    Mowrer, D.S.

    1995-04-01

    By incorporating quantitative engineering tools into fire-risk evaluations, safety engineers and managers can mitigate hazards caused by pressurized oil. Fire remains the number one factor for unscheduled downtime, lost production and equipment damage. Pressurized oil is a common and sometimes underestimated fire hazard. This hazard exists in many industrial facilities and can fuel a pool fire or an intense torch fire. Standard sprinkler protection does not affect torch fires, and either roil fire type can rapidly weaken structural steel members to the point of failure. Two case histories show how safety engineers using quantitative tools analyze fire hazards presented by pressurized oil and reduce its risks. The techniques are simple-to-use and are readily available.

  12. EGFR Protein Expression in Non-Small Cell Lung Cancer Predicts Response to an EGFR Tyrosine Kinase Inhibitor – A Novel Antibody for Immunohistochemistry or AQUA Technology

    PubMed Central

    Mascaux, Celine; Wynes, Murry W.; Kato, Yasufumi; Tran, Cindy; Asuncion, Bernadette Reyna; Zhao, Jason M.; Gustavson, Mark; Ranger-Moore, Jim; Gaire, Fabien; Matsubayashi, Jun; Nagao, Toshitaka; Yoshida, Koichi; Ohira, Tatuso; Ikeda, Norihiko; Hirsch., Fred R

    2011-01-01

    Introduction Epidermal growth factor receptor (EGFR) protein expression in non-small cell lung cancer (NSCLC) is not recommended for predicting response to EGFR tyrosine kinase inhibitors (TKIs) due to conflicting results, all using antibodies detecting EGFR external domain (ED). We tested the predictive value of EGFR protein expression for response to an EGFR TKI using an antibody that detects the intracellular domain (ID) and compared fluorescence-based Automated QUantitative Analysis (AQUA) technology to immunohistochemistry (IHC). Methods Specimens from 98 gefitinib-treated NSCLC Japanese patients were evaluated by IHC (n=98/98) and AQUA technology (n=70/98). EGFR ID- (5B7) and ED-specific antibodies (3C6 and 31G7) were compared. Results EGFR expression evaluated with 5B7 was significantly higher in responders versus non-responders to gefitinib both with IHC and with AQUA. ED-specific antibodies did not significantly predicted response. Using AQUA and ID-specific antibody resulted in the best prediction performance with a positive and negative predictive value (PPV/NPV) for responders of 50% and 87%, respectively. EGFR expression with ID-specific antibody and AQUA also predicted responders in EGFR mutated patients. Increased EGFR expression with the ID antibody associated with increased median PFS (11.7 months vs 5.0, Log-rank p=0.034) and OS (38.6 vs 14.9, p=0.040), from gefitinib therapy. Conclusions EGFR protein expression using an ID-specific antibody specifically predicts response to gefitinib in NSCLC patients, including in EGFR mutated patients, and increased PFS/OS from gefitinib. These data suggest that the choice of diagnostic antibody and methodology matters to predict response and outcome to specific therapies. The potential clinical application needs further validation. PMID:21994417

  13. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved. PMID:25823584

  14. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  15. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-01-01

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF. PMID:22373162

  16. Interspecific competition between Microcystis aeruginosa and Anabaena flos-aquae from Taihu Lake, China.

    PubMed

    Zhang, Xue-Wei; Fu, Jie; Song, Shuang; Zhang, Ping; Yang, Xian-Hui; Zhang, Li-Rong; Luo, Yin; Liu, Chang-Hong; Zhu, Hai-Liang

    2014-01-01

    Microcystis and Anabaena are the main cyanobacteria that cause cyanobacterial blooms in Taihu Lake, China. The mechanism of population competition between M. aeruginosa and A. flos-aquae was studied by co-cultivation in the laboratory. The growth of M. aeruginosa was inhibited, while the growth of A. flos-aquae was promoted. The degree of inhibition or promotion was related to the ratio of the initial cell densities. Both cell-free filtrates of A. flos-aquae and co-culture inhibited M. aeruginosa growth, while both cell-free filtrates of M. aeruginosa and co-culture promoted A. flos-aquae growth. Analysis of the cell-free filtrate by gas chromatography-mass spectrometry indicated that M. aeruginosa and A. flos-aquae may secrete some extracellular allelochemicals that inhibit (promote) the growth of M. aeruginosa (A. flos-aquae) in co-culture. These compounds included sulfur compounds, naphthalene derivatives, cedrene derivatives, quinones, phenol derivatives, diphenyl derivatives, anthracene derivatives, and phthalate esters. This study can help to understand the characteristics of M. aeruginosa and A. flos-aquae and to provide new concepts for the control of cyanobacterial blooms in Taihu Lake. PMID:24772823

  17. Aqua's First 10 Years: An Overview

    NASA Technical Reports Server (NTRS)

    Parkinson, Claire L.

    2012-01-01

    NASA's Aqua spacecraft was launched at 2:55 a.m. on May 4, 2002, from Vandenberg Air Force Base in California, into a near-polar, sun-synchronous orbit at an altitude of 705 km. Aqua carries six Earth-observing instruments to collect data on water in all its forms (liquid, vapor, and solid) and on a wide variety of additional Earth system variables (Parkinson 2003). The design lifetime for Aqua's prime mission was 6 years, and Aqua is now well into its extended mission, approaching 10 years of successful operations. The Aqua data have been used for hundreds of scientific studies and continue to be used for scientific discovery and numerous practical applications.

  18. A Full Snow Season in Yellowstone: A Database of Restored Aqua Band 6

    NASA Technical Reports Server (NTRS)

    Gladkova, Irina; Grossberg, Michael; Bonev, George; Romanov, Peter; Riggs, George; Hall, Dorothy

    2013-01-01

    The algorithms for estimating snow extent for the Moderate Resolution Imaging Spectroradiometer (MODIS) optimally use the 1.6- m channel which is unavailable for MODIS on Aqua due to detector damage. As a test bed to demonstrate that Aqua band 6 can be restored, we chose the area surrounding Yellowstone and Grand Teton national parks. In such rugged and difficult-to-access terrain, satellite images are particularly important for providing an estimation of snow-cover extent. For the full 2010-2011 snow season covering the Yellowstone region, we have used quantitative image restoration to create a database of restored Aqua band 6. The database includes restored radiances, normalized vegetation index, normalized snow index, thermal data, and band-6-based snow-map products. The restored Aqua-band-6 data have also been regridded and combined with Terra data to produce a snow-cover map that utilizes both Terra and Aqua snow maps. Using this database, we show that the restored Aqua-band-6-based snow-cover extent has a comparable performance with respect to ground stations to the one based on Terra. The result of a restored band 6 from Aqua is that we have an additional band-6 image of the Yellowstone region each day. This image can be used to mitigate cloud occlusion, using the same algorithms used for band 6 on Terra. We show an application of this database of restored band-6 images to illustrate the value of creating a cloud gap filling using the National Aeronautics and Space Administration s operational cloud masks and data from both Aqua and Terra.

  19. A QUANTITATIVE ANALYSIS OF DISTANT OPEN CLUSTERS

    SciTech Connect

    Janes, Kenneth A.; Hoq, Sadia

    2011-03-15

    The oldest open star clusters are important for tracing the history of the Galactic disk, but many of the more distant clusters are heavily reddened and projected against the rich stellar background of the Galaxy. We have undertaken an investigation of several distant clusters (Berkeley 19, Berkeley 44, King 25, NGC 6802, NGC 6827, Berkeley 52, Berkeley 56, NGC 7142, NGC 7245, and King 9) to develop procedures for separating probable cluster members from the background field. We next created a simple quantitative approach for finding approximate cluster distances, reddenings, and ages. We first conclude that with the possible exception of King 25 they are probably all physical clusters. We also find that for these distant clusters our typical errors are about {+-}0.07 in E(B - V), {+-}0.15 in log(age), and {+-}0.25 in (m - M){sub o}. The clusters range in age from 470 Myr to 7 Gyr and range from 7.1 to 16.4 kpc from the Galactic center.

  20. Advances in Quantitative Trait Analysis in Yeast

    PubMed Central

    Liti, Gianni; Louis, Edward J.

    2012-01-01

    Understanding the genetic mechanisms underlying complex traits is one of the next frontiers in biology. The budding yeast Saccharomyces cerevisiae has become an important model for elucidating the mechanisms that govern natural genetic and phenotypic variation. This success is partially due to its intrinsic biological features, such as the short sexual generation time, high meiotic recombination rate, and small genome size. Precise reverse genetics technologies allow the high throughput manipulation of genetic information with exquisite precision, offering the unique opportunity to experimentally measure the phenotypic effect of genetic variants. Population genomic and phenomic studies have revealed widespread variation between diverged populations, characteristic of man-made environments, as well as geographic clusters of wild strains along with naturally occurring recombinant strains (mosaics). Here, we review these recent studies and provide a perspective on how these previously unappreciated levels of variation can help to bridge our understanding of the genotype-phenotype gap, keeping budding yeast at the forefront of genetic studies. Not only are quantitative trait loci (QTL) being mapped with high resolution down to the nucleotide, for the first time QTLs of modest effect and complex interactions between these QTLs and between QTLs and the environment are being determined experimentally at unprecedented levels using next generation techniques of deep sequencing selected pools of individuals as well as multi-generational crosses. PMID:22916041

  1. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  2. Using fire tests for quantitative risk analysis

    SciTech Connect

    Ling, W.C.T.; Williamson, R.B.

    1980-03-01

    Fires can be considered a causal chain-of-events in which the growth and spread of fire may cause damage and injury if it is rapid enough to overcome the barriers placed in its way. Fire tests for fire resistance of the barriers can be used in a quantitative risk assessment. The fire growth and spread is modelled in a State Transition Model (STM). The fire barriers are presented as part of the Fire Protection Model (FPM) which is based on a portion of the NFPA Decision Tree. An Emergency Equivalent Network is introduced to couple the Fire Growth Model (FGM) and the FPM so that the spread of fire beyond the room-of-origin can be computed. An example is presented in which a specific building floor plan is analyzed to obtain the shortest expected time for fire to spread between two points. To obtain the probability and time for each link in the network, data from the results of fire tests were used. These results were found to be lacking and new standards giving better data are advocated.

  3. Multiple quantitative trait analysis using bayesian networks.

    PubMed

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

  4. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  5. Multiple Quantitative Trait Analysis Using Bayesian Networks

    PubMed Central

    Scutari, Marco; Howell, Phil; Balding, David J.; Mackay, Ian

    2014-01-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

  6. Model-free linkage analysis of a quantitative trait.

    PubMed

    Morris, Nathan J; Stein, Catherine M

    2012-01-01

    Model-free methods of linkage analysis for quantitative traits are a class of easily implemented, computationally efficient, and statistically robust approaches to searching for linkage to a quantitative trait. By "model-free" we refer to methods of linkage analysis that do not fully specify a genetic model (i.e., the causal allele frequency and penetrance functions). In this chapter, we briefly survey the methods that are available, and then we discuss the necessary steps to implement an analysis using the programs GENIBD, SIBPAL, and RELPAL in the Statistical Analysis for Genetic Epidemiology (S.A.G.E.) software suite. PMID:22307705

  7. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  8. A Quantitative Analysis of Countries' Research Strengths

    ERIC Educational Resources Information Center

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  9. Quantitative analysis of chromatin proteomes in disease.

    PubMed

    Monte, Emma; Chen, Haodong; Kolmakova, Maria; Parvatiyar, Michelle; Vondriska, Thomas M; Franklin, Sarah

    2012-01-01

    In the nucleus reside the proteomes whose functions are most intimately linked with gene regulation. Adult mammalian cardiomyocyte nuclei are unique due to the high percentage of binucleated cells,(1) the predominantly heterochromatic state of the DNA, and the non-dividing nature of the cardiomyocyte which renders adult nuclei in a permanent state of interphase.(2) Transcriptional regulation during development and disease have been well studied in this organ,(3-5) but what remains relatively unexplored is the role played by the nuclear proteins responsible for DNA packaging and expression, and how these proteins control changes in transcriptional programs that occur during disease.(6) In the developed world, heart disease is the number one cause of mortality for both men and women.(7) Insight on how nuclear proteins cooperate to regulate the progression of this disease is critical for advancing the current treatment options. Mass spectrometry is the ideal tool for addressing these questions as it allows for an unbiased annotation of the nuclear proteome and relative quantification for how the abundance of these proteins changes with disease. While there have been several proteomic studies for mammalian nuclear protein complexes,(8-13) until recently(14) there has been only one study examining the cardiac nuclear proteome, and it considered the entire nucleus, rather than exploring the proteome at the level of nuclear sub compartments.(15) In large part, this shortage of work is due to the difficulty of isolating cardiac nuclei. Cardiac nuclei occur within a rigid and dense actin-myosin apparatus to which they are connected via multiple extensions from the endoplasmic reticulum, to the extent that myocyte contraction alters their overall shape.(16) Additionally, cardiomyocytes are 40% mitochondria by volume(17) which necessitates enrichment of the nucleus apart from the other organelles. Here we describe a protocol for cardiac nuclear enrichment and further fractionation into biologically-relevant compartments. Furthermore, we detail methods for label-free quantitative mass spectrometric dissection of these fractions-techniques amenable to in vivo experimentation in various animal models and organ systems where metabolic labeling is not feasible. PMID:23299252

  10. Quantitative Analysis of Chromatin Proteomes in Disease

    PubMed Central

    Monte, Emma; Chen, Haodong; Kolmakova, Maria; Parvatiyar, Michelle; Vondriska, Thomas M.; Franklin, Sarah

    2012-01-01

    In the nucleus reside the proteomes whose functions are most intimately linked with gene regulation. Adult mammalian cardiomyocyte nuclei are unique due to the high percentage of binucleated cells,1 the predominantly heterochromatic state of the DNA, and the non-dividing nature of the cardiomyocyte which renders adult nuclei in a permanent state of interphase.2 Transcriptional regulation during development and disease have been well studied in this organ,3-5 but what remains relatively unexplored is the role played by the nuclear proteins responsible for DNA packaging and expression, and how these proteins control changes in transcriptional programs that occur during disease.6 In the developed world, heart disease is the number one cause of mortality for both men and women.7 Insight on how nuclear proteins cooperate to regulate the progression of this disease is critical for advancing the current treatment options. Mass spectrometry is the ideal tool for addressing these questions as it allows for an unbiased annotation of the nuclear proteome and relative quantification for how the abundance of these proteins changes with disease. While there have been several proteomic studies for mammalian nuclear protein complexes,8-13 until recently14 there has been only one study examining the cardiac nuclear proteome, and it considered the entire nucleus, rather than exploring the proteome at the level of nuclear sub compartments.15 In large part, this shortage of work is due to the difficulty of isolating cardiac nuclei. Cardiac nuclei occur within a rigid and dense actin-myosin apparatus to which they are connected via multiple extensions from the endoplasmic reticulum, to the extent that myocyte contraction alters their overall shape.16 Additionally, cardiomyocytes are 40% mitochondria by volume17 which necessitates enrichment of the nucleus apart from the other organelles. Here we describe a protocol for cardiac nuclear enrichment and further fractionation into biologically-relevant compartments. Furthermore, we detail methods for label-free quantitative mass spectrometric dissection of these fractions-techniques amenable to in vivo experimentation in various animal models and organ systems where metabolic labeling is not feasible. PMID:23299252

  11. Longitudinal Association Analysis of Quantitative Traits

    PubMed Central

    Fan, Ruzong; Zhang, Yiwei; Albert, Paul S.; Liu, Aiyi; Wang, Yuanjia; Xiong, Momiao

    2015-01-01

    Longitudinal genetic studies provide a valuable resource for exploring key genetic and environmental factors that affect complex traits over time. Genetic analysis of longitudinal data that incorporate temporal variations is important for understanding genetic architecture and biological variations of common complex diseases. Although they are important, there is a paucity of statistical methods to analyze longitudinal human genetic data. In this article, longitudinal methods are developed for temporal association mapping to analyze population longitudinal data. Both parametric and nonparametric models are proposed. The models can be applied to multiple diallelic genetic markers such as single-nucleotide polymorphisms and multiallelic markers such as microsatellites. By analytical formulae, we show that the models take both the linkage disequilibrium and temporal trends into account simultaneously. Variance-covariance structure is constructed to model the single measurement variation and multiple measurement correlations of an individual based on the theory of stochastic processes. Novel penalized spline models are used to estimate the time-dependent mean functions and regression coefficients. The methods were applied to analyze Framingham Heart Study data of Genetic Analysis Workshop (GAW) 13 and GAW 16. The temporal trends and genetic effects of the systolic blood pressure are successfully detected by the proposed approaches. Simulation studies were performed to find out that the nonparametric penalized linear model is the best choice in fitting real data. The research sheds light on the important area of longitudinal genetic analysis, and it provides a basis for future methodological investigations and practical applications. PMID:22965819

  12. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  13. QUANTITATIVE TRAIT LOCUS ANALYSIS AS A GENE DISCOVERY TOOL

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative trait locus analysis has been a mainstay approach for obtaining a genetic description of complex agronomic traits for plants. What is sometimes overlooked is the role QTL analysis can play in identifying genes that underlay complex traits. In this chapter, I will describe the basic st...

  14. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  15. Quantitative fault tree analysis using the Set Evaluation Program (SEP)

    SciTech Connect

    Olman, M.D.

    1982-09-01

    This report describes the use of a program to aid in the analysis of fault trees. The user has the option of executing one or more of a number of procedures which have been developed to perform the quantitative analysis of a fault tree.

  16. Quantitative Analysis of Galaxy-Galaxy Lensing

    NASA Astrophysics Data System (ADS)

    Schneider, Peter; Rix, Hans-Walter

    1997-01-01

    Gravitational light deflection due to mass along the line of sight will distort the images of background sources. Although an individual galaxy is not massive enough to cause a detectable lensing distortion in the background population, this effect can be measured statistically for a population of galaxies, and a first detection was claimed recently by Brainerd, Blandford, & Smail (BBS). BBS modeled their observations by describing galaxy halos as isothermal spheres of velocity dispersion ?, truncated at a radius s, where ? and s scale with the luminosity of the galaxy. Through Monte Carlo simulations they predicted the mean image polarization as a function of radius and compared it to the observations. In this paper we follow up on this discovery by developing a maximum-likelihood analysis that can constrain the halo properties of distant galaxy populations through ``galaxy-galaxy'' lensing; with it we show that the mean masses and sizes of halos can be estimated accurately, without excessive data requirements. The proposed maximum-likelihood analysis contains several important new elements: (1) it takes full account of the actual image ellipticities, positions, and apparent magnitudes, and as a consequence, it provides more efficient parameter estimation; (2) it provides automatically the proper relative weight for images of different ellipticities; (3) it uses a redshift probability distribution for each galaxy image and does not require a foreground lens-background image dichotomy; (4) it provides a rigorous means to investigate the covariances among the parameters that describe the halo model. We apply this analysis technique to simulated observations, using for ease of comparison the same lens model as BBS, and determine the best-fitting values, ?* and s*, corresponding to an L* galaxy. We explore two different observing strategies: (1) taking deep images (e.g., with HST) on small fields, and (2) using shallower images on larger fields. From these simulations we find that ?* can be determined to <~10% accuracy if a sample of about 5000 galaxies with measured ellipticities are available, down to R <~ 23. The corresponding data can be obtained on a 4 m class telescope in a few nights of very good seeing. Alternatively, the same accuracy in the determination of ?* can be achieved from about 10 moderately deep WFPC2 fields, on which galaxy shapes can be measured to about R ~ 25 and for which ground-based images are available on which the WFPC2 fields are centered. Firm lower limits can be set on the radial extent of the halo, but the maximal halo extent is poorly constrained. We show that this likelihood approach can also be used to constrain other parameters of the galaxy population, such as the Tully-Fisher index or the mean redshift of the galaxies as a function of apparent magnitude. Finally, we show how multicolor information, constraining the redshift of individual galaxies, can dramatically improve the accuracy of the parameter determination.

  17. Aqua-vanadyl ion interaction with Nafion® membranes

    SciTech Connect

    Vijayakumar, Murugesan; Govind, Niranjan; Li, Bin; Wei, Xiaoliang; Nie, Zimin; Thevuthasan, Suntharampillai; Sprenkle, Vince L.; Wang, Wei

    2015-03-23

    Lack of comprehensive understanding about the interactions between Nafion membrane and battery electrolytes prevents the straightforward tailoring of optimal materials for redox flow battery applications. In this work, we analyzed the interaction between aqua-vanadyl cation and sulfonic sites within the pores of Nafion membranes using combined theoretical and experimental X-ray spectroscopic methods. Molecular level interactions, namely, solvent share and contact pair mechanisms are discussed based on Vanadium and Sulfur K-edge spectroscopic analysis.

  18. Aqua-vanadyl ion interaction with Nafion® membranes

    DOE PAGESBeta

    Vijayakumar, Murugesan; Govind, Niranjan; Li, Bin; Wei, Xiaoliang; Nie, Zimin; Thevuthasan, Suntharampillai; Sprenkle, Vince L.; Wang, Wei

    2015-03-23

    Lack of comprehensive understanding about the interactions between Nafion membrane and battery electrolytes prevents the straightforward tailoring of optimal materials for redox flow battery applications. In this work, we analyzed the interaction between aqua-vanadyl cation and sulfonic sites within the pores of Nafion membranes using combined theoretical and experimental X-ray spectroscopic methods. Molecular level interactions, namely, solvent share and contact pair mechanisms are discussed based on Vanadium and Sulfur K-edge spectroscopic analysis.

  19. Analysis of the influence of river discharge and wind on the Ebro turbid plume using MODIS-Aqua and MODIS-Terra data

    NASA Astrophysics Data System (ADS)

    Fernández-Nóvoa, D.; Mendes, R.; deCastro, M.; Dias, J. M.; Sánchez-Arcilla, A.; Gómez-Gesteira, M.

    2015-02-01

    The turbid plume formed at many river mouths influences the adjacent coastal area because it transports sediments, nutrients, and pollutants. The effects of the main forcings affecting the Ebro turbid plume were analyzed using data obtained from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor onboard the Aqua and Terra satellites over the period 2003-2011. Composite images were obtained for days under certain river discharge conditions (different flow regimes) and different types of wind (alongshore and cross-shore winds) in order to obtain a representative plume pattern for each situation. River discharge was the main driver of the Ebro River plume, followed by wind as the secondary force and regional oceanic circulation as the third one. Turbid plume extension increased monotonically with increased river discharge. Under high river discharge conditions (> 355 m3 s- 1), wind distributed the plume in the dominant wind direction. Seaward winds (mistral) produced the largest extension of the plume (1893 km2), whereas southern alongshore winds produced the smallest one (1325 km2). Northern alongshore winds induced the highest mean turbid value of the plume, and southern alongshore winds induced the lowest one. Regardless of the wind condition, more than 70% of the plume extension was located south of the river mouth influenced by the regional oceanic circulation.

  20. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  1. Synergism of MODIS Aerosol Remote Sensing from Terra and Aqua

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Kaufman, Yoram J.; Remer, Lorraine A.

    2003-01-01

    The MODerate-resolution Imaging Spectro-radiometer (MODIS) sensors, aboard the Earth Observing System (EOS) Terra and Aqua satellites, are showing excellent competence at measuring the global distribution and properties of aerosols. Terra and Aqua were launched on December 18, 1999 and May 4, 2002 respectively, with daytime equator crossing times of approximately 10:30 am and 1:30 pm respectively. Several aerosol parameters are retrieved at 10-km spatial resolution from MODIS daytime data over land and ocean surfaces. The parameters retrieved include: aerosol optical thickness (AOT) at 0.47, 0.55 and 0.66 micron wavelengths over land, and at 0.47, 0.55, 0.66, 0.87, 1.2, 1.6, and 2.1 microns over ocean; Angstrom exponent over land and ocean; and effective radii, and the proportion of AOT contributed by the small mode aerosols over ocean. Since the beginning of its operation, the quality of Terra-MODIS aerosol products (especially AOT) have been evaluated periodically by cross-correlation with equivalent data sets acquired by ground-based (and occasionally also airborne) sunphotometers, particularly those coordinated within the framework of the AErosol Robotic NETwork (AERONET). Terra-MODIS AOT data have been found to meet or exceed pre-launch accuracy expectations, and have been applied to various studies dealing with local, regional, and global aerosol monitoring. The results of these Terra-MODIS aerosol data validation efforts and studies have been reported in several scientific papers and conferences. Although Aqua-MODIS is still young, it is already yielding formidable aerosol data products, which are also subjected to careful periodic evaluation similar to that implemented for the Terra-MODIS products. This paper presents results of validation of Aqua-MODIS aerosol products with AERONET, as well as comparative evaluation against corresponding Terra-MODIS data. In addition, we show interesting independent and synergistic applications of MODIS aerosol data from both Terra and Aqua. In certain situations, this combined analysis of Terra- and Aqua-MODIS data offers an insight into the diurnal cycle of aerosol loading.

  2. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  3. Early Child Grammars: Qualitative and Quantitative Analysis of Morphosyntactic Production

    ERIC Educational Resources Information Center

    Legendre, Geraldine

    2006-01-01

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is…

  4. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  5. Quantitative methods for the analysis of zoosporic fungi.

    PubMed

    Marano, Agostina V; Gleason, Frank H; Bärlocher, Felix; Pires-Zottarelli, Carmen L A; Lilje, Osu; Schmidt, Steve K; Rasconi, Serena; Kagami, Maiko; Barrera, Marcelo D; Sime-Ngando, Télesphore; Boussiba, Sammy; de Souza, José I; Edwards, Joan E

    2012-04-01

    Quantitative estimations of zoosporic fungi in the environment have historically received little attention, primarily due to methodological challenges and their complex life cycles. Conventional methods for quantitative analysis of zoosporic fungi to date have mainly relied on direct observation and baiting techniques, with subsequent fungal identification in the laboratory using morphological characteristics. Although these methods are still fundamentally useful, there has been an increasing preference for quantitative microscopic methods based on staining with fluorescent dyes, as well as the use of hybridization probes. More recently however PCR based methods for profiling and quantification (semi- and absolute) have proven to be rapid and accurate diagnostic tools for assessing zoosporic fungal assemblages in environmental samples. Further application of next generation sequencing technologies will however not only advance our quantitative understanding of zoosporic fungal ecology, but also their function through the analysis of their genomes and gene expression as resources and databases expand in the future. Nevertheless, it is still necessary to complement these molecular-based approaches with cultivation-based methods in order to gain a fuller quantitative understanding of the ecological and physiological roles of zoosporic fungi. PMID:22360942

  6. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  7. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  8. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    NASA Astrophysics Data System (ADS)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  9. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  10. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  11. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  12. Quantitative Rietveld analysis of CAC clinker phases using synchrotron radiation

    SciTech Connect

    Guirado, F. . E-mail: francesc.guirado@urv.cat; Gali, S.

    2006-11-15

    The quantitative Rietveld analyses of twenty samples of CAC from four different manufacturers over the world, one synthetic mixture and a NIST standard were performed using synchrotron radiation. As compared with conventional XRD, synchrotron powder diffraction permitted to find new minor phases, improve the characterization of solid solutions of iron rich CAC phases and reduce preferential orientation and microabsorption effects. Diffraction data were complemented with XRF and TG/DT analyses. Synchrotron results were used as a reference test to improve the performance of conventional powder diffraction, by an accurate selection of refinable profile and structural parameters, and permitted to extract several recommendations for conventional quantitative Rietveld procedures. It is shown that with these recommendations in mind, conventional XRD based Rietveld analyses are comparable to those obtained from synchrotron data. In summary, quantitative XRD Rietveld analysis is confirmed as an excellent tool for the CAC cement industry.

  13. Automatic Quantitative Analysis Of Digital Cardiac Angiographic Sequences

    NASA Astrophysics Data System (ADS)

    Hack, Stanley N.; Lele, Surendra; Streicker, Mark; Mostafavi, Hassan

    1989-05-01

    This paper describes a fully automated system for quantitatively analyzing digital coronary angiograms and ventriculograms. Angiographic and ventriculographic image sequences are initially acquired in a purely digital format and stored on a video-rate digital disk system. Left ventriculograms and coronary angiograms are analyzed by this system which functionally is divided into three distinct categories: 1) Image Sequence Review, 2) Coronary Vasculature Analysis, and 3) Left Ventricular Analysis. The software system described integrates the quantitative analysis functions required in a cardiac catheterization laboratory into a single, menu-driven package. The angiograms and ventriculograms displayed, enhanced, and analyzed are in a digital format. All analyses are performed automatically with provisions for user intervention and editing. The algorithms and models upon which this package is based are described.

  14. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  15. Effect of MODIS Terra Radiometric Calibration Improvements on Collection 6 Deep Blue Aerosol Products: Validation and Terra/Aqua Consistency

    NASA Technical Reports Server (NTRS)

    Sayer, A. M.; Hsu, N. C.; Bettenhausen, C.; Jeong, M.-J.; Meister, G.

    2015-01-01

    The Deep Blue (DB) algorithm's primary data product is midvisible aerosol optical depth (AOD). DB applied to Moderate Resolution Imaging Spectroradiometer (MODIS) measurements provides a data record since early 2000 for MODIS Terra and mid-2002 for MODIS Aqua. In the previous data version (Collection 5, C5), DB production from Terra was halted in 2007 due to sensor degradation; the new Collection 6 (C6) has both improved science algorithms and sensor radiometric calibration. This includes additional calibration corrections developed by the Ocean Biology Processing Group to address MODIS Terra's gain, polarization sensitivity, and detector response versus scan angle, meaning DB can now be applied to the whole Terra record. Through validation with Aerosol Robotic Network (AERONET) data, it is shown that the C6 DB Terra AOD quality is stable throughout the mission to date. Compared to the C5 calibration, in recent years the RMS error compared to AERONET is smaller by approximately 0.04 over bright (e.g., desert) and approximately 0.01-0.02 over darker (e.g., vegetated) land surfaces, and the fraction of points in agreement with AERONET within expected retrieval uncertainty higher by approximately 10% and approximately 5%, respectively. Comparisons to the Aqua C6 time series reveal a high level of correspondence between the two MODIS DB data records, with a small positive (Terra-Aqua) average AOD offset <0.01. The analysis demonstrates both the efficacy of the new radiometric calibration efforts and that the C6 MODIS Terra DB AOD data remain stable (to better than 0.01 AOD) throughout the mission to date, suitable for quantitative scientific analyses.

  16. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  17. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  18. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  19. Quantitative SERS analysis of azorubine (E 122) in sweet drinks.

    PubMed

    Peksa, Vlastimil; Jahn, Martin; Štolcová, Lucie; Schulz, Volker; Proška, Jan; Procházka, Marek; Weber, Karina; Cialla-May, Dana; Popp, Jürgen

    2015-03-01

    Considering both the potential effects on human health and the need for knowledge of food composition, quantitative detection of synthetic dyes in foodstuffs and beverages is an important issue. For the first time, we report a fast quantitative analysis of the food and drink colorant azorubine (E 122) in different types of beverages using surface-enhanced Raman scattering (SERS) without any sample preparation. Seven commercially available sweet drinks (including two negative controls) with high levels of complexity (sugar/artificial sweetener, ethanol content, etc.) were tested. Highly uniform Au "film over nanospheres" (FON) substrates together with use of Raman signal from silicon support as internal intensity standard enabled us to quantitatively determine the concentration of azorubine in each drink. SERS spectral analysis provided sufficient sensitivity (0.5-500 mg L(-1)) and determined azorubine concentration closely correlated with those obtained by a standard HPLC technique. The analysis was direct without the need for any pretreatment of the drinks or Au surface. Our SERS approach is a simple and rapid (35 min) prescan method, which can be easily implemented for a field application and for preliminary testing of food samples. PMID:25664564

  20. Quantitative analysis of vitamin A using Fourier transform Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Hancewicz, Thomas M.; Petty, Chris

    1995-11-01

    Near infrared Fourier transform Raman spectroscopy has been successfully used to quantitatively analyze vitamin A additives in a sorbitan mono-oleate base vehicle. Although measurements can be made on the raw materials, their high viscosity causes them to be difficult to handle in an industrial testing lab. Accurate quantitation is possible using a simple dilution of the sample. This reduces the overall measurement time by speeding up preparation and clean-up. Results are quantified over a range of 0.05 ml -1 up to 1 mg ml -1 using a partial least-squares analysis model. A discussion is made of factors affecting quantitative analysis using FT Raman instrumentation in an industrial environment. Application of the multiplicative scatter correction (MSC) as a pretreatment step for Raman data is discussed with reference to the partial least squares (PLS) calibration. A discussion is presented to the information imbedded in the latent PLS factors and how analysis of these factors can often add to an understanding of the chemical information being modeled.

  1. Quantitative Motion Analysis in Two and Three Dimensions.

    PubMed

    Wessels, Deborah J; Lusche, Daniel F; Kuhl, Spencer; Scherer, Amanda; Voss, Edward; Soll, David R

    2016-01-01

    This chapter describes 2D quantitative methods for motion analysis as well as 3D motion analysis and reconstruction methods. Emphasis is placed on the analysis of dynamic cell shape changes that occur through extension and retraction of force generating structures such as pseudopodia and lamellipodia. Quantitative analysis of these structures is an underutilized tool in the field of cell migration. Our intent, therefore, is to present methods that we developed in an effort to elucidate mechanisms of basic cell motility, directed cell motion during chemotaxis, and metastasis. We hope to demonstrate how application of these methods can more clearly define alterations in motility that arise due to specific mutations or disease and hence, suggest mechanisms or pathways involved in normal cell crawling and treatment strategies in the case of disease. In addition, we present a 4D tumorigenesis model for high-resolution analysis of cancer cells from cell lines and human cancer tissue in a 3D matrix. Use of this model led to the discovery of the coalescence of cancer cell aggregates and unique cell behaviors not seen in normal cells or normal tissue. Graphic illustrations to visually display and quantify cell shape are presented along with algorithms and formulae for calculating select 2D and 3D motion analysis parameters. PMID:26498790

  2. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    PubMed

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery. PMID:26544640

  3. Quantitative analysis of the selective pressure exerted on homologous proteins.

    PubMed

    Blanca, F; Ferraz, C; Sri Widada, J; Liautard, J P

    1990-07-01

    Evolution analysis is used to locate the regions of a protein that are important for its function or structure. The rate of evolution is generally constant for a given family of homologous sequences. From the starting point of this observation, an algorithm is proposed to establish quantitatively the sequence zones where selective pressure is maximal. A program that computes this pressure has been written in PASCAL. Analysis of results on some sequences validate this theoretical approach, and this knowledge can be used as a starting-point for carrying out site-directed mutagenesis. PMID:2207746

  4. Quantitative Epistasis Analysis and Pathway Inference from Genetic Interaction Data

    PubMed Central

    Phenix, Hilary; Morin, Katy; Batenchuk, Cory; Parker, Jacob; Abedi, Vida; Yang, Liu; Tepliakova, Lioudmila; Perkins, Theodore J.; Kærn, Mads

    2011-01-01

    Inferring regulatory and metabolic network models from quantitative genetic interaction data remains a major challenge in systems biology. Here, we present a novel quantitative model for interpreting epistasis within pathways responding to an external signal. The model provides the basis of an experimental method to determine the architecture of such pathways, and establishes a new set of rules to infer the order of genes within them. The method also allows the extraction of quantitative parameters enabling a new level of information to be added to genetic network models. It is applicable to any system where the impact of combinatorial loss-of-function mutations can be quantified with sufficient accuracy. We test the method by conducting a systematic analysis of a thoroughly characterized eukaryotic gene network, the galactose utilization pathway in Saccharomyces cerevisiae. For this purpose, we quantify the effects of single and double gene deletions on two phenotypic traits, fitness and reporter gene expression. We show that applying our method to fitness traits reveals the order of metabolic enzymes and the effects of accumulating metabolic intermediates. Conversely, the analysis of expression traits reveals the order of transcriptional regulatory genes, secondary regulatory signals and their relative strength. Strikingly, when the analyses of the two traits are combined, the method correctly infers ?80% of the known relationships without any false positives. PMID:21589890

  5. Quantitative analysis of transferrin cycling by automated fluorescence microscopy.

    PubMed

    Hirschmann, David T; Kasper, Christoph A; Spiess, Martin

    2015-01-01

    Surface receptors are transported between the plasma membrane and intracellular compartments by various endocytic mechanisms and by recycling via different pathways from sorting or recycling endosomes. The analysis of cellular components involved in mediating or regulating these transport steps is of high current interest and requires quantitative methods to determine rates of endocytosis and/or recycling. Various biochemical procedures to measure uptake of labeled ligand molecules or internalization and reappearance of surface-labeled receptors have been developed. Here, we describe a quantitative method based on fluorescence microscopy of adherent cells taking advantage of the transferrin (Tf) receptor as the prototype of cycling transport receptors. Tf is endocytosed with bound Fe(3+) and, upon release of the iron ion in endosomes, recycled as apo-Tf together with the receptor. To follow the ligand-receptor complex, fluorescently labeled Tf is used and detected microscopically with or without releasing Tf from cell surface receptors by acid stripping. To go beyond the observation of a few individual cells, automated fluorescence microscopy is employed to image thousands of cells at different time points and in parallel with different treatments (such as chemical inhibitors, siRNA silencing, or transfection of candidate genes) in a 96-well format. Computer-assisted image analysis allows unbiased quantitation of Tf content of each cell and to distinguish between different cell populations. PMID:25702129

  6. Label-Free Technologies for Quantitative Multiparameter Biological Analysis

    PubMed Central

    Qavi, Abraham J.; Washburn, Adam L.; Byeon, Ji-Yeon; Bailey, Ryan C.

    2009-01-01

    In the post-genomic era, information is king and information-rich technologies are critically important drivers in both fundamental biology and medicine. It is now known that single-parameter measurements provide only limited detail and that quantitation of multiple biomolecular signatures can more fully illuminate complex biological function. Label-free technologies have recently attracted significant interest for sensitive and quantitative multiparameter analysis of biological systems. There are several different classes of label-free sensors that are currently being developed both in academia and in industry. In this critical review, we highlight, compare, and contrast some of the more promising approaches. We will describe the fundamental principles of these different methodologies and discuss advantages and disadvantages that might potentially help one in selecting the appropriate technology for a given bioanalytical application. PMID:19221722

  7. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  8. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells

    PubMed Central

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  9. Quantitative 3D analysis of huge nanoparticle assemblies.

    PubMed

    Zanaga, Daniele; Bleichrodt, Folkert; Altantzis, Thomas; Winckelmans, Naomi; Palenstijn, Willem Jan; Sijbers, Jan; de Nijs, Bart; van Huis, Marijn A; Sánchez-Iglesias, Ana; Liz-Marzán, Luis M; van Blaaderen, Alfons; Joost Batenburg, K; Bals, Sara; Van Tendeloo, Gustaaf

    2015-12-17

    Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed. PMID:26607629

  10. Quantitative analysis of enzymatic assays using indoxyl-based substrates.

    PubMed

    Fanjul-Bolado, Pablo; González-García, María Begoña; Costa-García, Agustín

    2006-11-01

    Hydrolysis of indoxyl-based substrates by hydrolytic enzymes is a commonly used semiquantitative detection system that generates a water-insoluble indigo dye which is difficult to quantify. This work describes the quantitative analysis and enzyme kinetics for alkaline phosphatase (AP) and 5-bromo-4-chloro-3-indoxyl phosphate (BCIP) in solution obtained by applying known solubilization methodology from the textiles industry to the enzymatic product. This proposal is based on the reduction of the tetrahalo-indigo blue dye in a basic medium with the aim of generating its aqueous-soluble parent compound termed indigo white, which gives a rich yellow color in solution and is fluorescent. A quantitative ELISA (where a soluble end product is required) is accomplished for first time using BCIP as substrate. PMID:17036214

  11. Universal Quantitative NMR Analysis of Complex Natural Samples

    PubMed Central

    Simmler, Charlotte; Napolitano, José G.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    Nuclear Magnetic Resonance (NMR) is a universal and quantitative analytical technique. Being a unique structural tool, NMR also competes with metrological techniques for purity determination and reference material analysis. In pharmaceutical research, applications of quantitative NMR (qNMR) cover mostly the identification and quantification of drug and biological metabolites. Offering an unbiased view of the sample composition, and the possibility to simultaneously quantify multiple compounds, qNMR has become the method of choice for metabolomic studies and quality control of complex natural samples such as foods, plants or herbal remedies, and biofluids. In this regard, NMR-based metabolomic studies, dedicated to both the characterization of herbal remedies and clinical diagnosis, have increased considerably. PMID:24484881

  12. Quantitative 3D analysis of huge nanoparticle assemblies

    NASA Astrophysics Data System (ADS)

    Zanaga, Daniele; Bleichrodt, Folkert; Altantzis, Thomas; Winckelmans, Naomi; Palenstijn, Willem Jan; Sijbers, Jan; de Nijs, Bart; van Huis, Marijn A.; Sánchez-Iglesias, Ana; Liz-Marzán, Luis M.; van Blaaderen, Alfons; Joost Batenburg, K.; Bals, Sara; van Tendeloo, Gustaaf

    2015-12-01

    Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed.Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06962a

  13. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  14. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  15. Variability in quantitative cardiac magnetic resonance perfusion analysis.

    PubMed

    Bratis, K; Nagel, Eike

    2013-06-01

    By taking advantage of its high spatial resolution, noninvasive and nontoxic nature first-pass perfusion cardiovascular magnetic resonance (CMR) has rendered an indispensable tool for the noninvasive detection of reversible myocardial ischemia. A potential advantage of perfusion CMR is its ability to quantitatively assess perfusion reserve within a myocardial segment, as expressed semi- quantitatively by myocardial perfusion reserve index (MPRI) and fully- quantitatively by absolute myocardial blood flow (MBF). In contrast to the high accuracy and reliability of CMR in evaluating cardiac function and volumes, perfusion CMR is adversely affected by multiple potential reasons during data acquisition as well as post-processing. Various image acquisition techniques, various contrast agents and doses as well as variable blood flow at rest as well as variable reactions to stress all influence the acquired data. Mechanisms underlying the variability in perfusion CMR post processing, as well as their clinical significance, are yet to be fully elucidated. The development of a universal, reproducible, accurate and easily applicable tool in CMR perfusion analysis remains a challenge and will substantially enforce the role of perfusion CMR in improving clinical care. PMID:23825774

  16. Chromatin immunoprecipitation: optimization, quantitative analysis and data normalization

    PubMed Central

    Haring, Max; Offermann, Sascha; Danker, Tanja; Horst, Ina; Peterhansel, Christoph; Stam, Maike

    2007-01-01

    Background Chromatin remodeling, histone modifications and other chromatin-related processes play a crucial role in gene regulation. A very useful technique to study these processes is chromatin immunoprecipitation (ChIP). ChIP is widely used for a few model systems, including Arabidopsis, but establishment of the technique for other organisms is still remarkably challenging. Furthermore, quantitative analysis of the precipitated material and normalization of the data is often underestimated, negatively affecting data quality. Results We developed a robust ChIP protocol, using maize (Zea mays) as a model system, and present a general strategy to systematically optimize this protocol for any type of tissue. We propose endogenous controls for active and for repressed chromatin, and discuss various other controls that are essential for successful ChIP experiments. We experienced that the use of quantitative PCR (QPCR) is crucial for obtaining high quality ChIP data and we explain why. The method of data normalization has a major impact on the quality of ChIP analyses. Therefore, we analyzed different normalization strategies, resulting in a thorough discussion of the advantages and drawbacks of the various approaches. Conclusion Here we provide a robust ChIP protocol and strategy to optimize the protocol for any type of tissue; we argue that quantitative real-time PCR (QPCR) is the best method to analyze the precipitates, and present comprehensive insights into data normalization. PMID:17892552

  17. Fluorescent foci quantitation for high-throughput analysis

    PubMed Central

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  18. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  19. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  20. Quantitative proteomic analysis of yeast DNA replication proteins.

    PubMed

    Kubota, Takashi; Stead, David A; Hiraga, Shin-ichiro; ten Have, Sara; Donaldson, Anne D

    2012-06-01

    Chromatin is dynamically regulated, and proteomic analysis of its composition can provide important information about chromatin functional components. Many DNA replication proteins for example bind chromatin at specific times during the cell cycle. Proteomic investigation can also be used to characterize changes in chromatin composition in response to perturbations such as DNA damage, while useful information is obtained by testing the effects on chromatin composition of mutations in chromosome stability pathways. We have successfully used the method of stable isotope labeling by amino acids in cell culture (SILAC) for quantitative proteomic analysis of normal and pathological changes to yeast chromatin. Here we describe this proteomic method for analyzing changes to Saccharomyces cerevisiae chromatin, illustrating the procedure with an analysis of the changes that occur in chromatin composition as cells progress from a G1 phase block (induced by alpha factor) into S phase (in the presence of DNA replication inhibitor hydroxyurea). PMID:22465796

  1. Quantitative strain analysis in AlGaAs-based devices

    NASA Astrophysics Data System (ADS)

    Tomm, Jens W.; Gerhardt, Axel; Müller, Roland; Biermann, Mark L.; Holland, Joseph P.; Lorenzen, Dirk; Kaulfersch, Eberhard

    2003-06-01

    We present a strategy for quantitative spectroscopic analysis of packaging-induced strain using both finite element analysis and band-structure calculations. This approach holds for a wide class of AlGaAs-based, and related, devices, among them high-power "cm-bars." The influence on the results of particular device structure properties, such as intrinsic strain and quantum-well geometry, is analyzed. We compare theoretical results based on a unaxial stress model with photocurrent data obtained from an externally strained cm-bar, and obtain better agreement than from alternative strain models. The general approach is also applicable to the analysis of all data that refer to changes of the electronic band structure, such as absorption and photoluminescence.

  2. Teaching Neuroinformatics with an Emphasis on Quantitative Locus Analysis

    PubMed Central

    Grisham, William; Korey, Christopher A.; Schottler, Natalie A.; McCauley, Lisa Beck; Beatty, Jackson

    2012-01-01

    Although powerful bioinformatics tools are available for free on the web and are used by neuroscience professionals on a daily basis, neuroscience students are largely ignorant of them. This Neuroinformatics module weaves together several bioinformatics tools to make a comprehensive unit. This unit encompasses quantifying a phenotype through a Quantitative Trait Locus (QTL) analysis, which links phenotype to loci on chromosomes that likely had an impact on the phenotype. Students then are able to sift through a list of genes in the region(s) of the chromosome identified by the QTL analysis and find a candidate gene that has relatively high expression in the brain region of interest. Once such a candidate gene is identified, students can find out more information about the gene, including the cells/layers in which it is expressed, the sequence of the gene, and an article about the gene. All of the resources employed are available at no cost via the internet. Didactic elements of this instructional module include genetics, neuroanatomy, Quantitative Trait Locus analysis, molecular techniques in neuroscience, and statistics—including multiple regression, ANOVA, and a bootstrap technique. This module was presented at the Faculty for Undergraduate Neuroscience (FUN) 2011 Workshop at Pomona College and can be accessed at http://mdcune.psych.ucla.edu/modules/bioinformatics. PMID:23493834

  3. Neutron diffractometer INES for quantitative phase analysis of archaeological objects

    NASA Astrophysics Data System (ADS)

    Imberti, S.; Kockelmann, W.; Celli, M.; Grazzi, F.; Zoppi, M.; Botti, A.; Sodo, A.; Imperiale, M. Leo; de Vries-Melein, M.; Visser, D.; Postma, H.

    2008-03-01

    With the Italian Neutron Experimental Station (INES) a new general purpose neutron powder diffractometer is available at ISIS, characterized by a high resolution at low d-spacings, and particularly suited for the quantitative phase analysis of a wide range of archaeological materials. Time-of-flight neutron diffraction is notable for being a non-destructive technique, allowing a reliable determination of the phase compositions of multiphase artefacts, with or without superficial corrosion layers. A selection of archaeometric studies carried out during the first year of the INES user programme is presented here to demonstrate the capabilities of the instrument.

  4. Quantitative trace analysis of complex mixtures using SABRE hyperpolarization.

    PubMed

    Eshuis, Nan; van Weerdenburg, Bram J A; Feiters, Martin C; Rutjes, Floris P J T; Wijmenga, Sybren S; Tessari, Marco

    2015-01-26

    Signal amplification by reversible exchange (SABRE) is an emerging nuclear spin hyperpolarization technique that strongly enhances NMR signals of small molecules in solution. However, such signal enhancements have never been exploited for concentration determination, as the efficiency of SABRE can strongly vary between different substrates or even between nuclear spins in the same molecule. The first application of SABRE for the quantitative analysis of a complex mixture is now reported. Despite the inherent complexity of the system under investigation, which involves thousands of competing binding equilibria, analytes at concentrations in the low micromolar range could be quantified from single-scan SABRE spectra using a standard-addition approach. PMID:25469822

  5. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  6. Quantitative proteomic and phosphoproteomic analysis of Trypanosoma cruzi amastigogenesis.

    PubMed

    Queiroz, Rayner M L; Charneau, Sébastien; Mandacaru, Samuel C; Schwämmle, Veit; Lima, Beatriz D; Roepstorff, Peter; Ricart, Carlos A O

    2014-12-01

    Chagas disease is a tropical neglected disease endemic in Latin America caused by the protozoan Trypanosoma cruzi. The parasite has four major life stages: epimastigote, metacyclic trypomastigote, bloodstream trypomastigote, and amastigote. The differentiation from infective trypomastigotes into replicative amastigotes, called amastigogenesis, takes place in vivo inside mammalian host cells after a period of incubation in an acidic phagolysosome. This differentiation process can be mimicked in vitro by incubating tissue-culture-derived trypomastigotes in acidic DMEM. Here we used this well-established differentiation protocol to perform a comprehensive quantitative proteomic and phosphoproteomic analysis of T. cruzi amastigogenesis. Samples from fully differentiated forms and two biologically relevant intermediate time points were Lys-C/trypsin digested, iTRAQ-labeled, and multiplexed. Subsequently, phosphopeptides were enriched using a TiO2 matrix. Non-phosphorylated peptides were fractionated via hydrophilic interaction liquid chromatography prior to LC-MS/MS analysis. LC-MS/MS and bioinformatics procedures were used for protein and phosphopeptide quantitation, identification, and phosphorylation site assignment. We were able to identify regulated proteins and pathways involved in coordinating amastigogenesis. We also observed that a significant proportion of the regulated proteins were membrane proteins. Modulated phosphorylation events coordinated by protein kinases and phosphatases that are part of the signaling cascade induced by incubation in acidic medium were also evinced. To our knowledge, this work is the most comprehensive quantitative proteomics study of T. cruzi amastigogenesis, and these data will serve as a trustworthy basis for future studies, and possibly for new potential drug targets. PMID:25225356

  7. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  8. Quantitative analysis of volume images: electron microscopic tomography of HIV

    NASA Astrophysics Data System (ADS)

    Nystroem, Ingela; Bengtsson, Ewert W.; Nordin, Bo G.; Borgefors, Gunilla

    1994-05-01

    Three-dimensional objects should be represented by 3D images. So far, most of the evaluation of images of 3D objects have been done visually, either by looking at slices through the volumes or by looking at 3D graphic representations of the data. In many applications a more quantitative evaluation would be valuable. Our application is the analysis of volume images of the causative agent of the acquired immune deficiency syndrome (AIDS), namely human immunodeficiency virus (HIV), produced by electron microscopic tomography (EMT). A structural analysis of the virus is of importance. The representation of some of the interesting structural features will depend on the orientation and the position of the object relative to the digitization grid. We describe a method of defining orientation and position of objects based on the moment of inertia of the objects in the volume image. In addition to a direct quantification of the 3D object a quantitative description of the convex deficiency may provide valuable information about the geometrical properties. The convex deficiency is the volume object subtracted from its convex hull. We describe an algorithm for creating an enclosing polyhedron approximating the convex hull of an arbitrarily shaped object.

  9. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  10. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  11. Quantitatively understanding cellular uptake of gold nanoparticles via radioactivity analysis

    PubMed Central

    Shao, Xia; Schnau, Paul; Qian, Wei; Wang, Xueding

    2015-01-01

    The development of multifunctional gold nanoparticles (AuNPs) underwent an explosion in the last two decades. However, many questions regarding detailed surface chemistry and how they are affecting the behaviors of AuNPs in vivo and in vitro still need to be addressed before AuNPs can be widely adapted into clinical settings. In this work, radioactivity analysis was employed for quantitative evaluation of I-125 radiolabeled AuNPs uptakes by cancer cells. Facilitated with this new method, we have conducted initial bioevaluation of surfactant-free AuNPs produced by femtosecond laser ablation. Cellular uptake of AuNPs as a function of the RGD density on the AuNP surface, as well as a function of time, has been quantified. The radioactivity analysis may shed light on the dynamic interactions of AuNPs with cancer cells, and help achieve optimized designs of AuNPs for future clinical applications. PMID:26405436

  12. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  13. Quantitatively Understanding Cellular Uptake of Gold Nanoparticles via Radioactivity Analysis.

    PubMed

    Shao, Xia; Schnau, Paul; Qian, Wei; Wang, Xueding

    2015-05-01

    The development of multifunctional gold nanoparticles (AuNPs) underwent an explosion in the last two decades. However, many questions regarding detailed surface chemistry and how they are affecting the behaviors of AuNPs in vivo and in vitro still need to be addressed before AuNPs can be widely adapted into clinical settings. In this work, radioactivity analysis was employed for quantitative evaluation of I-125 radiolabeled AuNPs uptakes by cancer cells. Facilitated with this new method, we have conducted initial bioevaluation of surfactant-free AuNPs produced by femtosecond laser ablation. Cellular uptake of AuNPs as a function of the RGD density on the AuNP surface, as well as a function of time, has been quantified. The radioactivity analysis may shed light on the dynamic interactions of AuNPs with cancer cells, and help achieve optimized designs of AuNPs for future clinical applications. PMID:26505012

  14. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  15. Quantitative analysis of cyclic beta-turn models.

    PubMed Central

    Perczel, A.; Fasman, G. D.

    1992-01-01

    The beta-turn is a frequently found structural unit in the conformation of globular proteins. Although the circular dichroism (CD) spectra of the alpha-helix and beta-pleated sheet are well defined, there remains some ambiguity concerning the pure component CD spectra of the different types of beta-turns. Recently, it has been reported (Hollósi, M., Kövér, K.E., Holly, S., Radics, L., & Fasman, G.D., 1987, Biopolymers 26, 1527-1572; Perczel, A., Hollósi, M., Foxman, B.M., & Fasman, G.D., 1991a, J. Am. Chem. Soc. 113, 9772-9784) that some pseudohexapeptides (e.g., the cyclo[(delta)Ava-Gly-Pro-Aaa-Gly] where Aaa = Ser, Ser(OtBu), or Gly) in many solvents adopt a conformational mixture of type I and the type II beta-turns, although the X-ray-determined conformation was an ideal type I beta-turn. In addition to these pseudohexapeptides, conformational analysis was also carried out on three pseudotetrapeptides and three pseudooctapeptides. The target of the conformation analysis reported herein was to determine whether the ring stress of the above beta-turn models has an influence on their conformational properties. Quantitative nuclear Overhauser effect (NOE) measurements yielded interproton distances. The conformational average distances so obtained were interpreted utilizing molecular dynamics (MD) simulations to yield the conformational percentages. These conformational ratios were correlated with the conformational weights obtained by quantitative CD analysis of the same compounds. The pure component CD curves of type I and type II beta-turns were also obtained, using a recently developed algorithm (Perczel, A., Tusnády, G., Hollósi, M., & Fasman, G.D., 1991b, Protein Eng. 4(6), 669-679). For the first time the results of a CD deconvolution, based on the CD spectra of 14 beta-turn models, were assigned by quantitative NOE results. The NOE experiments confirmed the ratios of the component curves found for the two major beta-turns by CD analysis. These results can now be used to enhance the conformational determination of globular proteins on the basis of their CD spectra. PMID:1304345

  16. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  17. Epistasis analysis for quantitative traits by functional regression model

    PubMed Central

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-01-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI’s Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10?10) in the ESP, and 11 were replicated in the CHARGE-S study. PMID:24803592

  18. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). PMID:25913743

  19. Functional Linear Models for Association Analysis of Quantitative Traits

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Mills, James L.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao

    2014-01-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  20. Operational Experiences in Planning and Reconstructing Aqua Inclination Maneuvers

    NASA Technical Reports Server (NTRS)

    Rand, David; Reilly, Jacqueline; Schiff, Conrad

    2004-01-01

    As the lead satellite in NASA's growing Earth Observing System (EOS) PM constellation, it is increasingly critical that Aqua maintain its various orbit requirements. The two of interest for this paper are maintaining an orbit inclination that provides for a consistent mean local time and a semi-major Axis (SMA) that allows for ground track repeatability. Maneuvers to adjust the orbit inclination involve several flight dynamics constraints and complexities which make planning such maneuvers challenging. In particular, coupling between the orbital and attitude degrees of freedom lead to changes in SMA when changes in inclination are effected. A long term mission mean local time trend analysis was performed in order to determine the size and placement of the required inclination maneuvers. Following this analysis, detailed modeling of each burn and its Various segments was performed to determine its effects on the immediate orbit state. Data gathered from an inclination slew test of the spacecraft and first inclination maneuver uncovered discrepancies in the modeling method that were investigated and resolved. The new modeling techniques were applied and validated during the second spacecraft inclination maneuver. These improvements should position Aqua to successfully complete a series of inclination maneuvers in the fall of 2004. The following paper presents the events and results related

  1. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  2. Quantitative modeling and data analysis of SELEX experiments

    NASA Astrophysics Data System (ADS)

    Djordjevic, Marko; Sengupta, Anirvan M.

    2006-03-01

    SELEX (systematic evolution of ligands by exponential enrichment) is an experimental procedure that allows the extraction, from an initially random pool of DNA, of those oligomers with high affinity for a given DNA-binding protein. We address what is a suitable experimental and computational procedure to infer parameters of transcription factor-DNA interaction from SELEX experiments. To answer this, we use a biophysical model of transcription factor-DNA interactions to quantitatively model SELEX. We show that a standard procedure is unsuitable for obtaining accurate interaction parameters. However, we theoretically show that a modified experiment in which chemical potential is fixed through different rounds of the experiment allows robust generation of an appropriate dataset. Based on our quantitative model, we propose a novel bioinformatic method of data analysis for such a modified experiment and apply it to extract the interaction parameters for a mammalian transcription factor CTF/NFI. From a practical point of view, our method results in a significantly improved false positive/false negative trade-off, as compared to both the standard information theory based method and a widely used empirically formulated procedure.

  3. Quantitative modeling and data analysis of SELEX experiments

    NASA Astrophysics Data System (ADS)

    Djordjevic, Marko; Sengupta, Anirvan M.

    2006-03-01

    SELEX (Systematic Evolution of Ligands by Exponential Enrichment) is an experimental procedure that allows extracting, from an initially random pool of DNA, those oligomers with high affinity for a given DNA-binding protein. We address what is a suitable experimental and computational procedure to infer parameters of transcription factor-DNA interaction from SELEX experiments. To answer this, we use a biophysical model of transcription factor-DNA interactions to quantitatively model SELEX. We show that a standard procedure is unsuitable for obtaining accurate interaction parameters. However, we theoretically show that a modified experiment in which chemical potential is fixed through different rounds of the experiment allows robust generation of an appropriate data set. Based on our quantitative model, we propose a novel bioinformatic method of data analysis for such modified experiment and apply it to extract the interaction parameters for a mammalian transcription factor CTF/NFI. From a practical point of view, our method results in a significantly improved false positive/false negative trade-off, as compared to both the standard information theory based method and a widely used empirically formulated procedure. This work will appear in Physical Biology. This work was supported by NIH grant GM67794. Final parts of this work were supported by NSF under Agreement No. 0112050 and NSF grant MCB-0418891.

  4. Quantitative analysis of gene function in the Drosophila embryo.

    PubMed Central

    Tracey, W D; Ning, X; Klingler, M; Kramer, S G; Gergen, J P

    2000-01-01

    The specific functions of gene products frequently depend on the developmental context in which they are expressed. Thus, studies on gene function will benefit from systems that allow for manipulation of gene expression within model systems where the developmental context is well defined. Here we describe a system that allows for genetically controlled overexpression of any gene of interest under normal physiological conditions in the early Drosophila embryo. This regulated expression is achieved through the use of Drosophila lines that express a maternal mRNA for the yeast transcription factor GAL4. Embryos derived from females that express GAL4 maternally activate GAL4-dependent UAS transgenes at uniform levels throughout the embryo during the blastoderm stage of embryogenesis. The expression levels can be quantitatively manipulated through the use of lines that have different levels of maternal GAL4 activity. Specific phenotypes are produced by expression of a number of different developmental regulators with this system, including genes that normally do not function during Drosophila embryogenesis. Analysis of the response to overexpression of runt provides evidence that this pair-rule segmentation gene has a direct role in repressing transcription of the segment-polarity gene engrailed. The maternal GAL4 system will have applications both for the measurement of gene activity in reverse genetic experiments as well as for the identification of genetic factors that have quantitative effects on gene function in vivo. PMID:10628987

  5. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  6. EDXRF quantitative analysis of chromophore chemical elements in corundum samples.

    PubMed

    Bonizzoni, L; Galli, A; Spinolo, G; Palanza, V

    2009-12-01

    Corundum is a crystalline form of aluminum oxide (Al(2)O(3)) and is one of the rock-forming minerals. When aluminum oxide is pure, the mineral is colorless, but the presence of trace amounts of other elements such as iron, titanium, and chromium in the crystal lattice gives the typical colors (including blue, red, violet, pink, green, yellow, orange, gray, white, colorless, and black) of gemstone varieties. The starting point for our work is the quantitative evaluation of the concentration of chromophore chemical elements with a precision as good as possible to match the data obtained by different techniques as such as optical absorption photoluminescence. The aim is to give an interpretation of the absorption bands present in the NIR and visible ranges which do not involve intervalence charge transfer transitions (Fe(2+) --> Fe(3+) and Fe(2+) --> Ti(4+)), commonly considered responsible of the important features of the blue sapphire absorption spectra. So, we developed a method to evaluate as accurately as possible the autoabsorption effects and the secondary excitation effects which frequently are sources of relevant errors in the quantitative EDXRF analysis. PMID:19821113

  7. Advance in orientation microscopy: quantitative analysis of nanocrystalline structures.

    PubMed

    Seyring, Martin; Song, Xiaoyan; Rettenmayr, Markus

    2011-04-26

    The special properties of nanocrystalline materials are generally accepted to be a consequence of the high density of planar defects (grain and twin boundaries) and their characteristics. However, until now, nanograin structures have not been characterized with similar detail and statistical relevance as coarse-grained materials, due to the lack of an appropriate method. In the present paper, a novel method based on quantitative nanobeam diffraction in transmission electron microscopy (TEM) is presented to determine the misorientation of adjacent nanograins and subgrains. Spatial resolution of <5 nm can be achieved. This method is applicable to characterize orientation relationships in wire, film, and bulk materials with nanocrystalline structures. As a model material, nanocrystalline Cu is used. Several important features of the nanograin structure are discovered utilizing quantitative analysis: the fraction of twin boundaries is substantially higher than that observed in bright-field images in the TEM; small angle grain boundaries are prominent; there is an obvious dependence of the grain boundary characteristics on grain size distribution and mean grain size. PMID:21375327

  8. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  9. Quantitative analysis and classification of AFM images of human hair.

    PubMed

    Gurden, S P; Monteiro, V F; Longo, E; Ferreira, M M C

    2004-07-01

    The surface topography of human hair, as defined by the outer layer of cellular sheets, termed cuticles, largely determines the cosmetic properties of the hair. The condition of the cuticles is of great cosmetic importance, but also has the potential to aid diagnosis in the medical and forensic sciences. Atomic force microscopy (AFM) has been demonstrated to offer unique advantages for analysis of the hair surface, mainly due to the high image resolution and the ease of sample preparation. This article presents an algorithm for the automatic analysis of AFM images of human hair. The cuticular structure is characterized using a series of descriptors, such as step height, tilt angle and cuticle density, allowing quantitative analysis and comparison of different images. The usefulness of this approach is demonstrated by a classification study. Thirty-eight AFM images were measured, consisting of hair samples from (a) untreated and bleached hair samples, and (b) the root and distal ends of the hair fibre. The multivariate classification technique partial least squares discriminant analysis is used to test the ability of the algorithm to characterize the images according to the properties of the hair samples. Most of the images (86%) were found to be classified correctly. PMID:15230871

  10. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  11. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  12. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  13. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  14. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  15. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  16. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  17. Quantitative analysis of creatinine in urine by metalized nanostructured parylene.

    PubMed

    Wang, Hui; Malvadkar, Niranjan; Koytek, S; Bylander, J; Reeves, W Brian; Demirel, Melik C

    2010-01-01

    A highly accurate, real-time multisensor agent monitor for biomarker detection is required for early detection of kidney diseases. Urine creatinine level can provide useful information on the status of the kidney. We prepare nanostructured surface-enhanced Raman spectroscopy (SERS) substrates without template or lithography, which provides controllable, well-organized nanostructures on the surface, for the quantitative analysis of creatinine concentration in urine. We present our work on sensitivity of the SERS substrate to urine samples collected from diabetic patients and healthy persons. We report the preparation of a new type of SERS substrate, which provides fast (<10 s), highly sensitive (creatinine concentration <0.5 microg/mL) and reproducible (<5% variation) detection of urine. Our method to analyze the creatinine level in urine is in good agreement with the enzymatic method. PMID:20459278

  18. Quantitative analysis of creatinine in urine by metalized nanostructured parylene

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Malvadkar, Niranjan; Koytek, S.; Bylander, J.; Reeves, W. Brian; Demirel, Melik C.

    2010-03-01

    A highly accurate, real-time multisensor agent monitor for biomarker detection is required for early detection of kidney diseases. Urine creatinine level can provide useful information on the status of the kidney. We prepare nanostructured surface-enhanced Raman spectroscopy (SERS) substrates without template or lithography, which provides controllable, well-organized nanostructures on the surface, for the quantitative analysis of creatinine concentration in urine. We present our work on sensitivity of the SERS substrate to urine samples collected from diabetic patients and healthy persons. We report the preparation of a new type of SERS substrate, which provides fast (<10 s), highly sensitive (creatinine concentration <0.5 ?g/mL) and reproducible (<5% variation) detection of urine. Our method to analyze the creatinine level in urine is in good agreement with the enzymatic method.

  19. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  20. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  1. Quantitative polymerase chain reaction analysis by deconvolution of internal standard

    PubMed Central

    2010-01-01

    Background Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. Results We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. Conclusions This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results. PMID:20429911

  2. Quantitative analysis and parametric display of regional myocardial mechanics

    NASA Astrophysics Data System (ADS)

    Eusemann, Christian D.; Bellemann, Matthias E.; Robb, Richard A.

    2000-04-01

    Quantitative assessment of regional heart motion has significant potential for more accurate diagnosis of heart disease and/or cardiac irregularities. Local heart motion may be studied from medical imaging sequences. Using functional parametric mapping, regional myocardial motion during a cardiac cycle can be color mapped onto a deformable heart model to obtain better understanding of the structure- to-function relationships in the myocardium, including regional patterns of akinesis or diskinesis associated with ischemia or infarction. In this study, 3D reconstructions were obtained from the Dynamic Spatial Reconstructor at 15 time points throughout one cardiac cycle of pre-infarct and post-infarct hearts. Deformable models were created from the 3D images for each time point of the cardiac cycles. Form these polygonal models, regional excursions and velocities of each vertex representing a unit of myocardium were calculated for successive time-intervals. The calculated results were visualized through model animations and/or specially formatted static images. The time point of regional maximum velocity and excursion of myocardium through the cardiac cycle was displayed using color mapping. The absolute value of regional maximum velocity and maximum excursion were displayed in a similar manner. Using animations, the local myocardial velocity changes were visualized as color changes on the cardiac surface during the cardiac cycle. Moreover, the magnitude and direction of motion for individual segments of myocardium could be displayed. Comparison of these dynamic parametric displays suggest that the ability to encode quantitative functional information on dynamic cardiac anatomy enhances the diagnostic value of 4D images of the heart. Myocardial mechanics quantified this way adds a new dimension to the analysis of cardiac functional disease, including regional patterns of akinesis and diskinesis associated with ischemia and infarction. Similarly, disturbances in regional contractility and filling may be detected and evaluated using such measurements and displays.

  3. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  4. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    PubMed

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  5. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2) the use of a narrow frequency band for acquisition, (3) the inability to identify P and S waves given the small sample size, and (4) acquisition using a narrow amplitude range given a low signal to noise ratio. Moving forward to the final stage of this thesis, with the ability to characterize the sources of AE, we applied our method to study an engineering problem. We chose hydraulic fracturing because of its obvious importance in the future of Canadian energy production. During a hydraulic fracture treatment, whether in a lab or in the field, energy is added to the system via hydraulic pressure. The injection energy, which is on the order of 10 J in the lab and and 100 GJ in the field, is used in the creation of new fracture surface area, the radiation of elastic waves, and aseismic deformation. In the field, it has been consistently shown that the amount of induced seismic energy radiated is between 1e-7 % and 1e-3 % of the injection energy. We tested these findings by calculating the AE energy as a percentage of the injection energy and found that for eight laboratory hydraulic fracture experiments, the seismic energy ranged from 7.02e-08 % to 1.24e-04 % of the injection energy. These results support those made in the field, which concludes that seismic energy projection is a very small component of the hydraulic fracture energy budget and that the dominant energy budget term is aseismic deformation.

  6. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    PubMed

    Wandinger, Sebastian K; Lahortiga, Idoya; Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T M; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  7. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling

    PubMed Central

    Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T. M.; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  8. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  9. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T.

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  10. Quantitative analysis of noninvasive diagnostic procedures for induction motor drives

    NASA Astrophysics Data System (ADS)

    Eltabach, Mario; Antoni, Jerome; Najjar, Micheline

    2007-10-01

    This paper reports quantitative analyses of spectral fault components in five noninvasive diagnostic procedures that use input electric signals to detect different types of abnormalities in induction motors. Besides the traditional one phase current spectrum analysis "SC", the diagnostic procedures based on spectrum analysis of the instantaneous partial powers " P ab", " P cb", total power " P abc", and the current space vector modulus " csvm" are considered. The aim of this comparison study is to improve the diagnosis tools for detection of electromechanical faults in electrical machines by using the best suitable diagnostic procedure knowing some motor and fault characteristics. Defining a severity factor as the increase in amplitude of the fault characteristic frequency, with respect to the healthy condition, enables us to study the sensitivity of the electrical diagnostic tools. As a result, it is shown that the relationship between the angular displacement of the current side-bands components at frequencies ( f± fosc) is directly related to the type of induction motor faults. It is also proved that the total instantaneous power diagnostic procedure was observed to exhibit the highest values of the detection criterion in case of mechanical faults while in case of electrical ones the most reliable diagnostic procedure is tightly related to the value of the motor power factor angle and the group motor-load inertia. Finally, simulation and experimental results show good agreement with the fault modeling theoretical results.

  11. Quantitative analysis of triple-mutant genetic interactions.

    PubMed

    Braberg, Hannes; Alexander, Richard; Shales, Michael; Xu, Jiewei; Franks-Skiba, Kathleen E; Wu, Qiuqin; Haber, James E; Krogan, Nevan J

    2014-08-01

    The quantitative analysis of genetic interactions between pairs of gene mutations has proven to be effective for characterizing cellular functions, but it can miss important interactions for functionally redundant genes. To address this limitation, we have developed an approach termed triple-mutant analysis (TMA). The procedure relies on a query strain that contains two deletions in a pair of redundant or otherwise related genes, which is crossed against a panel of candidate deletion strains to isolate triple mutants and measure their growth. A central feature of TMA is to interrogate mutants that are synthetically sick when two other genes are deleted but interact minimally with either single deletion. This approach has been valuable for discovering genes that restore critical functions when the principal actors are deleted. TMA has also uncovered double-mutant combinations that produce severe defects because a third protein becomes deregulated and acts in a deleterious fashion, and it has revealed functional differences between proteins presumed to act together. The protocol is optimized for Singer ROTOR pinning robots, takes 3 weeks to complete and measures interactions for up to 30 double mutants against a library of 1,536 single mutants. PMID:25010907

  12. Quantitative EEG analysis in post-traumatic anosmia.

    PubMed

    Bonanni, E; Borghetti, D; Fabbrini, M; Maestri, M; Cignoni, F; Sartucci, F; Murri, L

    2006-12-11

    Many objective and quantitative methods have been developed to create a procedure or a device to prove, describe and quantify olfactory deficit and anosmia, especially after a head trauma. Electrophysiological testing throughout olfactoelectroencephalography (olfactoEEG) is based on brain activity desynchronisation, and on the subsequent disappearance of alpha activity on the posterior regions after an olfactory stimulus. Yet traditional evaluation of EEG can be difficult, because of little or hardly detectable alpha activity on the posterior regions ('alpha rare'). The aim of this study was to evaluate the Olfactory Stop Reaction (OSR) by means of frequency band power calculation and subsequent topographical mapping in patients with post-traumatic anosmia, who presented 'alpha rare' EEG. Twenty-five consecutive patients, affected by anosmia caused by head trauma, were submitted to an EEG recording with olfactory stimulation. After signal processing and analysis, an Olfactory Stop Reaction was detected in 17 out of 25 patients; moreover, in these patients we detected a significant decrease in alpha band power in the occipital regions and an increase in theta band power on midline frontal and central regions after olfactory stimulation. In the remaining eight patients, no significant variation in band power was observed. In conclusion, an objective evaluation of the olfactory function with this method of automatic EEG signal analysis allows the limits given by psychophysical methods and traditional EEG to be overcome and attempts to fulfil the requirements for standardization of olfactory function evalution. PMID:17113930

  13. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    PubMed Central

    2015-01-01

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high-throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with postexcision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics and provides insights not readily obtainable from such approaches. PMID:25350482

  14. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-01

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  15. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  16. Quantitative analysis of PPT1 interactome in human neuroblastoma cells

    PubMed Central

    Scifo, Enzo; Szwajda, Agnieszka; Soliymani, Rabah; Pezzini, Francesco; Bianchi, Marzia; Dapkunas, Arvydas; D?bski, Janusz; Uusi-Rauva, Kristiina; Dadlez, Micha?; Gingras, Anne-Claude; Tyynelä, Jaana; Simonati, Alessandro; Jalanko, Anu; Baumann, Marc H.; Lalowski, Maciej

    2015-01-01

    Mutations in the CLN1 gene that encodes Palmitoyl protein thioesterase 1 (PPT1) or CLN1, cause Infantile NCL (INCL, MIM#256730). PPT1 removes long fatty acid chains such as palmitate from modified cysteine residues of proteins. The data shown here result from isolated protein complexes from PPT1-expressing SH-SY5Y stable cells that were subjected to single step affinity purification coupled to mass spectrometry (AP-MS). Prior to the MS analysis, we utilised a modified filter-aided sample preparation (FASP) protocol. Based on label free quantitative analysis of the data by SAINT, 23 PPT1 interacting partners (IP) were identified. A dense connectivity in PPT1 network was further revealed by functional coupling and extended network analyses, linking it to mitochondrial ATP synthesis coupled protein transport and thioester biosynthetic process. Moreover, the terms: inhibition of organismal death, movement disorders and concentration of lipid were predicted to be altered in the PPT1 network. Data presented here are related to Scifo et al. (J. Proteomics, 123 (2015) 42–53). PMID:26217791

  17. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created. PMID:25336172

  18. Automated quantitative gait analysis in animal models of movement disorders

    PubMed Central

    2010-01-01

    Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD), Huntington's disease (HD) and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders. PMID:20691122

  19. Automated quantitative analysis of coordinated locomotor behaviour in rats.

    PubMed

    Tanger, H J; Vanwersch, R A; Wolthuis, O L

    1984-03-01

    Disturbances of motor coordination are usually difficult to quantify. Therefore, a method was developed for the automated quantitative analysis of the movements of the dyed paws of stepping rats, registered by a colour TV camera. The signals from the TV-video system were converted by an electronic interface into voltages proportional to the X- and Y-coordinates of the paws, from which a desktop computer calculated the movements of these paws in time and distance. Application 1 analysed the steps of a rat walking in a hollow rotating wheel. The results showed low variability of the walking pattern, the method was insensitive to low doses of alcohol, but was suitable to quantify overt, e.g. neurotoxic, locomotor disturbances or recovery thereof. In application 2 hurdles were placed in a similar hollow wheel and the rats were trained to step from the top of one hurdle to another. Physostigmine-induced disturbances of this acquired complex motor task could be detected at doses far below those that cause overt symptoms. PMID:6738111

  20. Quantitative analysis by digital processing of streakline flow visualization images

    NASA Astrophysics Data System (ADS)

    Borleteau, J.-P.; Cognet, G.

    1986-01-01

    An experimental technique is described for the quantitative analysis of smoke filament (streakline) flow visualization images. The streakline, smoke droplets emitted by a point source, is illuminated by a sheet of laser light. The 1-2 micron diameter droplets trace the diffusive motions in the flow. Fast photography is performed to record the streaklines using ASA 400 film (an 800 ASA film system is under development). The film frames are scanned by a video camera linked to a mainframe (PDP 11-34) computer to digitize the images within a 256 x 256 pixel grid with 256 gray levels. Sample results are provided from a study of the entrained mixing layer of two plane parallel flows at speeds of 2 and 1 m/sec, respectively. The photography was performed at 600 frames/sec with 1/3000 sec exposures, yielding a spatial resolution of 1.5 mm. The method furnishes data for calculating the probability distribution for the presence of an oil droplet at any point of the flow, as well as recording the turbulent structures which form. Correlations between the vertical displacements of the droplets and the longitudinal speed of the flow permit calculating the frequency of passage of the vortices.

  1. Quantitative analysis of retinal glycerolipid molecular species acetylated by acetolysis.

    PubMed

    Choe, H G; Wiegand, R D; Anderson, R E

    1989-03-01

    A method for the quantitative analysis of molecular species of 1,2-diacylglycerol acetates (1,2-DGAC) containing polyunsaturated fatty acids is described. Phosphatidylethanolamine (PE) isolated from frog retina was used to test the method. PE was converted to 1,2-DGAC by acetolysis. The molecular species of the 1,2-DGAC were resolved by reverse-phase high performance liquid chromatography (HPLC), detected by UV absorption spectroscopy at 210 nm, and identified by gas-liquid chromatography (GLC) of the fatty acid methyl esters (FAME). Molar response curves were generated for each DGAC molecular species that eluted as a single entity from HPLC by determining the moles of fatty acids in the molecular species collected and the response (peak area unit) of the UV detector. Each molecular species response curve was linear from about 10 pmoles to 4-8 nmoles, allowing the slope of each curve to be used as a molar absorptivity. This method provides a means for quantification of most of the molecular species of all glycerolipid classes. PMID:2786049

  2. SILAC-based quantitative proteomic analysis of gastric cancer secretome

    PubMed Central

    Marimuthu, Arivusudar; Subbannayya, Yashwanth; Sahasrabuddhe, Nandini A.; Balakrishnan, Lavanya; Syed, Nazia; Sekhar, Nirujogi Raja; Katte, Teesta V.; Pinto, Sneha M.; Srikanth, Srinivas M.; Kumar, Praveen; Pawar, Harsh; Kashyap, Manoj K.; Maharudraiah, Jagadeesha; Ashktorab, Hassan; Smoot, Duane T; Ramaswamy, Girija; Kumar, Rekha V.; Cheng, Yulan; Meltzer, Stephen J; Roa, Juan Carlos; Chaerkady, Raghothama; Prasad, T.S. Keshava; Harsha, H. C.; Chatterjee, Aditi; Pandey, Akhilesh

    2013-01-01

    Purpose Gastric cancer is a commonly occurring cancer in Asia and one of the leading causes of cancer deaths. However, there is no reliable blood-based screening test for this cancer. Identifying proteins secreted from tumor cells could lead to the discovery of clinically useful biomarkers for early detection of gastric cancer. Experimental design A SILAC-based quantitative proteomic approach was employed to identify secreted proteins that were differentially expressed between neoplastic and non-neoplastic gastric epithelial cells. Proteins from the secretome were subjected to SDS-PAGE and SCX-based fractionation, followed by mass spectrometric analysis on an LTQ-Orbitrap Velos mass spectrometer. Immunohistochemical labeling was employed to validate a subset of candidates using tissue microarrays. Results We identified 2,205 proteins in the gastric cancer secretome of which 263 proteins were overexpressed >4-fold in gastric cancer-derived cell lines as compared to non-neoplastic gastric epithelial cells. Three candidate proteins, proprotein convertase subtilisin/kexin type 9 (PCSK9), lectin mannose binding 2 (LMAN2) and PDGFA associated protein 1 (PDAP1), were validated by immunohistochemical labeling. Conclusions and clinical relevance We report here the largest cancer secretome described to date. The novel biomarkers identified in the current study are excellent candidates for further testing as early detection biomarkers for gastric adenocarcinoma. PMID:23161554

  3. A new method for the quantitative analysis of endodontic microleakage.

    PubMed

    Haïkel, Y; Wittenmeyer, W; Bateman, G; Bentaleb, A; Allemann, C

    1999-03-01

    The aim of this in vitro study was to evaluate the apical seal obtained with three commonly used root canal sealing cements: Sealapex, AH Plus or Topseal, and Sealite, using a new method based on the quantitative analysis of 125I-radiolabeled lysozyme penetration. One hundred thirteen teeth with straight single root canals were instrumented to master apical point #25/30. These were divided into three groups: (i) negative control (4 roots) covered with two layers of nail polish, (ii) test group (105 roots) obturated by laterally condensed guttapercha with the three cements; and (iii) positive control (4 roots) obturated without cement. The groups were then immersed in 125I lysozyme solution for a period of 1, 7, 14, or 28 days. After removal, six sections of 0.8 mm length each were made of each root with a fine diamond wire. Each section was analyzed for activity by a gamma counter, corrected for decay, and used to quantify protein penetration. Leakage was high in the positive control and almost negligible in the negative control. AH Plus (Topseal) and Sealapex showed similar leakage behavior over time, with AH Plus (Topseal) performing better. Sealite showed acceptable leakage up until day 14, after which a large increase occurred, presumably due to three-dimensional instability. PMID:10321181

  4. Quantitative Comparison of Mitotic Spindles by Confocal Image Analysis

    SciTech Connect

    Price, Jeffery R; Aykac, Deniz; Gleason, Shaun Scott

    2005-01-01

    The mitotic spindle is a subcellular protein structure that facilitates chromosome segregation and is crucial to cell division. We describe an image processing approach to quantitatively characterize and compare mitotic spindles that have been imaged three dimensionally using confocal microscopy with fixed-cell preparations. The proposed approach is based on a set of features that are computed from each image stack representing a spindle. We compare several spindle datasets of varying biological (genotype) and/or environmental (drug treatment) conditions. The goal of this effort is to aid biologists in detecting differences between spindles that may not be apparent under subjective visual inspection, and furthermore, to eventually automate such analysis in high-throughput scenarios (thousands of images) where manual inspection would be unreasonable. Experimental results on positive- and negative-control data indicate that the proposed approach is indeed effective. Differences are detected when it is known they do exist (positive control) and no differences are detected when there are none (negative control). In two other experimental comparisons, results indicate structural spindle differences that biologists had not observed previously.

  5. Quantitative texture analysis of talc in mantle hydrated mylonites

    NASA Astrophysics Data System (ADS)

    Benitez-Perez, J. M.; Gomez Barreiro, J.; Wenk, H. R.; Vogel, S. C.; Soda, Y.; Voltolini, M.; Martinez-Catalan, J. R.

    2014-12-01

    A quantitative texture analysis of talc-serpentinite mylonites developed in highly deformed ultramafic rocks from different orogenic contexts have been done with neutorn diffraction at HIPPO (Los Álamos National Laboratory). Mineral assemblage, metamorphic evolution and deformative fabric of these samples could be correlated with those verified along the shallow levels (<100km; <5GPa) of a subduction zone. The hydration of mantle (ultramafic) rocks at those levels it is likely to occur dynamically, with important implications on seismogenesis. Given the high anisotropy of the major phases in the samples (i.e. talc and antigorite) it is expected to influence seismic anisotropy of the whole system, in the presence of texture. However to date there was no data on the crystallographic preferred orientation of talc and examples of antigorite textures are very limited. We explore the contribution of talc texture to the seismic anisotropy of mantle hydrated mylonites. Acknowledgements: This work has been funded by research project CGL2011-22728 of Spanish Ministry of Economy and Competitiveness. JGB and JMBP are grateful to the Ramón y Cajal and FPI funding programs. Access to HIPPO (LANSCE) to conduct diffraction experiments is kindly acknowledged.

  6. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  7. Gas mixing by cardiogenic oscillations: a theoretical quantitative analysis.

    PubMed

    Slutsky, A S

    1981-11-01

    A quantitative theoretical model of the enhanced gas mixing secondary to cardiogenic oscillations is presented based on the concept of augmented gas transport within the tracheobronchial tree (Science 209: 609, 1980). The model assumes "well-mixed" flow in the upper airways with the enhanced mixing described by Deff = Dmol + K . ud, where Deff is the effective diffusivity; Dmol, the molecular diffusivity: K, a constant; u, the root-mean-square flow; and d, the airway diameter. In the smaller airways on analysis based on Taylor laminar dispersion is used described by Deff = Dmol + (1/192) (ud)2/Dmol. The model predicts that, in dogs, cardiogenic oscillations should enhance gas mixing about 10-fold depending on the flow rates generated by the heart. Other predictions are that the augmentation of gas mixing should be greater 1) at lower lung volumes, 2) with sulfur hexafluoride vs. helium or air, 3) after peripheral airway dilation, and 4) after central airways constriction. Theoretical predictions are very close to published experimental results where available. This model should help in the development of mathematical models of gas mixing within the lungs that will include the contribution of cardiogenic oscillations. PMID:7298465

  8. Limits of normality of quantitative thoracic CT analysis

    PubMed Central

    2013-01-01

    Introduction Although computed tomography (CT) is widely used to investigate different pathologies, quantitative data from normal populations are scarce. Reference values may be useful to estimate the anatomical or physiological changes induced by various diseases. Methods We analyzed 100 helical CT scans taken for clinical purposes and referred as nonpathological by the radiologist. Profiles were manually outlined on each CT scan slice and each voxel was classified according to its gas/tissue ratio. For regional analysis, the lungs were divided into 10 sterno-vertebral levels. Results We studied 53 males and 47 females (age 64 ± 13 years); males had a greater total lung volume, lung gas volume and lung tissue. Noninflated tissue averaged 7 ± 4% of the total lung weight, poorly inflated tissue averaged 18 ± 3%, normally inflated tissue averaged 65 ± 8% and overinflated tissue averaged 11 ± 7%. We found a significant correlation between lung weight and subject's height (P <0.0001, r2 = 0.49); the total lung capacity in a supine position was 4,066 ± 1,190 ml, ~1,800 ml less than the predicted total lung capacity in a sitting position. Superimposed pressure averaged 2.6 ± 0.5 cmH2O. Conclusion Subjects without lung disease present significant amounts of poorly inflated and overinflated tissue. Normal lung weight can be predicted from patient's height with reasonable confidence. PMID:23706034

  9. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  10. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    NASA Astrophysics Data System (ADS)

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.; Fei, Baowei

    2012-07-01

    Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology.

  11. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    NASA Astrophysics Data System (ADS)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 ?m width and 100 ?m depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  12. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    PubMed Central

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.; Fei, Baowei

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  13. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy. PMID:24607988

  14. Global Tractography with Embedded Anatomical Priors for Quantitative Connectivity Analysis

    PubMed Central

    Lemkaddem, Alia; Skiöldebrand, Didrik; Dal Palú, Alessandro; Thiran, Jean-Philippe; Daducci, Alessandro

    2014-01-01

    Tractography algorithms provide us with the ability to non-invasively reconstruct fiber pathways in the white matter (WM) by exploiting the directional information described with diffusion magnetic resonance. These methods could be divided into two major classes, local and global. Local methods reconstruct each fiber tract iteratively by considering only directional information at the voxel level and its neighborhood. Global methods, on the other hand, reconstruct all the fiber tracts of the whole brain simultaneously by solving a global energy minimization problem. The latter have shown improvements compared to previous techniques but these algorithms still suffer from an important shortcoming that is crucial in the context of brain connectivity analyses. As no anatomical priors are usually considered during the reconstruction process, the recovered fiber tracts are not guaranteed to connect cortical regions and, as a matter of fact, most of them stop prematurely in the WM; this violates important properties of neural connections, which are known to originate in the gray matter (GM) and develop in the WM. Hence, this shortcoming poses serious limitations for the use of these techniques for the assessment of the structural connectivity between brain regions and, de facto, it can potentially bias any subsequent analysis. Moreover, the estimated tracts are not quantitative, every fiber contributes with the same weight toward the predicted diffusion signal. In this work, we propose a novel approach for global tractography that is specifically designed for connectivity analysis applications which: (i) explicitly enforces anatomical priors of the tracts in the optimization and (ii) considers the effective contribution of each of them, i.e., volume, to the acquired diffusion magnetic resonance imaging (MRI) image. We evaluated our approach on both a realistic diffusion MRI phantom and in vivo data, and also compared its performance to existing tractography algorithms. PMID:25452742

  15. Understanding Maneuver Uncertainties during Inclination Maneuvers of the Aqua Spacecraft

    NASA Technical Reports Server (NTRS)

    McKinley, David P.

    2007-01-01

    During the Fall 2006 inclination campaign for the Aqua spacecraft it was discovered that there was significant uncertainty in the prediction of the semimajor axis change during a maneuver. The low atmospheric drag environment at the time of the maneuvers amplified the effects of this uncertainty leading to a potential violation of the spacecraft ground-track requirements. In order to understand the uncertainty, a Monte Carlo simulation was developed to characterize the expected semi-major axis change uncertainty given the observed behavior of the spacecraft propulsion and attitude control systems during a maneuver. This expected uncertainty was then used to develop new analysis tools to ensure that future inclination maneuver plans will .meet ground-track control requirements in the presence of the error.

  16. Analysis of quantitative phase detection based on optical information processing

    NASA Astrophysics Data System (ADS)

    Tao, Wang; Tu, Jiang-Chen; Chun, Kuang-Tao; Yu, Han-Wang; Xin, Du

    2009-07-01

    Phase object exists widely in nature, such as biological cells, optical components, atmospheric flow field and so on. The phase detection of objects has great significance in the basic research, nondestructive testing, aerospace, military weapons and other areas. The usual methods of phase object detection include interference method, grating method, schlieren method, and phase-contrast method etc. These methods have their own advantages, but they also have some disadvantages on detecting precision, environmental requirements, cost, detection rate, detection range, detection linearity in various applications, even the most sophisticated method-phase contrast method mainly used in microscopic structure, lacks quantitative analysis of the size of the phase of the object and the relationship between the image contrast and the optical system. In this paper, various phase detection means and the characteristics of different applications are analyzed based on the optical information processing, and a phase detection system based on optical filtering is formed. Firstly the frequency spectrum of the phase object is achieved by Fourier transform lens in the system, then the frequency spectrum is changed reasonably by the filter, at last the image which can represent the phase distribution through light intensity is achieved by the inverse Fourier transform. The advantages and disadvantages of the common used filters such as 1/4 wavelength phase filter, high-pass filter and edge filter are analyzed, and their phase resolution is analyzed in the same optical information processing system, and the factors impacting phase resolution are pointed out. The paper draws a conclusion that there exists an optimal filter which makes the detect accuracy best for any application. At last, we discussed how to design an optimal filter through which the ability of the phase testing of optical information processing system can be improved most.

  17. Quantitative analysis of flavanones and chalcones from willow bark.

    PubMed

    Freischmidt, A; Untergehrer, M; Ziegler, J; Knuth, S; Okpanyi, S; Müller, J; Kelber, O; Weiser, D; Jürgenliemk, G

    2015-09-01

    Willow bark extracts are used for the treatment of fever, pain and inflammation. Recent clinical and pharmacological research revealed that not only the salicylic alcohol derivatives, but also the polyphenols significantly contribute to these effects. Quantitative analysis of the European Pharmacopoeia still focuses on the determination of the salicylic alcohol derivatives. The objective of the present study was the development of an effective quantification method for the determination of as many flavanone and chalcone glycosides as possible in Salix purpurea and other Salix species as well as commercial preparations thereof. As Salix species contain a diverse spectrum of the glycosidated flavanones naringenin, eriodictyol, and the chalcone chalconaringenin, a subsequent acidic and enzymatic hydrolysis was developed to yield naringenin and eriodictyol as aglycones, which were quantified by HPLC. The 5-O-glucosides were cleaved with 11.5% TFA before subsequent hydrolysis of the 7-O-glucosides with an almond ?-glucosidase at pH 6-7. The method was validated with regard to LOD, LOQ, intraday and interday precision, accuracy, stability, recovery, time of hydrolysis, robustness and applicability to extracts. All 5-O- and 7-O-glucosides of naringenin, eriodictyol and chalconaringenin were completely hydrolysed and converted to naringenin and eriodictyol. The LOD of the HPLC method was 0.77 ?M of naringenin and 0.45 ?M of eriodictyol. The LOQ was 2.34 ?M of naringenin and 1.35 ?M for eriodictyol. The method is robust with regard to sample weight, but susceptible concerning enzyme deterioration. The developed method is applicable to the determination of flavanone and chalcone glycosides in willow bark and corresponding preparations. PMID:26492639

  18. Quantitative analysis of analgoantipyretics in dosage form using planar chromatography.

    PubMed

    Franeta, J T; Agbaba, D D; Eric, S M; Pavkov, S P; Vladimirov, S D; Aleksic, M B

    2001-03-01

    In the therapy of pain of weaker genesis, frequently used drugs usually represent a mix of analgoantipyretics of different chemical structures, mostly derivatives of salicylic acid, pyrazolone and p-aminophenol as well as derivatives of propionic and acetylsalicylic acid. For the determination of these drugs, different chromatographic methods have been applied, mostly HPLC, due to the the lower polarity (pyrazolones derivatives) and thermolability, as well as nonvolatility of compounds investigated. TLC method, considering advantages which include simplicity, reasonable sensitivity, rapidity, excellent resolving power and low cost has been successfully explored for the determination of analgoantipyretic compounds. The aim of this work was to develop a simple and rapid HPTLC method for the determination of acetylsalicylic acid, paracetamol, caffeine and phenobarbitone in dosage form. The determination of analgoantipyretics were performed on pre-coated HPTLC silica gel plates (10 x 20 cm(2)) by development in the mobile phase dichlormethane-ethyl acetate-cyclohexane-isopropanol-0.1 M HCL-formic acid (9:8:3:1.5:0.2:0.2 v/v/v/v/v/v). Migration distances (68.6+0.2 mm, 54.1+0.1 mm, 36.4+0.14 mm and 85.9+0.11 mm for acetylsalicylic acid, paracetamol, caffeine and phenobarbitone, respectively) with low RSD values (0.13--0.39%) showed a satisfactory reproductivity of the chromatographic system. TLC scanner was used for direct evaluation of the chromatograms in the reflectance/absorbance mode. Established calibration curves (r>0.999), precision (0.3--1.02%) and detection limits, as well as recovery values (96.51--98.1%) were validated and found to be satisfactory. The method was found to be reproducible and convenient for the quantitative analysis of compounds investigated in their dosage forms. PMID:11248516

  19. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  20. Descriptive Quantitative Analysis of Rearfoot Alignment Radiographic Parameters.

    PubMed

    Meyr, Andrew J; Wagoner, Matthew R

    2015-01-01

    Although the radiographic parameters of the transverse talocalcaneal angle (tTCA), calcaneocuboid angle (CCA), talar head uncovering (THU), calcaneal inclination angle (CIA), talar declination angle (TDA), lateral talar-first metatarsal angle (lTFA), and lateral talocalcaneal angle (lTCA) form the basis of the preoperative evaluation and procedure selection for pes planovalgus deformity, the so-called normal values of these measurements are not well-established. The objectives of the present study were to retrospectively evaluate the descriptive statistics of these radiographic parameters (tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA) in a large population, and, second, to determine an objective basis for defining "normal" versus "abnormal" measurements. As a secondary outcome, the relationship of these variables to the body mass index was assessed. Anteroposterior and lateral foot radiographs from 250 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated. The results revealed a mean measurement of 24.12°, 13.20°, 74.32%, 16.41°, 26.64°, 8.37°, and 43.41° for the tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA, respectively. These were generally in line with the reported historical normal values. Descriptive statistical analysis demonstrated that the tTCA, THU, and TDA met the standards to be considered normally distributed but that the CCA, CIA, lTFA, and lTCA demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, only the CIA (R = -0.2428) and lTCA (R = -0.2449) demonstrated substantial correlation with the body mass index. No differentiations in deformity progression were observed when the radiographic parameters were plotted against each other to lead to a quantitative basis for defining "normal" versus "abnormal" measurements. PMID:26002682

  1. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  2. Visual Modeling for Aqua Ventus I off Monhegan Island, ME

    SciTech Connect

    Hanna, Luke A.; Whiting, Jonathan M.; Copping, Andrea E.

    2013-11-27

    To assist the University of Maine in demonstrating a clear pathway to project completion, PNNL has developed visualization models of the Aqua Ventus I project that accurately depict the Aqua Ventus I turbines from various points on Monhegain Island, ME and the surrounding area. With a hub height of 100 meters, the Aqua Ventus I turbines are large and may be seen from many areas on Monhegan Island, potentially disrupting important viewsheds. By developing these visualization models, which consist of actual photographs taken from Monhegan Island and the surrounding area with the Aqua Ventus I turbines superimposed within each photograph, PNNL intends to support the project’s siting and permitting process by providing the Monhegan Island community and various other stakeholders with a probable glimpse of how the Aqua Ventus I project will appear.

  3. Quantitative descriptive analysis and principal component analysis for sensory characterization of ultrapasteurized milk.

    PubMed

    Chapman, K W; Lawless, H T; Boor, K J

    2001-01-01

    Quantitative descriptive analysis was used to describe the key attributes of nine ultrapasteurized (UP) milk products of various fat levels, including two lactose-reduced products, from two dairy plants. Principal components analysis identified four significant principal components that accounted for 87.6% of the variance in the sensory attribute data. Principal component scores indicated that the location of each UP milk along each of four scales primarily corresponded to cooked, drying/lingering, sweet, and bitter attributes. Overall product quality was modeled as a function of the principal components using multiple least squares regression (R2 = 0.810). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring UP fluid milk product attributes that are important to consumers. PMID:11210023

  4. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    PubMed Central

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ?6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T. forsythia were used. Salivary diagnostics of periodontitis has potential especially in large-scale population studies and health promotion. The cumulative strategy appears to be useful in the analysis of salivary bacteria as markers of periodontitis. PMID:26484315

  5. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    PubMed

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ?6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T. forsythia were used. Salivary diagnostics of periodontitis has potential especially in large-scale population studies and health promotion. The cumulative strategy appears to be useful in the analysis of salivary bacteria as markers of periodontitis. PMID:26484315

  6. Quantitative genetics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...

  7. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    ERIC Educational Resources Information Center

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  8. [Addition internal standard method in chromatographic quantitative analysis].

    PubMed

    Zheng, Y J; Kang, Y; Feng, Y Z; Zhang, R; Zhang, W B

    2001-09-01

    Internal standard method is a conventional chromatographic quantitative method which requires one or several internal standards added. The internal standard component must not be contained in the sample and need a good separation between the internal standard and sample components. In many cases selecting an internal standard is not convenient or even restricted by the seperation of components. In this paper, we try to combine the internal standard method and the addition method to form a new chromatographic quantitation method named addition internal standard method. The principles of addition internal standard method are suitable to not only chromatographic quantitation but also polarography etc. The related theory and foundation of the method are defined. The operation steps and the conditions suitable to the method are discussed. The advantages and disadvantages of this method are explained in detail. PMID:12545448

  9. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  10. Quantitative analysis of a labeled 13C system in NMR

    NASA Astrophysics Data System (ADS)

    Liu, Huawei; Zhang, Shanmin

    2015-02-01

    A quantitative method to analyze labeled S spin systems is proposed. Different from non-labeled systems, the reciprocity relation is broken down in labeled systems by residual S spins dipole-dipole interactions and indirect interactions. But under fast MAS, these interactions are reduced considerably and demand a longer time to communicate between different parts of the Hamiltonian. Therefore within a short contact time (<0.5 ms), the same quantitative method can be used for both non-labeled and labeled systems. The method is independent of local structures and can be applied to samples of multiple substances.

  11. Quantitative analysis of sensor for pressure waveform measurement

    PubMed Central

    2010-01-01

    Background Arterial pressure waveforms contain important diagnostic and physiological information since their contour depends on a healthy cardiovascular system [1]. A sensor was placed at the measured artery and some contact pressure was used to measure the pressure waveform. However, where is the location of the sensor just about enough to detect a complete pressure waveform for the diagnosis? How much contact pressure is needed over the pulse point? These two problems still remain unresolved. Method In this study, we propose a quantitative analysis to evaluate the pressure waveform for locating the position and applying the appropriate force between the sensor and the radial artery. The two-axis mechanism and the modified sensor have been designed to estimate the radial arterial width and detect the contact pressure. The template matching method was used to analyze the pressure waveform. In the X-axis scan, we found that the arterial diameter changed waveform (ADCW) and the pressure waveform would change from small to large and then back to small again when the sensor was moved across the radial artery. In the Z-axis scan, we also found that the ADCW and the pressure waveform would change from small to large and then back to small again when the applied contact pressure continuously increased. Results In the X-axis scan, the template correlation coefficients of the left and right boundaries of the radial arterial width were 0.987 ± 0.016 and 0.978 ± 0.028, respectively. In the Z-axis scan, when the excessive contact pressure was more than 100 mm Hg, the template correlation was below 0.983. In applying force, when using the maximum amplitude as the criteria level, the lower contact pressure (r = 0.988 ± 0.004) was better than the higher contact pressure (r = 0.976 ± 0.012). Conclusions Although, the optimal detective position has to be close to the middle of the radial arterial, the pressure waveform also has a good completeness with a template correlation coefficient of above 0.99 when the position was within ± 1 mm of the middle of the radial arterial range. In applying force, using the maximum amplitude as the criteria level, the lower contact pressure was better than the higher contact pressure. PMID:20092621

  12. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (?-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  13. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  14. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  15. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  16. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  17. Aqua splint suture technique in isolated zygomatic arch fractures.

    PubMed

    Kim, Dong-Kyu; Kim, Seung Kyun; Lee, Jun Ho; Park, Chan Hum

    2014-04-01

    Various methods have been used to treat zygomatic arch fractures, but no optimal modality exists for reducing these fractures and supporting the depressed bone fragments without causing esthetic problems and discomfort for life. We developed a novel aqua splint and suture technique for stabilizing isolated zygomatic arch fractures. The objective of this study is to evaluate the effect of novel aqua splint and suture technique in isolated zygomatic arch fractures. Patients with isolated zygomatic arch fractures were treated by a single surgeon in a single center from January 2000 through December 2012. Classic Gillies approach without external fixation was performed from January 2000 to December 2003, while the novel technique has been performed since 2004. 67 consecutive patients were included (Classic method, n = 32 and Novel method, n = 35). An informed consent was obtained from all patients. The novel aqua splint and suture technique was performed by the following fashion: first, we evaluated intraoperatively the bony alignment by ultrasonography and then, reduced the depressed fracture surgically using the Gillies approach. Thereafter, to stabilize the fracture and obtain the smooth facial figure, we made an aqua splint that fit the facial contour and placed monofilament nonabsorbable sutures around the fractured zygomatic arch. The novel aqua splint and suture technique showed significantly correlated with better cosmetic and functional results. In conclusion, the aqua splint suture technique is very simple, quick, safe, and effective for stabilizing repositioned zygomatic arch fractures. The aqua splint suture technique can be a good alternative procedure in isolated zygomatic arch fractures. PMID:23793598

  18. Is the new AquaTrainer® snorkel valid for VO2 assessment in swimming?

    PubMed

    Baldari, C; Fernandes, R J; Meucci, M; Ribeiro, J; Vilas-Boas, J P; Guidetti, L

    2013-04-01

    The Cosmed AquaTrainer® snorkel, in connection with the K4b2 analyzer, is the most recent instrument used for real time gas analysis during swimming. This study aimed to test if a new AquaTrainer® snorkel with 2 (SV2) or 4 (SV4) valves is comparable to a standard face mask (Mask) being valid for real time gas analysis under controlled laboratory and swimming pool conditions. 9 swimmers performed 2 swimming and 3 cycling tests at 3 different workloads on separate days. Tests were performed in random order, at constant exercise load with direct turbine temperature measurements, breathing with Mask, SV4 and SV2 while cycling, and with SV2 and SV4 while swimming. A high agreement was obtained using Passing - Bablok regression analysis in oxygen consumption, carbon dioxide production, tidal volumes, pulmonary ventilation, expiratory fraction of oxygen and carbon dioxide, and heart rate comparing different conditions in swimming and cycling. Proportional and fixed differences were always rejected (95% CI always contained the value 1 for the slope and the 0 for the intercept). In conclusion, the new SV2 AquaTrainer® snorkel, can be considered a valid device for gas analysis, being comparable to the Mask and the SV4 in cycling, and to the SV4 in swimming. PMID:23041962

  19. Quantitative PCR Analysis of DNA Aptamer Pharmacokinetics in Mice

    PubMed Central

    Perschbacher, Katherine; Smestad, John A.; Peters, Justin P.; Standiford, Miranda M.; Denic, Aleksandar; Wootla, Bharath; Warrington, Arthur E.; Rodriguez, Moses

    2015-01-01

    DNA aptamer oligonucleotides and their protein conjugates show promise as therapeutics in animal models of diseases such as multiple sclerosis. These molecules are large and highly charged, raising questions about their biodistribution and pharmacokinetics in mammals. Here we exploit the power of quantitative polymerase chain reaction to accurately quantitate the tissue distribution of 40-nucleotide DNA aptamers and their streptavidin conjugates after intraperitoneal injection in mice. We show remarkably rapid distribution to peripheral tissues including the central nervous system. Modeling of tissue distribution data reveals the importance of DNA aptamer sequence, 3? modification, and protein conjugation in enhancing tissue exposure. These data help to interpret the previously observed effectiveness of aptamer conjugates, as opposed to free aptamers, in stimulating central nervous system remyelination in a mouse model of multiple sclerosis. PMID:25536292

  20. Quantitative PCR analysis of DNA aptamer pharmacokinetics in mice.

    PubMed

    Perschbacher, Katherine; Smestad, John A; Peters, Justin P; Standiford, Miranda M; Denic, Aleksandar; Wootla, Bharath; Warrington, Arthur E; Rodriguez, Moses; Maher, L James

    2015-02-01

    DNA aptamer oligonucleotides and their protein conjugates show promise as therapeutics in animal models of diseases such as multiple sclerosis. These molecules are large and highly charged, raising questions about their biodistribution and pharmacokinetics in mammals. Here we exploit the power of quantitative polymerase chain reaction to accurately quantitate the tissue distribution of 40-nucleotide DNA aptamers and their streptavidin conjugates after intraperitoneal injection in mice. We show remarkably rapid distribution to peripheral tissues including the central nervous system. Modeling of tissue distribution data reveals the importance of DNA aptamer sequence, 3' modification, and protein conjugation in enhancing tissue exposure. These data help to interpret the previously observed effectiveness of aptamer conjugates, as opposed to free aptamers, in stimulating central nervous system remyelination in a mouse model of multiple sclerosis. PMID:25536292

  1. Quantitative sectioning and noise analysis for structured illumination microscopy

    PubMed Central

    Hagen, Nathan; Gao, Liang; Tkaczyk, Tomasz S.

    2011-01-01

    Structured illumination (SI) has long been regarded as a nonquantitative technique for obtaining sectioned microscopic images. Its lack of quantitative results has restricted the use of SI sectioning to qualitative imaging experiments, and has also limited researchers’ ability to compare SI against competing sectioning methods such as confocal microscopy. We show how to modify the standard SI sectioning algorithm to make the technique quantitative, and provide formulas for calculating the noise in the sectioned images. The results indicate that, for an illumination source providing the same spatially-integrated photon flux at the object plane, and for the same effective slice thicknesses, SI sectioning can provide higher SNR images than confocal microscopy for an equivalent setup when the modulation contrast exceeds about 0.09. PMID:22274364

  2. Quantitative analysis of HSV gene expression during lytic infection

    PubMed Central

    Turner, Anne-Marie W.; Arbuckle, Jesse H.; Kristie, Thomas M.

    2014-01-01

    Herpes Simplex Virus (HSV) is a human pathogen that establishes latency and undergoes periodic reactivation, resulting in chronic recurrent lytic infection. HSV lytic infection is characterized by an organized cascade of three gene classes, however successful transcription and expression of the first, the immediate early class, is critical to the overall success of viral infection. This initial event of lytic infection is also highly dependent on host cell factors. This unit uses RNA interference and small molecule inhibitors to examine the role of host and viral proteins in HSV lytic infection. Methods detailing isolation of viral and host RNA and genomic DNA, followed by quantitative real-time PCR, allow characterization of impacts on viral transcription and replication respectively. Western blot can be used to confirm quantitative PCR results. This combination of protocols represents a starting point for researchers interested in virus-host interactions during HSV lytic infection. PMID:25367270

  3. Quantitative spectroscopic analysis of and distance to SN1999em

    NASA Astrophysics Data System (ADS)

    Dessart, L.; Hillier, D. J.

    2006-02-01

    Multi-epoch multi-wavelength spectroscopic observations of photospheric-phase type II supernovae (SN) provide information on massive-star progenitor properties, the core-collapse mechanism, and distances in the Universe. Following successes of recent endeavors (Dessart & Hillier 2005a, A&A, 437, 667; 2005b, A&A, 439, 671) with the non-LTE model atmosphere code CMFGEN (Hillier & Miller 1998, ApJ, 496, 407), we present a detailed quantitative spectroscopic analysis of the type II SN1999em and, using the Expanding Photosphere Method (EPM) or synthetic fits to observed spectra, à la Baron et al. (2004, ApJ, 616, 91), we estimate its distance. Selecting eight epochs, which cover the first 38 days after discovery, we obtain satisfactory fits to optical spectroscopic observations of SN1999em (including the UV and near-IR ranges when available). We use the same iron-group metal content for the ejecta, the same power-law density distribution (with exponent n = 10{-}12), and a Hubble-velocity law at all times. We adopt a H/He/C/N/O abundance pattern compatible with CNO-cycle equilibrium values for a RSG/BSG progenitor, with C/O enhanced and N depleted at later times. The overall evolution of the spectral energy distribution, whose peak shifts to longer wavelengths as time progresses, reflects the steady temperature/ionization-level decrease of the ejecta, associated non-linearly with a dramatic shift to ions with stronger line-blocking powers in the UV and optical (Fe ii, Tiii). In the parameter space investigated, CMFGEN is very sensitive and provides photospheric temperatures and velocities, reddenings, and the H/He abundance ratio with an accuracy of ±500 K, ±10%, 0.05 and 50%, respectively. Following Leonard et al. (2002, PASP, 114, 35), and their use of correction factors from Hamuy et al. (2001, ApJ, 558, 615), we estimate an EPM distance to SN1999em that also falls 30% short of the Cepheid distance of 11.7 Mpc to its host galaxy NGC 1637 (Leonard et al. 2003, ApJ, 594, 247). However, using the systematically higher correction factors of Dessart & Hillier (2005b) removes the discrepancy. A significant scatter, arising primarily from errors in the correction factors and derived temperatures, is seen in distances derived using different band passes. However, adopting both correction factors and corresponding color-temperatures from tailored models to each observation leads to a good agreement between distance estimates obtained from different band passes. The need for detailed model computations thus defeats the appeal and simplicity of the original EPM method, which uses tabulated correction factors and broadband fluxes, for distance determinations. However, detailed fits to SN optical spectra, based on tailored models for individual SN observations, offers a promising approach to obtaining accurate distances, either through the EPM or via the technique of Baron et al. (2004). Our best distance-estimate to SN1999em is 11.5 ± 1.0 Mpc. We note that to achieve 10-20% accuracy in such distance estimates requires multiple observations, covering preferentially a range of early epochs preceding the hydrogen-recombination phase.

  4. Fluorescent microscopy approaches of quantitative soil microbial analysis

    NASA Astrophysics Data System (ADS)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ hybridization method (FISH). This approach was used for evaluation of contribution of each gram-negative bactera group. No significant difference between the main soil gram-negative bacterial groups (phylum Proteobacteria and Bacteroidetes) was found both under anaerobic and anaerobic conditions in chernozem in the topsoil. Thus soil gram-negative bacteria play an important ecological role in natural polymer degradation as common group of microorganisms. Another approach with using cascade filtration technique for bacterial population density estimation in chernozem was compared to classical method of fluorescent microscopy. Quantification of soil bacteria with cascade filtration provided by filters with different diameters and filtering of soil suspension in fixed amount. In comparison to the classical fluorescent microscopy method the modification with filtration of soil suspension provided to quantify more bacterial cells. Thus biomass calculation results of soil bacteria by using classical fluorescent microscopy could be underestimated and combination with cascade filtration technique allow to avoid potential experimental error. Thereby, combination and comparison of several fluorescent microscopy methods modifications established during the research provided miscellaneous approaches in soil bacteria quantification and analysis of ecological roles of soil microorganisms.

  5. Quantitative analysis of the human T cell palmitome

    PubMed Central

    Morrison, Eliot; Kuropka, Benno; Kliche, Stefanie; Brügger, Britta; Krause, Eberhard; Freund, Christian

    2015-01-01

    Palmitoylation is a reversible post-translational modification used to inducibly compartmentalize proteins in cellular membranes, affecting the function of receptors and intracellular signaling proteins. The identification of protein “palmitomes” in several cell lines raises the question to what extent this modification is conserved in primary cells. Here we use primary T cells with acyl-biotin exchange and quantitative mass spectrometry to identify a pool of proteins previously unreported as palmitoylated in vivo. PMID:26111759

  6. Semi-quantitative spectrographic analysis and rank correlation in geochemistry

    USGS Publications Warehouse

    Flanagan, F.J.

    1957-01-01

    The rank correlation coefficient, rs, which involves less computation than the product-moment correlation coefficient, r, can be used to indicate the degree of relationship between two elements. The method is applicable in situations where the assumptions underlying normal distribution correlation theory may not be satisfied. Semi-quantitative spectrographic analyses which are reported as grouped or partly ranked data can be used to calculate rank correlations between elements. ?? 1957.

  7. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  8. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  9. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. PMID:23523366

  10. Quantitative analysis of a wind energy conversion model

    NASA Astrophysics Data System (ADS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-03-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s-1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is cp = 0.15. The v3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively.

  11. Electrospray Mass Spectrometry for Quantitative Plasma Proteome Analysis

    PubMed Central

    Wang, Hong; Hanash, Sam

    2015-01-01

    Summary Electrospray ionization mass spectrometry (ESI-MS) is an efficient soft ionization procedure for macro biomolecules. However, it is a rather delicate process to produce charged molecules for mass-to-charge ratio (m/z) based measurement. In this chapter, the mechanism of ESI is briefly presented, and the experimental pipeline for quantitative profiling of plasma proteins (prefractionation immunodepletion, protein isotope tagging, 2D-HPLC separation of intact proteins, and LC-MS) is presented as applied by our group in studies of cancer biomarker discovery. PMID:19544026

  12. Radionuclide quantitation of left-to right cardiac shunts using deconvolution analysis: concise communication

    SciTech Connect

    Ham, H.R.; Dobbeleir, A.; Virat, P.; Piepsz, A.; Lenaers, A.

    1981-08-01

    Quantitative radionuclide angiocardiography (QRAC) was performed with and without deconvolution analysis (DA) in 87 children with various heart disorders. QRAC shunt quantitation was possible without DA in 70% of the cases and with DA in 95%. Among 21 patients with prolonged bolus injections, quantitation of the shunt was possible in 52% of the cases without DA an in all cases with DA. Correlation between oximetry and QRAC with DA was better than between oximetry and QRAC without DA. It is concluded that QRAC with DA is a more reliable, noninvasive means for detection and quantitation of left-to-right cardiac shunts than QRAC without DA.

  13. Aqua satellite orbiting the Earth - Duration: 116 seconds.

    NASA Video Gallery

    This animation shows the Aqua satellite orbiting the Earth on August 27, 2005 by revealing MODIS true-color imagery for that day. This animation is on a cartesian map projection, so the satellite w...

  14. Calibration Adjustments to the MODIS Aqua Ocean Color Bands

    NASA Technical Reports Server (NTRS)

    Meister, Gerhard

    2012-01-01

    After the end of the SeaWiFS mission in 2010 and the MERIS mission in 2012, the ocean color products of the MODIS on Aqua are the only remaining source to continue the ocean color climate data record until the VIIRS ocean color products become operational (expected for summer 2013). The MODIS on Aqua is well beyond its expected lifetime, and the calibration accuracy of the short wavelengths (412nm and 443nm) has deteriorated in recent years_ Initially, SeaWiFS data were used to improve the MODIS Aqua calibration, but this solution was not applicable after the end of the SeaWiFS mission_ In 2012, a new calibration methodology was applied by the MODIS calibration and support team using desert sites to improve the degradation trending_ This presentation presents further improvements to this new approach. The 2012 reprocessing of the MODIS Aqua ocean color products is based on the new methodology.

  15. Building No. 905, showing typical aqua medias or rain hoods ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Building No. 905, showing typical aqua medias or rain hoods - Presidio of San Francisco, Enlisted Men's Barracks Type, West end of Crissy Field, between Pearce & Maudlin Streets, San Francisco, San Francisco County, CA

  16. ES8 Aqua-FM3 Ed3

    Atmospheric Science Data Center

    2016-02-10

    ... Detailed CERES ERBElike Level 2 (ES-8) Product Information Collection Guide:  ES8 CG R3V4  (PDF) ... for Terra and Aqua; Edition2 for TRMM) are approved for science publications. SCAR-B Block:  ...

  17. QUANTITATIVE ANALYSIS OF PERIODONTAL PATHOGENS IN PERIODONTITIS AND GINGIVITIS.

    PubMed

    Scapoli, L; Girardi, A; Palmieri, A; Martinelli, M; Cura, F; Lauritano, D; Carinci, F

    2015-01-01

    Periodontal tissues surround the teeth and provide their attachment. Periodontal diseases include a mild and reversible form named gingivitis and periodontitis that is the main cause of tooth loss in adults. Gingivitis, that affects gums and coronal junctional epithelium, as well as periodontitis, that is characterized by loss of connective tissue attachment, are caused by a persistent inflammatory response promoted by alteration of periodontal biofilm. The aim of the study was to test whether the prevalence or relative amount of each species was associated with a particular clinical condition. Periodontal evaluation of 539 unrelated patients was performed by the Periodontal Screening and Recording (PSR) system. Subgingival samples were obtained from the site with the worst PSR score. A selection of eleven bacterial species was evaluated by quantitative real time PCR. Some bacterial species were found to be associated with all phases of periodontal disease, such as Tannerella forsythia, Treponema denticola, and Treponema lecithinolyticum, while other species were more specifically associated with periodontitis, such as Porphyromonas endodontalis and Porphyromonas gingivalis, or with gingivitis, such as Capnocytophaga ochracea and Campylobacter rectus. Quantitative and qualitative analyses helps to better understand the microbial changes associated with different stages of periodontal disease. PMID:26511188

  18. A Quantitative Proteomic Analysis of In Vitro Assembled Chromatin.

    PubMed

    Völker-Albert, Moritz Carl; Pusch, Miriam Caroline; Fedisch, Andreas; Schilcher, Pierre; Schmidt, Andreas; Imhof, Axel

    2016-03-01

    The structure of chromatin is critical for many aspects of cellular physiology and is considered to be the primary medium to store epigenetic information. It is defined by the histone molecules that constitute the nucleosome, the positioning of the nucleosomes along the DNA and the non-histone proteins that associate with it. These factors help to establish and maintain a largely DNA sequence-independent but surprisingly stable structure. Chromatin is extensively disassembled and reassembled during DNA replication, repair, recombination or transcription in order to allow the necessary factors to gain access to their substrate. Despite such constant interference with chromatin structure, the epigenetic information is generally well maintained. Surprisingly, the mechanisms that coordinate chromatin assembly and ensure proper assembly are not particularly well understood. Here, we use label free quantitative mass spectrometry to describe the kinetics of in vitro assembled chromatin supported by an embryo extract prepared from preblastoderm Drosophila melanogaster embryos. The use of a data independent acquisition method for proteome wide quantitation allows a time resolved comparison of in vitro chromatin assembly. A comparison of our in vitro data with proteomic studies of replicative chromatin assembly in vivo reveals an extensive overlap showing that the in vitro system can be used for investigating the kinetics of chromatin assembly in a proteome-wide manner. PMID:26811354

  19. Quantitative phenotypic analysis of multistress response in Zygosaccharomyces rouxii complex.

    PubMed

    Solieri, Lisa; Dakal, Tikam C; Bicciato, Silvio

    2014-06-01

    Zygosaccharomyces rouxii complex comprises three yeasts clusters sourced from sugar- and salt-rich environments: haploid Zygosaccharomyces rouxii, diploid Zygosaccharomyces sapae and allodiploid/aneuploid strains of uncertain taxonomic affiliations. These yeasts have been characterized with respect to gene copy number variation, karyotype variability and change in ploidy, but functional diversity in stress responses has not been explored yet. Here, we quantitatively analysed the stress response variation in seven strains of the Z. rouxii complex by modelling growth variables via model and model-free fitting methods. Based on the spline fit as most reliable modelling method, we resolved different interstrain responses to 15 environmental perturbations. Compared with Z. rouxii CBS 732(T) and Z. sapae strains ABT301(T) and ABT601, allodiploid strain ATCC 42981 and aneuploid strains CBS 4837 and CBS 4838 displayed higher multistress resistance and better performance in glycerol respiration even in the presence of copper. ?-based logarithmic phenotypic index highlighted that ABT601 is a slow-growing strain insensitive to stress, whereas ABT301(T) grows fast on rich medium and is sensitive to suboptimal conditions. Overall, the differences in stress response could imply different adaptation mechanisms to sugar- and salt-rich niches. The obtained phenotypic profiling contributes to provide quantitative insights for elucidating the adaptive mechanisms to stress in halo- and osmo-tolerant Zygosaccharomyces yeasts. PMID:24533625

  20. Quantitative analysis of biopolymers by matrix-assisted laser desorption

    SciTech Connect

    Tang, K.; Allman, S.L.; Jones, R.B.; Chen, C.H. )

    1993-08-01

    During the past few years, major efforts have been made to use mass spectrometry to measure biopolymers because of the great potential benefit to biological and medical research. Although the theoretical details of laser desorption and ionization mechanisms of MALDI are not yet fully understood, several models have been presented to explain the production of large biopolymer ions. In brief, it is very difficult to obtain reliable measurements of the absolute quantity of analytes by MALDI. If MALDI is going to become a routine analytical tool, it is obvious that quantitative measurement capability must be pursued. Oligonucleotides and protein samples used in this work were purchased from commercial sources. Nicotinic acid was used as matrix for both types of biopolymers. From this experiment, it is seen that it is difficult to obtain absolute quantitative measurements of biopolymers using MALDI. However, internal calibration with molecules having similar chemical properties can be used to resolve these difficulties. Chemical reactions between biopolymers must be avoided to prevent the destruction of the analyte materials. 10 refs., 8 figs.

  1. Qualitative and quantitative analysis of volatile constituents from latrines.

    PubMed

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 ?g/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

  2. Quantitative interpretation of mineral hyperspectral images based on principal component analysis and independent component analysis methods.

    PubMed

    Jiang, Xiping; Jiang, Yu; Wu, Fang; Wu, Fenghuang

    2014-01-01

    Interpretation of mineral hyperspectral images provides large amounts of high-dimensional data, which is often complicated by mixed pixels. The quantitative interpretation of hyperspectral images is known to be extremely difficult when three types of information are unknown, namely, the number of pure pixels, the spectrum of pure pixels, and the mixing matrix. The problem is made even more complex by the disturbance of noise. The key to interpreting abstract mineral component information, i.e., pixel unmixing and abundance inversion, is how to effectively reduce noise, dimension, and redundancy. A three-step procedure is developed in this study for quantitative interpretation of hyperspectral images. First, the principal component analysis (PCA) method can be used to process the pixel spectrum matrix and keep characteristic vectors with larger eigenvalues. This can effectively reduce the noise and redundancy, which facilitates the abstraction of major component information. Second, the independent component analysis (ICA) method can be used to identify and unmix the pixels based on the linear mixed model. Third, the pure-pixel spectrums can be normalized for abundance inversion, which gives the abundance of each pure pixel. In numerical experiments, both simulation data and actual data were used to demonstrate the performance of our three-step procedure. Under simulation data, the results of our procedure were compared with theoretical values. Under the actual data measured from core hyperspectral images, the results obtained through our algorithm are compared with those of similar software (Mineral Spectral Analysis 1.0, Nanjing Institute of Geology and Mineral Resources). The comparisons show that our method is effective and can provide reference for quantitative interpretation of hyperspectral images. PMID:24694708

  3. Improvements to direct quantitative analysis of multiple microRNAs facilitating faster analysis.

    PubMed

    Ghasemi, Farhad; Wegman, David W; Kanoatov, Mirzo; Yang, Burton B; Liu, Stanley K; Yousef, George M; Krylov, Sergey N

    2013-11-01

    Studies suggest that patterns of deregulation in sets of microRNA (miRNA) can be used as cancer diagnostic and prognostic biomarkers. Establishing a "miRNA fingerprint"-based diagnostic technique requires a suitable miRNA quantitation method. The appropriate method must be direct, sensitive, capable of simultaneous analysis of multiple miRNAs, rapid, and robust. Direct quantitative analysis of multiple microRNAs (DQAMmiR) is a recently introduced capillary electrophoresis-based hybridization assay that satisfies most of these criteria. Previous implementations of the method suffered, however, from slow analysis time and required lengthy and stringent purification of hybridization probes. Here, we introduce a set of critical improvements to DQAMmiR that address these technical limitations. First, we have devised an efficient purification procedure that achieves the required purity of the hybridization probe in a fast and simple fashion. Second, we have optimized the concentrations of the DNA probe to decrease the hybridization time to 10 min. Lastly, we have demonstrated that the increased probe concentrations and decreased incubation time removed the need for masking DNA, further simplifying the method and increasing its robustness. The presented improvements bring DQAMmiR closer to use in a clinical setting. PMID:24127917

  4. Bridging the gaps for global sustainable development: a quantitative analysis.

    PubMed

    Udo, Victor E; Jansson, Peter Mark

    2009-09-01

    Global human progress occurs in a complex web of interactions between society, technology and the environment as driven by governance and infrastructure management capacity among nations. In our globalizing world, this complex web of interactions over the last 200 years has resulted in the chronic widening of economic and political gaps between the haves and the have-nots with consequential global cultural and ecosystem challenges. At the bottom of these challenges is the issue of resource limitations on our finite planet with increasing population. The problem is further compounded by pleasure-driven and poverty-driven ecological depletion and pollution by the haves and the have-nots respectively. These challenges are explored in this paper as global sustainable development (SD) quantitatively; in order to assess the gaps that need to be bridged. Although there has been significant rhetoric on SD with very many qualitative definitions offered, very few quantitative definitions of SD exist. The few that do exist tend to measure SD in terms of social, energy, economic and environmental dimensions. In our research, we used several human survival, development, and progress variables to create an aggregate SD parameter that describes the capacity of nations in three dimensions: social sustainability, environmental sustainability and technological sustainability. Using our proposed quantitative definition of SD and data from relatively reputable secondary sources, 132 nations were ranked and compared. Our comparisons indicate a global hierarchy of needs among nations similar to Maslow's at the individual level. As in Maslow's hierarchy of needs, nations that are struggling to survive are less concerned with environmental sustainability than advanced and stable nations. Nations such as the United States, Canada, Finland, Norway and others have higher SD capacity, and thus, are higher on their hierarchy of needs than nations such as Nigeria, Vietnam, Mexico and other developing nations. To bridge such gaps, we suggest that global public policy for local to global governance and infrastructure management may be necessary. Such global public policy requires holistic development strategies in contrast to the very simplistic north-south, developed-developing nations dichotomies. PMID:19500899

  5. Quantitative analysis of inhomogeneous luminance effect on visibility of text

    NASA Astrophysics Data System (ADS)

    Haraguchi, Takeshi; Suzuki, Taka-Aki; Okajima, Katsunori

    2009-11-01

    In the present study, we measured the visibility of several types of Japanese text on a liquid crystal display (LCD) with a spatially inhomogeneous luminance and extended the visibility index function (VIF) to explain the current experimental results with a higher degree of accuracy. We quantitatively analyzed the effect of an inhomogeneous luminance, which was produced by the graphical representation of a background without reflected light and by reflected light on a homogeneous background. These results showed that the visibility of text was influenced by the inhomogeneity of the background luminance in a domain that depended on text size. Then we applied a weighted average background luminance with a two dimensional Gaussian function, whose distribution width was related to the text size, to VIF. Finally, we proposed a modified VIF and showed that the new method was able to precisely estimate the actual visibility of text with an inhomogeneous luminance.

  6. MICROFLUIDIC PLATFORM FOR THE QUANTITATIVE ANALYSIS OF LEUKOCYTE MIGRATION SIGNATURES

    PubMed Central

    Wong, Elisabeth; Briscoe, David M.; Irimia, Daniel

    2014-01-01

    Leukocyte migration into tissues is characteristic of inflammation. It is usually measured in vitro as the average displacement of populations of cells towards a chemokine gradient, not acknowledging other patterns of cell migration. Here, we designed and validated a microfluidic migration platform to simultaneously analyze four qualitative migration patterns: chemo-attraction, -repulsion, -kinesis and -inhibition, using single-cell quantitative metrics of direction, speed, persistence, and fraction of cells responding. We find that established chemokines C5a and IL-8 induce chemoattraction and repulsion in equal proportions, resulting in the dispersal of cells. These migration signatures are characterized by high persistence and speed and are independent of the chemokine dose or receptor expression. Furthermore, we find that twice as many T-lymphocytes migrate away than towards SDF-1 and their directional migration patterns are not persistent. Overall, our platform characterizes migratory signature responses and uncovers an avenue for precise characterization of leukocyte migration and therapeutic modulators. PMID:25183261

  7. Quantitative analysis on electric dipole energy in Rashba band splitting

    NASA Astrophysics Data System (ADS)

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-09-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime.

  8. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    PubMed

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. PMID:26482958

  9. Copper in silicon: Quantitative analysis of internal and proximity gettering

    SciTech Connect

    McHugo, S.A.; Flink, C.; Weber, E.R.

    1997-08-01

    The behavior of copper in the presence of a proximity gettering mechanism and a standard internal gettering mechanism in silicon was studied. He implantation-induced cavities in the near surface region were used as a proximity gettering mechanism and oxygen precipitates in the bulk of the material provided internal gettering sites. Moderate levels of copper contamination were introduced by ion implantation such that the copper was not supersaturated during the anneals, thus providing realistic copper contamination/gettering conditions. Copper concentrations at cavities and internal gettering sites were quantitatively measured after the annealings. In this manner, the gettering effectiveness of cavities was measured when in direct competition with internal gettering sites. The cavities were found to be the dominant gettering mechanism with only a small amount of copper gettered at the internal gettering sites. These results reveal the benefits of a segregation-type gettering mechanism for typical contamination conditions.

  10. Influence of energy straggling on quantitative PIXE analysis

    NASA Astrophysics Data System (ADS)

    Ruvalcaba, J. L.; Miranda, J.

    1996-04-01

    Examples of numerical calculations aimed to estimate the effect of proton energy straggling on quantitative PIXE are presented. Special attention has been taken to low energy PIXE and external beam PIXE. X-ray yield calculations including beam energy straggling for beam energies ranging from 0.3 to 3 MeV were carried out and the X-ray yield variations due to the energy straggling were determined. Proton energy straggling was calculated using the uncorrected Bohr's formula. K-and L-shell ionization cross sections variations were considered for the X-ray yields calculations. Results indicate that in some cases the error induced on the produced X-ray yield may be up to a few percent on low energy PIXE. For external beam PIXE and high energy PIXE the X-ray yield variations are very small (less than 0.5%).

  11. Temporal analysis of neural differentiation using quantitative proteomics

    PubMed Central

    Chaerkady, Raghothama; Kerr, Candace L.; Marimuthu, Arivusudar; Kelkar, Dhanashree S.; Kashyap, Manoj Kumar; Gucek, Marjan; Gearhart, John D.; Pandey, Akhilesh

    2009-01-01

    The ability to derive neural progenitors, differentiated neurons and glial cells from human embryonic stem cells (hESCs) with high efficiency holds promise for a number of clinical applications. However, investigating the temporal events is crucial for defining the underlying mechanisms that drive this process of differentiation along different lineages. We carried out quantitative proteomic profiling using a multiplexed approach capable of analyzing eight different samples simultaneously to monitor the temporal dynamics of protein abundance as human embryonic stem cells differentiate into motor neurons or astrocytes. Using this approach, a catalog of ~1,200 proteins along with their relative quantitative expression patterns was generated. The differential expression of the large majority of these proteins has not previously been reported or studied in the context of neural differentiation. As expected, two of the widely used markers of pluripotency - alkaline phosphatase (ALPL) and LIN28 - were found to be downregulated during differentiation while S-100 and tenascin C were upregulated in astrocytes. Neurofilament 3 protein, doublecortin and CAM kinase-like 1 and nestin proteins were upregulated during motor neuron differentiation. We identified a number of proteins whose expression was largely confined to specific cell types - embryonic stem cells, embryoid bodies and differentiating motor neurons. For example, glycogen phosphorylase (PYGL) and fatty acid binding protein 5 (FABP5) were enriched in ESCs while beta spectrin (SPTBN5) was highly expressed in embryoid bodies. Karyopherin, heat shock 27 kDa protein 1 and cellular retinoic acid binding protein 2 (CRABP2) were upregulated in differentiating motor neurons but were downregulated in mature motor neurons. We validated some of the novel markers of the differentiation process using immunoblotting and immunocytochemical labeling. To our knowledge, this is the first large scale temporal proteomic profiling of human stem cell differentiation into neural cell types highlighting proteins with limited or undefined roles in neural fate. PMID:19173612

  12. Quantitative error analysis for computer assisted navigation: a feasibility study

    PubMed Central

    Güler, Ö.; Perwög, M.; Kral, F.; Schwarm, F.; Bárdosi, Z. R.; Göbel, G.; Freysinger, W.

    2013-01-01

    Purpose The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features – the User Localization Error (ULE) - is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system. Methods Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials) and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root mean square, RMS; probe tip calibration was 0.18 mm RMS. Variances of tracking along the principal directions were measured as 0.18 mm2, 0.32 mm2, and 0.42 mm2. ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively. Results The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE. Conclusions Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials. Submillimetric localization is possible for implanted screws; anatomic landmarks are not suitable for high-precision clinical navigation. PMID:23387758

  13. Response Neighborhoods in Online Learning Networks: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2005-01-01

    Theoretical foundation of Response mechanisms in networks of online learners are revealed by Statistical Analysis of p* Markov Models for the Networks. Our comparative analysis of two networks shows that the minimal-effort hunt-for-social-capital mechanism controls a major behavior of both networks: negative tendency to respond. Differences in…

  14. Status of Terra and Aqua MODIS Instruments

    NASA Technical Reports Server (NTRS)

    Xiong, Xiaoxiong; Wenny, Brian N.; Kuyper, James; Salomonson, Vicent; Barmes. William

    2008-01-01

    Currently, two nearly identical MODIS instruments are operating in space: one on the Terra spacecraft launched in December 1999 and another on the Aqua spacecraft launched in May 2002. MODIS has 36 spectral bands with wavelengths covering from visible (VIS) to long-wave infrared (LWIR). Since launch, MODIS observations and data products have contributed significantly to studies of changes in the Earth system of land, oceans, and atmosphere. To maintain its on-orbit calibration and data product quality, MODIS was built with a comprehensive set of on-board calibrators, consisting of a solar diffuser (SD) and a solar diffuser stability monitor (SDSM) for the reflective solar bands (RSB) and an on-board blackbody (BB) for the thermal emissive bands (TEB). Both instruments have demonstrated good performance. The primary Level 1 B (LIB) data products are top of the atmosphere (TOA) reflectance for RSB and radiance for TEB This paper provides an overview of MODIS calibration methodologies, activities, lifetime on-orbit performance and challenging issues for each MODIS, the impact on LIB product quality, and lessons learned for future sensors such as the NPOESS VIIRS.

  15. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  16. Signature lipid biomarker analysis for the quantitative analysis of environmental microbial ecology

    SciTech Connect

    White, D.C. |; Ringelberg, D.B.

    1996-12-31

    The assessment of microbes and their in situ interactions in various environments has proved to be a difficult task requiring the application of non-traditional methodology. Classical microbiological methods, that were so successful with infectious disease, have severe limitations for the analysis of environmental samples. Pure-culture isolation, biochemical testing and/or enumeration by direct microscopic counting or most probable number (MPN) destroy most of the interactions between the various components within the environment. These disruptive methods, which require isolation, are not well suited for the estimation of total biomass or the assessment of community composition within environmental samples. These classical methods provide little insight into the in situ phenotypic activity of the extant microbiota since several of these techniques are dependent on microbial growth and, thus, select against many environmental microorganisms which are non-culturable under a wide range of conditions. An analysis method was developed for quantitatively determining microbial communities in slimes, muds, soils, bioreactors, and sediments.

  17. Quantitative analysis of pheromone-binding protein specificity

    PubMed Central

    Katti, S.; Lokhande, N.; González, D.; Cassill, A.; Renthal, R.

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using ?-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between ?-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the ?-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ~100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ~200 nM for the silk moth pheromone bombykol and ~90 nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the pheromone receptor model proposed by Laughlin et al. (Cell 133: 1255–65, 2008) are discussed. PMID:23121132

  18. Quantitative XRD analysis of {110} twin density in biotic aragonites.

    PubMed

    Suzuki, Michio; Kim, Hyejin; Mukai, Hiroki; Nagasawa, Hiromichi; Kogure, Toshihiro

    2012-12-01

    {110} Twin densities in biotic aragonite have been estimated quantitatively from the peak widths of specific reflections in powder X-ray diffraction (XRD) patterns, as well as direct confirmation of the twins using transmission electron microscopy (TEM). Influence of the twin density on the peak widths in the XRD pattern was simulated using DIFFaX program, regarding (110) twin as interstratification of two types of aragonite unit layers with mirrored relationship. The simulation suggested that the twin density can be estimated from the difference of the peak widths between 111 and 021, or between 221 and 211 reflections. Biotic aragonite in the crossed-lamellar microstructure (three species) and nacreous microstructure (four species) of molluscan shells, fish otoliths (two species), and a coral were investigated. The XRD analyses indicated that aragonite crystals in the crossed-lamellar microstructure of the three species contain high density of the twins, which is consistent with the TEM examination. On the other hand, aragonite in the nacre of the four species showed almost no difference of the peak widths between the paired reflections, indicating low twin densities. The results for the fish otoliths were varied between the species. Such variation of the twin density in biotic aragonites may reflect different schemes of crystal growth in biomineralization. PMID:22989562

  19. Quantitative proteomic analysis of tumor reversion in multiple myeloma cells.

    PubMed

    Ge, Feng; Zhang, Liang; Tao, Sheng-Ce; Kitazato, Kaio; Zhang, Zhi-Ping; Zhang, Xian-En; Bi, Li-Jun

    2011-02-01

    Tumor reversion is defined as the process by which cancer cells lose their malignant phenotype. However, relatively little is known about the cellular proteome changes that occur during the reversion process. A biological model of multiple myeloma (MM) reversion was established by using the H-1 parvovirus as a tool to select for revertant cells from MM cells. Isolated revertant cells displayed a strongly suppressed malignant phenotype both in vitro and in vivo. To explore possible mechanisms of MM reversion, the protein profiles of the revertant and parental MM cells were compared using a quantitative proteomic strategy termed SILAC-MS. Our results revealed that 379 proteins were either activated or inhibited during the reversion process, with a much greater proportion of the proteins, including STAT3, TCTP, CDC2, BAG2, and PCNA, being inhibited. Of these, STAT3, which is significantly down regulated, was selected for further functional studies. Inhibition of STAT3 expression by RNA interference resulted in suppression of the malignant phenotype and concomitant down regulation of TCTP expression, suggesting that myeloma reversion operates, at least in part, through inhibition of STAT3. Our results provide novel insights into the mechanisms of tumor reversion and suggest new alternative approaches for MM treatment. PMID:21080727

  20. Quantitative proteomic analysis of the Salmonella-lettuce interaction

    PubMed Central

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-01-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens. PMID:24512637

  1. Quantitative analysis of virus and plasmid trafficking in cells

    NASA Astrophysics Data System (ADS)

    Lagache, Thibault; Dauty, Emmanuel; Holcman, David

    2009-01-01

    Intracellular transport of DNA carriers is a fundamental step of gene delivery. By combining both theoretical and numerical approaches we study here single and several viruses and DNA particles trafficking in the cell cytoplasm to a small nuclear pore. We present a physical model to account for certain aspects of cellular organization, starting with the observation that a viral trajectory consists of epochs of pure diffusion and epochs of active transport along microtubules. We define a general degradation rate to describe the limitations of the delivery of plasmid or viral particles to a nuclear pore imposed by various types of direct and indirect hydrolysis activity inside the cytoplasm. By replacing the switching dynamics by a single steady state stochastic description, we obtain estimates for the probability and the mean time for the first one of many particles to go from the cell membrane to a small nuclear pore. Computational simulations confirm that our model can be used to analyze and interpret viral trajectories and estimate quantitatively the success of nuclear delivery.

  2. Quantitative analysis of agricultural land use change in China

    NASA Astrophysics Data System (ADS)

    Chou, Jieming; Dong, Wenjie; Wang, Shuyu; Fu, Yuqing

    This article reviews the potential impacts of climate change on land use change in China. Crop sown area is used as index to quantitatively analyze the temporal-spatial changes and the utilization of the agricultural land. A new concept is defined as potential multiple cropping index to reflect the potential sowing ability. The impacting mechanism, land use status and its surplus capacity are investigated as well. The main conclusions are as following; During 1949-2010, the agricultural land was the greatest in amount in the middle of China, followed by that in the country's eastern and western regions. The most rapid increase and decrease of agricultural land were observed in Xinjiang and North China respectively, Northwest China and South China is also changed rapid. The variation trend before 1980 differed significantly from that after 1980. Agricultural land was affected by both natural and social factors, such as regional climate and environmental changes, population growth, economic development, and implementation of policies. In this paper, the effects of temperature and urbanization on the coverage of agriculture land are evaluated, and the results show that the urbanization can greatly affects the amount of agriculture land in South China, Northeast China, Xinjiang and Southwest China. From 1980 to 2009, the extent of agricultural land use had increased as the surplus capacity had decreased. Still, large remaining potential space is available, but the future utilization of agricultural land should be carried out with scientific planning and management for the sustainable development.

  3. A quantitative proteomic analysis of long-term memory

    PubMed Central

    2010-01-01

    Background Memory is the ability to store, retain, and later retrieve learned information. Long-term memory (LTM) formation requires: DNA transcription, RNA translation, and the trafficking of newly synthesized proteins. Several components of these processes have already been identified. However, due to the complexity of the memory formation process, there likely remain many yet to be identified proteins involved in memory formation and persistence. Results Here we use a quantitative proteomic method to identify novel memory-associated proteins in neural tissue taken from animals that were trained in vivo to form a long-term memory. We identified 8 proteins that were significantly up-regulated, and 13 that were significantly down-regulated in the LTM trained animals as compared to two different control groups. In addition we found 19 proteins unique to the trained animals, and 12 unique proteins found only in the control animals. Conclusions These results both confirm the involvement of previously identified memory proteins such as: protein kinase C (PKC), adenylate cyclase (AC), and proteins in the mitogen-activated protein kinase (MAPK) pathway. In addition these results provide novel protein candidates (e.g. UHRF1 binding protein) on which to base future studies. PMID:20331892

  4. Quantitative analysis of ultrasound images for computer-aided diagnosis.

    PubMed

    Wu, Jie Ying; Tuomi, Adam; Beland, Michael D; Konrad, Joseph; Glidden, David; Grand, David; Merck, Derek

    2016-01-01

    We propose an adaptable framework for analyzing ultrasound (US) images quantitatively to provide computer-aided diagnosis using machine learning. Our preliminary clinical targets are hepatic steatosis, adenomyosis, and craniosynostosis. For steatosis and adenomyosis, we collected US studies from 288 and 88 patients, respectively, as well as their biopsy or magnetic resonanceconfirmed diagnosis. Radiologists identified a region of interest (ROI) on each image. We filtered the US images for various texture responses and use the pixel intensity distribution within each ROI as feature parameterizations. Our craniosynostosis dataset consisted of 22 CT-confirmed cases and 22 age-matched controls. One physician manually measured the vectors from the center of the skull to the outer cortex at every 10 deg for each image and we used the principal directions as shape features for parameterization. These parameters and the known diagnosis were used to train classifiers. Testing with cross-validation, we obtained 72.74% accuracy and 0.71 area under receiver operating characteristics curve for steatosis ([Formula: see text]), 77.27% and 0.77 for adenomyosis ([Formula: see text]), and 88.63% and 0.89 for craniosynostosis ([Formula: see text]). Our framework is able to detect a variety of diseases with high accuracy. We hope to include it as a routinely available support system in the clinic. PMID:26835502

  5. Quantitative analysis of pheromone-binding protein specificity.

    PubMed

    Katti, S; Lokhande, N; González, D; Cassill, A; Renthal, R

    2013-02-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins, using ?-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila odorant-binding protein that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in Escherichia?coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between ?-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the ?-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ?100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ?200?nM for the silk moth pheromone bombykol and ?90?nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the Laughlin, Ha, Jones and Smith model of pheromone reception are discussed. PMID:23121132

  6. Quantitative Analysis of Synaptic Release at the Photoreceptor Synapse

    PubMed Central

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B.

    2010-01-01

    Abstract Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca2+ and exhibits an unusually shallow dependence on presynaptic Ca2+. To provide a quantitative description of the photoreceptor Ca2+ sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca2+-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca2+: exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca2+ binding sites on the rod Ca2+ sensor rather than the typical four or five. For most models, the on-rates for Ca2+ binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca2+ unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca2+ sensor, slow Ca2+ unbinding may support the fusion of vesicles located at a distance from Ca2+ channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  7. Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis

    PubMed Central

    Stukes, Sabriya A.; Cohen, Hillel W.

    2014-01-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  8. Quantitative analysis on electric dipole energy in Rashba band splitting

    PubMed Central

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-01-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime. PMID:26323493

  9. Quantitative MRI analysis of salivary glands in sickle cell disease

    PubMed Central

    Liao, J; Saito, N; Ozonoff, A; Jara, H; Steinberg, M; Sakai, O

    2012-01-01

    Objectives The purpose of this prospective study was to characterize the MR relaxometric features of the major salivary glands in patients with sickle cell disease (SCD). Methods 15 patients with SCD (aged 19.8–43.6 years) and 12 controls were imaged with the mixed turbo-spin echo pulse sequence. The major salivary glands were manually segmented and T1, T2 and secular T2 relaxometry histograms were modelled with Gaussian functions. Results Shortened T1 relaxation times were seen solely in the submandibular glands of patients with SCD (747.5 ± 54.8 ms vs 807.1 ± 38.3 ms, p < 0.001). Slight T2 and secular T2 shortening were seen in the parotid gland; however, this difference was not significant (p = 0.07). The sublingual gland showed no changes under MR relaxometry. There was no difference in glandular volumes, and no correlation was demonstrated between history of blood transfusion and salivary gland relaxometry. Conclusions Patients with SCD exhibited changes in quantitative MRI T1 relaxometry histograms of the submandibular glands. PMID:23166360

  10. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  11. Quantitative proteomic analysis of amphotericin B resistance in Leishmania infantum

    PubMed Central

    Brotherton, Marie-Christine; Bourassa, Sylvie; Légaré, Danielle; Poirier, Guy G.; Droit, Arnaud; Ouellette, Marc

    2014-01-01

    Amphotericin B (AmB) in its liposomal form is now considered as either first- or second-line treatment against Leishmania infections in different part of the world. Few cases of AmB resistance have been reported and resistance mechanisms toward AmB are still poorly understood. This paper reports a large-scale comparative proteomic study in the context of AmB resistance. Quantitative proteomics using stable isotope labeling of amino acids in cell culture (SILAC) was used to better characterize cytoplasmic and membrane-enriched (ME) proteomes of the in vitro generated Leishmania infantum AmB resistant mutant AmB1000.1. In total, 97 individual proteins were found as differentially expressed between the mutant and its parental sensitive strain (WT). More than half of these proteins were either metabolic enzymes or involved in transcription or translation processes. Key energetic pathways such as glycolysis and TCA cycle were up-regulated in the mutant. Interestingly, many proteins involved in reactive oxygen species (ROS) scavenging and heat-shock proteins were also up-regulated in the resistant mutant. This work provides a basis for further investigations to understand the roles of proteins differentially expressed in relation with AmB resistance. PMID:25057462

  12. Space-to-Ground Communication for Columbus: A Quantitative Analysis

    PubMed Central

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  13. Quantitative properties of achromatic color induction: an edge integration analysis.

    PubMed

    Rudd, Michael E; Zemach, Iris K

    2004-05-01

    Edge integration refers to a hypothetical process by which the visual system combines information about the local contrast, or luminance ratios, at luminance borders within an image to compute a scale of relative reflectances for the regions between the borders. The results of three achromatic color matching experiments, in which a test and matching ring were surrounded by one or more rings of varying luminance, were analyzed in terms of three alternative quantitative edge integration models: (1) a generalized Retinex algorithm, in which achromatic color is computed from a weighted sum of log luminance ratios, with weights free to vary as a function of distance from the test (Weighted Log Luminance Ratio model); (2) an elaboration of the first model, in which the weights given to distant edges are reduced by a percentage that depends on the log luminance ratios of borders lying between the distant edges and the target (Weighted Log Luminance Ratio model with Blockage); and (3) an alternative modification of the first model, in which Michelson contrasts are substituted for log luminance ratios in the achromatic color computation (Weighted Michelson Contrast model). The experimental results support the Weighted Log Luminance Ratio model over the other two edge integration models. The Weighted Log Luminance Ratio model is also shown to provide a better fit to the achromatic color matching data than does Wallach's Ratio Rule, which states that the two disks will match in achromatic color when their respective disk/ring luminance ratios are equal. PMID:15031090

  14. Quantitative analysis of task selection for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  15. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli

    PubMed Central

    Hur, Kwang-Ho; Mueller, Joachim D.

    2015-01-01

    The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell. PMID:26099032

  16. Quantitative histology analysis of the ovarian tumour microenvironment.

    PubMed

    Lan, Chunyan; Heindl, Andreas; Huang, Xin; Xi, Shaoyan; Banerjee, Susana; Liu, Jihong; Yuan, Yinyin

    2015-01-01

    Concerted efforts in genomic studies examining RNA transcription and DNA methylation patterns have revealed profound insights in prognostic ovarian cancer subtypes. On the other hand, abundant histology slides have been generated to date, yet their uses remain very limited and largely qualitative. Our goal is to develop automated histology analysis as an alternative subtyping technology for ovarian cancer that is cost-efficient and does not rely on DNA quality. We developed an automated system for scoring primary tumour sections of 91 late-stage ovarian cancer to identify single cells. We demonstrated high accuracy of our system based on expert pathologists' scores (cancer?=?97.1%, stromal?=?89.1%) as well as compared to immunohistochemistry scoring (correlation?=?0.87). The percentage of stromal cells in all cells is significantly associated with poor overall survival after controlling for clinical parameters including debulking status and age (multivariate analysis p?=?0.0021, HR?=?2.54, CI?=?1.40-4.60) and progression-free survival (multivariate analysis p?=?0.022, HR?=?1.75, CI?=?1.09-2.82). We demonstrate how automated image analysis enables objective quantification of microenvironmental composition of ovarian tumours. Our analysis reveals a strong effect of the tumour microenvironment on ovarian cancer progression and highlights the potential of therapeutic interventions that target the stromal compartment or cancer-stroma signalling in the stroma-high, late-stage ovarian cancer subset. PMID:26573438

  17. Quantitative histology analysis of the ovarian tumour microenvironment

    PubMed Central

    Lan, Chunyan; Heindl, Andreas; Huang, Xin; Xi, Shaoyan; Banerjee, Susana; Liu, Jihong; Yuan, Yinyin

    2015-01-01

    Concerted efforts in genomic studies examining RNA transcription and DNA methylation patterns have revealed profound insights in prognostic ovarian cancer subtypes. On the other hand, abundant histology slides have been generated to date, yet their uses remain very limited and largely qualitative. Our goal is to develop automated histology analysis as an alternative subtyping technology for ovarian cancer that is cost-efficient and does not rely on DNA quality. We developed an automated system for scoring primary tumour sections of 91 late-stage ovarian cancer to identify single cells. We demonstrated high accuracy of our system based on expert pathologists’ scores (cancer?=?97.1%, stromal?=?89.1%) as well as compared to immunohistochemistry scoring (correlation?=?0.87). The percentage of stromal cells in all cells is significantly associated with poor overall survival after controlling for clinical parameters including debulking status and age (multivariate analysis p?=?0.0021, HR?=?2.54, CI?=?1.40–4.60) and progression-free survival (multivariate analysis p?=?0.022, HR?=?1.75, CI?=?1.09–2.82). We demonstrate how automated image analysis enables objective quantification of microenvironmental composition of ovarian tumours. Our analysis reveals a strong effect of the tumour microenvironment on ovarian cancer progression and highlights the potential of therapeutic interventions that target the stromal compartment or cancer-stroma signalling in the stroma-high, late-stage ovarian cancer subset. PMID:26573438

  18. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  19. Dynamic and still microcirculatory image analysis for quantitative microcirculation research

    NASA Astrophysics Data System (ADS)

    Ying, Xiaoyou; Xiu, Rui-juan

    1994-05-01

    Based on analyses of various types of digital microcirculatory image (DMCI), we summed up the image features of DMCI, the digitizing demands for digital microcirculatory imaging, and the basic characteristics of the DMCI processing. A dynamic and still imaging separation processing (DSISP) mode was designed for developing a DMCI workstation and the DMCI processing. Original images in this study were clinical microcirculatory images from human finger nail-bed and conjunctiva microvasculature, and intravital microvascular network images from animal tissue or organs. A series of dynamic and still microcirculatory image analysis functions were developed in this study. The experimental results indicate most of the established analog video image analysis methods for microcirculatory measurement could be realized in a more flexible way based on the DMCI. More information can be rapidly extracted from the quality improved DMCI by employing intelligence digital image analysis methods. The DSISP mode is very suitable for building a DMCI workstation.

  20. Quantitative spectroscopy for the analysis of GOME data

    NASA Technical Reports Server (NTRS)

    Chance, K.

    1997-01-01

    Accurate analysis of the global ozone monitoring experiment (GOME) data to obtain atmospheric constituents requires reliable, traceable spectroscopic parameters for atmospheric absorption and scattering. Results are summarized for research that includes: the re-determination of Rayleigh scattering cross sections and phase functions for the 200 nm to 1000 nm range; the analysis of solar spectra to obtain a high-resolution reference spectrum with excellent absolute vacuum wavelength calibration; Ring effect cross sections and phase functions determined directly from accurate molecular parameters of N2 and O2; O2 A band line intensities and pressure broadening coefficients; and the analysis of absolute accuracies for ultraviolet and visible absorption cross sections of O3 and other trace species measurable by GOME.

  1. Quantitative analysis of three-dimensional landmark coordinate data

    NASA Astrophysics Data System (ADS)

    Richtsmeier, Joan T.

    1991-04-01

    The advantages of using three-dimensional (3D) data in the description and analysis of biological forms are obvious: these data provide realistic geometrically integrated models of the forms under study and can be rotated translated and dissected electronically for viewing. 3D coordinate data can be collected from several sources including computed tomographic images stereo photographs specially designed microscopes and digitizers. But once collected how can these data be analyzed to address biologically relevant research questions? This paper demonstrates the capabilities of two analytical techniques finite-element scaling analysis and Euclidean distances matrix analysis in the comparison of 3D biological forms. Examples include studies of growth of the craniofacial complex and analyses of differences in form between members of biologically defined groups (e. g. species sexes diagnostic categories).

  2. Epilepsy surgery failure in children: a quantitative and qualitative analysis

    PubMed Central

    Englot, Dario J.; Han, Seunggu J.; Rolston, John D.; Ivan, Michael E.; Kuperman, Rachel A.; Chang, Edward F.; Gupta, Nalin; Sullivan, Joseph E.; Auguste, Kurtis I.

    2015-01-01

    Object Resection is a safe and effective treatment option for children with pharmacoresistant focal epilepsy, but some patients continue experience seizures after surgery. While most studies of pediatric epilepsy surgery focus on predictors of postoperative seizure outcome, these factors are often not modifiable, and the reasons for surgical failure may remain unclear. Methods The authors performed a retrospective cohort study of children and adolescents who received focal resective surgery for pharmacoresistant epilepsy. Both quantitative and qualitative analyses of factors associated with persistent postoperative seizures were conducted. Results Records were reviewed from 110 patients, ranging in age from 6 months to 19 years at the time of surgery, who underwent a total of 115 resections. At a mean 3.1-year follow-up, 76% of patients were free of disabling seizures (Engel Class I outcome). Seizure freedom was predicted by temporal lobe surgery compared with extra-temporal resection, tumor or mesial temporal sclerosis compared with cortical dysplasia or other pathologies, and by a lower preoperative seizure frequency. Factors associated with persistent seizures (Engel Class II–IV outcome) included residual epileptogenic tissue adjacent to the resection cavity (40%), an additional epileptogenic zone distant from the resection cavity (32%), and the presence of a hemispheric epilepsy syndrome (28%). Conclusions While seizure outcomes in pediatric epilepsy surgery may be improved by the use of high-resolution neuroimaging and invasive electrographic studies, a more aggressive resection should be considered in certain patients, including hemispherectomy if a hemispheric epilepsy syndrome is suspected. Family counseling regarding treatment expectations is critical, and reoperation may be warranted in select cases. PMID:25127098

  3. Quantitative analysis of nailfold capillary morphology in patients with fibromyalgia

    PubMed Central

    Choi, Dug-Hyun

    2015-01-01

    Background/Aims Nailfold capillaroscopy (NFC) has been used to examine morphological and functional microcirculation changes in connective tissue diseases. It has been demonstrated that NFC patterns reflect abnormal microvascular dynamics, which may play a role in fibromyalgia (FM) syndrome. The aim of this study was to determine NFC patterns in FM, and their association with clinical features of FM. Methods A total of 67 patients with FM, and 30 age- and sex-matched healthy controls, were included. Nailfold capillary patterns were quantitatively analyzed using computerized NFC. The parameters of interest were as follows: number of capillaries within the central 3 mm, deletion score, apical limb width, capillary width, and capillary dimension. Capillary dimension was determined by calculating the number of capillaries using the Adobe Photoshop version 7.0. Results FM patients had a lower number of capillaries and higher deletion scores on NFC compared to healthy controls (17.3 ± 1.7 vs. 21.8 ± 2.9, p < 0.05; 2.2 ± 0.9 vs. 0.7 ± 0.6, p < 0.05, respectively). Both apical limb width (µm) and capillary width (µm) were significantly decreased in FM patients (1.1 ± 0.2 vs. 3.7 ± 0.6; 5.4 ± 0.5 vs. 7.5 ± 1.4, respectively), indicating that FM patients have abnormally decreased digital capillary diameter and density. Interestingly, there was no difference in capillary dimension between the two groups, suggesting that the length or tortuosity of capillaries in FM patients is increased to compensate for diminished microcirculation. Conclusions FM patients had altered capillary density and diameter in the digits. Diminished microcirculation on NFC may alter capillary density and increase tortuosity. PMID:26161020

  4. Quantitative flow cytometric analysis of ABO red cell antigens.

    PubMed

    Sharon, R; Fibach, E

    1991-01-01

    A flow cytometry method has been employed to quantitatively compare the expression of A, B and H antigens on various red blood cells (RBC). The H substance was directly labelled by fluorescein-conjugated anti-H lectin and the A and B antigens by indirect staining first with monoclonal anti-A or anti-B antibodies followed by fluorescently, fluorescein (FITC) or phycoerythrin (PE), labelled anti-mouse immunoglobulin (Ig) antibodies. More than a ten-fold difference in cellular fluorescence intensity was found within each sample. Both the percentage and the mean fluorescence of the positive subpopulation for each antigen were determined. Each RBC population was characterized with respect to the expression of A, B or H antigen by a compound mean value that was the calculated product of these two parameters. The results demonstrated a reciprocal relationship between the compound means of A or B and H. The ratio of A/H or B/H was found to be most informative. Homozygotes for A or B had ratios of greater than 200 and greater than 30, respectively, while heterozygotes (AO or BO) had ratios of less than 5. This method could also distinguish between A1 and A2; RBC carrying the A1 phenotype (as determined by agglutination with anti-A1 lectin) showed a higher A/H ratio than those carrying A2. In contrast to the reciprocity in the expression of A (or B) and H found in RBC obtained from different individuals, a direct correlation was found in the expression of these antigens by individual cells within a given population.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:1764978

  5. Quantitative analysis of wrist electrodermal activity during sleep.

    PubMed

    Sano, Akane; Picard, Rosalind W; Stickgold, Robert

    2014-12-01

    We present the first quantitative characterization of electrodermal activity (EDA) patterns on the wrists of healthy adults during sleep using dry electrodes. We compare the new results on the wrist to the prior findings on palmar or finger EDA by characterizing data measured from 80 nights of sleep consisting of 9 nights of wrist and palm EDA from 9 healthy adults sleeping at home, 56 nights of wrist and palm EDA from one healthy adult sleeping at home, and 15 nights of wrist EDA from 15 healthy adults in a sleep laboratory, with the latter compared to concurrent polysomnography. While high frequency patterns of EDA called "storms" were identified by eye in the 1960s, we systematically compare thresholds for automatically detecting EDA peaks and establish criteria for EDA storms. We found that more than 80% of the EDA peaks occurred in non-REM sleep, specifically during slow-wave sleep (SWS) and non-REM stage 2 sleep (NREM2). Also, EDA amplitude is higher in SWS than in other sleep stages. Longer EDA storms were more likely to occur in the first two quarters of sleep and during SWS and NREM2. We also found from the home studies (65 nights) that EDA levels were higher and the skin conductance peaks were larger and more frequent when measured on the wrist than when measured on the palm. These EDA high frequency peaks and high amplitude were sometimes associated with higher skin temperature, but more work is needed looking at neurological and other EDA elicitors in order to elucidate their complete behavior. PMID:25286449

  6. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  7. Procedures for Quantitative Analysis of Change Facilitator Interventions.

    ERIC Educational Resources Information Center

    Hord, Shirley M.; Hall, Gene E.

    The procedures and coding schema that have been developed by the Research on the Improvement Process (RIP) Program for analyzing the frequency of interventions and for examining their internal characteristics are described. In two in-depth ethnographic studies of implementation efforts, interventions were the focus of data collection and analysis

  8. Quantitative Analysis of Cell Migration Using Optical Flow

    PubMed Central

    Boric, Katica; Orio, Patricio; Viéville, Thierry; Whitlock, Kathleen

    2013-01-01

    Neural crest cells exhibit dramatic migration behaviors as they populate their distant targets. Using a line of zebrafish expressing green fluorescent protein (sox10:EGFP) in neural crest cells we developed an assay to analyze and quantify cell migration as a population, and use it here to characterize in detail the subtle defects in cell migration caused by ethanol exposure during early development. The challenge was to quantify changes in the in vivo migration of all Sox10:EGFP expressing cells in the visual field of time-lapse movies. To perform this analysis we used an Optical Flow algorithm for motion detection and combined the analysis with a fit to an affine transformation. Through this analysis we detected and quantified significant differences in the cell migrations of Sox10:EGFP positive cranial neural crest populations in ethanol treated versus untreated embryos. Specifically, treatment affected migration by increasing the left-right asymmetry of the migrating cells and by altering the direction of cell movements. Thus, by applying this novel computational analysis, we were able to quantify the movements of populations of cells, allowing us to detect subtle changes in cell behaviors. Because cranial neural crest cells contribute to the formation of the frontal mass these subtle differences may underlie commonly observed facial asymmetries in normal human populations. PMID:23936049

  9. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  10. Mass spectrometry for real-time quantitative breath analysis.

    PubMed

    Smith, David; Špan?l, Patrik; Herbig, Jens; Beauchamp, Jonathan

    2014-06-01

    Breath analysis research is being successfully pursued using a variety of analytical methods, prominent amongst which are gas chromatography with mass spectrometry, GC-MS, ion mobility spectrometry, IMS, and the fast flow and flow-drift tube techniques called selected ion flow tube mass spectrometry, SIFT-MS, and proton transfer reaction mass spectrometry, PTR-MS. In this paper the case is made for real-time breath analysis by obviating sample collection into bags or onto traps that can suffer from partial degradation of breath metabolites or the introduction of impurities. Real-time analysis of a broad range of volatile chemical compounds can be best achieved using SIFT-MS and PTR-MS, which are sufficiently sensitive and rapid to allow the simultaneous analyses of several trace gas metabolites in single breath exhalations. The basic principles and the ion chemistry that underpin these two analytical techniques are briefly described and the differences between them, including their respective strengths and weaknesses, are revealed, especially with reference to the analysis of the complex matrix that is exhaled breath. A recent innovation is described that combines time-of-flight mass spectrometry with the proton transfer flow-drift tube reactor, PTR-TOFMS, which provides greater resolution in the analytical mass spectrometer and allows separation of protonated isobaric molecules. Examples are presented of some recent data that well illustrate the quality and real-time feature of SIFT-MS and PTR-MS for the analysis of exhaled breath for physiological/biochemical/pharmacokinetics studies and for the identification and quantification of biomarkers relating to specific disease states. PMID:24682047

  11. Clinical value of quantitative analysis of ST slope during exercise.

    PubMed Central

    Ascoop, C A; Distelbrink, C A; De Lang, P A

    1977-01-01

    The diagnostic performance of automatic analysis of the exercise electrocardiogram in detecting ischaemic heart disease was studied in 147 patients with angiographically documented coronary disease. The results were compared with the results of visual analysis of the same recordings. Using a bicycle ergometer we tried to reach at least 90 per cent of the predicted maximal heart rate of the patient. Two bipolar thoracic leads (CM5, CC5) were used. In the visual analysis the criterion of the so-called ischaemic ST segment was applied. For the automatic analysis the population was divided into a learning group (N=87) and a testing group (N=60). In the learning group first critical values were computed for different ST measurements that provided optimal separation between patients with (CAG POS.) and without (CAG. NEG.) significant coronary stenoses as revealed by coronary arteriography. These critical values were kept unchanged when applied to the testing group. With respect to the visual method an increase of the sensitivity by 0-45 and 0-36 was obtained by the automatic analysis in the learning and testing group, respectively. The best separation between CAG. POS. and CAG. NEG. group was reached using a criterion consisting of a linear combination of the slope of the initial part of the ST segment and the ST depression; the sensitivity being 0-70 and 0-60, respectively, in the learning and testing group. Using a criterion based on the area between the baseline and the ST segment (the SX integral) these values were 0-42 and 0-49, respectively. All specificities were kept to at least 0-90. PMID:319813

  12. Quantitative analysis of night skyglow amplification under cloudy conditions

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio

    2014-10-01

    The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU-intensive. These new results are of theoretical and experimental significance as they will motivate experimentalists to collect data from various regions to build an overall picture of the AF, and will encourage modellers to test the consistency with theoretical predictions.

  13. Quantitative analysis of cell-free DNA in ovarian cancer

    PubMed Central

    SHAO, XUEFENG; He, YAN; JI, MIN; CHEN, XIAOFANG; QI, JING; SHI, WEI; HAO, TIANBO; JU, SHAOQING

    2015-01-01

    The aim of the present study was to investigate the association between cell-free DNA (cf-DNA) levels and clinicopathological characteristics of patients with ovarian cancer using a branched DNA (bDNA) technique, and to determine the value of quantitative cf-DNA detection in assisting with the diagnosis of ovarian cancer. Serum specimens were collected from 36 patients with ovarian cancer on days 1, 3 and 7 following surgery, and additional serum samples were also collected from 22 benign ovarian tumor cases, and 19 healthy, non-cancerous ovaries. bDNA techniques were used to detect serum cf-DNA concentrations. All data were analyzed using SPSS version 18.0. The cf-DNA levels were significantly increased in the ovarian cancer group compared with those of the benign ovarian tumor group and healthy ovarian group (P<0.01). Furthermore, cf-DNA levels were significantly increased in stage III and IV ovarian cancer compared with those of stages I and II (P<0.01). In addition, cf-DNA levels were significantly increased on the first day post-surgery (P<0.01), and subsequently demonstrated a gradual decrease. In the ovarian cancer group, the area under the receiver operating characteristic curve of cf-DNA and the sensitivity were 0.917 and 88.9%, respectively, which was higher than those of cancer antigen 125 (0.724, 75%) and human epididymis protein 4 (0.743, 80.6%). There was a correlation between the levels of serum cf-DNA and the occurrence and development of ovarian cancer in the patients evaluated. bDNA techniques possessed higher sensitivity and specificity than other methods for the detection of serum cf-DNA in patients exhibiting ovarian cancer, and bDNA techniques are more useful for detecting cf-DNA than other factors. Thus, the present study demonstrated the potential value for the use of bDNA as an adjuvant diagnostic method for ovarian cancer. PMID:26788153

  14. Quantitative analysis of structural neuroimaging of mesial temporal lobe epilepsy

    PubMed Central

    Memarian, Negar; Thompson, Paul M; Engel, Jerome; Staba, Richard J

    2013-01-01

    Mesial temporal lobe epilepsy (MTLE) is the most common of the surgically remediable drug-resistant epilepsies. MRI is the primary diagnostic tool to detect anatomical abnormalities and, when combined with EEG, can more accurately identify an epileptogenic lesion, which is often hippocampal sclerosis in cases of MTLE. As structural imaging technology has advanced the surgical treatment of MTLE and other lesional epilepsies, so too have the analysis techniques that are used to measure different structural attributes of the brain. These techniques, which are reviewed here and have been used chiefly in basic research of epilepsy and in studies of MTLE, have identified different types and the extent of anatomical abnormalities that can extend beyond the affected hippocampus. These results suggest that structural imaging and sophisticated imaging analysis could provide important information to identify networks capable of generating spontaneous seizures and ultimately help guide surgical therapy that improves postsurgical seizure-freedom outcomes. PMID:24319498

  15. Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells

    NASA Astrophysics Data System (ADS)

    Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.

    2014-05-01

    Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

  16. Quantitative analysis of reformulated gasoline by mass spectrometry

    SciTech Connect

    Drinkwater, D.; Nero, V.

    1995-12-31

    Reformulated gasoline (RFG) is a new product offered in the United States since January 1, 1995 in an effort by the EPA to meet the Clean Air Act. RFG is a highly-designed product with tight specifications for aromatics, benzene, oxygenate content, distillation properties, sulfur content, toxics content, and vapor pressure. Because of all the new regulatory requirements on this product, a number of new analytical techniques have been specified for its analysis, including GC/MS for total aromatics. Data from the EPA and proposed ASTM methods for RFG analysis are presented, along with data suggesting a number of extensions and improvements on these methods. The accuracy and precision of these methods are evaluated.

  17. African Primary Care Research: Quantitative analysis and presentation of results

    PubMed Central

    Ogunbanjo, Gboyega A.

    2014-01-01

    Abstract This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report. PMID:26245435

  18. A contribution to the quantitative analysis of transmitted images.

    PubMed

    Dzubur, A; Danilović, Z; Caklović, N; Seiwerth, S

    1995-01-01

    The scope of this paper is to contribute to the analysis of the problem of connecting two, relatively distinct, mostly technological, aspects of modern pathology. The authors analyze the differences between the values of morphometric and other parameters, such as: area, density, perimeter, etc. obtained on the same set of images before and after transmission. The results obtained prove that the differences due to the compression-transmission-decompression module, which is inherent to the image transmission process, are negligible. PMID:8526566

  19. Quantitative analysis of PMLA nanoconjugate components after backbone cleavage.

    PubMed

    Ding, Hui; Patil, Rameshwar; Portilla-Arias, Jose; Black, Keith L; Ljubimova, Julia Y; Holler, Eggehard

    2015-01-01

    Multifunctional polymer nanoconjugates containing multiple components show great promise in cancer therapy, but in most cases complete analysis of each component is difficult. Polymalic acid (PMLA) based nanoconjugates have demonstrated successful brain and breast cancer treatment. They consist of multiple components including targeting antibodies, Morpholino antisense oligonucleotides (AONs), and endosome escape moieties. The component analysis of PMLA nanoconjugates is extremely difficult using conventional spectrometry and HPLC method. Taking advantage of the nature of polyester of PMLA, which can be cleaved by ammonium hydroxide, we describe a method to analyze the content of antibody and AON within nanoconjugates simultaneously using SEC-HPLC by selectively cleaving the PMLA backbone. The selected cleavage conditions only degrade PMLA without affecting the integrity and biological activity of the antibody. Although the amount of antibody could also be determined using the bicinchoninic acid (BCA) method, our selective cleavage method gives more reliable results and is more powerful. Our approach provides a new direction for the component analysis of polymer nanoconjugates and nanoparticles. PMID:25894227

  20. Quantitative Analysis of PMLA Nanoconjugate Components after Backbone Cleavage

    PubMed Central

    Ding, Hui; Patil, Rameshwar; Portilla-Arias, Jose; Black, Keith L.; Ljubimova, Julia Y.; Holler, Eggehard

    2015-01-01

    Multifunctional polymer nanoconjugates containing multiple components show great promise in cancer therapy, but in most cases complete analysis of each component is difficult. Polymalic acid (PMLA) based nanoconjugates have demonstrated successful brain and breast cancer treatment. They consist of multiple components including targeting antibodies, Morpholino antisense oligonucleotides (AONs), and endosome escape moieties. The component analysis of PMLA nanoconjugates is extremely difficult using conventional spectrometry and HPLC method. Taking advantage of the nature of polyester of PMLA, which can be cleaved by ammonium hydroxide, we describe a method to analyze the content of antibody and AON within nanoconjugates simultaneously using SEC-HPLC by selectively cleaving the PMLA backbone. The selected cleavage conditions only degrade PMLA without affecting the integrity and biological activity of the antibody. Although the amount of antibody could also be determined using the bicinchoninic acid (BCA) method, our selective cleavage method gives more reliable results and is more powerful. Our approach provides a new direction for the component analysis of polymer nanoconjugates and nanoparticles. PMID:25894227

  1. Multispectral Cloud Retrievals from MODIS on Terra and Aqua

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Ackerman, Steven A.; Menzel, W. Paul; Gray, Mark A.; Moody, Eric G.

    2002-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) was developed by NASA and launched onboard the Terra spacecraft on December 18, 1999 and the Aqua spacecraft on April 26, 2002. MODIS scans a swath width sufficient to provide nearly complete global coverage every two days from each polar-orbiting, sun-synchronous, platform at an altitude of 705 km, and provides images in 36 spectral bands between 0.415 and 14.235 microns with spatial resolutions of 250 m (2 bands), 500 m (5 bands) and 1000 m (29 bands). In this paper we will describe the various methods being used for the remote sensing of cloud properties using MODIS data, focusing primarily on the MODIS cloud mask used to distinguish clouds, clear sky, heavy aerosol, and shadows on the ground, and on the remote sensing of cloud optical properties, especially cloud optical thickness and effective radius of water drops and ice crystals. Additional properties of clouds derived from multispectral thermal infrared measurements, especially cloud top pressure and emissivity, will also be described. Results will be presented of MODIS cloud properties both over the land and over the ocean, showing the consistency in cloud retrievals over various ecosystems used in the retrievals. The implications of this new observing system on global analysis of the Earth's environment will be discussed.

  2. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    NASA Astrophysics Data System (ADS)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  3. Response of the cyanobacterium Microcystis flos-aquae to levofloxacin.

    PubMed

    Wan, Jinjin; Guo, Peiyong; Zhang, Shengxiang

    2014-03-01

    The effects of levofloxacin (LEV) on Microcystis flos-aquae and its mechanism were investigated by determining the responses of some parameters of M. flos-aquae to LEV stress, including growth inhibition ratio, chlorophyll a content, superoxide dismutase (SOD) and catalase (CAT) activities, malondialdehyde (MDA) content, F v/F 0 and F v/F m, etc. The results indicated that LEV at 0.001-0.1 ?g L(-1) could stimulate the growth of M. flos-aquae and increase the chlorophyll a content but did not induce a significant increase in the activity of antioxidant enzymes (SOD and CAT) and the content of MDA. When the LEV concentration exceeds 10 ?g L(-1), the growth of M. flos-aquae could be significantly inhibited (the highest inhibition ratio can be up to 88.38 % at 100 ?g L(-1)) and chlorophyll a content, SOD and CAT activities, and MDA content also significantly decreased in a concentration-dependent manner, indicating that high concentrations of LEV caused a severe oxidative stress on algal cells, resulting in a large number of reactive oxygen species produced in algal cells and thereby inhibiting the growth of algae. At the same time, the F v/F m and F v/F 0 values of M. flos-aquae decreased significantly with both exposure time and increasing test concentration of LEV, showing that the process of photosynthesis was inhibited. PMID:24288061

  4. Quantitative analysis of mouse corpus callosum from electron microscopy images.

    PubMed

    West, Kathryn L; Kelm, Nathaniel D; Carson, Robert P; Does, Mark D

    2015-12-01

    This article provides morphometric analysis of 72 electron microscopy images from control (n=4) and hypomyelinated (n=2) mouse corpus callosum. Measures of axon diameter and g-ratio were tabulated across all brains from two regions of the corpus callosum and a non-linear relationship between axon diameter and g-ratio was observed. These data are related to the accompanying research article comparing multiple methods of measuring g-ratio entitled 'A revised model for estimating g-ratio from MRI' (West et al., NeuroImage, 2015). PMID:26504893

  5. Quantitative Analysis and Validation of Method Using HPTLC

    NASA Astrophysics Data System (ADS)

    Dhandhukia, Pinakin C.; Thakker, Janki N.

    High performance thin layer chromatography is an emerging alternative analytical technique in comparison with conventional column chromatography because of its simplicity, rapidity, accuracy, robustness, and cost effectiveness. Choice of vast array of supporting matrices and solvent systems resulted in separation of almost all types of analytes except volatiles. First step of a robust method development for routine quantification is to check the stability of analyte during various steps of chromatographic development followed by preparation of calibration curves. Thereafter, various validation aspects of analysis namely peak purity, linearity and range, precision, limit of detection, limit of quantification, robustness, and accuracy have to be measured.

  6. Some remarks on the quantitative analysis of behavior

    PubMed Central

    Marr, M. Jackson

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences—physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  7. Some remarks on the quantitative analysis of behavior.

    PubMed

    Marr, M J

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences-physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  8. Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size

    SciTech Connect

    Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao

    2011-12-01

    In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

  9. Quantitative Analysis of Guanine Nucleotide Exchange Factors (GEFs) as Enzymes

    PubMed Central

    Randazzo, Paul A; Jian, Xiaoying; Chen, Pei-Wen; Zhai, Peng; Soubias, Olivier; Northup, John K

    2014-01-01

    The proteins that possess guanine nucleotide exchange factor (GEF) activity, which include about ~800 G protein coupled receptors (GPCRs),1 15 Arf GEFs,2 81 Rho GEFs,3 8 Ras GEFs,4 and others for other families of GTPases,5 catalyze the exchange of GTP for GDP on all regulatory guanine nucleotide binding proteins. Despite their importance as catalysts, relatively few exchange factors (we are aware of only eight for ras superfamily members) have been rigorously characterized kinetically.5–13 In some cases, kinetic analysis has been simplistic leading to erroneous conclusions about mechanism (as discussed in a recent review14). In this paper, we compare two approaches for determining the kinetic properties of exchange factors: (i) examining individual equilibria, and; (ii) analyzing the exchange factors as enzymes. Each approach, when thoughtfully used,14,15 provides important mechanistic information about the exchange factors. The analysis as enzymes is described in further detail. With the focus on the production of the biologically relevant guanine nucleotide binding protein complexed with GTP (G•GTP), we believe it is conceptually simpler to connect the kinetic properties to cellular effects. Further, the experiments are often more tractable than those used to analyze the equilibrium system and, therefore, more widely accessible to scientists interested in the function of exchange factors. PMID:25332840

  10. High-throughput quantitative N-glycan analysis of glycoproteins.

    PubMed

    Doherty, Margaret; McManus, Ciara A; Duke, Rebecca; Rudd, Pauline M

    2012-01-01

    N-linked oligosaccharides are complex non-template-derived structures that are attached to the side chains of asparagine, via the nitrogen atom. Specific changes in the N-glycans of serum glycoproteins have been associated with the pathogenesis of many diseases. The oligosaccharides present on the C(H)2 domain of immunoglobulins are known to modulate the effector functions of the molecule. These glycans provoke various biological effects, necessitating the development of robust high-throughput technology in order to fully characterize the N-glycosylation profile. This chapter describes in detail four methods to release N-glycans from the glycoprotein of interest. Two of these protocols, referred to as the "In-Gel Block" and "1D sodium dodecyl sulfate-polyacrylamide gel electrophoresis" methods, require immobilization of the glycoprotein prior to analysis. An automated method is also described, involving the purification of immunoglobulins directly from fermentation media, and, finally, an "In-solution method" is detailed, which directly releases the N-glycans into solution. HILIC and WAX-HPLC are used to analyze the N-glycan profile. Exoglycosidase enzymes digestion arrays, in combination with computer-assisted data analysis, are used to determine both the sequence and linkage of the N-glycans present. PMID:22735961

  11. Quantitative phosphoproteomic analysis of prion-infected neuronal cells

    PubMed Central

    2010-01-01

    Prion diseases or transmissible spongiform encephalopathies (TSEs) are fatal diseases associated with the conversion of the cellular prion protein (PrPC) to the abnormal prion protein (PrPSc). Since the molecular mechanisms in pathogenesis are widely unclear, we analyzed the global phospho-proteome and detected a differential pattern of tyrosine- and threonine phosphorylated proteins in PrPSc-replicating and pentosan polysulfate (PPS)-rescued N2a cells in two-dimensional gel electrophoresis. To quantify phosphorylated proteins, we performed a SILAC (stable isotope labeling by amino acids in cell culture) analysis and identified 105 proteins, which showed a regulated phosphorylation upon PrPSc infection. Among those proteins, we validated the dephosphorylation of stathmin and Cdc2 and the induced phosphorylation of cofilin in PrPSc-infected N2a cells in Western blot analyses. Our analysis showed for the first time a differentially regulated phospho-proteome in PrPSc infection, which could contribute to the establishment of novel protein markers and to the development of novel therapeutic intervention strategies in targeting prion-associated disease. PMID:20920157

  12. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  13. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  14. A Quantitative Content Analysis of Mercer University MEd, EdS, and Doctoral Theses

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Gaiek, Lura S.; White, Torian A.; Slappey, Lisa A.; Chastain, Andrea; Harris, Rose Prejean

    2010-01-01

    Quantitative content analysis of a body of research not only helps budding researchers understand the culture, language, and expectations of scholarship, it helps identify deficiencies and inform policy and practice. Because of these benefits, an analysis of a census of 980 Mercer University MEd, EdS, and doctoral theses was conducted. Each thesis…

  15. A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals

    ERIC Educational Resources Information Center

    Takusi, Gabriel Samuto

    2010-01-01

    This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

  16. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  17. Digitally Enhanced Thin-Layer Chromatography: An Inexpensive, New Technique for Qualitative and Quantitative Analysis

    ERIC Educational Resources Information Center

    Hess, Amber Victoria Irish

    2007-01-01

    A study conducted shows that if digital photography is combined with regular thin-layer chromatography (TLC), it could perform highly improved qualitative analysis as well as make accurate quantitative analysis possible for a much lower cost than commercial equipment. The findings suggest that digitally enhanced TLC (DE-TLC) is low-cost and easy…

  18. Digitally Enhanced Thin-Layer Chromatography: An Inexpensive, New Technique for Qualitative and Quantitative Analysis

    ERIC Educational Resources Information Center

    Hess, Amber Victoria Irish

    2007-01-01

    A study conducted shows that if digital photography is combined with regular thin-layer chromatography (TLC), it could perform highly improved qualitative analysis as well as make accurate quantitative analysis possible for a much lower cost than commercial equipment. The findings suggest that digitally enhanced TLC (DE-TLC) is low-cost and easy…

  19. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    PubMed

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0.981 and that for Ni is 0.995. The concentrations of Cr and Ni in two target samples were determined using PLS calibration method, and the relative errors for the two unknown steel alloy samples are lower than 6.62% and 1.49% respectively. The obtained results showed that in the quantitative analysis of steel alloys, the matrix effect would be reduced effectively and the quantitative analysis accuracy would be improved by the PLS calibration method. PMID:25508749

  20. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    PubMed

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0.981 and that for Ni is 0.995. The concentrations of Cr and Ni in two target samples were determined using PLS calibration method, and the relative errors for the two unknown steel alloy samples are lower than 6.62% and 1.49% respectively. The obtained results showed that in the quantitative analysis of steel alloys, the matrix effect would be reduced effectively and the quantitative analysis accuracy would be improved by the PLS calibration method. PMID:25474970

  1. Quantitative real-time single particle analysis of virions

    PubMed Central

    Heider, Susanne; Metzner, Christoph

    2014-01-01

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed—or adapted from other fields, such as nanotechnology—to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification. PMID:24999044

  2. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila

    PubMed Central

    Itskov, Pavel M.; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H.; Ribeiro, Carlos

    2014-01-01

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake. PMID:25087594

  3. Quantitative Analysis of Single Particle Trajectories: Mean Maximal Excursion Method

    PubMed Central

    Tejedor, Vincent; Bénichou, Olivier; Voituriez, Raphael; Jungmann, Ralf; Simmel, Friedrich; Selhuber-Unkel, Christine; Oddershede, Lene B.; Metzler, Ralf

    2010-01-01

    An increasing number of experimental studies employ single particle tracking to probe the physical environment in complex systems. We here propose and discuss what we believe are new methods to analyze the time series of the particle traces, in particular, for subdiffusion phenomena. We discuss the statistical properties of mean maximal excursions (MMEs), i.e., the maximal distance covered by a test particle up to time t. Compared to traditional methods focusing on the mean-squared displacement we show that the MME analysis performs better in the determination of the anomalous diffusion exponent. We also demonstrate that combination of regular moments with moments of the MME method provides additional criteria to determine the exact physical nature of the underlying stochastic subdiffusion processes. We put the methods to test using experimental data as well as simulated time series from different models for normal and anomalous dynamics such as diffusion on fractals, continuous time random walks, and fractional Brownian motion. PMID:20371337

  4. Quantitative Analysis of Spectral Impacts on Silicon Photodiode Radiometers: Preprint

    SciTech Connect

    Myers, D. R.

    2011-04-01

    Inexpensive broadband pyranometers with silicon photodiode detectors have a non-uniform spectral response over the spectral range of 300-1100 nm. The response region includes only about 70% to 75% of the total energy in the terrestrial solar spectral distribution from 300 nm to 4000 nm. The solar spectrum constantly changes with solar position and atmospheric conditions. Relative spectral distributions of diffuse hemispherical irradiance sky radiation and total global hemispherical irradiance are drastically different. This analysis convolves a typical photodiode response with SMARTS 2.9.5 spectral model spectra for different sites and atmospheric conditions. Differences in solar component spectra lead to differences on the order of 2% in global hemispherical and 5% or more in diffuse hemispherical irradiances from silicon radiometers. The result is that errors of more than 7% can occur in the computation of direct normal irradiance from global hemispherical irradiance and diffuse hemispherical irradiance using these radiometers.

  5. Quantitative radiographic analysis of fiber reinforced polymer composites.

    PubMed

    Baidya, K P; Ramakrishna, S; Rahman, M; Ritchie, A

    2001-01-01

    X-ray radiographic examination of the bone fracture healing process is a widely used method in the treatment and management of patients. Medical devices made of metallic alloys reportedly produce considerable artifacts that make the interpretation of radiographs difficult. Fiber reinforced polymer composite materials have been proposed to replace metallic alloys in certain medical devices because of their radiolucency, light weight, and tailorable mechanical properties. The primary objective of this paper is to provide a comparable radiographic analysis of different fiber reinforced polymer composites that are considered suitable for biomedical applications. Composite materials investigated consist of glass, aramid (Kevlar-29), and carbon reinforcement fibers, and epoxy and polyether-ether-ketone (PEEK) matrices. The total mass attenuation coefficient of each material was measured using clinical X-rays (50 kev). The carbon fiber reinforced composites were found to be more radiolucent than the glass and kevlar fiber reinforced composites. PMID:11261603

  6. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila.

    PubMed

    Itskov, Pavel M; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H; Ribeiro, Carlos

    2014-01-01

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake. PMID:25087594

  7. Quantitative analysis of caffeine applied to pharmaceutical industry

    NASA Astrophysics Data System (ADS)

    Baucells, M.; Ferrer, N.; Gómez, P.; Lacort, G.; Roura, M.

    1993-03-01

    The direct determination of some compounds like caffeine in pharmaceutical samples without sample pretreatment and without the separation of these compounds from the matrix (acetyl salicylic acid, paracetamol,…) is very worthwhile. It enables analysis to be performed quickly and without the problems associated with sample manipulation. The samples were diluted directly in KBr powder. We used both diffuse reflectance (DRIFT) and transmission techniques in order to measure the intensity of the peaks of the caffeine in the pharmaceutical matrix. Limits of detection, determination, relative standard deviation and recovery using caffeine in the same matrix as in the pharmaceutical product are related. Two methods for the quantification of caffeine were used: calibration line and standard addition techniques.

  8. Quantitative space-bandwidth product analysis in digital holography.

    PubMed

    Claus, Daniel; Iliescu, Daciana; Bryanston-Cross, Peter

    2011-12-01

    The space-bandwidth product (SBP) is a measure for the information capacity an optical system possesses. The two information processing steps in digital holography, recording, and reconstruction are analyzed with respect to the SBP. The recording setups for a Fresnel hologram, Fourier hologram, and image-plane hologram, which represent the most commonly used setup configurations in digital holography, are investigated. For the recording process, the required SBP to ensure the recording of the entire object information is calculated. This is accomplished by analyzing the recorded interference pattern in the hologram-plane. The paraxial diffraction model is used in order to simulate the light propagation from the object to hologram-plane. The SBP in the reconstruction process is represented by the product of the reconstructed field-of-view and spatial frequency bandwidth. The outcome of this analysis results in the best SBP adapted digital holographic setup. PMID:22192996

  9. Quantitative proteomic analysis of the brainstem following lethal sarin exposure.

    PubMed

    Meade, Mitchell L; Hoffmann, Andrea; Makley, Meghan K; Snider, Thomas H; Schlager, John J; Gearhart, Jeffery M

    2015-06-22

    The brainstem represents a major tissue area affected by sarin organophosphate poisoning due to its function in respiratory and cardiovascular control. While the acute toxic effects of sarin on brainstem-related responses are relatively unknown, other brain areas e.g., cortex or cerebellum, have been studied more extensively. The study objective was to analyze the guinea pig brainstem toxicology response following sarin (2×LD50) exposure by proteome pathway analysis to gain insight into the complex regulatory mechanisms that lead to impairment of respiratory and cardiovascular control. Guinea pig exposure to sarin resulted in the typical acute behavior/physiology outcomes with death between 15 and 25min. In addition, brain and blood acetylcholinesterase activity was significantly reduced in the presence of sarin to 95%, and 89%, respectively, of control values. Isobaric-tagged (iTRAQ) liquid chromatography tandem mass spectrometry (LC-MS/MS) identified 198 total proteins of which 23% were upregulated, and 18% were downregulated following sarin exposure. Direct gene ontology (GO) analysis revealed a sarin-specific broad-spectrum proteomic profile including glutamate-mediated excitotoxicity, calcium overload, energy depletion responses, and compensatory carbohydrate metabolism, increases in ROS defense, DNA damage and chromatin remodeling, HSP response, targeted protein degradation (ubiquitination) and cell death response. With regards to the sarin-dependent effect on respiration, our study supports the potential interference of sarin with CO2/H(+) sensitive chemoreceptor neurons of the brainstem retrotrapezoid nucleus (RTN) that send excitatory glutamergic projections to the respiratory centers. In conclusion, this study gives insight into the brainstem broad-spectrum proteome following acute sarin exposure and the gained information will assist in the development of novel countermeasures. PMID:25842371

  10. Scalp Surgery: Quantitative Analysis of Follicular Unit Growth

    PubMed Central

    Caruana, Giorgia

    2015-01-01

    Background: Over the years, different kinds of hair transplantation have been compared in an attempt to overcome male pattern alopecia and, at the same time, maximize both the survival and growth rate of grafted hair. In this study, we have assessed the survival and growth rate of follicular units (FU) in an in vitro model, as compared with that of conventional hair micrografts, to experimentally evaluate and elaborate on the differences between these 2 approaches in hair transplantation procedures. Methods: Group A (control; n = 100 follicles) was composed of hair micrografts, whereas FUs were assigned to Group B (experimental; n = 100 follicles, n = 35 FUs). Each group was cultured for a period of 10 days; the total stretch of follicles was measured soon after the harvest and 10 days later. The Kruskal-Wallis one-way analysis of variance on ranks test was used to perform statistical analysis. Results: The growth rate of follicles from Group A (mean 10-day shaft growth rate = 0.30 mm) proved to be statistically different compared with that of Group B (mean 10-day shaft growth rate = 0.23 mm). Conversely, our data did not show any significant difference between the survival rate of hair grafts from these 2 groups. Conclusions: Our data highlighted a reduced FU shaft growth compared with that of hair micrografts, corroborating, to a certain extent, the hypothesis that a significant amount of adipose tissue surrounding the follicle included in the graft may result in an inadequate nourishment supply to follicular cells. PMID:26579345

  11. Digital photogrammetry for quantitative wear analysis of retrieved TKA components.

    PubMed

    Grochowsky, J C; Alaways, L W; Siskey, R; Most, E; Kurtz, S M

    2006-11-01

    The use of new materials in knee arthroplasty demands a way in which to accurately quantify wear in retrieved components. Methods such as damage scoring, coordinate measurement, and in vivo wear analysis have been used in the past. The limitations in these methods illustrate a need for a different methodology that can accurately quantify wear, which is relatively easy to perform and uses a minimal amount of expensive equipment. Off-the-shelf digital photogrammetry represents a potentially quick and easy alternative to what is readily available. Eighty tibial inserts were visually examined for front and backside wear and digitally photographed in the presence of two calibrated reference fields. All images were segmented (via manual and automated algorithms) using Adobe Photoshop and National Institute of Health ImageJ. Finally, wear was determined using ImageJ and Rhinoceros software. The absolute accuracy of the method and repeatability/reproducibility by different observers were measured in order to determine the uncertainty of wear measurements. To determine if variation in wear measurements was due to implant design, 35 implants of the three most prevalent designs were subjected to retrieval analysis. The overall accuracy of area measurements was 97.8%. The error in automated segmentation was found to be significantly lower than that of manual segmentation. The photogrammetry method was found to be reasonably accurate and repeatable in measuring 2-D areas and applicable to determining wear. There was no significant variation in uncertainty detected among different implant designs. Photogrammetry has a broad range of applicability since it is size- and design-independent. A minimal amount of off-the-shelf equipment is needed for the procedure and no proprietary knowledge of the implant is needed. PMID:16649169

  12. [Quantitative analysis of seven phenolic acids in eight Yinqiao Jiedu serial preparations by quantitative analysis of multi-components with single-marker].

    PubMed

    Wang, Jun-jun; Zhang, Li; Guo, Qing; Kou, Jun-ping; Yu, Bo-yang; Gu, Dan-hua

    2015-04-01

    The study aims to develop a unified method to determine seven phenolic acids (neochlorogenic acid, chlorogenic acid, 4-caffeoylquinic acid, caffeic acid, isochlorogenic acid B, isochlorogenic acid A and isochlorogenic acid C) contained in honeysuckle flower that is the monarch drug of all the eight Yinqiao Jiedu serial preparations using quantitative analysis of multi-components by single-marker (QAMS). Firstly, chlorogenic acid was used as a reference to get the average relative correction factors (RCFs) of the other phenolic acids in ratios to the reference; columns and instruments from different companies were used to validate the durability of the achieved RCFs in different levels of standard solutions; and honeysuckle flower extract was used as the reference substance to fix the positions of chromatographic peaks. Secondly, the contents of seven phenolic acids in eight different Yinqiao Jiedu serial preparations samples were calculated based on the RCFs durability. Finally, the quantitative results were compared between QAMS and the external standard (ES) method. The results have showed that the durability of the achieved RCFs is good (RSD during 0.80% - 2.56%), and there are no differences between the quantitative results of QAMS and ES (the relative average deviation < 0.93%). So it can be successfully used to the quantitative control of honeysuckle flower principally prescribed in Yinqiao Jiedu serial preparations. PMID:26223132

  13. Label-Free Quantitative Proteomics and N-terminal Analysis of Human Metastatic Lung Cancer Cells

    PubMed Central

    Min, Hophil; Han, Dohyun; Kim, Yikwon; Cho, Jee Yeon; Jin, Jonghwa; Kim, Youngsoo

    2014-01-01

    Proteomic analysis is helpful in identifying cancer-associated proteins that are differentially expressed and fragmented that can be annotated as dysregulated networks and pathways during metastasis. To examine meta-static process in lung cancer, we performed a proteomics study by label-free quantitative analysis and N-terminal analysis in 2 human non-small-cell lung cancer cell lines with disparate metastatic potentials—NCI-H1703 (primary cell, stage I) and NCI-H1755 (metastatic cell, stage IV). We identified 2130 proteins, 1355 of which were common to both cell lines. In the label-free quantitative analysis, we used the NSAF normalization method, resulting in 242 differential expressed proteins. For the N-terminal proteome analysis, 325 N-terminal peptides, including 45 novel fragments, were identified in the 2 cell lines. Based on two proteomic analysis, 11 quantitatively expressed proteins and 8 N-terminal peptides were enriched for the focal adhesion pathway. Most proteins from the quantitative analysis were upregulated in metastatic cancer cells, whereas novel fragment of CRKL was detected only in primary cancer cells. This study increases our understanding of the NSCLC metastasis proteome. PMID:24805778

  14. Funtools: Fits Users Need Tools for Quick, Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Mandel, Eric; Brederkamp, Joe (Technical Monitor)

    2001-01-01

    The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

  15. Quantitative Analysis with Heavy Ion E-TOF ERD

    SciTech Connect

    Banks, J.C.; Doyle, B.L.; Font, A. Climent

    1999-07-23

    Heavy ion TOF ERD combined with energy detection (E-TOF-ERD) is a powerful analytical technique taking advantage of the following facts: the scattering cross section is usually very high ({approximately}10{sup {minus}21} cm{sup 2}/sr) compared to regular He RBS ({approximately}10{sup {minus}25} cm{sup 2}/sr), contrary to what happens with the energy resolution in ordinary surface solid barrier detectors, time resolution is almost independent of the atomic mass of the detected element, and the detection in coincidence of time and energy signals allows for the mass separation of overlapping signals with the same energy (or time of flight). Measurements on several oxides have been performed with the E-TOF-ERD set up at Sandia National Laboratories using an incident beam of 10-15 MeV Au. The information on the composition of the sample is obtained from the time domain spectrum, which is converted to energy domain, and then, using existing software codes, the analysis is performed. During the quantification of the results, they have found problems related to the interaction of the beam with the sample and to the tabulated values of the stopping powers for heavy ions.

  16. A quantitative analysis of small atomic and molecular systems

    NASA Astrophysics Data System (ADS)

    Pollet, Lode

    2011-05-01

    Ultracold atoms in an optical lattice provide a unique toolbox for emulating the prototypical models of condensed matter physics. Before the optical lattice system can be trusted as a quantum simulator however, it needs to be validated and benchmarked against known results, for which quantum Monte Carlo simulations are ideally suited. In the first part of this talk, an overview of recent numerical studies of ultracold bosonic and fermionic systems in an optical lattice will be given, starting with a full comparison based on experimental time-of-flight images of bosons in an optical lattice and ab-initio simulations. Next, the temperature and entropy in present experiments on fermions in an optical lattice will be estimated, and the full thermodynamics on approach to the Neel temperature will be presented. In the second part of the talk, a similar numerical analysis will be given for polar bosonic molecules. Special emphasis will be given on the feasibility of observing supersolid phases. We thank the Swiss National Science Foundation under grant PZ00P2_131892 for financial support.

  17. Quantitative safety analysis using fracture mechanics and ultrasonic stress measurements

    NASA Astrophysics Data System (ADS)

    Clark, Al V.; Anderson, Ted L.; Lozev, Margarit G.; Fuchs, P. A.

    1996-11-01

    Fracture mechanics can be applied to assess the safety of cracked bridge members. If crack length and stresses are known, the crack driving force (stress intensity factor, K) can be calculated. K was calculated for hot-rolled beams, as a function of crack length. K eventually becomes negative, indicating not further crack propagation. However a cracked girder will become compliant and 'shed' load to uncracked neighboring members. Our calculations show that the changes in both compliance and load-carrying capacity of the cracked girder are small until the girder is deeply cracked. A finite-element analysis of a cracked girder showed that by determining the bending stresses at about one beam depth from the crack it is possible to determine K. Measurement of these stresses was simulated in a field test. The method used small changes in sound speed to determine stress. The ultrasonic transducers used required no couplants and no surface preparation. They were also used to measure stresses in an integral backwall bridge.

  18. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  19. Quantitative analysis of American woodcock nest and brood habitat

    USGS Publications Warehouse

    Bourgeois, A.

    1977-01-01

    Sixteen nest and 19 brood sites of American woodcock (Philohela minoI) were examined in northern lower Michigan between 15 April and 15 June 1974 to determine habitat structure associated with these sites. Woodcock hens utilized young, second-growth forest stands which were similar in species composition for both nesting and brood rearing. A multi-varIate discriminant function analysis revealed a significant (P< 0.05) difference, however, in habitat structure. Nest habitat was characterized by lower tree density (2176 trees/ha) and basal area (8.6 m2/ha), by being close to forest openings (7 m) and by being situated on dry, relatively well drained sites. In contrast, woodcock broods were located in sites that had nearly twice the tree density (3934 trees/hal and basal area (16.5 m2/ha), was located over twice as far from forest openings (18 m) and generally occurred on damp sites, near (8 m) standing water. Importance of the habitat features to the species and possible management implications are discussed.

  20. Direct Quantitative Analysis of Arsenic in Coal Fly Ash

    PubMed Central

    Hartuti, Sri; Kambara, Shinji; Takeyama, Akihiro; Kumabe, Kazuhiro; Moritomi, Hiroshi

    2012-01-01

    A rapid, simple method based on graphite furnace atomic absorption spectrometry is described for the direct determination of arsenic in coal fly ash. Solid samples were directly introduced into the atomizer without preliminary treatment. The direct analysis method was not always free of spectral matrix interference, but the stabilization of arsenic by adding palladium nitrate (chemical modifier) and the optimization of the parameters in the furnace program (temperature, rate of temperature increase, hold time, and argon gas flow) gave good results for the total arsenic determination. The optimal furnace program was determined by analyzing different concentrations of a reference material (NIST1633b), which showed the best linearity for calibration. The optimized parameters for the furnace programs for the ashing and atomization steps were as follows: temperatures of 500–1200 and 2150°C, heating rates of 100 and 500°C?s?1, hold times of 90 and 7?s, and medium then maximum and medium argon gas flows, respectively. The calibration plots were linear with a correlation coefficient of 0.9699. This method was validated using arsenic-containing raw coal samples in accordance with the requirements of the mass balance calculation; the distribution rate of As in the fly ashes ranged from 101 to 119%. PMID:23251836

  1. Quantitative analysis of bloggers' collective behavior powered by emotions

    NASA Astrophysics Data System (ADS)

    Mitrovi?, Marija; Paltoglou, Georgios; Tadi?, Bosiljka

    2011-02-01

    Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.

  2. Segmentation and learning in the quantitative analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Ruggiero, Christy; Ross, Amy; Porter, Reid

    2015-02-01

    In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.

  3. Aroma in rice: genetic analysis of a quantitative trait.

    PubMed

    Lorieux, M; Petrov, M; Huang, N; Guiderdoni, E; Ghesquière, A

    1996-11-01

    A new approach was developed which succeeded in tagging for the first time a major gene and two QTLs controlling grain aroma in rice. It involved a combination of two techniques, quantification of volatile compounds in the cooking water by gas chromatography, and molecular marker mapping. Four types of molecular marker were used (RFLPs, RAPDs, STSs, isozymes). Evaluation and mapping were performed on a doubled haploid line population which (1) conferred a precise character evaluation by enabling the analysis of large quantities of grains per genotype and (2) made possible the comparison of gas chromatography results and sensitive tests. The population size (135 lines) provided a good mapping precision. Several markers on chromosome 8 were found to be closely linked to a major gene controlling the presence of 2-acetyl-1-pyrroline (AcPy), the main compound of rice aroma. Moreover, our results showed that AcPy concentration in plants is regulated by at least two chromosomal regions. Estimations of recombination fractions on chromosome 8 were corrected for strong segregation distortion. This study confirms that AcPy is the major component of aroma. Use of the markers linked to AcPy major gene and QTLs for marker-assisted selection by successive backcrosses may be envisaged. PMID:24162494

  4. Selective quantitative analysis of the intensity of immunohistochemical reactions.

    PubMed

    Thal, D R; Horn, M; Schlote, W

    1995-04-01

    The present study reports a new method for the densitometric measurement of the intensity of immunohistochemical reactions. This method is based on a programm for the Kontron VIDAS image analysis system and has been designed for the measurement of small differences in the relative intensity of immunohistochemical reactions. Immunohistochemistry was performed with the avidin-biotin-peroxidase complex and diaminobenzidine-HCl and H2O2 for enzyme visualization. Several methods for shade correction and image processing were elaborated. The study was carried out on gerbil Purkinje cells using monoclonal antibodies raised against calbindin D28k. Prerequesites of correct measurement were standardized preparation, i.e., identical thickness of the paraffin sections, identical performance of immunohistochemistry, and avoidance of any counterstaining. The evaluation of small intensity differences of immunohistochemical reactions was found to be feasible either by substractive shade correction and standardized normalization or by shade correction by division by a reference image and standardized thresholding. Small differences in antigen concentration were not detectable without additional image processing. PMID:7660737

  5. Quantitative video-based gait pattern analysis for hemiparkinsonian rats.

    PubMed

    Lee, Hsiao-Yu; Hsieh, Tsung-Hsun; Liang, Jen-I; Yeh, Ming-Long; Chen, Jia-Jin J

    2012-09-01

    Gait disturbances are common in the rat model of Parkinson's disease (PD) by administrating 6-hydroxydopamine. However, few studies have simultaneously assessed spatiotemporal gait indices and the kinematic information of PD rats during overground locomotion. This study utilized a simple, accurate, and reproducible method for quantifying the spatiotemporal and kinematic changes of gait patterns in hemiparkinsonian rats. A transparent walkway with a tilted mirror was set to capture underview footprints and lateral joint ankle images using a high-speed and high-resolution digital camera. The footprint images were semi-automatically processed with a threshold setting to identify the boundaries of soles and the critical points of each hindlimb for deriving the spatiotemporal and kinematic indices of gait. Following PD lesion, asymmetrical gait patterns including a significant decrease in the step/stride length and increases in the base of support and ankle joint angle were found. The increased footprint length, toe spread, and intermediary toe spread were found, indicating a compensatory gait pattern for impaired locomotor function. The temporal indices showed a significant decrease in the walking speed with increased durations of the stance/swing phase and double support time, which was more evident in the affected hindlimb. Furthermore, the ankle kinematic data showed that the joint angle decreased at the toe contact stage. We conclude that the proposed gait analysis method can be used to precisely detect locomotor function changes in PD rats, which is useful for objective assessments of investigating novel treatments for PD animal model. PMID:22707230

  6. Quantitative analysis of fall risk using TUG test.

    PubMed

    Zakaria, Nor Aini; Kuwae, Yutaka; Tamura, Toshiyo; Minato, Kotaro; Kanaya, Shigehiko

    2015-01-01

    We examined falling risk among elderly using a wearable inertial sensor, which combines accelerometer and gyrosensors devices, applied during the Timed Up and Go (TUG) test. Subjects were categorised into two groups as low fall risk and high fall risk with 13.5 s duration taken to complete the TUG test as the threshold between them. One sensor was attached at the subject's waist dorsally, while acceleration and gyrosensor signals in three directions were extracted during the test. The analysis was carried out in phases: sit-bend, bend-stand, walking, turning, stand-bend and bend-sit. Comparisons between the two groups showed that time parameters along with root mean square (RMS) value, amplitude and other parameters could reveal the activities in each phase. Classification using RMS value of angular velocity parameters for sit-stand phase, RMS value of acceleration for walking phase and amplitude of angular velocity signal for turning phase along with time parameters suggests that this is an improved method in evaluating fall risk, which promises benefits in terms of improvement of elderly quality of life. PMID:23964848

  7. Quantitative real-time single particle analysis of virions

    SciTech Connect

    Heider, Susanne; Metzner, Christoph

    2014-08-15

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed—or adapted from other fields, such as nanotechnology—to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification. - Highlights: • We introduce four methods for virus particle-based quantification of viruses. • They allow for quantification of a wide range of samples in under an hour time. • The additional measurement of size and zeta potential is possible for some.

  8. Methods for quantitative analysis of trabecular bone structure.

    PubMed

    Cortet, B; Colin, D; Dubois, P; Delcambre, B; Marchandise, X

    1995-12-01

    Bone mineral density accounts for 70% to 80% of the mechanical resistance of bone but is unrelated to bone tissue structure. The vertebral fracture risk increases with advancing age irrespective of whether or not bone mineral density decreases, suggesting that changes in bone microarchitecture contribute significantly to the development of osteoporosis. In contrast to bone mass, bone architecture is difficult to evaluate. Among the various methods developed to investigate bone structure, biomechanical studies are of limited value since they are done on cadaver bones. Measurement of microarchitectural parameters (e.g., mean trabecular thickness, density and separation) in bone specimens obtained by needle biopsy is the gold-standard technique. Parameters reflecting trabecular interconnections (e.g., total number of nodes and free ends) can also be measured on needle biopsy specimens. New techniques of as yet unproven validity include star volume and trabecular bone pattern factor measurement. Noninvasive techniques capable of supplying qualitative information about bone tissue are also under study. Ultrasonography can theoretically provide data on bone microarchitecture but has not yet been proven useful in clinical practice. Statistical, structural, or fractal analysis techniques can be used to evaluate bone texture on digitized roentgenograms, computed tomography sections, or magnetic resonance imaging displays; although this approach holds great promise, it is still under evaluation and has not yet been compared with histomorphometry. Lastly, the apparent relaxation time of bone marrow determined using magnetic resonance imaging may also provide information on bone structure. PMID:8869221

  9. Compensation for Time-Dependent Star Tracker Thermal Deformation on the Aqua Spacecraft

    NASA Technical Reports Server (NTRS)

    Hashmall, Joseph A.; Natanson, Gregory; Glickman, Jonathan; Sedlak, Joseph

    2004-01-01

    Analysis of attitude sensor data from the Aqua mission showed small but systematic differences between batch least-squares and extended Kalman filter attitudes. These differences were also found to be correlated with star tracker residuals, gyro bias estimates, and star tracker baseplate temperatures. This paper describes the analysis that shows that these correlations are all consistent with a single cause: time-dependent thermal deformation of star tracker alignments. These varying alignments can be separated into relative and common components. The relative misalignments can be determined and compensated for. The common misalignments can only be determined in special cases.

  10. Statistical shape analysis using 3D Poisson equation-A quantitatively validated approach.

    PubMed

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. PMID:26874288

  11. Quantitative flux analysis reveals folate-dependent NADPH production

    PubMed Central

    Fan, Jing; Ye, Jiangbin; Kamphorst, Jurre J.; Shlomi, Tomer; Thompson, Craig B.; Rabinowitz, Joshua D.

    2014-01-01

    ATP is the dominant energy source in animals for mechanical and electrical work (e.g., muscle contraction, neuronal firing). For chemical work, there is an equally important role for NADPH, which powers redox defense and reductive biosynthesis1. The most direct route to produce NADPH from glucose is the oxidative pentose phosphate pathway (oxPPP), with malic enzyme sometimes also important. While the relative contribution of glycolysis and oxidative phosphorylation to ATP production has been extensively analyzed, similar analysis of NADPH metabolism has been lacking. Here we demonstrate the ability to directly track, by liquid chromatography-mass spectrometry, the passage of deuterium from labeled substrates into NADPH, and combine this approach with carbon labeling and mathematical modeling to measure cytosolic NADPH fluxes. In proliferating cells, the largest contributor to cytosolic NADPH is the oxPPP. Surprisingly a nearly comparable contribution comes from serine-driven one-carbon metabolism, where oxidation of methylene tetrahydrofolate to 10-formyl-tetrahydrofolate is coupled to reduction of NADP+ to NADPH. Moreover, tracing of mitochondrial one-carbon metabolism revealed complete oxidation of 10-formyl-tetrahydrofolate to make NADPH. Since folate metabolism has not previously been considered an NADPH producer, confirmation of its functional significance was undertaken through knockdown of methylenetetrahydrofolate dehydrogenase (MTHFD) genes. Depletion of either the cytosolic or mitochondrial MTHFD isozyme resulted in decreased cellular NADPH/NADP+ and GSH/GSSG ratios and increased cell sensitivity to oxidative stress. Thus, while the importance of folate metabolism for proliferating cells has been long recognized and attributed to its function of producing one carbon units for nucleic acid synthesis, another crucial function of this pathway is generating reducing power. PMID:24805240

  12. Quantitative flux analysis reveals folate-dependent NADPH production

    NASA Astrophysics Data System (ADS)

    Fan, Jing; Ye, Jiangbin; Kamphorst, Jurre J.; Shlomi, Tomer; Thompson, Craig B.; Rabinowitz, Joshua D.

    2014-06-01

    ATP is the dominant energy source in animals for mechanical and electrical work (for example, muscle contraction or neuronal firing). For chemical work, there is an equally important role for NADPH, which powers redox defence and reductive biosynthesis. The most direct route to produce NADPH from glucose is the oxidative pentose phosphate pathway, with malic enzyme sometimes also important. Although the relative contribution of glycolysis and oxidative phosphorylation to ATP production has been extensively analysed, similar analysis of NADPH metabolism has been lacking. Here we demonstrate the ability to directly track, by liquid chromatography-mass spectrometry, the passage of deuterium from labelled substrates into NADPH, and combine this approach with carbon labelling and mathematical modelling to measure NADPH fluxes. In proliferating cells, the largest contributor to cytosolic NADPH is the oxidative pentose phosphate pathway. Surprisingly, a nearly comparable contribution comes from serine-driven one-carbon metabolism, in which oxidation of methylene tetrahydrofolate to 10-formyl-tetrahydrofolate is coupled to reduction of NADP+ to NADPH. Moreover, tracing of mitochondrial one-carbon metabolism revealed complete oxidation of 10-formyl-tetrahydrofolate to make NADPH. As folate metabolism has not previously been considered an NADPH producer, confirmation of its functional significance was undertaken through knockdown of methylenetetrahydrofolate dehydrogenase (MTHFD) genes. Depletion of either the cytosolic or mitochondrial MTHFD isozyme resulted in decreased cellular NADPH/NADP+ and reduced/oxidized glutathione ratios (GSH/GSSG) and increased cell sensitivity to oxidative stress. Thus, although the importance of folate metabolism for proliferating cells has been long recognized and attributed to its function of producing one-carbon units for nucleic acid synthesis, another crucial function of this pathway is generating reducing power.

  13. Quantitative ultrasound texture analysis for clinical decision making support

    NASA Astrophysics Data System (ADS)

    Wu, Jie Ying; Beland, Michael; Konrad, Joseph; Tuomi, Adam; Glidden, David; Grand, David; Merck, Derek

    2015-03-01

    We propose a general ultrasound (US) texture-analysis and machine-learning framework for detecting the presence of disease that is suitable for clinical application across clinicians, disease types, devices, and operators. Its stages are image selection, image filtering, ROI selection, feature parameterization, and classification. Each stage is modular and can be replaced with alternate methods. Thus, this framework is adaptable to a wide range of tasks. Our two preliminary clinical targets are hepatic steatosis and adenomyosis diagnosis. For steatosis, we collected US images from 288 patients and their pathology-determined values of steatosis (%) from biopsies. Two radiologists independently reviewed all images and identified the region of interest (ROI) most representative of the hepatic echotexture for each patient. To parameterize the images into comparable quantities, we filter the US images at multiple scales for various texture responses. For each response, we collect a histogram of pixel features within the ROI, and parameterize it as a Gaussian function using its mean, standard deviation, kurtosis, and skew to create a 36-feature vector. Our algorithm uses a support vector machine (SVM) for classification. Using a threshold of 10%, we achieved 72.81% overall accuracy, 76.18% sensitivity, and 65.96% specificity in identifying steatosis with leave-ten-out cross-validation (p<0.0001). Extending this framework to adenomyosis, we identified 38 patients with MR-confirmed findings of adenomyosis and previous US studies and 50 controls. A single rater picked the best US-image and ROI for each case. Using the same processing pipeline, we obtained 76.14% accuracy, 86.00% sensitivity, and 63.16% specificity with leave-one-out cross-validation (p<0.0001).

  14. Segmentation of vascular structures and hematopoietic cells in 3D microscopy images and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Mu, Jian; Yang, Lin; Kamocka, Malgorzata M.; Zollman, Amy L.; Carlesso, Nadia; Chen, Danny Z.

    2015-03-01

    In this paper, we present image processing methods for quantitative study of how the bone marrow microenvironment changes (characterized by altered vascular structure and hematopoietic cell distribution) caused by diseases or various factors. We develop algorithms that automatically segment vascular structures and hematopoietic cells in 3-D microscopy images, perform quantitative analysis of the properties of the segmented vascular structures and cells, and examine how such properties change. In processing images, we apply local thresholding to segment vessels, and add post-processing steps to deal with imaging artifacts. We propose an improved watershed algorithm that relies on both intensity and shape information and can separate multiple overlapping cells better than common watershed methods. We then quantitatively compute various features of the vascular structures and hematopoietic cells, such as the branches and sizes of vessels and the distribution of cells. In analyzing vascular properties, we provide algorithms for pruning fake vessel segments and branches based on vessel skeletons. Our algorithms can segment vascular structures and hematopoietic cells with good quality. We use our methods to quantitatively examine the changes in the bone marrow microenvironment caused by the deletion of Notch pathway. Our quantitative analysis reveals property changes in samples with deleted Notch pathway. Our tool is useful for biologists to quantitatively measure changes in the bone marrow microenvironment, for developing possible therapeutic strategies to help the bone marrow microenvironment recovery.

  15. Corrections to the MODIS Aqua Calibration Derived From MODIS Aqua Ocean Color Products

    NASA Technical Reports Server (NTRS)

    Meister, Gerhard; Franz, Bryan Alden

    2013-01-01

    Ocean color products such as, e.g., chlorophyll-a concentration, can be derived from the top-of-atmosphere radiances measured by imaging sensors on earth-orbiting satellites. There are currently three National Aeronautics and Space Administration sensors in orbit capable of providing ocean color products. One of these sensors is the Moderate Resolution Imaging Spectroradiometer (MODIS) on the Aqua satellite, whose ocean color products are currently the most widely used of the three. A recent improvement to the MODIS calibration methodology has used land targets to improve the calibration accuracy. This study evaluates the new calibration methodology and describes further calibration improvements that are built upon the new methodology by including ocean measurements in the form of global temporally averaged water-leaving reflectance measurements. The calibration improvements presented here mainly modify the calibration at the scan edges, taking advantage of the good performance of the land target trending in the center of the scan.

  16. Quantitative underwater 3D motion analysis using submerged video cameras: accuracy analysis and trajectory reconstruction.

    PubMed

    Silvatti, Amanda P; Cerveri, Pietro; Telles, Thiago; Dias, Fábio A S; Baroni, Guido; Barros, Ricardo M L

    2013-01-01

    In this study we aim at investigating the applicability of underwater 3D motion capture based on submerged video cameras in terms of 3D accuracy analysis and trajectory reconstruction. Static points with classical direct linear transform (DLT) solution, a moving wand with bundle adjustment and a moving 2D plate with Zhang's method were considered for camera calibration. As an example of the final application, we reconstructed the hand motion trajectories in different swimming styles and qualitatively compared this with Maglischo's model. Four highly trained male swimmers performed butterfly, breaststroke and freestyle tasks. The middle fingertip trajectories of both hands in the underwater phase were considered. The accuracy (mean absolute error) of the two calibration approaches (wand: 0.96 mm - 2D plate: 0.73 mm) was comparable to out of water results and highly superior to the classical DLT results (9.74 mm). Among all the swimmers, the hands' trajectories of the expert swimmer in the style were almost symmetric and in good agreement with Maglischo's model. The kinematic results highlight symmetry or asymmetry between the two hand sides, intra- and inter-subject variability in terms of the motion patterns and agreement or disagreement with the model. The two outcomes, calibration results and trajectory reconstruction, both move towards the quantitative 3D underwater motion analysis. PMID:22435960

  17. AquaSMART: Water & Boating Safety, Grades 3-5. Teacher's Guide.

    ERIC Educational Resources Information Center

    Texas State Dept. of Parks and Wildlife, Austin.

    This teacher's guide accompanies a program designed to teach water and boating safety to students in grades 3-5. The written curriculum accompanies a video, AquaSMART 3-5. The theme of the curriculum is AquaSMART. To become AquaSMART, students must learn 10 basic lessons for water and boating safety. The written curriculum begins with an overview…

  18. AquaSMART: Water & Boating Safety, Grades K-2. Teacher's Guide.

    ERIC Educational Resources Information Center

    Texas State Dept. of Parks and Wildlife, Austin.

    This teacher's guide accompanies a program designed to teach water and boating safety to students in grades K-2. The written curriculum accompanies a video, AquaSMART K-2. The theme of the curriculum is AquaSMART. To become AquaSMART, students must learn 10 basic lessons for water and boating safety. The teacher's guide begins with an overview of…

  19. Computerized rapid high resolution quantitative analysis of plasma lipoproteins based upon single vertical spin centrifugation.

    PubMed

    Cone, J T; Segrest, J P; Chung, B H; Ragland, J B; Sabesin, S M; Glasscock, A

    1982-08-01

    A method has been developed for rapidly quantitating the cholesterol concentration of normal and certain variant lipoproteins in a large number of patients (over 240 in one week). The method employs a microcomputer interfaced to the vertical autoprofiler (VAP) described earlier (Chung et al. 1981. J. Lipid Res. 22: 1003-1014). Software developed to accomplish rapid on-line analysis of the VAP signal uses peak shapes and positions derived from prior VAP analysis of isolated authentic lipoproteins HDL, LDL, and VLDL to quantitate these species in a VAP profile. Variant lipoproteins VHDL (a species with density greater than that of HDL(3)), MDL (a species, most likely Lp(a), with density intermediate between that of HDL and LDL), and IDL are subsequently quantitated by a method combining difference calculations with curve shapes. The procedure has been validated qualitatively by negative stain electron microscopy, gradient gel electrophoresis, strip electrophoresis, chemical analysis of the lipids, radioimmunoassay of the apolipoproteins, and measurement of the density of the peak centers. It has been validated quantitatively by comparison with Lipid Research Clinic methodology for HDL-, LDL-, and VLDL-cholesterol, and for MDL- and IDL-cholesterol by comparison of the amounts of MDL or IDL predicted to be present by the method with that known to be present following standard addition to whole plasma. These validations show that the method is a rapid and accurate technique of lipoprotein analysis suitable for the routine screening of patients for abnormal amounts of normal or variant lipoproteins, as well as for use as a research tool for quantitation of changes in cholesterol content of six or seven different plasma lipoprotein fractions.-Cone, J. T., J. P. Segrest, B. H. Chung, J. B. Ragland, S. M. Sabesin, and A. Glasscock. Computerized rapid high resolution quantitative analysis of plasma lipoproteins based upon single vertical spin centrifugation. PMID:7130860

  20. Quantitative analysis of urinary stone composition with micro-Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Huang, Yi-Yu; Chiu, Yi-Chun; Chiang, Huihua Kenny; Chou, Y. H. Jet; Lu, Shing-Hwa; Chiu, Allen W.

    2010-02-01

    Urolithiasis is a common, disturbing disease with high recurrent rate (60% in five years). Accurate identification of urinary stone composition is important for treatment and prevention purpose. Our previous studies have demonstrated that micro-Raman spectroscopy (MRS)-based approach successfully detects the composition of tiny stone powders after minimal invasive urological surgery. But quantitative analysis of urinary stones was not established yet. In this study, human urinary stone mixed with two compositions of COM, HAP, COD, and uric acid, were analyzed quantitatively by using a 632.98 nm Raman spectrometric system. This quantitative analysis was based on the construction of calibration curves of known mixtures of synthetically prepared pure COM, HAP, COD and uric acid. First, the various concentration (mole fraction) ratio of binary mixtures including COM and HAP, COM and COD, or COM and uric acid, were produced. Second, the intensities of the characteristic bands at 1462cm -1(IRCOM), 1477cm-1(IRCOD), 961cm-1(IRHAP) and 1402cm-1(IRuric acid), for COD, COM, HAP and uric acid were used respectively for intensity calculation. Various binary mixtures of known concentration ratio were recorded as the basis for the quantitative analysis. The ratios of the relative intensities of the Raman bands corresponding to binary mixtures of known composition on the inverse of the COM concentration yielded a linear dependence. Third, urinary stone fragments collected from patients after management were analyzed with the use of the calibration curve and the quantitative analysis of unknown samples was made by the interpolation analysis. We successfully developed a MRS-based quantitative analytical method for measuring two composition.

  1. Analysis of mixed cell cultures with quantitative digital holographic phase microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Wibbeling, Jana; Ketelhut, Steffi

    2014-05-01

    In order to study, for example, the influence of pharmaceuticals or pathogens on different cell types under identical measurement conditions and to analyze interactions between different cellular specimens a minimally-invasive quantitative observation of mixed cell cultures is of particular interest. Quantitative phase microscopy (QPM) provides high resolution detection of optical path length changes that is suitable for stain-free minimally-invasive live cell analysis. Due to low light intensities for object illumination, QPM minimizes the interaction with the sample and is in particular suitable for long term time-lapse investigations, e.g., for the detection of cell morphology alterations due to drugs and toxins. Furthermore, QPM has been demonstrated to be a versatile tool for the quantification of cellular growth, the extraction morphological parameters and cell motility. We studied the feasibility of QPM for the analysis of mixed cell cultures. It was explored if quantitative phase images provide sufficient information to distinguish between different cell types and to extract cell specific parameters. For the experiments quantitative phase imaging with digital holographic microscopy (DHM) was utilized. Mixed cell cultures with different types of human pancreatic tumor cells were observed with quantitative DHM phase contrast up to 35 h. The obtained series of quantitative phase images were evaluated by adapted algorithms for image segmentation. From the segmented images the cellular dry mass and the mean cell thickness were calculated and used in the further analysis as parameters to quantify the reliability the measurement principle. The obtained results demonstrate that it is possible to characterize the growth of cell types with different morphologies in a mixed cell culture separately by consideration of specimen size and cell thickness in the evaluation of quantitative DHM phase images.

  2. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    NASA Astrophysics Data System (ADS)

    Pořízka, P.; Demidov, A.; Kaiser, J.; Keivanian, J.; Gornushkin, I.; Panne, U.; Riedel, J.

    2014-11-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis.

  3. Possibility of quantitative estimation of blood cell forms by the spatial-frequency spectrum analysis

    NASA Astrophysics Data System (ADS)

    Spiridonov, Igor N.; Safonova, Larisa P.; Samorodov, Andrey V.

    2000-05-01

    At present in hematology there are no quantitative estimates of such important for the cell classification parameters: cell form and nuclear form. Due to the absence of the correlation between morphological parameters and parameters measured by hemoanalyzers, both flow cytometers and computer recognition systems, do not provide the completeness of the clinical blood analysis. Analysis of the spatial-frequency spectra of blood samples (smears and liquid probes) permit the estimate the forms quantitatively. On the results of theoretical and experimental researches carried out an algorithm of the form quantitative estimation by means of SFS parameters has been created. The criteria of the quality of these estimates have been proposed. A test bench based on the coherent optical and digital processors. The received results could be applied for the automated classification of ether normal or pathological blood cells in the standard blood smears.

  4. The correlation of contrast-enhanced ultrasound and MRI perfusion quantitative analysis in rabbit VX2 liver cancer.

    PubMed

    Xiang, Zhiming; Liang, Qianwen; Liang, Changhong; Zhong, Guimian

    2014-12-01

    Our objective is to explore the value of liver cancer contrast-enhanced ultrasound (CEUS) and MRI perfusion quantitative analysis in liver cancer and the correlation between these two analysis methods. Rabbit VX2 liver cancer model was established in this study. CEUS was applied. Sono Vue was applied in rabbits by ear vein to dynamically observe and record the blood perfusion and changes in the process of VX2 liver cancer and surrounding tissue. MRI perfusion quantitative analysis was used to analyze the mean enhancement time and change law of maximal slope increasing, which were further compared with the pathological examination results. Quantitative indicators of liver cancer CEUS and MRI perfusion quantitative analysis were compared, and the correlation between them was analyzed by correlation analysis. Rabbit VX2 liver cancer model was successfully established. CEUS showed that time-intensity curve of rabbit VX2 liver cancer showed "fast in, fast out" model while MRI perfusion quantitative analysis showed that quantitative parameter MTE of tumor tissue increased and MSI decreased: the difference was statistically significant (P < 0.01). The diagnostic results of CEUS and MRI perfusion quantitative analysis were not significantly different (P > 0.05). However, the quantitative parameter of them were significantly positively correlated (P < 0.05). CEUS and MRI perfusion quantitative analysis can both dynamically monitor the liver cancer lesion and surrounding liver parenchyma, and the quantitative parameters of them are correlated. The combined application of both is of importance in early diagnosis of liver cancer. PMID:25123838

  5. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  6. Quantitative Analysis of Organic Compounds: A Simple and Rapid Method for Use in Schools

    ERIC Educational Resources Information Center

    Schmidt, Hans-Jurgen

    1973-01-01

    Describes the procedure for making a quantitative analysis of organic compounds suitable for secondary school chemistry classes. Using the Schoniger procedure, the organic compound, such as PVC, is decomposed in a conical flask with oxygen. The products are absorbed in a suitable liquid and analyzed by titration. (JR)

  7. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    SciTech Connect

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  8. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  9. Forty Years of the "Journal of Librarianship and Information Science": A Quantitative Analysis, Part I

    ERIC Educational Resources Information Center

    Furner, Jonathan

    2009-01-01

    This paper reports on the first part of a two-part quantitative analysis of volume 1-40 (1969-2008) of the "Journal of Librarianship and Information Science" (formerly the "Journal of Librarianship"). It provides an overview of the current state of LIS research journal publishing in the UK; a review of the publication and printing history of…

  10. Whose American Government? A Quantitative Analysis of Gender and Authorship in American Politics Texts

    ERIC Educational Resources Information Center

    Cassese, Erin C.; Bos, Angela L.; Schneider, Monica C.

    2014-01-01

    American government textbooks signal to students the kinds of topics that are important and, by omission, the kinds of topics that are not important to the discipline of political science. This article examines portrayals of women in introductory American politics textbooks through a quantitative content analysis of 22 widely used texts. We find…

  11. A Quantitative Discourse Analysis of Student-Initiated Checks of Understanding during Teacher-Fronted Lessons

    ERIC Educational Resources Information Center

    Shepherd, Michael A.

    2012-01-01

    Recent research highlights the paradoxical importance of students' being able to check their understanding with teachers and of teachers' constraining student participation. Using quantitative discourse analysis, this paper examines third graders' discursive strategies in initiating such checks and teachers' strategies in constraining them. The…

  12. A GC-FID method for quantitative analysis of N,N-carbonyldiimidazole.

    PubMed

    Lee, Claire; Mangion, Ian

    2016-03-20

    N,N-Carbonyldiimidazole (CDI), a common synthetic reagent used in commercial scale pharmaceutical synthesis, is known to be sensitive to hydrolysis from ambient moisture. This liability demands a simple, robust analytical method to quantitatively determine reagent quality to ensure reproducible performance in chemical reactions. This work describes a protocol for a rapid GC-FID based analysis of CDI. PMID:26773533

  13. A Colorimetric Analysis Experiment Not Requiring a Spectrophotometer: Quantitative Determination of Albumin in Powdered Egg White

    ERIC Educational Resources Information Center

    Charlton, Amanda K.; Sevcik, Richard S.; Tucker, Dorie A.; Schultz, Linda D.

    2007-01-01

    A general science experiment for high school chemistry students might serve as an excellent review of the concepts of solution preparation, solubility, pH, and qualitative and quantitative analysis of a common food product. The students could learn to use safe laboratory techniques, collect and analyze data using proper scientific methodology and…

  14. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  15. A Colorimetric Analysis Experiment Not Requiring a Spectrophotometer: Quantitative Determination of Albumin in Powdered Egg White

    ERIC Educational Resources Information Center

    Charlton, Amanda K.; Sevcik, Richard S.; Tucker, Dorie A.; Schultz, Linda D.

    2007-01-01

    A general science experiment for high school chemistry students might serve as an excellent review of the concepts of solution preparation, solubility, pH, and qualitative and quantitative analysis of a common food product. The students could learn to use safe laboratory techniques, collect and analyze data using proper scientific methodology and…

  16. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  17. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A Complementary Perspective

    ERIC Educational Resources Information Center

    Kahveci, Ajda

    2010-01-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses.…

  18. Whose American Government? A Quantitative Analysis of Gender and Authorship in American Politics Texts

    ERIC Educational Resources Information Center

    Cassese, Erin C.; Bos, Angela L.; Schneider, Monica C.

    2014-01-01

    American government textbooks signal to students the kinds of topics that are important and, by omission, the kinds of topics that are not important to the discipline of political science. This article examines portrayals of women in introductory American politics textbooks through a quantitative content analysis of 22 widely used texts. We find…

  19. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  20. Qualitative and quantitative analysis of mixtures of compounds containing both hydrogen and deuterium

    NASA Technical Reports Server (NTRS)

    Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.

    1969-01-01

    Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.

  1. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A Complementary Perspective

    ERIC Educational Resources Information Center

    Kahveci, Ajda

    2010-01-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses.…

  2. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    ERIC Educational Resources Information Center

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  3. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  4. A Quantitative Analysis of Cognitive Strategy Usage in the Marking of Two GCSE Examinations

    ERIC Educational Resources Information Center

    Suto, W. M. Irenka; Greatorex, Jackie

    2008-01-01

    Diverse strategies for marking GCSE examinations have been identified, ranging from simple automatic judgements to complex cognitive operations requiring considerable expertise. However, little is known about patterns of strategy usage or how such information could be utilised by examiners. We conducted a quantitative analysis of previous verbal…

  5. Quantitative analysis of TATB impurities by thin layer chromatography. Process development endeavor No. 205

    SciTech Connect

    Schaeffer, C.L.

    1983-01-01

    A method for quantitative analysis of impurities in TATB was developed using a densitometer to analyze spots on thin-layer chromatographic plates. The precision associated within the densitometer averaged 3.98% standard deviation. Spotting and developing the plate decreased the precision to 5.4% standard deviation.

  6. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    ERIC Educational Resources Information Center

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  7. Quantitative Intersectionality: A Critical Race Analysis of the Chicana/o Educational Pipeline

    ERIC Educational Resources Information Center

    Covarrubias, Alejandro

    2011-01-01

    Utilizing the critical race framework of intersectionality, this research reexamines the Chicana/o educational pipeline through a quantitative intersectional analysis. This approach disaggregates data along the intersection of race, class, gender, and citizenship status to provide a detailed portrait of the educational trajectory of Mexican-origin…

  8. Gas chromatograph-mass spectrometer (GC/MS) system for quantitative analysis of reactive chemical compounds

    DOEpatents

    Grindstaff, Quirinus G. (Oak Ridge, TN)

    1992-01-01

    Described is a new gas chromatograph-mass spectrometer (GC/MS) system and method for quantitative analysis of reactive chemical compounds. All components of such a GC/MS system external to the oven of the gas chromatograph are programmably temperature controlled to operate at a volatilization temperature specific to the compound(s) sought to be separated and measured.

  9. QUANTITATIVE PCR ANALYSIS OF MOLDS IN THE DUST FROM HOMES OF ASTHMATIC CHILDREN IN NORTH CAROLINA

    EPA Science Inventory

    The vacuum bag (VB) dust was analyzed by mold specific quantitative PCR. These results were compared to the analysis survey calculated for each of the homes. The mean and standard deviation (SD) of the ERMI values in the homes of the NC asthmatic children was 16.4 (6.77), compa...

  10. Clinical applications of a quantitative analysis of regional lift ventricular wall motion

    NASA Technical Reports Server (NTRS)

    Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.

    1975-01-01

    Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.

  11. A Quantitative Categorical Analysis of Metadata Elements in Image-Applicable Metadata Schemas.

    ERIC Educational Resources Information Center

    Greenberg, Jane

    2001-01-01

    Reports on a quantitative categorical analysis of metadata elements in the Dublin Core, VRA (Visual Resource Association) Core, REACH (Record Export for Art and Cultural Heritage), and EAD (Encoded Archival Description) metadata schemas, all of which can be used for organizing and describing images. Introduces a new schema comparison methodology…

  12. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  13. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  14. A quantitative ratiometric sensor for time-resolved analysis of auxin dynamics

    PubMed Central

    Wend, Sabrina; Bosco, Cristina Dal; Kämpf, Michael M.; Ren, Fugang; Palme, Klaus; Weber, Wilfried; Dovzhenko, Alexander; Zurbriggen, Matias D.

    2013-01-01

    Time-resolved quantitative analysis of auxin-mediated processes in plant cells is as of yet limited. By applying a synergistic mammalian and plant synthetic biology approach, we have developed a novel ratiometric luminescent biosensor with wide applicability in the study of auxin metabolism, transport, and signalling. The sensitivity and kinetic properties of our genetically encoded biosensor open new perspectives for the analysis of highly complex auxin dynamics in plant growth and development. PMID:23787479

  15. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples. PMID:25993858

  16. Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Jiang, Dejun; Zhao, Shusen; Shen, Jingling

    2008-03-01

    A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.

  17. Analysis of focal adhesion turnover: A quantitative live cell imaging example

    PubMed Central

    Stehbens, Samantha

    2014-01-01

    Recent advances in optical and fluorescent protein technology have rapidly raised expectations in cell biology, allowing quantitative insights into dynamic intracellular processes like never before. However, quantitative live cell imaging comes with many challenges including how best to translate dynamic microscopy data into numerical outputs that can be used to make meaningful comparisons rather than relying on representative data sets. Here we use analysis of focal adhesion turnover dynamics as a straight-forward specific example on how to image, measure and analyze intracellular dynamics, but we believe this outlines a thought process and can provide guidance on how to understand dynamic microcopy data of other intracellular structures. PMID:24974036

  18. In-line quantitative phase imaging for damage detection and analysis

    NASA Astrophysics Data System (ADS)

    Douti, Dam-Bé L.; Aknoun, Sherazade; Monneret, Serge; Hecquet, Christophe; Commandré, Mireille; Gallais, Laurent

    2014-10-01

    We investigate quantitative phase imaging as a measurement method for laser damage detection and analysis of laser induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry technique associated to a high magnification optical microscope. The system has been used for in situ observation of optical thin films and bulk samples irradiated by 500fs pulses. It is shown that the technique realizes high sensitivity, convenient use and can provide quantitative information on the refractive index or surface modification of the samples under test.

  19. Quantitative trait loci analysis of powdery mildew disease resistance in the Arabidopsis thaliana accession kashmir-1.

    PubMed Central

    Wilson, I W; Schiff, C L; Hughes, D E; Somerville, S C

    2001-01-01

    Powdery mildew diseases are economically important diseases, caused by obligate biotrophic fungi of the Erysiphales. To understand the complex inheritance of resistance to the powdery mildew disease in the model plant Arabidopsis thaliana, quantitative trait loci analysis was performed using a set of recombinant inbred lines derived from a cross between the resistant accession Kashmir-1 and the susceptible accession Columbia glabrous1. We identified and mapped three independent powdery mildew quantitative disease resistance loci, which act additively to confer disease resistance. The locus with the strongest effect on resistance was mapped to a 500-kbp interval on chromosome III. PMID:11454776

  20. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows. PMID:26138893

  1. The quantitative discrimination between shrinkage and gas microporosity in cast aluminum alloys using spatial data analysis

    SciTech Connect

    Anson, J.P.; Gruzleski, J.E.

    1999-11-01

    Microporosity in cast aluminum alloys may originate from hydrogen gas evolution, microshrinkage, or a combination of both. A spatial analysis method for the quantitative discrimination between shrinkage and gas porosity is presented and explained. It is shown that shrinkage pores can be selected and analyzed separately from gas pores by nearest-neighbor analysis. The principles of spatial statistics are discussed, and the types of spatial point patterns, complete spatial randomness, and nearest-neighbor cluster analysis are reviewed with respect to microporosity analysis. The pore distribution of a cast Al-7% Si (A356) foundry alloy is used as an example.

  2. [White matter fiber tractography and quantitative analysis of diffusion tensor imaging].

    PubMed

    Shimoji, Keigo; Tokumaru, Aya M

    2015-04-01

    Magnetic resonance diffusion tensor imaging (DTI) is a noninvasive technique that can identify and quantify white matter tracts by evaluating the diffusion of water in biological tissues. While association fibers are bundles of axons within the brain that unite different parts of the same cerebral hemisphere, projection fibers are bundles of axons that unite the cortex with lower parts of the brain and the spinal cord. The commissural fibers are axon bundles that connect the two hemispheres of the brain. Quantitative analysis of DTI can be roughly classified into three types: region of interest analysis, tract-specific analysis, and fully automated hypothesis free whole brain analysis. PMID:25846596

  3. Application of BP Neural Network Based on Genetic Algorithm in Quantitative Analysis of Mixed GAS

    NASA Astrophysics Data System (ADS)

    Chen, Hongyan; Liu, Wenzhen; Qu, Jian; Zhang, Bing; Li, Zhibin

    Aiming at the problem of mixed gas detection in neural network and analysis on the principle of gas detection. Combining BP algorithm of genetic algorithm with hybrid gas sensors, a kind of quantitative analysis system of mixed gas is designed. The local minimum of network learning is the main reason which affects the precision of gas analysis. On the basis of the network study to improve the learning algorithms, the analyses and tests for CO, CO2 and HC compounds were tested. The results showed that the above measures effectively improve and enhance the accuracy of the neural network for gas analysis.

  4. Quantitative image analysis for the characterization of microbial aggregates in biological wastewater treatment: a review.

    PubMed

    Costa, J C; Mesquita, D P; Amaral, A L; Alves, M M; Ferreira, E C

    2013-09-01

    Quantitative image analysis techniques have gained an undeniable role in several fields of research during the last decade. In the field of biological wastewater treatment (WWT) processes, several computer applications have been developed for monitoring microbial entities, either as individual cells or in different types of aggregates. New descriptors have been defined that are more reliable, objective, and useful than the subjective and time-consuming parameters classically used to monitor biological WWT processes. Examples of this application include the objective prediction of filamentous bulking, known to be one of the most problematic phenomena occurring in activated sludge technology. It also demonstrated its usefulness in classifying protozoa and metazoa populations. In high-rate anaerobic processes, based on granular sludge, aggregation times and fragmentation phenomena could be detected during critical events, e.g., toxic and organic overloads. Currently, the major efforts and needs are in the development of quantitative image analysis techniques focusing on its application coupled with stained samples, either by classical or fluorescent-based techniques. The use of quantitative morphological parameters in process control and online applications is also being investigated. This work reviews the major advances of quantitative image analysis applied to biological WWT processes. PMID:23716077

  5. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  6. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  7. Quantitative Analysis of Pork and Chicken Products by Droplet Digital PCR

    PubMed Central

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises. PMID:25243184

  8. [A multivariate nonlinear model for quantitative analysis in laser-induced breakdown spectroscopy].

    PubMed

    Chen, Xing-Long; Fu, Hong-Bo; Wang, Jing-Ge; Ni, Zhi-Bo; He, Wen-Gan; Xu, Jun; Rao Rui-zhong; Dong, Rui-Zhong

    2014-11-01

    Most quantitative models used in laser-induced breakdown spectroscopy (LIBS) are based on the hypothesis that laser-induced plasma approaches the state of local thermal equilibrium (LTE). However, the local equilibrium is possible only at a specific time segment during the evolution. As the populations of each energy level does not follow Boltzmann distribution in non-LTE condition, those quantitative models using single spectral line would be inaccurate. A multivariate nonlinear model, in which the LTE is not required, was proposed in this article to reduce the signal fluctuation and improve the accuracy of quantitative analysis. This multivariate nonlinear model was compared with the internal calibration model which is based on the LTE condition. The content of Mn in steel samples was determined by using the two models, respectively. A minor error and a minor relative standard deviation (RSD) were observed in multivariate nonlinear model. This result demonstrates that multivariate nonlinear model can improve measurement accuracy and repeatability. PMID:25752066

  9. The Trouble With Tailings: How Alteration Mineralogy can Hinder Quantitative Phase Analysis of Mineral Waste

    NASA Astrophysics Data System (ADS)

    Wilson, S. A.; Mills, S. J.; Dipple, G. M.; Raudsepp, M.

    2009-05-01

    Quantitative phase analysis, using the Rietveld method and X-ray powder-diffraction data, has become a standard technique for analysis of mineral waste from mining operations. This method relies upon the availability of well defined crystal structures for all detectable mineral phases in a sample. An even more basic assumption, central to quantitative mineralogy, is that all significant mineral phases can be detected from X-ray diffraction data. This is not always the case, because X-ray amorphous and nanocrystalline mineral phases can develop within geological samples as a result of chemical weathering. The extent of mineral-water interaction to which mine tailings are exposed, during processing and storage, makes these materials particularly susceptible to weathering and alteration. We have used the Rietveld method and X-ray powder-diffraction data to quantify the uptake of atmospheric CO2 into secondary carbonate minerals at two operating mines: the Diavik Diamond Mine, Northwest Territories, Canada, and the Mount Keith Nickel Mine, Western Australia, Australia. At Diavik, nominally anhydrous minerals in kimberlitic mine tailings have been found to contain X-ray amorphous material and hydroxyl groups detectable by Raman spectroscopy. A series of weighed mixtures, prepared to simulate kimberlite mine tailings, has been used to assess the effects of X-ray amorphous material on quantitative phase analysis of Diavik tailings. At Mount Keith, hydrated sulphate minerals and halide minerals develop efflorescent crusts at the surface of the tailings storage facility. Hydrated sulphate minerals in these mine tailings commonly decompose to amorphous substances rather than dehydrating to produce minerals detectable from X-ray powder-diffraction patterns. Nanocrystalline and X-ray amorphous material in mine tailings can affect the accuracy of quantitative determinations of CO2 trapping and abundances of sulphur-bearing minerals associated with redox reactions. Here we assess the impact of amorphous material on quantitative X-ray diffraction results with particular reference to CO2 sequestration and suggest strategies for detection and analysis.

  10. Development of a new real-time MCE-based quantitative analysis system

    NASA Astrophysics Data System (ADS)

    Zhang, Mingqiang; Sun, Fengrong; Yao, Gui-hua

    2009-10-01

    The quantitative analysis system for real time myocardial contrast echocardiography can measure the values of A(microvascular cross-sectional area or myocardial blood volume)and β(myocardial microbubble velocity), AÂ.β(myocardial blood flow), A-EER (endo-epi ratio of A ), β-EER and AÂ.β-EER from the signal intensity of real-time 2-D grayscale images and power Doppler images, draw the time-intensity curves to indicate the variation of the intensity of micro-bubbles scattering in subendocardial layer and subepicardial layer with the varying of myocardial segments, and estimate the hemodynamic parameters by nonlinear regression analysis. The system also conformed to the digital imaging and communications in medicine (DICOM) standard and could be integrated into the picture archiving and communication system (PACS). The examples of clinical study indicated the clinical effectiveness of the system and the reliability of the quantitative analysis techniques employed in the system.

  11. Quantitative trait locus analysis of susceptibility to diet-induced atherosclerosis in recombinant inbred mice

    SciTech Connect

    Hyman, R.W.; Frank, S.; Warden, C.H.

    1994-12-01

    Quantitative trait locus (QTL) analysis is a statistical method that can be applied to identify loci making a significant impact on a phenotype. For the phenotype of susceptibility to diet-induced atherosclerosis in the mouse, we have studied four quantitative traits: area of aortic fatty streaks and serum concentrations of high-density lipoprotein-bound cholesterol (HDL-cholesterol), apolipoprotein A-I, and apolipoprotein A-II (apo A-II). QTL analysis revealed a significant locus on chromosome 1 distal impacting serum apo A-II concentration on a high-fat diet and serum HDL-cholesterol concentration on a chow diet. This locus is presumably Apoa-2, the structural gene for apo A-II. QTL analysis of aortic fatty streaks failed to reveal a significant locus. 19 refs., 3 tabs.

  12. Identification and quantitative analysis of chemical compounds based on multiscale linear fitting of terahertz spectra

    NASA Astrophysics Data System (ADS)

    Qiao, Lingbo; Wang, Yingxin; Zhao, Ziran; Chen, Zhiqiang

    2014-07-01

    Terahertz (THz) time-domain spectroscopy is considered as an attractive tool for the analysis of chemical composition. The traditional methods for identification and quantitative analysis of chemical compounds by THz spectroscopy are all based on full-spectrum data. However, intrinsic features of the THz spectrum only lie in absorption peaks due to existence of disturbances, such as unexpected components, scattering effects, and barrier materials. We propose a strategy that utilizes Lorentzian parameters of THz absorption peaks, extracted by a multiscale linear fitting method, for both identification of pure chemicals and quantitative analysis of mixtures. The multiscale linear fitting method can automatically remove background content and accurately determine Lorentzian parameters of the absorption peaks. The high recognition rate for 16 pure chemical compounds and the accurate predicted concentrations for theophylline-lactose mixtures demonstrate the practicability of our approach.

  13. Determinations of Carbon Dioxide by Titration: New Experiments for General, Physical, and Quantitative Analysis Courses

    NASA Astrophysics Data System (ADS)

    Crossno, S. K.; Kalbus, L. H.; Kalbus, G. E.

    1996-02-01

    The determination of mixtures containing NaOH and Na2CO3 or Na2CO3 and NaHCO3 by titration is a common experiment in a Quantitative Analysis course. This determination can be adapted for the analysis of CO2 within a sample. The CO2 is released and absorbed in a solution containing excess NaOH. Titration with standard HCl leads to the determination of CO2 present in the sample. A number of interesting experiments in Quantitative Analysis, General and/or Physical Chemistry have been developed. Among these are the following determinations: CO2 content in carbonated beverages, carbonate and bicarbonate in various real life samples, and the molecular weight of CO2.

  14. Accuracy improvement of quantitative analysis in laser-induced breakdown spectroscopy using modified wavelet transform.

    PubMed

    Zou, X H; Guo, L B; Shen, M; Li, X Y; Hao, Z Q; Zeng, Q D; Lu, Y F; Wang, Z M; Zeng, X Y

    2014-05-01

    A modified algorithm of background removal based on wavelet transform was developed for spectrum correction in laser-induced breakdown spectroscopy (LIBS). The optimal type of wavelet function, decomposition level and scaling factor ? were determined by the root-mean-square error of calibration (RMSEC) of the univariate regression model of the analysis element, which is considered as the optimization criteria. After background removal by this modified algorithm with RMSEC, the root-mean-square error of cross-validation (RMSECV) and the average relative error (ARE) criteria, the accuracy of quantitative analysis on chromium (Cr), vanadium (V), cuprum (Cu), and manganese (Mn) in the low alloy steel was all improved significantly. The results demonstrated that the algorithm developed is an effective pretreatment method in LIBS to significantly improve the accuracy in the quantitative analysis. PMID:24921726

  15. Identification and Comparison of Candidate Olfactory Genes in the Olfactory and Non-Olfactory Organs of Elm Pest Ambrostoma quadriimpressum (Coleoptera: Chrysomelidae) Based on Transcriptome Analysis

    PubMed Central

    Wang, Yinliang; Chen, Qi; Zhao, Hanbo; Ren, Bingzhong

    2016-01-01

    The leaf beetle Ambrostoma quadriimpressum (Coleoptera: Chrysomelidae) is a predominant forest pest that causes substantial damage to the lumber industry and city management. However, no effective and environmentally friendly chemical method has been discovered to control this pest. Until recently, the molecular basis of the olfactory system in A. quadriimpressum was completely unknown. In this study, antennae and leg transcriptomes were analyzed and compared using deep sequencing data to identify the olfactory genes in A. quadriimpressum. Moreover, the expression profiles of both male and female candidate olfactory genes were analyzed and validated by bioinformatics, motif analysis, homology analysis, semi-quantitative RT-PCR and RT-qPCR experiments in antennal and non-olfactory organs to explore the candidate olfactory genes that might play key roles in the life cycle of A. quadriimpressum. As a result, approximately 102.9 million and 97.3 million clean reads were obtained from the libraries created from the antennas and legs, respectively. Annotation led to 34344 Unigenes, which were matched to known proteins. Annotation data revealed that the number of genes in antenna with binding functions and receptor activity was greater than that of legs. Furthermore, many pathway genes were differentially expressed in the two organs. Sixteen candidate odorant binding proteins (OBPs), 10 chemosensory proteins (CSPs), 34 odorant receptors (ORs), 20 inotropic receptors [1] and 2 sensory neuron membrane proteins (SNMPs) and their isoforms were identified. Additionally, 15 OBPs, 9 CSPs, 18 ORs, 6 IRs and 2 SNMPs were predicted to be complete ORFs. Using RT-PCR, RT-qPCR and homology analysis, AquaOBP1/2/4/7/C1/C6, AquaCSP3/9, AquaOR8/9/10/14/15/18/20/26/29/33, AquaIR8a/13/25a showed olfactory-specific expression, indicating that these genes might play a key role in olfaction-related behaviors in A. quadriimpressum such as foraging and seeking. AquaOBP4/C5, AquaOBP4/C5, AquaCSP7/9/10, AquaOR17/24/32 and AquaIR4 were highly expressed in the antenna of males, suggesting that these genes were related to sex-specific behaviors, and expression trends that were male specific were observed for most candidate olfactory genes, which supported the existence of a female-produced sex pheromone in A. quadriimpressum. All of these results could provide valuable information and guidance for future functional studies on these genes and provide better molecular knowledge regarding the olfactory system in A. quadriimpressum. PMID:26800515

  16. Identification and Comparison of Candidate Olfactory Genes in the Olfactory and Non-Olfactory Organs of Elm Pest Ambrostoma quadriimpressum (Coleoptera: Chrysomelidae) Based on Transcriptome Analysis.

    PubMed

    Wang, Yinliang; Chen, Qi; Zhao, Hanbo; Ren, Bingzhong

    2016-01-01

    The leaf beetle Ambrostoma quadriimpressum (Coleoptera: Chrysomelidae) is a predominant forest pest that causes substantial damage to the lumber industry and city management. However, no effective and environmentally friendly chemical method has been discovered to control this pest. Until recently, the molecular basis of the olfactory system in A. quadriimpressum was completely unknown. In this study, antennae and leg transcriptomes were analyzed and compared using deep sequencing data to identify the olfactory genes in A. quadriimpressum. Moreover, the expression profiles of both male and female candidate olfactory genes were analyzed and validated by bioinformatics, motif analysis, homology analysis, semi-quantitative RT-PCR and RT-qPCR experiments in antennal and non-olfactory organs to explore the candidate olfactory genes that might play key roles in the life cycle of A. quadriimpressum. As a result, approximately 102.9 million and 97.3 million clean reads were obtained from the libraries created from the antennas and legs, respectively. Annotation led to 34344 Unigenes, which were matched to known proteins. Annotation data revealed that the number of genes in antenna with binding functions and receptor activity was greater than that of legs. Furthermore, many pathway genes were differentially expressed in the two organs. Sixteen candidate odorant binding proteins (OBPs), 10 chemosensory proteins (CSPs), 34 odorant receptors (ORs), 20 inotropic receptors [1] and 2 sensory neuron membrane proteins (SNMPs) and their isoforms were identified. Additionally, 15 OBPs, 9 CSPs, 18 ORs, 6 IRs and 2 SNMPs were predicted to be complete ORFs. Using RT-PCR, RT-qPCR and homology analysis, AquaOBP1/2/4/7/C1/C6, AquaCSP3/9, AquaOR8/9/10/14/15/18/20/26/29/33, AquaIR8a/13/25a showed olfactory-specific expression, indicating that these genes might play a key role in olfaction-related behaviors in A. quadriimpressum such as foraging and seeking. AquaOBP4/C5, AquaOBP4/C5, AquaCSP7/9/10, AquaOR17/24/32 and AquaIR4 were highly expressed in the antenna of males, suggesting that these genes were related to sex-specific behaviors, and expression trends that were male specific were observed for most candidate olfactory genes, which supported the existence of a female-produced sex pheromone in A. quadriimpressum. All of these results could provide valuable information and guidance for future functional studies on these genes and provide better molecular knowledge regarding the olfactory system in A. quadriimpressum. PMID:26800515

  17. Quantitative end qualitative analysis of the electrical activity of rectus abdominis muscle portions.

    PubMed

    Negrão Filho, R de Faria; Bérzin, F; Souza, G da Cunha

    2003-01-01

    The purpose of this study was to investigate the electrical behavior pattern of the Rectus abdominis muscle by qualitative and quantitative analysis of the electromyographic signal obtained from its superior, medium and inferior portions during dynamic and static activities. Ten voluntaries (aged X = 17.8 years, SD = 1.6) athletic males were studied without history of muscle skeletal disfunction. For the quantitative analysis the RMS (Root Mean Square) values obtained in the electromyographic signal during the isometric exercises were normalized and expressed in maximum voluntary isometric contraction percentages. For the qualitative analysis of the dynamic activity the electromyographic signal was processed by full-wave rectification, linear envelope and normalization (amplitude and time), so that the resulting curve of the processed signal was submitted to descriptive graphic analysis. The results of the quantitative study show that there is not a statistically significant difference among the portions of the muscle. Qualitative analysis demonstrated two aspects: the presence of a common activation electric pattern in the portions of Rectus abdominis muscle and the absence of significant difference in the inclination angles in the electrical activity curve during the isotonic exercises. PMID:12964259

  18. Aqua-planet simulations of the formation of the South Atlantic convergence zone

    NASA Technical Reports Server (NTRS)

    Nieto Ferreira, Rosana; Chao, Winston C.

    2013-01-01

    The impact of Amazon Basin convection and cold fronts on the formation and maintenance of the South Atlantic convergence zone (SACZ) is studied using aqua-planet simulations with a general circulation model. In the model, a circular patch of warm sea-surface temperature (SST) is used to mimic the effect of the Amazon Basin on South American monsoon convection. The aqua-planet simulations were designed to study the effect of the strength and latitude of Amazon Basin convection on the formation of the SACZ. The simulations indicate that the strength of the SACZ increases as the Amazon convection intensifies and is moved away from the equator. Of the two controls studied here, the latitude of the Amazon convection exerts the strongest effect on the strength of the SACZ. An analysis of the synoptic-scale variability in the simulations shows the importance of frontal systems in the formation of the aqua-planet SACZ. Composite time series of frontal systems that occurred in the simulations show that a robust SACZ occurs when fronts penetrate into the subtropics and become stationary there as they cross eastward of the longitude of the Amazon Basin. Moisture convergence associated with these frontal systems produces rainfall not along the model SACZ region and along a large portion of the northern model Amazon Basin. Simulations in which the warm SST patch was too weak or too close to the equator did not produce frontal systems that extended into the tropics and became stationary, and did not form a SACZ. In the model, the SACZ forms as Amazon Basin convection strengthens and migrates far enough southward to allow frontal systems to penetrate into the tropics and stall over South America. This result is in agreement with observations that the SACZ tends to form after the onset of the monsoon season in the Amazon Basin.

  19. HPTLC Hyphenated with FTIR: Principles, Instrumentation and Qualitative Analysis and Quantitation

    NASA Astrophysics Data System (ADS)

    Cimpoiu, Claudia

    In recent years, much effort has been devoted to the coupling of high-performance thin-layer chromatography (HPTLC) with spectrometric methods because of the robustness and simplicity of HPTLC and the need for detection techniques that provide identification and determination of sample constituents. IR is one of the spectroscopic methods that have been coupled with HPTLC. IR spectroscopy has a high potential for the elucidation of molecular structures, and the characteristic absorption bands can be used for compound-specific detection. HPTLC-FTIR coupled method has been widely used in the modern laboratories for the qualitative and quantitative analysis. The potential of this method is demonstrated by its application in different fields of analysis such as drug analysis, forensic analysis, food analysis, environmental analysis, biological analysis, etc. The hyphenated HPTLC-FTIR technique will be developed in the future with the aim of taking full advantage of this method.

  20. Qualitative and Quantitative Analysis with Contrast-Enhanced Ultrasonography: Diagnosis Value in Hypoechoic Renal Angiomyolipoma

    PubMed Central

    Lu, Qing; Wang, Wen-ping; Li, Cui-xian; Xue, Li-yun

    2015-01-01

    Objective To evaluate the value of enhancement features and quantitative parameters of contrast-enhanced ultrasonography (CEUS) in differentiating solid hypoechoic renal angiomyolipomas (AMLs) from clear cell renal cell carcinomas (ccRCCs). Materials and Methods We analyzed the enhancement features and quantitative parameters of CEUS in 174 hypoechoic renal masses (32 AMLs and 142 ccRCCs) included in the study. Results Centripetal enhancement pattern was more common in AMLs than in ccRCCs on CEUS (71.9% vs. 23.2%, p < 0.001). At peak enhancement, all AMLs showed homogeneous enhancement (100% in AML, 27.5% in ccRCCs; p < 0.001). Quantitative analysis showed no significant difference between rise time and time to peak. Tumor-to-cortex (TOC) enhancement ratio in AMLs was significantly lower than that in ccRCCs (p < 0.001). The criteria of centripetal enhancement and homogeneous peak enhancement together with TOC ratio < 91.0% used to differentiate hypoechoic AMLs from ccRCCs resulted in a sensitivity and specificity of 68.9% and 95.8%, respectively. Conclusion Both qualitative and quantitative analysis with CEUS are valuable in the differential diagnosis of hypoechoic renal AMLs from ccRCCs. PMID:25741195

  1. QARIP: a web server for quantitative proteomic analysis of regulated intramembrane proteolysis

    PubMed Central

    Ivankov, Dmitry N.; Bogatyreva, Natalya S.; Hönigschmid, Peter; Dislich, Bastian; Hogl, Sebastian; Kuhn, Peer-Hendrik; Frishman, Dmitrij; Lichtenthaler, Stefan F.

    2013-01-01

    Regulated intramembrane proteolysis (RIP) is a critical mechanism for intercellular communication and regulates the function of membrane proteins through sequential proteolysis. RIP typically starts with ectodomain shedding of membrane proteins by extracellular membrane-bound proteases followed by intramembrane proteolysis of the resulting membrane-tethered fragment. However, for the majority of RIP proteases the corresponding substrates and thus, their functions, remain unknown. Proteome-wide identification of RIP protease substrates is possible by mass spectrometry-based quantitative comparison of RIP substrates or their cleavage products between different biological states. However, this requires quantification of peptides from only the ectodomain or cytoplasmic domain. Current analysis software does not allow matching peptides to either domain. Here we present the QARIP (Quantitative Analysis of Regulated Intramembrane Proteolysis) web server which matches identified peptides to the protein transmembrane topology. QARIP allows determination of quantitative ratios separately for the topological domains (cytoplasmic, ectodomain) of a given protein and is thus a powerful tool for quality control, improvement of quantitative ratios and identification of novel substrates in proteomic RIP datasets. To our knowledge, the QARIP web server is the first tool directly addressing the phenomenon of RIP. The web server is available at http://webclu.bio.wzw.tum.de/qarip/. This website is free and open to all users and there is no login requirement. PMID:23729472

  2. Dynamic analysis of pathogen-infected host cells using quantitative phase microscopy

    NASA Astrophysics Data System (ADS)

    Lee, Seungrag; Kim, Young Ran; Lee, Ji Yong; Rhee, Joon Haeng; Park, Chang-Soo; Kim, Dug Young

    2011-03-01

    We present the real-time quantitative analysis of Vibrio vulnificus-infected host cells using quantitative phase microscopy (QPM) based on interferometric techniques. This provides the ability to retrieve the phase or optical path-length distribution over the cell with nanometer path-length sensitivity from a single interferogram image. We have used QPM to study dynamic cell morphologic changes and to noninvasively quantify the cell volumes of rat basophilic leukemia RBL-2H3 cells infected with V. vulnificus strains: wild type (MO6-24/O) and RtxA1 toxin mutant (CMM770). During the process of V. vulnificus infection in RBL-2H3 cells, the dynamic changes of quantitative phase images, cell volumes, and areas were observed in real time using QPM. In contrast, dramatic changes were not detected in RBL-2H3 cells infected with the noncytotoxic RtxA1 toxin mutant. The results showed good correlation between QPM analysis and biochemical assays, such as lactate dehydrogenase assay or ?-hexosaminidase release assay. We suggest that QPM is a powerful quantitative method to study the dynamic process of host cells infected with pathogens in a noninvasive manner.

  3. Sources of Technical Variability in Quantitative LC-MS Proteomics: Human Brain Tissue Sample Analysis

    PubMed Central

    Piehowski, Paul D.; Petyuk, Vladislav A.; Orton, Daniel J.; Xie, Fang; Ramirez-Restrepo, Manuel; Engel, Anzhelika; Lieberman, Andrew P.; Albin, Roger L.; Camp, David G.; Smith, Richard D.; Myers, Amanda J.

    2013-01-01

    To design a robust quantitative proteomics study, an understanding of both the inherent heterogeneity of the biological samples being studied as well as the technical variability of the proteomics methods and platform is needed. Additionally, accurately identifying the technical steps associated with the largest variability would provide valuable information for the improvement and design of future processing pipelines. We present an experimental strategy that allows for a detailed examination of the variability of the quantitative LC-MS proteomics measurements. By replicating analyses at different stages of processing, various technical components can be estimated and their individual contribution to technical variability can be dissected. This design can be easily adapted to other quantitative proteomics pipelines. Herein, we applied this methodology to our label-free workflow for the processing of human brain tissue. For this application, the pipeline was divided into four critical components: Tissue dissection and homogenization (extraction), protein denaturation followed by trypsin digestion and SPE clean-up (digestion), short-term run-to-run instrumental response fluctuation (instrumental variance), and long-term drift of the quantitative response of the LC-MS/MS platform over the 2 week period of continuous analysis (instrumental stability). From this analysis, we found the following contributions to variability: extraction (72%) >> instrumental variance (16%) > instrumental stability (8.4%) > digestion (3.1%). Furthermore, the stability of the platform and its' suitability for discovery proteomics studies is demonstrated. PMID:23495885

  4. Sources of Technical Variability in Quantitative LC-MS Proteomics: Human Brain Tissue Sample Analysis.

    SciTech Connect

    Piehowski, Paul D.; Petyuk, Vladislav A.; Orton, Daniel J.; Xie, Fang; Moore, Ronald J.; Ramirez Restrepo, Manuel; Engel, Anzhelika; Lieberman, Andrew P.; Albin, Roger L.; Camp, David G.; Smith, Richard D.; Myers, Amanda J.

    2013-05-03

    To design a robust quantitative proteomics study, an understanding of both the inherent heterogeneity of the biological samples being studied as well as the technical variability of the proteomics methods and platform is needed. Additionally, accurately identifying the technical steps associated with the largest variability would provide valuable information for the improvement and design of future processing pipelines. We present an experimental strategy that allows for a detailed examination of the variability of the quantitative LC-MS proteomics measurements. By replicating analyses at different stages of processing, various technical components can be estimated and their individual contribution to technical variability can be dissected. This design can be easily adapted to other quantitative proteomics pipelines. Herein, we applied this methodology to our label-free workflow for the processing of human brain tissue. For this application, the pipeline was divided into four critical components: Tissue dissection and homogenization (extraction), protein denaturation followed by trypsin digestion and SPE clean-up (digestion), short-term run-to-run instrumental response fluctuation (instrumental variance), and long-term drift of the quantitative response of the LC-MS/MS platform over the 2 week period of continuous analysis (instrumental stability). From this analysis, we found the following contributions to variability: extraction (72%) >> instrumental variance (16%) > instrumental stability (8.4%) > digestion (3.1%). Furthermore, the stability of the platform and its’ suitability for discovery proteomics studies is demonstrated.

  5. Quantitative analysis of biological tissues using Fourier transform-second-harmonic generation imaging

    NASA Astrophysics Data System (ADS)

    Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.

    2010-02-01

    We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.

  6. Analysis of Sputtered Neutrals by Nonresonant Multiphoton Ionization. II. A Quantitative Composition Analysis of Cu-Al Alloy

    NASA Astrophysics Data System (ADS)

    Kawatoh, Eizoh; Shimizu, Ryuichi

    1991-04-01

    A sputtered neutral mass analysis by multiphoton ionization was applied to the composition analysis of Cu-Al alloy. The reliability of the quantitative estimation of the signal intensity was improved by using an ion-counting system with a simple flight tube. Measured intensities of Cu and Al from a Cu-Al alloy sample represented atomic compositions with high accuracy. Errors in the estimation of the surface chemical composition are within several percent.

  7. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  8. Quantitative Analysis of the Nanopore Translocation Dynamics of Simple Structured Polynucleotides

    PubMed Central

    Schink, Severin; Renner, Stephan; Alim, Karen; Arnaut, Vera; Simmel, Friedrich C.; Gerland, Ulrich

    2012-01-01

    Nanopore translocation experiments are increasingly applied to probe the secondary structures of RNA and DNA molecules. Here, we report two vital steps toward establishing nanopore translocation as a tool for the systematic and quantitative analysis of polynucleotide folding: 1), Using α-hemolysin pores and a diverse set of different DNA hairpins, we demonstrate that backward nanopore force spectroscopy is particularly well suited for quantitative analysis. In contrast to forward translocation from the vestibule side of the pore, backward translocation times do not appear to be significantly affected by pore-DNA interactions. 2), We develop and verify experimentally a versatile mesoscopic theoretical framework for the quantitative analysis of translocation experiments with structured polynucleotides. The underlying model is based on sequence-dependent free energy landscapes constructed using the known thermodynamic parameters for polynucleotide basepairing. This approach limits the adjustable parameters to a small set of sequence-independent parameters. After parameter calibration, the theoretical model predicts the translocation dynamics of new sequences. These predictions can be leveraged to generate a baseline expectation even for more complicated structures where the assumptions underlying the one-dimensional free energy landscape may no longer be satisfied. Taken together, backward translocation through α-hemolysin pores combined with mesoscopic theoretical modeling is a promising approach for label-free single-molecule analysis of DNA and RNA folding. PMID:22225801

  9. Applications of global quantitative18F-FDG-PET analysis in temporal lobe epilepsy.

    PubMed

    Peter, Jonah; Houshmand, Sina; Werner, Thomas J; Rubello, Domenico; Alavi, Abass

    2016-03-01

    Temporal lobe epilepsy (TLE) is a prevalent neurodegenerative disease associated with various neuropsychiatric disorders and decreased quality of life. Much has been said about the use of fluorine-18 fluorodeoxyglucose positron emission tomography (F-FDG-PET), magnetic resonance imaging (MRI), and computed tomography in the qualitative assessment of TLE. However, research into the applications of quantitative measurements to treat and diagnose TLE is severely lacking in the literature. Global quantitative analysis usingF-FDG-PET is a powerful tool in the metabolic assessment of TLE, and can more accurately identify seizure lateralization and the potential effects of treatment as compared with visual assessments and traditional biopsy region-of-interest quantification. Therefore, there is a pressing need to introduce these novel methods to the treatment of TLE. AlthoughF-FDG-PET is most commonly used for visual assessments, qualitative analysis is associated with high levels of interobserver and intraobserver variability. Semiquantitative analysis using standardized uptake value is a more consistently accurate measure of the hypometabolic patterns seen in TLE patients. Novel methods of global quantitative analysis developed in our laboratory have the potential to improve TLE assessment by limiting variability and correcting for the partial volume effect. It is of great importance to adopt these techniques into the mainstream diagnosis and treatment of TLE in order to improve patient care worldwide. PMID:26588069

  10. Status of Aqua MODIS On-orbit Calibration and Characterization

    NASA Technical Reports Server (NTRS)

    Xiong, X.; Barnes, W.; Chiang, K.; Erives, H.; Che, N.; Sun, J.; Isaacman, A.; Salomonson, V.

    2004-01-01

    The MODIS Flight Model 1 (FM1) has been in operation for more than two years since its launch onboard the NASA's Earth Observing System (EOS) Aqua spacecraft on May 4, 2002. The MODIS has 36 spectral bands: 20 reflective solar bands (RSB) with center wavelengths from 0.41 to 2.2 micron and 16 thermal emissive bands (TEB) from 3.7 to 14.5 micron. It provides the science community observations (data products) of the Earth's land, oceans, and atmosphere for a board range of applications. Its primary on-orbit calibration and characterization activities are performed using a solar diffuser (SD) and a solar diffuser stability monitor (SDSM) system for the RSB and a blackbody for the TEB. Another on-board calibrator (OBC) known as the spectro-radiometric calibration assembly (SRCA) is used for the instrument's spatial (TEB and RSB) and spectral (RSB only) characterization. We present in this paper the status of Aqua MODIS calibration and characterization during its first two years of on-orbit operation. Discussions will be focused on the calibration activities executed on-orbit in order to maintain and enhance the instrument's performance and the quality of its Level 1B (L1B) data products. We also provide comparisons between Aqua MODIS and Terra MODIS (launched in December, 1999), including their similarity and difference in response trending and optics degradation. Existing data and results show that Aqua MODIS bands 8 (0.412 micron) and 9 (0.443 micron) have much smaller degradation than Terra MODIS bands 8 and 9. The most noticeable feature shown in the RSB trending is that the mirror side differences in Aqua MODIS are extremely small and stable (<0.1%) while the Terra MODIS RSB trending has shown significant mirror side difference and wavelength dependent degradation. The overall stability of the Aqua MODIS TEB is also better than that of the Terra MODIS during their first two years of on-orbit operation.

  11. Biomonitoring and risk assessment on earth and during exploratory missions using AquaHab ®

    NASA Astrophysics Data System (ADS)

    Slenzka, K.; Dünne, M.; Jastorff, B.

    2008-12-01

    Bioregenerative closed ecological life support systems (CELSS) will be necessary in the exploration context revitalizing atmosphere, waste water and producing food for the human CELSS mates. During these long-term space travels and stays far away from Earth in an hostile environment as well as far for example from any hospital and surgery potential, it will be necessary to know much more about chemical and drug contamination in the special sense and by human's themselves in detail. Additionally, there is a strong need on Earth for more relevant standardized test systems including aquatic ones for the prospective risk assessment of chemicals and drugs in general on a laboratory scale. Current standardized test systems are mono species tests, and thus do not represent system aspects and have reduced environmental relevance. The experience gained during the last years in our research group lead to the development of a self-sustaining closed aquatic habitat/facility, called AquaHab ® which can serve regarding space exploration and Earth application. The AquaHab ® module can be the home of several fish species, snails, plants, amphipods and bacteria. The possibility to use different effect endpoints with certain beneficial characteristics is the basis for the application of AquaHab ® in different fields. Influence of drugs and chemicals can be tested on several trophic levels and ecosystem levels; guaranteeing a high relevance for aquatic systems in the real environment. Analyses of effect parameters of different complexity (e.g. general biological and water chemical parameters, activity of biotransforming enzymes) result in broad spectra of sensitivity. Combined with residual analyses (including all metabolites), this leads to an extended prospective risk assessment of a chemical on Earth and in a closed Life Support System. The possibility to measure also sensitive "online" parameters (e.g. behavior, respiration/photosynthetic activity) enables a quick and sensitive effect analysis of water contaminants in respective environments. AquaHab ® is currently under development to an early warning biomonitoring system using genetically modified fish and green algae. The implementation of biosensors/biochip in addition is also discussed.

  12. Quantitative analysis of terahertz spectra for illicit drugs using adaptive-range micro-genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin

    2011-08-01

    In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.

  13. Quantitative analysis of surface characteristics and morphology in Death Valley, California using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Kierein-Young, K. S.; Kruse, F. A.; Lefkoff, A. B.

    1992-01-01

    The Jet Propulsion Laboratory Airborne Synthetic Aperture Radar (JPL-AIRSAR) is used to collect full polarimetric measurements at P-, L-, and C-bands. These data are analyzed using the radar analysis and visualization environment (RAVEN). The AIRSAR data are calibrated using in-scene corner reflectors to allow for quantitative analysis of the radar backscatter. RAVEN is used to extract surface characteristics. Inversion models are used to calculate quantitative surface roughness values and fractal dimensions. These values are used to generate synthetic surface plots that represent the small-scale surface structure of areas in Death Valley. These procedures are applied to a playa, smooth salt-pan, and alluvial fan surfaces in Death Valley. Field measurements of surface roughness are used to verify the accuracy.

  14. Quantitative Analysis for Monitoring Formulation of Lubricating Oil Using Terahertz Time-Domain Transmission Spectroscopy

    NASA Astrophysics Data System (ADS)

    Tian, Lu; Zhao, Kun; Zhou, Qing-Li; Shi, Yu-Lei; Zhang, Cun-Lin

    2012-04-01

    The quantitative analysis of zinc isopropyl-isooctyl-dithiophosphate (T204) mixed with lube base oil from Korea with viscosity index 70 (T204-Korea70) is presented by using terahertz time-domain spectroscopy (THz-TDS). Compared with the middle-infrared spectra of zinc n-butyl-isooctyl-dithiophosphate (T202) and T204, THz spectra of T202 and T204 show the weak broad absorption bands. Then, the absorption coefficients of the T204-Korea70 system follow Beer's law at the concentration from 0.124 to 4.024%. The experimental absorption spectra of T204-Korea70 agree with the calculated ones based on the standard absorption coefficients of T204 and Korea70. The quantitative analysis enables a strategy to monitor the formulation of lubricating oil in real time.

  15. Three-dimensional quantitative analysis of cardiovascular flow and heart wall motions in living zebrafish embryos

    NASA Astrophysics Data System (ADS)

    Lu, Jian; Fraser, Scott; Gharib, Morteza

    2007-11-01

    Progenitor factors such as blood flow induced mechanical forces play a key role in vertebrate heart development. However, three-dimensional (3-D) characteristics of in vivo cardiovascular flow and heart wall motions are not well understood due to the lack of proper imaging tools with sufficient spatial and temporal resolutions for quantitative analysis. In this study, a real-time high-speed 3-D imaging system based on defocusing digital particle image velocimetry was used to study dynamic cell motions in living zebrafish embryos. 500-nm fluorescent microspheres were injected into the blood stream to label the cardiovascular flow. Cardiac blood flow velocities were measured during a cardiac cycle at various early embryonic stages. The heart wall of a zebrafish embryo was labeled by a few fluorescent microspheres adhered to the wall. 3-D dynamic motions were reconstructed and quantitative analysis such as strain measurement was performed.

  16. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  17. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    PubMed

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. PMID:23892044

  18. Quantitative wake analysis of a freely swimming fish using 3D synthetic aperture PIV

    NASA Astrophysics Data System (ADS)

    Mendelson, Leah; Techet, Alexandra H.

    2015-07-01

    Synthetic aperture PIV (SAPIV) is used to quantitatively analyze the wake behind a giant danio ( Danio aequipinnatus) swimming freely in a seeded quiescent tank. The experiment is designed with minimal constraints on animal behavior to ensure that natural swimming occurs. The fish exhibits forward swimming and turning behaviors at speeds between 0.9 and 1.5 body lengths/second. Results show clearly isolated and linked vortex rings in the wake structure, as well as the thrust jet coming off of a visual hull reconstruction of the fish body. As a benchmark for quantitative analysis of volumetric PIV data, the vortex circulation and impulse are computed using methods consistent with those applied to planar PIV data. Volumetric momentum analysis frameworks are discussed for linked and asymmetric vortex structures, laying a foundation for further volumetric studies of swimming hydrodynamics with SAPIV. Additionally, a novel weighted refocusing method is presented as an improvement to SAPIV reconstruction.

  19. An efficient approach to the quantitative analysis of humic acid in water.

    PubMed

    Wang, Xue; Li, Bao Qiong; Zhai, Hong Lin; Xiong, Meng Yi; Liu, Ying

    2016-01-01

    Rayleigh and Raman scatterings inevitably appear in fluorescence measurements, which make the quantitative analysis more difficult, especially in the overlap of target signals and scattering signals. Based on the grayscale images of three-dimensional fluorescence spectra, the linear model with two selected Zernike moments was established for the determination of humic acid, and applied to the quantitative analysis of the real sample taken from the Yellow River. The correlation coefficient (R(2)) and leave-one-out cross validation correlation coefficient (R(2)cv) were up to 0.9994 and 0.9987, respectively. The average recoveries were reached 96.28%. Compared with N-way partial least square and alternating trilinear decomposition methods, our approach was immune from the scattering and noise signals owing to its powerful multi-resolution characteristic and the obtained results were more reliable and accurate, which could be applied in food analyses. PMID:26213072

  20. Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations

    NASA Technical Reports Server (NTRS)

    Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.

    1993-01-01

    We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.

  1. Network analysis of quantitative proteomics on asthmatic bronchi: effects of inhaled glucocorticoid treatment

    PubMed Central

    2011-01-01

    Background Proteomic studies of respiratory disorders have the potential to identify protein biomarkers for diagnosis and disease monitoring. Utilisation of sensitive quantitative proteomic methods creates opportunities to determine individual patient proteomes. The aim of the current study was to determine if quantitative proteomics of bronchial biopsies from asthmatics can distinguish relevant biological functions and whether inhaled glucocorticoid treatment affects these functions. Methods Endobronchial biopsies were taken from untreated asthmatic patients (n = 12) and healthy controls (n = 3). Asthmatic patients were randomised to double blind treatment with either placebo or budesonide (800 ?g daily for 3 months) and new biopsies were obtained. Proteins extracted from the biopsies were digested and analysed using isobaric tags for relative and absolute quantitation combined with a nanoLC-LTQ Orbitrap mass spectrometer. Spectra obtained were used to identify and quantify proteins. Pathways analysis was performed using Ingenuity Pathway Analysis to identify significant biological pathways in asthma and determine how the expression of these pathways was changed by treatment. Results More than 1800 proteins were identified and quantified in the bronchial biopsies of subjects. The pathway analysis revealed acute phase response signalling, cell-to-cell signalling and tissue development associations with proteins expressed in asthmatics compared to controls. The functions and pathways associated with placebo and budesonide treatment showed distinct differences, including the decreased association with acute phase proteins as a result of budesonide treatment compared to placebo. Conclusions Proteomic analysis of bronchial biopsy material can be used to identify and quantify proteins using highly sensitive technologies, without the need for pooling of samples from several patients. Distinct pathophysiological features of asthma can be identified using this approach and the expression of these features is changed by inhaled glucocorticoid treatment. Quantitative proteomics may be applied to identify mechanisms of disease that may assist in the accurate and timely diagnosis of asthma. Trial registration ClinicalTrials.gov registration NCT01378039 PMID:21939520

  2. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  3. Exploratory data analysis groupware for qualitative and quantitative electrophoretic gel analysis over the Internet-WebGel.

    PubMed

    Lemkin, P F; Myrick, J M; Lakshmanan, Y; Shue, M J; Patrick, J L; Hornbeck, P V; Thornwal, G C; Partin, A W

    1999-12-01

    Many scientists use quantitative measurements to compare the presence and amount, of various proteins and nucleotides among series of one- and two-dimensional (1-D and 2-D) electrophoretic gels. These gels are often scanned into digital image files. Gel spots are then quantified using stand-alone analysis software. However, as more research collaborations take place over the Internet, it has become useful to share intermediate quantitative data between researchers. This allows research group members to investigate their data and share their work in progress. We developed a World Wide Web group-accessible software system, WebGel, for interactively exploring qualitative and quantitative differences between electrophoretic gels. Such Internet databases are useful for publishing quantitative data and allow other researchers to explore the data with respect to their own research. Because intermediate results of one user may be shared with their collaborators using WebGel, this form of active data-sharing constitutes a groupware method for enhancing collaborative research. Quantitative and image gel data from a stand-alone gel image processing system are copied to a database accessible on the WebGel Web server. These data are then available for analysis by the WebGel database program residing on that server. Visualization is critical for better understanding of the data. WebGel helps organize labeled gel images into montages of corresponding spots as seen in these different gels. Various views of multiple gel images, including sets of spots, normalization spots, labeled spots, segmented gels, etc. may also be displayed. These displays are active and may be used for performing database operations directly on individual protein spots by simply clicking on them. Corresponding regions between sets of gels may be visually analyzed using Flicker-comparison (Electrophoresis 1997, 18, 122-140) as one of the WebGel methods for qualitative analysis. Quantitative exploratory data analysis can be performed by comparing protein concentration values between corresponding spots for multiple samples run in separate gels. These data are then used to generate reports on statistical differences between sets of gels (e.g., between different disease states such as benign or metastatic cancers, etc.). Using combined visual and quantitative methods, WebGel can help bridge the analysis of dissimilar gels which are difficult to analyze with stand-alone systems and can serve as a collaborative Internet tool in a groupware setting. PMID:10612275

  4. Symmetry breaking in the early mammalian embryo: the case for quantitative single-cell imaging analysis.

    PubMed

    Welling, Maaike; Ponti, Aaron; Pantazis, Periklis

    2016-03-01

    In recent years, advances in imaging probes, cutting-edge microscopy techniques and powerful bioinformatics image analysis have markedly expanded the imaging toolbox available to developmental biologists. Apart from traditional qualitative studies, embryonic development can now be investigated in vivo with improved spatiotemporal resolution, with more detailed quantitative analyses down to the single-cell level of the developing embryo. Such imaging tools can provide many benefits to investigate the emergence of the asymmetry in the early mammalian embryo. Quantitative single-cell imaging has provided a deeper knowledge of the dynamic processes of how and why apparently indistinguishable cells adopt separate fates that ensure proper lineage allocation and segregation. To advance our understanding of the mechanisms governing such cell fate decisions, we will need to address current limitations of fluorescent probes, while at the same time take on challenges in image processing and analysis. New discoveries and developments in quantitative, single-cell imaging analysis will ultimately enable a truly comprehensive, multi-dimensional and multi-scale investigation of the dynamic morphogenetic processes that work in concert to shape the embryo. PMID:26316520

  5. Quantitative trace element analysis of microdroplet residues by secondary ion mass spectrometry

    SciTech Connect

    Odom, R.W.; Lux, G.; Fleming, R.H.; Chu, P.K.; Niemeyer, I.C.; Blattner, R.J.

    1988-10-01

    This paper reports the results of secondary ion mass spectrometry (SIMS) analyses of the elemental components contained in microvolume liquid residues deposited onto high-purity graphite substrates. These residues were formed by evaporating the solvent in 25-nL volumes of standard solutions containing the analyte element and a known mass of a yttrium internal standard. The capability of the SIMS technique to quantitatively measure the mass of the analyte was determined from these standard samples. The relative ion yields of Al, Ca, Mn, Fe, Co, Cu, An, Se, and Pb with respect to the Y internal standard were determined. The minimum detectable quantities (MDQ) of these elements were measured along with the precision of the SIMS analysis. Gram detectivities for this set of elements dissolved in the 25-nL volumes ranged between 85 pg and 2.0 fg, corresponding to molar detectivities ranging between 16 ..mu..M and 1.6 nM. Stable isotope dilution analysis of samples containing enriched /sup 206/Pb demonstrated quantitative measurement accuracy within 3% of the true values for samples containing 4.0 mM Pb. SIMS analyses of the NBS bovine serum reference standard indicate that this technique can provide useful quantitative analysis of selected elements contained in a microvolume of biological fluids.

  6. Quantitative analysis of the epitaxial recrystallization effect induced by swift heavy ions in silicon carbide

    NASA Astrophysics Data System (ADS)

    Benyagoub, A.

    2015-12-01

    This paper discusses recent results on the recrystallization effect induced by swift heavy ions (SHI) in pre-damaged silicon carbide. The recrystallization kinetics was followed by using increasing SHI fluences and by starting from different levels of initial damage within the SiC samples. The quantitative analysis of the data shows that the recrystallization rate depends drastically on the local amount of crystalline material: it is nil in fully amorphous regions and becomes more significant with increasing amount of crystalline material. For instance, in samples initially nearly half-disordered, the recrystallization rate per incident ion is found to be 3 orders of magnitude higher than what it is observed with the well-known IBIEC process using low energy ions. This high rate can therefore not be accounted for by the existing IBIEC models. Moreover, decreasing the electronic energy loss leads to a drastic reduction of the recrystallization rate. A comprehensive quantitative analysis of all the experimental results shows that the SHI induced high recrystallization rate can only be explained by a mechanism based on the melting of the amorphous zones through a thermal spike process followed by an epitaxial recrystallization initiated from the neighboring crystalline regions if the size of the latter exceeds a certain critical value. This quantitative analysis also reveals that recent molecular dynamics calculations supposed to reproduce this phenomenon are wrong since they overestimated the recrystallization rate by a factor ?40.

  7. Toward quantitative deuterium analysis with laser-induced breakdown spectroscopy using atmospheric-pressure helium gas

    SciTech Connect

    Hedwig, Rinda; Lie, Zener Sukra; Kurniawan, Koo Hendrik; Kagawa, Kiichiro; Tjia, May On

    2010-01-15

    An experimental study has been carried out for the development of quantitative deuterium analysis using the neodymium doped yttrium aluminum garnet laser-induced breakdown spectroscopy (LIBS) with atmospheric pressure surrounding He gas by exploring the appropriate experimental condition and special sample cleaning technique. The result demonstrates the achievement of a full resolution between the D and H emission lines from zircaloy-4 samples, which is prerequisite for the desired quantitative analysis. Further, a linear calibration line with zero intercept was obtained for the emission intensity of deuterium from a number of zircaloy samples doped with predetermined concentrations of deuterium. The result is obtained by setting a +4 mm defocusing position for the laser beam, 6 {mu}s detection gating time, and 7 mm imaging position of the plasma for the detection, which is combined with a special procedure of repeated laser cleaning of the samples. This study has thus provided the basis for the development of practical quantitative deuterium analysis by LIBS.

  8. Low-dose CT for quantitative analysis in acute respiratory distress syndrome

    PubMed Central

    2013-01-01

    Introduction The clinical use of serial quantitative computed tomography (CT) to characterize lung disease and guide the optimization of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS) is limited by the risk of cumulative radiation exposure and by the difficulties and risks related to transferring patients to the CT room. We evaluated the effects of tube current-time product (mAs) variations on quantitative results in healthy lungs and in experimental ARDS in order to support the use of low-dose CT for quantitative analysis. Methods In 14 sheep chest CT was performed at baseline and after the induction of ARDS via intravenous oleic acid injection. For each CT session, two consecutive scans were obtained applying two different mAs: 60 mAs was paired with 140, 15 or 7.5 mAs. All other CT parameters were kept unaltered (tube voltage 120 kVp, collimation 32 × 0.5 mm, pitch 0.85, matrix 512 × 512, pixel size 0.625 × 0.625 mm). Quantitative results obtained at different mAs were compared via Bland-Altman analysis. Results Good agreement was observed between 60 mAs and 140 mAs and between 60 mAs and 15 mAs (all biases less than 1%). A further reduction of mAs to 7.5 mAs caused an increase in the bias of poorly aerated and nonaerated tissue (-2.9% and 2.4%, respectively) and determined a significant widening of the limits of agreement for the same compartments (-10.5% to 4.8% for poorly aerated tissue and -5.9% to 10.8% for nonaerated tissue). Estimated mean effective dose at 140, 60, 15 and 7.5 mAs corresponded to 17.8, 7.4, 2.0 and 0.9 mSv, respectively. Image noise of scans performed at 140, 60, 15 and 7.5 mAs corresponded to 10, 16, 38 and 74 Hounsfield units, respectively. Conclusions A reduction of effective dose up to 70% has been achieved with minimal effects on lung quantitative results. Low-dose computed tomography provides accurate quantitative results and could be used to characterize lung compartment distribution and possibly monitor time-course of ARDS with a lower risk of exposure to ionizing radiation. A further radiation dose reduction is associated with lower accuracy in quantitative results. PMID:24004842

  9. Novel analysis for improved validity in semi-quantitative 2-deoxyglucose autoradiographic imaging.

    PubMed

    Dawson, Neil; Ferrington, Linda; Olverman, Henry J; Kelly, Paul A T

    2008-10-30

    The original [(14)C]-2-deoxyglucose autoradiographic imaging technique allows for the quantitative determination of local cerebral glucose utilisation (LCMRglu) [Sokoloff L, Reivich, M, Kennedy C, Desrosiers M, Patlak C, Pettigrew K, et al. The 2-deoxyglucose-C-14 method for measurement of local cerebral glucose utilisation-theory, procedure and normal values in conscious and anestherized albino rats. J Neurochem 1977;28:897-916]. The range of applications to which the quantitative method can be readily applied is limited, however, by the requirement for the intermittent measurement of arterial radiotracer and glucose concentrations throughout the experiment, via intravascular cannulation. Some studies have applied a modified, semi-quantitative approach to estimate LCMRglu while circumventing the requirement for intravascular cannulation [Kelly S, Bieneman A, Uney J, McCulloch J. Cerebral glucose utilization in transgenic mice over-expressing heat shock protein 70 is altered by dizocilpine. Eur J Neurosci 2002;15(6):945-52; Jordan GR, McCulloch J, Shahid M, Hill DR, Henry B, Horsburgh K. Regionally selective and dose-dependent effects of the ampakines Org 26576 and Org 24448 on local cerebral glucose utilisation in the mouse as assessed by C-14-2-deoxyglucose autoradiography. Neuropharmacology 2005;49(2):254-64]. In this method only a terminal blood sample is collected for the determination of plasma [(14)C] and [glucose] and the rate of LCMRglu in each brain region of interest (RoI) is estimated by comparing the [(14)C] concentration in each region relative to a selected control region, which is proposed to demonstrate metabolic stability between the experimental groups. Here we show that the semi-quantitative method has reduced validity in the measurement of LCMRglu as compared to the quantitative method and that the validity of this technique is further compromised by the inability of the methods applied within the analysis to appropriately determine metabolic stability in the selected standard region. To address these issues we have developed a novel form of analysis that provides an index of LCMRglu (iLCMRglu) for application when using the semi-quantitative approach. Provided that the methodological constraints inherent in 2-deoxyglucose autoradiography (e.g. normoglycaemia) are met this analytical technique both increases the validity of LCMRglu estimation by the semi-quantitative method and also allows for its broader experimental application. PMID:18762213

  10. Summary of Terra and Aqua MODIS Long-Term Performance

    NASA Technical Reports Server (NTRS)

    Xiong, Xiaoxiong (Jack); Wenny, Brian N.; Angal, Amit; Barnes, William; Salomonson, Vincent

    2011-01-01

    Since launch in December 1999, the MODIS ProtoFlight Model (PFM) onboard the Terra spacecraft has successfully operated for more than 11 years. Its Flight Model (FM) onboard the Aqua spacecraft, launched in May 2002, has also successfully operated for over 9 years. MODIS observations are made in 36 spectral bands at three nadir spatial resolutions and are calibrated and characterized regularly by a set of on-board calibrators (OBC). Nearly 40 science products, supporting a variety of land, ocean, and atmospheric applications, are continuously derived from the calibrated reflectances and radiances of each MODIS instrument and widely distributed to the world-wide user community. Following an overview of MODIS instrument operation and calibration activities, this paper provides a summary of both Terra and Aqua MODIS long-term performance. Special considerations that are critical to maintaining MODIS data quality and beneficial for future missions are also discussed.

  11. Applications of quantitative 1H- and 13C-NMR spectroscopy in drug analysis.

    PubMed

    Pieters, L A; Vlietinck, A J

    1989-01-01

    The usefulness of 1H and 13C Fourier transform (FT) nuclear magnetic resonance spectroscopy (1H- and 13C-NMR) as quantitative methods stems from the potential direct relationship between the area under an NMR peak and the number of the particular type of nuclei that give rise to the signal, though it is necessary, especially for quantitative 13C-NMR, to take some precautions. The experimental limitations that have to be overcome in order to obtain quantitative 13C-NMR spectra are associated with the relaxation time, the nuclear Overhauser effect (NOE), and the NMR instrument itself (filter characteristics, power level of the exciting pulse, dynamic range, digital resolution). Practical problems aside, 13C-NMR has a greater potential than 1H-NMR for the study of organic systems. The sensitivity of 13C chemical shifts to small differences in molecular environment, coupled with a large chemical shift range, gives a "chromatographic" separation of resonances of interest, and has made 13C-NMR an attractive method for analysing complex mixtures. Some applications of quantitative 1H- and 13C-NMR spectroscopy in drug analysis are discussed. PMID:2490526

  12. Surface Albedo/BRDF Parameters (Terra/Aqua MODIS)

    DOE Data Explorer

    Trishchenko, Alexander

    2008-01-15

    Spatially and temporally complete surface spectral albedo/BRDF products over the ARM SGP area were generated using data from two Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on Terra and Aqua satellites. A landcover-based fitting (LBF) algorithm is developed to derive the BRDF model parameters and albedo product (Luo et al., 2004a). The approach employs a landcover map and multi-day clearsky composites of directional surface reflectance. The landcover map is derived from the Landsat TM 30-meter data set (Trishchenko et al., 2004a), and the surface reflectances are from MODIS 500m-resolution 8-day composite products (MOD09/MYD09). The MOD09/MYD09 data are re-arranged into 10-day intervals for compatibility with other satellite products, such as those from the NOVA/AVHRR and SPOT/VGT sensors. The LBF method increases the success rate of the BRDF fitting process and enables more accurate monitoring of surface temporal changes during periods of rapid spring vegetation green-up and autumn leaf-fall, as well as changes due to agricultural practices and snowcover variations (Luo et al., 2004b, Trishchenko et al., 2004b). Albedo/BRDF products for MODIS on Terra and MODIS on Aqua, as well as for Terra/Aqua combined dataset, are generated at 500m spatial resolution and every 10-day since March 2000 (Terra) and July 2002 (Aqua and combined), respectively. The purpose for the latter product is to obtain a more comprehensive dataset that takes advantages of multi-sensor observations (Trishchenko et al., 2002). To fill data gaps due to cloud presence, various interpolation procedures are applied based on a multi-year observation database and referring to results from other locations with similar landcover property. Special seasonal smoothing procedure is also applied to further remove outliers and artifacts in data series.

  13. Quantitative analysis of lunar crater's landscape: automatic detection, classification and geological applications

    NASA Astrophysics Data System (ADS)

    Li, Ke; Chen, Jianping; He, Shujun; Zhang, Mingchao

    2013-04-01

    Lunar craters are the most important geological tectonic features on the moon; they are among the most studied subjects when it comes to the analysis of the surface of the moon since they provide us with the relative age of the surface unit and more information about lunar geology. Quantitative analysis of landscape on lunar crater is an important approach in lunar geological unit dating which plays a key role in understanding and reconstruction of lunar geological evolution. In this paper, a new approach of automatic crater detection and classification is proposed based on the quantitative analysis of crater's landscape with different spatial resolution digital terrain models. The approach proposed in this paper includes the following key points: 1) A new crater detection method which selects profile similarity parameters as the distinguishing marks is presented. The new method overcomes the high error defect of former DTM based crater detection algorithm; 2) Craters are sorted by the morphological characteristics of profiles. The new quantitative classification method overcomes the subjectivity of the previously descriptive classification method. In order to verify the usefulness of the proposed method the pre-selected landing area of China's Chang'e-III lunar satellite-Sinus Iridum is chosen as the experimental zone. DTM with different resolutions from the Chang'e-I Laser Altimeter, the Chang'e-I Stereoscopic Camera and the Lunar Orbiter Laser Altimeter (LOLA) are used for crater detection and classification. Dating results of each geological unit are gotten using crater size-frequency distribution method (CSFD). By comparing the former dating and manual classification data, we found that the results obtained by our method and the former results have the strong consistency. With the combination of automatic crater detection and classification, this paper basically provided a quantitative approach which can analyze the lunar crater's landscape and get geological information from it. And the approach can be widely used on other planets like Mars.

  14. Qualitative and quantitative analysis of systems and synthetic biology constructs using P systems.

    PubMed

    Konur, Savas; Gheorghe, Marian; Dragomir, Ciprian; Mierla, Laurentiu; Ipate, Florentin; Krasnogor, Natalio

    2015-01-16

    Computational models are perceived as an attractive alternative to mathematical models (e.g., ordinary differential equations). These models incorporate a set of methods for specifying, modeling, testing, and simulating biological systems. In addition, they can be analyzed using algorithmic techniques (e.g., formal verification). This paper shows how formal verification is utilized in systems and synthetic biology through qualitative vs quantitative analysis. Here, we choose two well-known case studies: quorum sensing in P. aeruginosas and pulse generator. The paper reports verification analysis of two systems carried out using some model checking tools, integrated to the Infobiotics Workbench platform, where system models are based on stochastic P systems. PMID:25090609

  15. Stable-isotope dilution LC–MS for quantitative biomarker analysis

    PubMed Central

    Ciccimaro, Eugene; Blair, Ian A

    2010-01-01

    The ability to conduct validated analyses of biomarkers is critically important in order to establish the sensitivity and selectivity of the biomarker in identifying a particular disease. The use of stable-isotope dilution (SID) methodology in combination with LC–MS/MS provides the highest possible analytical specificity for quantitative determinations. This methodology is now widely used in the discovery and validation of putative exposure and disease biomarkers. This review will describe the application of SID LC–MS methodology for the analysis of small-molecule and protein biomarkers. It will also discuss potential future directions for the use of this methodology for rigorous biomarker analysis. PMID:20352077

  16. Quantitative analysis of tin alloy combined with artificial neural network prediction

    SciTech Connect

    Oh, Seong Y.; Yueh, Fang-Yu; Singh, Jagdish P.

    2010-05-01

    Laser-induced breakdown spectroscopy was applied to quantitative analysis of three impurities in Sn alloy. The impurities analysis was based on the internal standard method using the Sn I 333.062-nm line as the reference line to achieve the best reproducible results. Minor-element concentrations (Ag, Cu, Pb) in the alloy were comparatively evaluated by artificial neural networks (ANNs) and calibration curves. ANN was found to effectively predict elemental concentrations with a trend of nonlinear growth due to self-absorption. The limits of detection for Ag, Cu, and Pb in Sn alloy were determined to be 29, 197, and 213 ppm, respectively.

  17. Quantitative X-ray diffraction analysis of oxides formed on superalloys

    NASA Technical Reports Server (NTRS)

    Garlick, R. G.

    1972-01-01

    Methods were developed for quantitative analysis by X-ray diffraction of the oxides Al2O3, NiO, Cr2O3, CoO, and CoCr2O4 within a standard deviation of about 10 percent of the weight fraction reported or within 1 percent absolute. These error limits assume that the sample oxides are well characterized and that the physiochemical structure of the oxides in the samples are identical with those in the synthesized standards. Results are given for the use of one of the techniques in the analysis of spalls from a series of oxidation tests of the cobalt base alloy WI-52.

  18. Quantitative spectral and orientational analysis in surface sum frequency generation vibrational spectroscopy (SFG-VS)

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua

    Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D = /, and comparison between the PNA method with the commonly used polarization intensity ratio (PIR) method is discussed. The polarization and incident angle dependencies of the SFG-VS intensity are also reviewed, in the light of how experimental arrangements can be optimized to effectively abstract crucial information from the SFG-VS experiments. The values and models of the local field factors in the molecular layers are discussed. In order to examine the validity and limitations of the bond polarizability derivative model, the general expressions for molecular hyperpolarizability tensors and their expression with the bond polarizability derivative model for C3v, C2v and C?v molecular groups are given in the two appendixes. We show that the bond polarizability derivative model can quantitatively describe many aspects of the intensities observed in the SFG-VS spectrum of the vapour/neat liquid interfaces in different polarizations. Using the polarization analysis in SFG-VS, polarization selection rules or guidelines are developed for assignment of the SFG-VS spectrum. Using the selection rules, SFG-VS spectra of vapour/diol, and vapour/n-normal alcohol (n˜ 1-8) interfaces are assigned, and some of the ambiguity and confusion, as well as their implications in previous IR and Raman assignment, are duly discussed. The ability to assign a SFG-VS spectrum using the polarization selection rules makes SFG-VS not only an effective and useful vibrational spectroscopy technique for interface studies, but also a complementary vibrational spectroscopy method in general condensed phase studies. These developments will put quantitative orientational and spectral analysis in SFG-VS on a more solid foundation. The formulations, concepts and issues discussed in this review are expected to find broad applications for investigations on molecular interfaces in the future.

  19. Development of an AQUA Based Near-Surface Parameter Retrieval

    NASA Technical Reports Server (NTRS)

    Roberts, Brent; Clayson, Carol Anne

    2010-01-01

    The production of a satellite based turbulent surface flux product relies critically upon the near-surface input parameters. Development of retrieval algorithms for the necessary near-surface variables of wind speed, specific humidity, air temperature, and sea surface temperature has proceeded relatively independent of each another until recently. The use of a neural network approach using Special Sensor Microwave/Imager (SSM/I) data in conjunction with a first guess sea surface temperature has led to successful retrieval of all parameters simultaneously. However, SSM/I frequencies lack inherent sensitivity to the sea surface temperature (SST). Recent studies have found improved air temperature and humidity retrievals can be obtained via inclusion of microwave sounding channels weighted in the lower troposphere. The inclusion of SSM/I-like frequencies as well as SST-sensitive microwave channels on AMSR-E along with AMSU-A sounding data onboard the AQUA platform provides an unique opportunity. That is the ability to provide near-simultaneous (in space and time) measurements allowing the retrieval of all the near-surface variables, including SST. This study shows results of a new algorithm designed to take advantage of the unique sampling ability of AQUA based sensors. Results from a neural network based methodology will be shown as compared to in-situ based observations of near-surface variables. Implications for creation of an AQUA based turbulent surface product are also discussed.

  20. Evaluating the impact of cold focal plane temperature on Aqua MODIS thermal emissive band calibration

    NASA Astrophysics Data System (ADS)

    Li, Yonghong; Wu, Aisheng; Wenny, Brian; Xiong, Xiaoxiong

    2015-09-01

    Aqua MODIS, the second MODIS instrument of the NASA Earth Observation System, has operated for over thirteen years since launch in 2002. MODIS has sixteen thermal emissive bands (TEB) located on two separate cold focal plane assemblies (CFPA). The TEB are calibrated using onboard blackbody and space view observations. MODIS CFPA temperature is controlled by a radiative cooler and heaters in order to maintain detector gain stability. Beginning in 2006, the CFPA temperature gradually varies from its designed operating temperature with increasing orbital and seasonal fluctuations, with the largest observed impacts on the TEB photoconductive (PC) bands. In Aqua Collection 6 (C6), a correction to the detector gain due to the CFPA temperature variation is applied for data after mid-2012. This paper evaluates the impact of the CFPA temperature variation on the TEB PC band calibration through comparisons with simultaneous nadir overpasses (SNO) measurements from the Infrared Atmospheric Sounding Interferometer (IASI) and Atmospheric Infrared Sounder (AIRS). Our analysis shows that the current L1B product from mid-2011 to mid-2012 is affected by the CFPA temperature fluctuation. The MODIS-IASI comparison results show that no drift is observed in PC bands over the CFPA temperature variation range. Similarly, in the MODIS-AIRS comparison, bands 31-34 show nearly no trend over the range of CFPA temperature while a slight drift in bands 35-36 are seen from the comparison results.

  1. High Throughput Quantitative Analysis of Serum Proteins using Glycopeptide Capture and Liquid Chromatography Mass Spectrometry

    SciTech Connect

    Zhang, Hui; Yi, Eugene C.; Li, Xiao-jun; Mallick, Parag; Kelly-Spratt, Karen S.; Masselon, Christophe D.; Camp, David G.; Smith, Richard D.; Kemp, Christopher; Aebersold, Ruedi

    2005-02-01

    It is expected that the composition of the serum proteome can provide valuable information about the state of the human body in health and disease, and that this information can be extracted via quantitative proteomic measurements. Suitable proteomic techniques need to be sensitive, reproducible and robust to detect potential biomarkers below the level of highly expressed proteins, to generate data sets that are comparable between experiments and laboratories, and have high throughput to support statistical studies. In this paper, we report a method for high throughput quantitative analysis of serum proteins. It consists of the selective isolation of peptides that are N-linked glycosylated in the intact protein, the analysis of these, no de-glycosylated peptides by LC-ESI-MS, and the comparative analysis of the resulting patterns. By focusing selectively on a few formerly N-linked glycopeptides per serum protein, the complexity of the analyte sample is significantly reduced and the sensitivity and throughput of serum proteome analysis are increased compared with the analysis of total tryptic peptides from unfractionated samples. We provide data that document the performance of the method and show that sera from untreated normal mice and genetically identical mice with carcinogen induced skin cancer can be unambiguously discriminated using unsupervised clustering of the resulting peptide patterns. We further identify, by tandem mass spectrometry, some of the peptides that were consistently elevated in cancer mice compared to their control littermates.

  2. High Throughput Quantitative Analysis of Serum Proteins Using Glycopeptide Capture and Liquid Chromatography Mass Spectrometry

    SciTech Connect

    Zhang, Hui; Yi, Eugene C.; Li, Xiao-jun; Mallick, Parag; Kelly-Spratt, Karen S.; Masselon, Christophe D.; Camp, David G.; Smith, Richard D.; Kemp, Christopher J.; Aebersold, Reudi

    2005-02-01

    It is expected that the composition of the serum proteome can provide valuable information about the state of the human body in health and disease and that this information can be extracted via quantitative proteomic measurements. Suitable proteomic techniques need to be sensitive, reproducible, and robust to detect potential biomarkers below the level of highly expressed proteins, generate data sets that are comparable between experiments and laboratories, and have high throughput to support statistical studies. Here we report a method for high throughput quantitative analysis of serum proteins. It consists of the selective isolation of peptides that are N-linked glycosylated in the intact protein, the analysis of these now deglycosylated peptides by liquid chromatography electrospray ionization mass spectrometry, and the comparative analysis of the resulting patterns. By focusing selectively on a few formerly N-linked glycopeptides per serum protein, the complexity of the analyte sample is significantly reduced and the sensitivity and throughput of serum proteome analysis are increased compared with the analysis of total tryptic peptides from unfractionated samples. We provide data that document the performance of the method and show that sera from untreated normal mice and genetically identical mice with carcinogen-induced skin cancer can be unambiguously discriminated using unsupervised clustering of the resulting peptide patterns. We further identify, by tandem mass spectrometry, some of the peptides that were consistently elevated in cancer mice compared with their control littermates.

  3. RASL-seq for Massively Parallel and Quantitative Analysis of Gene Expression

    PubMed Central

    Li, Hairi; Qu, Jinsong; Fu, Xiang-Dong

    2012-01-01

    Large-scale, quantitative analysis of gene expression can be accomplished by microarray or RNA-seq analysis. While these methods are applicable to genome-wide analysis, it is often desirable to quantify expression of a more limited set of genes in thousands or even tens of thousands of biological samples. For example, some studies may need to monitor a sizable panel of key genes under many different experimental conditions, during development, or treated with a large library of small molecules, for which current genome-wide methods are either inefficient or cost-prohibitive. This unit presents a method that permits quantitative profiling of several hundred selected genes in a large number of samples by coupling RNA-mediated oligonucleotide Annealing, Selection, and Ligation with Next-Gen sequencing (RASL-seq). The method even allows direct analysis of RNA levels in cell lysates and is also adaptable to full automation, making it ideal for large-scale analysis of multiple biological pathways or regulatory gene networks in the context of systematic genetic or chemical genetic perturbations. PMID:22470064

  4. Oufti: an integrated software package for high-accuracy, high-throughput quantitative microscopy analysis.

    PubMed

    Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine

    2016-02-01

    With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today's single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279

  5. Quantitative Analysis of Protein Expression to Study Lineage Specification in Mouse Preimplantation Embryos.

    PubMed

    Saiz, Nestor; Kang, Minjung; Schrode, Nadine; Lou, Xinghua; Hadjantonakis, Anna-Katerina

    2016-01-01

    This protocol presents a method to perform quantitative, single-cell in situ analyses of protein expression to study lineage specification in mouse preimplantation embryos. The procedures necessary for embryo collection, immunofluorescence, imaging on a confocal microscope, and image segmentation and analysis are described. This method allows quantitation of the expression of multiple nuclear markers and the spatial (XYZ) coordinates of all cells in the embryo. It takes advantage of MINS, an image segmentation software tool specifically developed for the analysis of confocal images of preimplantation embryos and embryonic stem cell (ESC) colonies. MINS carries out unsupervised nuclear segmentation across the X, Y and Z dimensions, and produces information on cell position in three-dimensional space, as well as nuclear fluorescence levels for all channels with minimal user input. While this protocol has been optimized for the analysis of images of preimplantation stage mouse embryos, it can easily be adapted to the analysis of any other samples exhibiting a good signal-to-noise ratio and where high nuclear density poses a hurdle to image segmentation (e.g., expression analysis of embryonic stem cell (ESC) colonies, differentiating cells in culture, embryos of other species or stages, etc.). PMID:26967230

  6. [Quantitative analysis of surface composition of polypropylene blends using attenuated total reflectance FTIR spectroscopy].

    PubMed

    Chen, Han-jia; Zhu, Ya-fei; Zhang, Yi; Xu, Jia-rui

    2008-08-01

    The surface composition and structure of solid organic polymers influence many of their properties and applications. Oligomers such as poly(ethylene glycol) (PEG), poly(methyl methacrylate) (PMMA) poly(butyl methacrylate) (PBMA) and their graft copolymers of polybutadiene and polypropylene were used as the macromolecular surface modifiers of polypropylene. The compositions on surface and in bulk of the polypropylene (PP) blends were determined quantitatively using attenuated total reflectance FTIR spectroscopic (ATR-FTIR) technique with a variable-angle multiple-reflection ATR accessory and FTIR measurements, respectively. By validating by Lambert-Beer law, 1103 and 1733 cm(-1) can be used to represent modifiers characteristic absorbance band to determine quantitatively the surface composition of modifiers including poly(ethylene glycol) and carbonyl segment in PP blends, respectively. The determination error can be effectively eliminated by calibrating wavelength and using absorption peak area ratio as the calibrating basis for the quantitative analysis. To minimize the effect of contact between the polymer film and the internal reflection element on the results of absolute absorbance, the technique of "band ratioing" was developed, and it was testified that the error of the peak area ratios of interest can be reduced to 5% or below, which was suitable for ATR-FTIR used as a determining quantitative tool for surface composition. The working curves were then established and used to calculate the composition of the responding functional groups in the film surface of the PP blends. The depth distribution of modifiers on the surface of blend films also can be determined by changing the incident angle of interest on the basis of the equation of the depth of penetration of the excursion wave in ATR spectra. The results indicated that ATR-FTIR can be used to determine quantitatively the surface composition and distribution of modifiers with reproducible and reliable measurement results. PMID:18975806

  7. Investigating reference genes for quantitative real-time PCR analysis across four chicken tissues.

    PubMed

    Bagés, S; Estany, J; Tor, M; Pena, R N

    2015-04-25

    Accurate normalization of data is required to correct for different efficiencies and errors during the processing of samples in reverse transcription PCR analysis. The chicken is one of the main livestock species and its genome was one of the first reported and used in large scale transcriptomic analysis. Despite this, the chicken has not been investigated regarding the identification of reference genes suitable for the quantitative PCR analysis of growth and fattening genes. In this study, five candidate reference genes (B2M, RPL32, SDHA, TBP and YWHAZ) were evaluated to determine the most stable internal reference for quantitative PCR normalization in the two main commercial muscles (pectoralis major (breast) and biceps femoris (thigh)), liver and abdominal fat. Four statistical methods (geNorm, NormFinder, CV and BestKeeper) were used in the evaluation of the most suitable combination of reference genes. Additionally, a comprehensive ranking was established with the RefFinder tool. This analysis identified YWHAZ and TBP as the recommended combination for the analysis of biceps femoris and liver, YWHAZ and RPL32 for pectoralis major and RPL32 and B2M for abdominal fat and across-tissue studies. The final ranking for each tool changed slightly but overall the results, and most particularly the ability to discard the least robust candidates, were consistent between tools. The selection and number of reference genes were validated using SCD, a target gene related to fat metabolism. Overall, the results can be directly used to quantitate target gene expression in different tissues or in validation studies from larger transcriptomic experiments. PMID:25680290

  8. [Near-infrared spectrum quantitative analysis model based on principal components selected by elastic net].

    PubMed

    Chen, Wan-hui; Liu, Xu-hua; He, Xiong-kui; Min, Shun-geng; Zhang, Lu-da

    2010-11-01

    Elastic net is an improvement of the least-squares method by introducing in L1 and L2 penalties, and it has the advantages of the variable selection. The quantitative analysis model build by Elastic net can improve the prediction accuracy. Using 89 wheat samples as the experiment material, the spectrum principal components of the samples were selected by Elastic net. The analysis model was established for the near-infrared spectrum and the wheat's protein content, and the feasibility of using Elastic net to establish the quantitative analysis model was confirmed. In experiment, the 89 wheat samples were randomly divided into two groups, with 60 samples being the model set and 29 samples being the prediction set. The 60 samples were used to build analysis model to predict the protein contents of the 29 samples, and correlation coefficient (R) of the predicted value and chemistry observed value was 0. 984 9, with the mean relative error being 2.48%. To further investigate the feasibility and stability of the model, the 89 samples were randomly selected five times, with 60 samples to be model set and 29 samples to be prediction set. The five groups of principal components which were selected by Elastic net for building model were basically consistent, and compared with the PCR and PLS method, the model prediction accuracies were all better than PCR and similar with PLS. In view of the fact that Elastic net can realize the variable selection and the model has good prediction, it was shown that Elastic net is suitable method for building chemometrics quantitative analysis model. PMID:21284156

  9. Calibration strategy for semi-quantitative direct gas analysis using inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Gerdes, Kirk; Carter, Kimberly E.

    2011-09-01

    A process is described by which an ICP-MS equipped with an Octopole Reaction System (ORS) is calibrated using liquid phase standards to facilitate direct analysis of gas phase samples. The instrument response to liquid phase standards is analyzed to produce empirical factors relating ion generation and transmission efficiencies to standard operating parameters. Empirical factors generated for liquid phase samples are then used to produce semi-quantitative analysis of both mixed liquid/gas samples and pure gas samples. The method developed is similar to the semi-quantitative analysis algorithms in the commercial software, which have here been expanded to include gas phase elements such as Xe and Kr. Equations for prediction of relative ionization efficiencies and isotopic transmission are developed for several combinations of plasma operating conditions, which allows adjustment of limited parameters between liquid and gas injection modes. In particular, the plasma temperature and electron density are calculated from comparison of experimental results to the predictions of the Saha equation. Comparisons between operating configurations are made to determine the robustness of the analysis to plasma conditions and instrument operating parameters. Using the methods described in this research, the elemental concentrations in a liquid standard containing 45 analytes and treated as an unknown sample were quantified accurately to ± 50% for most elements using 133Cs as a single internal reference. The method is used to predict liquid phase mercury within 12% of the actual concentration and gas phase mercury within 28% of the actual concentration. The results verify that the calibration method facilitates accurate semi-quantitative, gas phase analysis of metal species with sufficient sensitivity to quantify metal concentrations lower than 1 ppb for many metallic analytes.

  10. Quantitative analysis of volatiles in edible oils following accelerated oxidation using broad spectrum isotope standards.

    PubMed

    Gómez-Cortés, Pilar; Sacks, Gavin L; Brenna, J Thomas

    2015-05-01

    Analysis of food volatiles generated by processing are widely reported but comparisons across studies is challenging in part because most reports are inherently semi-quantitative for most analytes due to limited availability of chemical standards. We recently introduced a novel strategy for creation of broad spectrum isotopic standards for accurate quantitative food chemical analysis. Here we apply the principle to quantification of 25 volatiles in seven thermally oxidised edible oils. After extended oxidation, total volatiles of high n-3 oils (flax, fish, cod liver) were 120-170 mg/kg while low n-3 vegetable oils were <50mg/kg. Separate experiments on thermal degradation of d5-ethyl linolenate indicate that off-aroma volatiles originate throughout the n-3 molecule and not solely the n-3 terminal end. These data represent the first report using broad-spectrum isotopically labelled standards for quantitative characterisation of processing-induced volatile generation across related foodstuffs, and verify the origin of specific volatiles from parent n-3 fatty acids. PMID:25529686

  11. Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function

    SciTech Connect

    Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.

    1987-06-01

    A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

  12. Quantitative analysis of phytosterols in edible oils using APCI liquid chromatography-tandem mass spectrometry

    PubMed Central

    Mo, Shunyan; Dong, Linlin; Hurst, W. Jeffrey; van Breemen, Richard B.

    2014-01-01

    Previous methods for the quantitative analysis of phytosterols have usually used GC-MS and require elaborate sample preparation including chemical derivatization. Other common methods such as HPLC with absorbance detection do not provide information regarding the identity of the analytes. To address the need for an assay that utilizes mass selectivity while avoiding derivatization, a quantitative method based on LC-tandem mass spectrometry (LC-MS-MS) was developed and validated for the measurement of six abundant dietary phytosterols and structurally related triterpene alcohols including brassicasterol, campesterol, cycloartenol, ?-sitosterol, stigmasterol, and lupeol in edible oils. Samples were saponified, extracted with hexane and then analyzed using reversed phase HPLC with positive ion atmospheric pressure chemical ionization tandem mass spectrometry and selected reaction monitoring. The utility of the LC-MS-MS method was demonstrated by analyzing 14 edible oils. All six compounds were present in at least some of the edible oils. The most abundant phytosterol in all samples was ?-sitosterol, which was highest in corn oil at 4.35 ± 0.03 mg/g, followed by campesterol in canola oil at 1.84 ± 0.01 mg/g. The new LC-MS-MS method for the quantitative analysis of phytosterols provides a combination of speed, selectivity and sensitivity that exceed those of previous assays. PMID:23884629

  13. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    PubMed

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time <60 s) and method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4 (+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical capability and sample analysis throughput suitable for broad applications in life sciences, agricultural chemistry, consumer safety, and beyond. Graphical abstract Position of FIA-MS relative to chromatography-MS and ambient MS in terms of analytical figures of merit and sample analysis throughput. PMID:26670771

  14. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  15. Fourier transform infrared spectroscopy quantitative analysis of SF6 partial discharge decomposition components

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoxing; Liu, Heng; Ren, Jiangbo; Li, Jian; Li, Xin

    2015-02-01

    Gas-insulated switchgear (GIS) internal SF6 gas produces specific decomposition components under partial discharge (PD). By detecting these characteristic decomposition components, such information as the type and level of GIS internal insulation deterioration can be obtained effectively, and the status of GIS internal insulation can be evaluated. SF6 was selected as the background gas for Fourier transform infrared spectroscopy (FTIR) detection in this study. SOF2, SO2F2, SO2, and CO were selected as the characteristic decomposition components for system analysis. The standard infrared absorption spectroscopy of the four characteristic components was measured, the optimal absorption peaks were recorded and the corresponding absorption coefficient was calculated. Quantitative detection experiments on the four characteristic components were conducted. The volume fraction variation trend of four characteristic components at different PD time were analyzed. And under five different PD quantity, the quantitative relationships among gas production rate, PD time, and PD quantity were studied.

  16. Quantitative quenching evaluation and direct intracellular metabolite analysis in Penicillium chrysogenum.

    PubMed

    Meinert, Sabine; Rapp, Sina; Schmitz, Katja; Noack, Stephan; Kornfeld, Georg; Hardiman, Timo

    2013-07-01

    Sustained progress in metabolic engineering methodologies has stimulated new efforts toward optimizing fungal production strains such as through metabolite analysis of Penicillium chrysogenum industrial-scale processes. Accurate intracellular metabolite quantification requires sampling procedures that rapidly stop metabolism (quenching) and avoid metabolite loss via the cell membrane (leakage). When sampling protocols are validated, the quenching efficiency is generally not quantitatively assessed. For fungal metabolomics, quantitative biomass separation using centrifugation is a further challenge. In this study, P. chrysogenum intracellular metabolites were quantified directly from biomass extracts using automated sampling and fast filtration. A master/slave bioreactor concept was applied to provide industrial production conditions. Metabolic activity during sampling was monitored by 13C tracing. Enzyme activities were efficiently stopped and metabolite leakage was absent. This work provides a reliable method for P. chrysogenum metabolomics and will be an essential base for metabolic engineering of industrial processes. PMID:23541815

  17. Quantitative analysis of virgin coconut oil in cream cosmetics preparations using fourier transform infrared (FTIR) spectroscopy.

    PubMed

    Rohman, A; Man, Yb Che; Sismindari

    2009-10-01

    Today, virgin coconut oil (VCO) is becoming valuable oil and is receiving an attractive topic for researchers because of its several biological activities. In cosmetics industry, VCO is excellent material which functions as a skin moisturizer and softener. Therefore, it is important to develop a quantitative analytical method offering a fast and reliable technique. Fourier transform infrared (FTIR) spectroscopy with sample handling technique of attenuated total reflectance (ATR) can be successfully used to analyze VCO quantitatively in cream cosmetic preparations. A multivariate analysis using calibration of partial least square (PLS) model revealed the good relationship between actual value and FTIR-predicted value of VCO with coefficient of determination (R2) of 0.998. PMID:19783522

  18. Qualitative and quantitative analysis of Eclipta prostrata L. by LC/MS.

    PubMed

    Han, Lifeng; Liu, Erwei; Kojo, Agyemang; Zhao, Jing; Li, Wei; Zhang, Yi; Wang, Tao; Gao, Xiumei

    2015-01-01

    Eclipta prostrata L. is one of the Chinese medicinal tonics which are usually used for treating loose teeth, dizziness, tinnitus, hemoptysis, hematuria, and uterine bleeding. However, quality control of this herbal medicine has been not satisfactory. This study reported its qualitative and quantitative analyses based on LC/MS method. UHPLC-DAD-Q-TOF-MS fingerprinting and MS fragmentation cleavage pathway were investigated for qualitative analysis. Furthermore, a method for simultaneous quantitative determination of nine compounds, luteolin 7-O-β-D-glucopyranoside, ecliptasaponin C, luteolin, eclalbasaponin IV, apigenin, ecliptasaponin A, echinocystic acid 28-O-β-D-glucopyranoside, echinocystic acid, and 3-oxo-16α-hydroxy-olean-12-en-28-oic acid in E. prostrata, was established. The method was validated for samples of E. prostrata from different habitats. The results showed good linear correlation, precision, accuracy, and repeatability that could be used for contents determination of the nine compounds in E. prostrata from different habitats. PMID:25667939

  19. Fourier transform infrared spectroscopy quantitative analysis of SF6 partial discharge decomposition components.

    PubMed

    Zhang, Xiaoxing; Liu, Heng; Ren, Jiangbo; Li, Jian; Li, Xin

    2015-02-01

    Gas-insulated switchgear (GIS) internal SF6 gas produces specific decomposition components under partial discharge (PD). By detecting these characteristic decomposition components, such information as the type and level of GIS internal insulation deterioration can be obtained effectively, and the status of GIS internal insulation can be evaluated. SF6 was selected as the background gas for Fourier transform infrared spectroscopy (FTIR) detection in this study. SOF2, SO2F2, SO2, and CO were selected as the characteristic decomposition components for system analysis. The standard infrared absorption spectroscopy of the four characteristic components was measured, the optimal absorption peaks were recorded and the corresponding absorption coefficient was calculated. Quantitative detection experiments on the four characteristic components were conducted. The volume fraction variation trend of four characteristic components at different PD time were analyzed. And under five different PD quantity, the quantitative relationships among gas production rate, PD time, and PD quantity were studied. PMID:25459612

  20. A tool for the quantitative spatial analysis of complex cellular systems.

    PubMed

    Fernandez-Gonzalez, Rodrigo; Barcellos-Hoff, Mary Helen; Ortiz-de-Solórzano, Carlos

    2005-09-01

    Spatial events largely determine the biology of cells, tissues, and organs. In this paper, we present a tool for the quantitative spatial analysis of heterogeneous cell populations, and we show experimental validation of this tool using both artificial and real (mammary gland tissue) data, in two and three dimensions. We present the refined relative neighborhood graph as a means to establish neighborhood between cells in an image while modeling the topology of the tissue. Then, we introduce the M function as a method to quantitatively evaluate the existence of spatial patterns within one cell population or the relationship between the spatial distributions of multiple cell populations. Finally, we show a number of examples that demonstrate the feasibility of our approach. PMID:16190466

  1. TOPICAL REVIEW Quantitative strain analysis of surfaces and interfaces using extremely asymmetric x-ray diffraction

    NASA Astrophysics Data System (ADS)

    Akimoto, Koichi; Emoto, Takashi

    2010-12-01

    Strain can reduce carrier mobility and the reliability of electronic devices and affect the growth mode of thin films and the stability of nanometer-scale crystals. To control lattice strain, a technique for measuring the minute lattice strain at surfaces and interfaces is needed. Recently, an extremely asymmetric x-ray diffraction method has been developed for this purpose. By employing Darwin's dynamical x-ray diffraction theory, quantitative evaluation of strain at surfaces and interfaces becomes possible. In this paper, we review our quantitative strain analysis studies on native SiO2/Si interfaces, reconstructed Si surfaces, Ni/Si(111)-H interfaces, sputtered III-V compound semiconductor surfaces, high-k/Si interfaces, and Au ion-implanted Si.

  2. The Other Half of the Story: Effect Size Analysis in Quantitative Research

    PubMed Central

    Maher, Jessica Middlemis; Markey, Jonathan C.; Ebert-May, Diane

    2013-01-01

    Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals. PMID:24006382

  3. Kinetics analysis and quantitative calculations for the successive radioactive decay process

    NASA Astrophysics Data System (ADS)

    Zhou, Zhiping; Yan, Deyue; Zhao, Yuliang; Chai, Zhifang

    2015-01-01

    The general radioactive decay kinetics equations with branching were developed and the analytical solutions were derived by Laplace transform method. The time dependence of all the nuclide concentrations can be easily obtained by applying the equations to any known radioactive decay series. Taking the example of thorium radioactive decay series, the concentration evolution over time of various nuclide members in the family has been given by the quantitative numerical calculations with a computer. The method can be applied to the quantitative prediction and analysis for the daughter nuclides in the successive decay with branching of the complicated radioactive processes, such as the natural radioactive decay series, nuclear reactor, nuclear waste disposal, nuclear spallation, synthesis and identification of superheavy nuclides, radioactive ion beam physics and chemistry, etc.

  4. Cholera Modeling: Challenges to Quantitative Analysis and Predicting the Impact of Interventions

    PubMed Central

    Grad, Yonatan H.; Miller, Joel C.; Lipsitch, Marc

    2012-01-01

    Several mathematical models of epidemic cholera have recently been proposed in response to outbreaks in Zimbabwe and Haiti. These models aim to estimate the dynamics of cholera transmission and the impact of possible interventions, with a goal of providing guidance to policy-makers in deciding among alternative courses of action, including vaccination, provision of clean water, and antibiotics. Here we discuss concerns about model misspecification, parameter uncertainty, and spatial heterogeneity intrinsic to models for cholera. We argue for caution in interpreting quantitative predictions, particularly predictions of the effectiveness of interventions. We specify sensitivity analyses that would be necessary to improve confidence in model-based quantitative prediction, and suggest types of monitoring in future epidemic settings that would improve analysis and prediction. PMID:22659546

  5. Quantitative Reverse Transcription-qPCR-Based Gene Expression Analysis in Plants.

    PubMed

    Abdallah, Heithem Ben; Bauer, Petra

    2016-01-01

    The investigation of gene expression is an initial and essential step to understand the function of a gene in a physiological context. Reverse transcription-quantitative real-time PCR (RT-qPCR) assays are reproducible, quantitative, and fast. They can be adapted to study model and non-model plant species without the need to have whole genome or transcriptome sequence data available. Here, we provide a protocol for a reliable RT-qPCR assay, which can be easily adapted to any plant species of interest. We describe the design of the qPCR strategy and primer design, considerations for plant material generation, RNA preparation and cDNA synthesis, qPCR setup and run, and qPCR data analysis, interpretation, and final presentation. PMID:26577777

  6. The Aqua-Planet Experiment (APE): CONTROL SST Simulation

    NASA Technical Reports Server (NTRS)

    Blackburn, Michael; Williamson, David L.; Nakajima, Kensuke; Ohfuchi, Wataru; Takahashi, Yoshiyuki O.; Hayashi, Yoshi-Yuki; Nakamura, Hisashi; Ishiwatari, Masaki; Mcgregor, John L.; Borth, Hartmut; Wirth, Volkmar; Frank, Helmut; Bechtold, Peter; Wedi, Nils P.; Tomita, Hirofumi; Satoh, Masaki; Zhao, Ming; Held, Isaac M.; Suarez, Max J.; Lee, Myong-In; Watanabe, Masahiro; Kimoto, Masahide; Liu, Yimin; Wang, Zaizhi; Molod, Andrea M.; Rajendran, Kavirajan; Kotoh, Akio; Stratton, Rachel

    2013-01-01

    Climate simulations by 16 atmospheric general circulation models (AGCMs) are compared on an aqua-planet, a water-covered Earth with prescribed sea surface temperature varying only in latitude. The idealised configuration is designed to expose differences in the circulation simulated by different models. Basic features of the aqua-planet climate are characterised by comparison with Earth. The models display a wide range of behaviour. The balanced component of the tropospheric mean flow, and mid-latitude eddy covariances subject to budget constraints, vary relatively little among the models. In contrast, differences in damping in the dynamical core strongly influence transient eddy amplitudes. Historical uncertainty in modelled lower stratospheric temperatures persists in APE.Aspects of the circulation generated more directly by interactions between the resolved fluid dynamics and parameterized moist processes vary greatly. The tropical Hadley circulation forms either a single or double inter-tropical convergence zone (ITCZ) at the equator, with large variations in mean precipitation. The equatorial wave spectrum shows a wide range of precipitation intensity and propagation characteristics. Kelvin mode-like eastward propagation with remarkably constant phase speed dominates in most models. Westward propagation, less dispersive than the equatorial Rossby modes, dominates in a few models or occurs within an eastward propagating envelope in others. The mean structure of the ITCZ is related to precipitation variability, consistent with previous studies.The aqua-planet global energy balance is unknown but the models produce a surprisingly large range of top of atmosphere global net flux, dominated by differences in shortwave reflection by clouds. A number of newly developed models, not optimised for Earth climate, contribute to this. Possible reasons for differences in the optimised models are discussed.The aqua-planet configuration is intended as one component of an experimental hierarchy used to evaluate AGCMs. This comparison does suggest that the range of model behaviour could be better understood and reduced in conjunction with Earth climate simulations. Controlled experimentation is required to explore individual model behavior and investigate convergence of the aqua-planet climate with increasing resolution.

  7. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    SciTech Connect

    Pasichnyk, I.; Perin, Y.; Velkov, K.

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  8. Optimisation of Lime-Soda process parameters for reduction of hardness in aqua-hatchery practices using Taguchi methods.

    PubMed

    Yavalkar, S P; Bhole, A G; Babu, P V Vijay; Prakash, Chandra

    2012-04-01

    This paper presents the optimisation of Lime-Soda process parameters for the reduction of hardness in aqua-hatchery practices in the context of M. rosenbergii. The fresh water in the development of fisheries needs to be of suitable quality. Lack of desirable quality in available fresh water is generally the confronting restraint. On the Indian subcontinent, groundwater is the only source of raw water, having varying degree of hardness and thus is unsuitable for the fresh water prawn hatchery practices (M. rosenbergii). In order to make use of hard water in the context of aqua-hatchery, Lime-Soda process has been recommended. The efficacy of the various process parameters like lime, soda ash and detention time, on the reduction of hardness needs to be examined. This paper proposes to determine the parameter settings for the CIFE well water, which is pretty hard by using Taguchi experimental design method. Orthogonal Arrays of Taguchi, Signal-to-Noise Ratio, the analysis of variance (ANOVA) have been applied to determine their dosage and analysed for their effect on hardness reduction. The tests carried out with optimal levels of Lime-Soda process parameters confirmed the efficacy of the Taguchi optimisation method. Emphasis has been placed on optimisation of chemical doses required to reduce the total hardness using Taguchi method and ANOVA, to suit the available raw water quality for aqua-hatchery practices, especially for fresh water prawn M. rosenbergii. PMID:24749379

  9. Quantitative and Qualitative Analysis of the Quality of Life of Individuals With Eating Disorders

    PubMed Central

    McCune, Ashley M.; Mandal, Konoy; Lundgren, Jennifer D.

    2015-01-01

    Objective: To examine the quality of a broad range of life domains using both quantitative and qualitative methodologies. Method: Forty-eight individuals seeking inpatient treatment for an eating disorder (mean age = 29.8 years, female = 100%, white = 96.4%) from 2007 to 2009 completed the Quality of Life Inventory (QOLI) and the Eating Disorder Examination Questionnaire; a medical chart review confirmed diagnosis and treatment history. Patients diagnosed with anorexia nervosa (n = 24) and bulimia nervosa (n = 24) were compared. Body mass index (kg/m2), treatment history, number of comorbid psychiatric conditions, and eating disorder severity were used to predict quality of life. Finally, an inductive content analysis was performed on qualitative QOLI responses to contextualize the quantitative findings. Results: Participants with anorexia nervosa, compared to those with bulimia nervosa, reported significantly less satisfaction with the domain of relatives (F1,46 = 5.35; P = .025); no other significant group differences were found. The only significant predictor of QOLI global score was number of previous treatments (F1,41 = 8.67; P = .005; R2 = 0.175). Content analysis of qualitative data yielded complementary findings to the quantitative data; interesting group differences emerged for satisfaction with health with implications for measuring quality of life domains. Conclusions: Across several life domains, individuals seeking treatment for anorexia nervosa and bulimia nervosa appear to have similar levels of satisfaction, as evidenced by numeric and descriptive responses. Satisfaction with relatives, however, appears to differ between groups and suggests a specific target for intervention among patients in treatment for anorexia nervosa (eg, a family-based intervention such as the Maudsley approach). The use of quantitative and qualitative assessments, such as the QOLI, provides more clinically meaningful, contextualized information about quality of life than traditional self-report assessments alone. PMID:26445689

  10. Quantitative Gene Expression Analysis in Microdissected Archival Formalin-Fixed and Paraffin-Embedded Tumor Tissue

    PubMed Central

    Specht, Katja; Richter, Thomas; Müller, Ulrike; Walch, Axel; Werner, Martin; Höfler, Heinz

    2001-01-01

    Formalin-fixed, paraffin-embedded tissue is the most widely available material for retrospective clinical studies. In combination with the potential of genomics, these tissues represent an invaluable resource for the elucidation of disease mechanisms and validation of differentially expressed genes as novel therapeutic targets or prognostic indicators. We describe here an approach that, in combination with laser-assisted microdissection allows quantitative gene expression analysis in formalin-fixed, paraffin-embedded archival tissue. Using an optimized RNA microscale extraction procedure in conjunction with real-time quantitative reverse transcriptase-polymerase chain reaction based on fluorogenic TaqMan methodology, we analyzed the expression of a panel of cancer-relevant genes, EGF-R, HER-2/neu, FGF-R4, p21/WAF1/Cip1, MDM2, and HPRT and PGK as controls. We demonstrate that expression level determinations from formalin-fixed, paraffin-embedded tissues are accurate and reproducible. Measurements were comparable to those obtained with matching fresh-frozen tissue and neither fixation grade nor time significantly affected the results. Laser microdissection studies with 5-?m thick sections and defined numbers of tumor cells demonstrated that reproducible quantitation of specific mRNAs can be achieved with only 50 cells. We applied our approach to HER-2/neu quantitative gene expression analysis in 54 microdissected tumor and nonneoplastic archival samples from patients with Barrett’s esophageal adenocarcinoma and showed that the results matched those obtained in parallel by fluorescence in situ hybridization and immunohistochemistry. Thus, the combination of laser-assisted microdissection and real-time TaqMan reverse transcriptase-polymerase chain reaction opens new avenues for the investigation and clinical validation of gene expression changes in archival tissue specimens. PMID:11159180

  11. Noninvasive Characterization of Locally Advanced Breast Cancer Using Textural Analysis of Quantitative Ultrasound Parametric Images

    PubMed Central

    Tadayyon, Hadi; Sadeghi-Naini, Ali; Czarnota, Gregory J.

    2014-01-01

    PURPOSE: The identification of tumor pathologic characteristics is an important part of breast cancer diagnosis, prognosis, and treatment planning but currently requires biopsy as its standard. Here, we investigated a noninvasive quantitative ultrasound method for the characterization of breast tumors in terms of their histologic grade, which can be used with clinical diagnostic ultrasound data. METHODS: Tumors of 57 locally advanced breast cancer patients were analyzed as part of this study. Seven quantitative ultrasound parameters were determined from each tumor region from the radiofrequency data, including mid-band fit, spectral slope, 0-MHz intercept, scatterer spacing, attenuation coefficient estimate, average scatterer diameter, and average acoustic concentration. Parametric maps were generated corresponding to the region of interest, from which four textural features, including contrast, energy, homogeneity, and correlation, were determined as further tumor characterization parameters. Data were examined on the basis of tumor subtypes based on histologic grade (grade I versus grade II to III). RESULTS: Linear discriminant analysis of the means of the parametric maps resulted in classification accuracy of 79%. On the other hand, the linear combination of the texture features of the parametric maps resulted in classification accuracy of 82%. Finally, when both the means and textures of the parametric maps were combined, the best classification accuracy was obtained (86%). CONCLUSIONS: Textural characteristics of quantitative ultrasound spectral parametric maps provided discriminant information about different types of breast tumors. The use of texture features significantly improved the results of ultrasonic tumor characterization compared to conventional mean values. Thus, this study suggests that texture-based quantitative ultrasound analysis of in vivo breast tumors can provide complementary diagnostic information about tumor histologic characteristics. PMID:25500086

  12. Assessment of Parent-of-Origin Effects in Linkage Analysis of Quantitative Traits

    PubMed Central

    Hanson, Robert L.; Kobes, Sayuko; Lindsay, Robert S.; Knowler, William C.

    2001-01-01

    Methods are presented for incorporation of parent-of-origin effects into linkage analysis of quantitative traits. The estimated proportion of marker alleles shared identical by descent is first partitioned into a component derived from the mother and a component derived from the father. These parent-specific estimates of allele sharing are used in variance-components or Haseman-Elston methods of linkage analysis so that the effect of the quantitative-trait locus carried on the maternally derived chromosome is potentially different from the effect of the locus on the paternally derived chromosome. Statistics for linkage between trait and marker loci derived from either or both parents are then calculated, as are statistics for testing whether the effect of the maternally derived locus is equal to that of the paternally derived locus. Analyses of data simulated for 956 siblings from 263 nuclear families who had participated in a linkage study revealed that type I error rates for these statistics were generally similar to nominal values. Power to detect an imprinted locus was substantially increased when analyzed with a model allowing for parent-of-origin effects, compared with analyses that assumed equal effects; for example, for an imprinted locus accounting for 30% of the phenotypic variance, the expected LOD score was 4.5 when parent-of-origin effects were incorporated into the analysis, compared with 3.1 when these effects were ignored. The ability to include parent-of-origin effects within linkage analysis of quantitative traits will facilitate genetic dissection of complex traits. PMID:11254452

  13. Reconstructing Past Depositional and Diagenetic Processes through Quantitative Stratigraphic Analysis of the Martian Sedimentary Rock Record

    NASA Astrophysics Data System (ADS)

    Stack, Kathryn M.

    High-resolution orbital and in situ observations acquired of the Martian surface during the past two decades provide the opportunity to study the rock record of Mars at an unprecedented level of detail. This dissertation consists of four studies whose common goal is to establish new standards for the quantitative analysis of visible and near-infrared data from the surface of Mars. Through the compilation of global image inventories, application of stratigraphic and sedimentologic statistical methods, and use of laboratory analogs, this dissertation provides insight into the history of past depositional and diagenetic processes on Mars. The first study presents a global inventory of stratified deposits observed in images from the High Resolution Image Science Experiment (HiRISE) camera on-board the Mars Reconnaissance Orbiter. This work uses the widespread coverage of high-resolution orbital images to make global-scale observations about the processes controlling sediment transport and deposition on Mars. The next chapter presents a study of bed thickness distributions in Martian sedimentary deposits, showing how statistical methods can be used to establish quantitative criteria for evaluating the depositional history of stratified deposits observed in orbital images. The third study tests the ability of spectral mixing models to obtain quantitative mineral abundances from near-infrared reflectance spectra of clay and sulfate mixtures in the laboratory for application to the analysis of orbital spectra of sedimentary deposits on Mars. The final study employs a statistical analysis of the size, shape, and distribution of nodules observed by the Mars Science Laboratory Curiosity rover team in the Sheepbed mudstone at Yellowknife Bay in Gale crater. This analysis is used to evaluate hypotheses for nodule formation and to gain insight into the diagenetic history of an ancient habitable environment on Mars.

  14. Quantitative Analysis of STD-NMR Spectra of Reversibly Forming Ligand-Receptor Complexes

    NASA Astrophysics Data System (ADS)

    Krishna, N. Rama; Jayalakshmi, V.

    We describe our work on the quantitative analysis of STD-NMR spectra of reversibly forming ligand-receptor complexes. This analysis is based on the theory of complete relaxation and conformational exchange matrix analysis of saturation transfer (CORCEMA-ST) effects. As part of this work, we have developed two separate versions of the CORCEMA-ST program. The first version predicts the expected STD intensities for a given model of a ligand-protein complex, and compares them quantitatively with the experimental data. This version is very useful for rapidly determining if a model for a given ligand-protein complex is compatible with the STD-NMR data obtained in solution. It is also useful in determining the optimal experimental conditions for undertaking the STD-NMR measurements on a given complex by computer simulations. In the second version of the CORCEMA-ST program, we have implemented a torsion angle refinement feature for the bound ligand within the protein binding pocket. In this approach, the global minimum for the bound ligand conformation is obtained by a hybrid structure refinement protocol involving CORCEMA-ST calculation of intensities and simulated annealing refinement of torsion angles of the bound ligand using STD-NMR intensities as experimental constraints to minimize a pseudo-energy function. This procedure is useful in refining and improving the initial models based on crystallography, computer docking, or other procedures to generate models for the bound ligand within the protein binding pocket compatible with solution STD-NMR data. In this chapter we describe the properties of the STD-NMR spectra, including the dependence of the intensities on various parameters. We also describe the results of the CORCEMA-ST analyses of experimental STD-NMR data on some ligand-protein complexes to illustrate the quantitative analysis of the data using this method. This CORCEMA-ST program is likely to be useful in structure-based drug design efforts.

  15. Qualitative and quantitative analysis of calcium-based microfillers using terahertz spectroscopy and imaging.

    PubMed

    Abina, Andreja; Puc, Uroš; Jegli?, Anton; Prah, Jana; Venckevi?ius, Rimvydas; Kašalynas, Irmantas; Valušis, Gintaras; Zidanšek, Aleksander

    2015-10-01

    In different industrial applications, several strictly defined parameters of calcium-based microfillers such as average particle size, particle size distribution, morphology, specific surface area, polymorphism and chemical purity, play a key role in the determination of its usefulness and effectiveness. Therefore, an analytical tool is required for rapid and non-destructive characterization of calcium-based microfillers during the synthesis process or before its use in a further manufacturing process. Since spectroscopic techniques are preferred over microscopy and thermogravimetry, particularly due to its non-destructive nature and short analysis time, we applied terahertz (THz) spectroscopy to analyse calcite microfillers concentration in polymer matrix, its granulation and chemical treatment. Based on the analysis of peak absorbance amplitude, peak frequency position, and the appearance of additional spectral features, quantitative and qualitative analysis was successfully achieved. In addition, THz imaging was also applied for both quantitative and qualitative analysis of calcium-based microfillers. By using spatial distribution map, the inhomogeneity in concentration of calcium carbonate in polymer matrix was characterized. Moreover, by THz spectroscopy and imaging different calcium compounds were detected in binary mixtures. Finally, we demonstrated that the applied spectroscopic technique offers valuable results and can be, in combination with other spectroscopic and microscopic techniques, converted to a powerful rapid analytical tool. PMID:26078145

  16. SILAC-Based Quantitative Proteomic Analysis of Diffuse Large B-Cell Lymphoma Patients.

    PubMed

    Rüetschi, Ulla; Stenson, Martin; Hasselblom, Sverker; Nilsson-Ehle, Herman; Hansson, Ulrika; Fagman, Henrik; Andersson, Per-Ola

    2015-01-01

    Diffuse large B-cell lymphoma (DLBCL), the most common lymphoma, is a heterogeneous disease where the outcome for patients with early relapse or refractory disease is very poor, even in the era of immunochemotherapy. In order to describe possible differences in global protein expression and network patterns, we performed a SILAC-based shotgun (LC-MS/MS) quantitative proteomic analysis in fresh-frozen tumor tissue from two groups of DLBCL patients with totally different clinical outcome: (i) early relapsed or refractory and (ii) long-term progression-free patients. We could identify over 3,500 proteins; more than 1,300 were quantified in all patients and 87 were significantly differentially expressed. By functional annotation analysis on the 66 proteins overexpressed in the progression-free patient group, we found an enrichment of proteins involved in the regulation and organization of the actin cytoskeleton. Also, five proteins from actin cytoskeleton regulation, applied in a supervised regression analysis, could discriminate the two patient groups. In conclusion, SILAC-based shotgun quantitative proteomic analysis appears to be a powerful tool to explore the proteome in DLBCL tumor tissue. Also, as progression-free patients had a higher expression of proteins involved in the actin cytoskeleton protein network, such a pattern indicates a functional role in the sustained response to immunochemotherapy. PMID:26060582

  17. An artificial neural network approach to laser-induced breakdown spectroscopy quantitative analysis

    NASA Astrophysics Data System (ADS)

    D'Andrea, Eleonora; Pagnotta, Stefano; Grifoni, Emanuela; Lorenzetti, Giulia; Legnaioli, Stefano; Palleschi, Vincenzo; Lazzerini, Beatrice

    2014-09-01

    The usual approach to laser-induced breakdown spectroscopy (LIBS) quantitative analysis is based on the use of calibration curves, suitably built using appropriate reference standards. More recently, statistical methods relying on the principles of artificial neural networks (ANN) are increasingly used. However, ANN analysis is often used as a ‘black box’ system and the peculiarities of the LIBS spectra are not exploited fully. An a priori exploration of the raw data contained in the LIBS spectra, carried out by a neural network to learn what are the significant areas of the spectrum to be used for a subsequent neural network delegated to the calibration, is able to throw light upon important information initially unknown, although already contained within the spectrum. This communication will demonstrate that an approach based on neural networks specially taylored for dealing with LIBS spectra would provide a viable, fast and robust method for LIBS quantitative analysis. This would allow the use of a relatively limited number of reference samples for the training of the network, with respect to the current approaches, and provide a fully automatizable approach for the analysis of a large number of samples.

  18. Common and distinct neural correlates of personal and vicarious reward: A quantitative meta-analysis.

    PubMed

    Morelli, Sylvia A; Sacchet, Matthew D; Zaki, Jamil

    2015-05-15

    Individuals experience reward not only when directly receiving positive outcomes (e.g., food or money), but also when observing others receive such outcomes. This latter phenomenon, known as vicarious reward, is a perennial topic of interest among psychologists and economists. More recently, neuroscientists have begun exploring the neuroanatomy underlying vicarious reward. Here we present a quantitative whole-brain meta-analysis of this emerging literature. We identified 25 functional neuroimaging studies that included contrasts between vicarious reward and a neutral control, and subjected these contrasts to an activation likelihood estimate (ALE) meta-analysis. This analysis revealed a consistent pattern of activation across studies, spanning structures typically associated with the computation of value (especially ventromedial prefrontal cortex) and mentalizing (including dorsomedial prefrontal cortex and superior temporal sulcus). We further quantitatively compared this activation pattern to activation foci from a previous meta-analysis of personal reward. Conjunction analyses yielded overlapping VMPFC activity in response to personal and vicarious reward. Contrast analyses identified preferential engagement of the nucleus accumbens in response to personal as compared to vicarious reward, and in mentalizing-related structures in response to vicarious as compared to personal reward. These data shed light on the common and unique components of the reward that individuals experience directly and through their social connections. PMID:25554428

  19. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…

  20. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…