These are representative sample records from related to your search topic.
For comprehensive and current results, perform a real-time search at

Multivariate Quantitative Chemical Analysis  

NASA Technical Reports Server (NTRS)

Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

Kinchen, David G.; Capezza, Mary



Quantitative Hydrocarbon Surface Analysis  

NASA Technical Reports Server (NTRS)

The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

Douglas, Vonnie M.



Science Writers' Guide to Aqua  

NSDL National Science Digital Library

This guide provides an overview of the Aqua mission, instruments, research, science teams, and Aqua website. This information is provided to aid the professional science writer in writing stories and articles related to the Aqua mission. Note: this guide was produced before Aqua was launched; for the most recent information on Aqua, go to




NSDL National Science Digital Library

In this activity, students construct their own rocket-powered boat called an "aqua-thruster." These aqua-thrusters will be made from a film canister and will use carbon dioxide gas â produced from a chemical reaction between an antacid tablet and water â to propel it. Students observe the effect that surface area of this simulated solid rocket fuel has on thrust.

Integrated Teaching And Learning Program


A Quantitative Fitness Analysis Workflow  

PubMed Central

Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel1,2,3,4. QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods5,6. However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases3. For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously1. Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and imaging. Any of these automated steps can be replaced by an equivalent, manual procedure, with an associated reduction in throughput, and we also present a lower throughput manual protocol. The same QFA software tools can be applied to images captured in either workflow. We have extensive experience applying QFA to cultures of the budding yeast S. cerevisiae but we expect that QFA will prove equally useful for examining cultures of the fission yeast S. pombe and bacterial cultures. PMID:22907268

Lydall, D.A.



Aquae Urbis Romae: the Waters of the City of Rome  

Microsoft Academic Search

Summary form only given. This presentation will provide an introduction to and demonstration of Aquae Urbis Romae: the Waters of the City of Rome. Aquae Urbis Romae is a synoptic analysis of the hydrological and hydraulic history of Rome that presents water as a single continuum related to nearly 3000 years of urban development. Published a free, public, interactive World

K. W. Rinne; Jesús Lorés; Emili Junyent



Quantitative Analysis of Transcript Accumulation  

E-print Network

) to as great as 100:1. MATERIALS AND METHODS Plant Materials and RNA Extractions Upland cotton (Gos) and three natural homoeologous gene pairs expressed in tetraploid cotton (Gossypium hirsutum) ovules to resolve highly similar gene products [>98% identical in polyploid cotton (6)] and to provide quantitative

Wendel, Jonathan F.


Aqua Education and Public Outreach  

NASA Astrophysics Data System (ADS)

NASA's Aqua satellite was launched on May 4, 2002, with six instruments designed to collect data about the Earth's atmosphere, biosphere, hydrosphere, and cryosphere. Since the late 1990s, the Aqua mission has involved considerable education and public outreach (EPO) activities, including printed products, formal education, an engineering competition, webcasts, and high-profile multimedia efforts. The printed products include Aqua and instrument brochures, an Aqua lithograph, Aqua trading cards, NASA Fact Sheets on Aqua, the water cycle, and weather forecasting, and an Aqua science writers' guide. On-going formal education efforts include the Students' Cloud Observations On-Line (S'COOL) Project, the MY NASA DATA Project, the Earth System Science Education Alliance, and, in partnership with university professors, undergraduate student research modules. Each of these projects incorporates Aqua data into its inquiry-based framework. Additionally, high school and undergraduate students have participated in summer internship programs. An earlier formal education activity was the Aqua Engineering Competition, which was a high school program sponsored by the NASA Goddard Space Flight Center, Morgan State University, and the Baltimore Museum of Industry. The competition began with the posting of a Round 1 Aqua-related engineering problem in December 2002 and concluded in April 2003 with a final round of competition among the five finalist teams. The Aqua EPO efforts have also included a wide range of multimedia products. Prior to launch, the Aqua team worked closely with the Special Projects Initiative (SPI) Office to produce a series of live webcasts on Aqua science and the Cool Science website, which displays short video clips of Aqua scientists and engineers explaining the many aspects of the Aqua mission. These video clips, the Aqua website, and numerous presentations have benefited from dynamic visualizations showing the Aqua launch, instrument deployments, instrument sensing, and the Aqua orbit. More recently, in 2008 the Aqua team worked with the ViewSpace production team from the Space Telescope Science Institute to create an 18-minute ViewSpace feature showcasing the science and applications of the Aqua mission. Then in 2010 and 2011, Aqua and other NASA Earth-observing missions partnered with National CineMedia on the "Know Your Earth" (KYE) project. During January and July 2010 and 2011, KYE ran 2-minute segments highlighting questions that promoted global climate literacy on lobby LCD screens in movie theaters throughout the U.S. Among the ongoing Aqua EPO efforts is the incorporation of Aqua data sets onto the Dynamic Planet, a large digital video globe that projects a wide variety of spherical data sets. Aqua also has a highly successful collaboration with EarthSky communications on the production of an Aqua/EarthSky radio show and podcast series. To date, eleven productions have been completed and distributed via the EarthSky network. In addition, a series of eight video podcasts (i.e., vodcasts) are under production by NASA Goddard TV in conjunction with Aqua personnel, highlighting various aspects of the Aqua mission.

Graham, S. M.; Parkinson, C. L.; Chambers, L. H.; Ray, S. E.



Analysis of Raman Lidar and radiosonde measurements from the AWEX-G field campaign and its relation to Aqua validation  

NASA Technical Reports Server (NTRS)

Early work within the Aqua validation activity revealed there to be large differences in water vapor measurement accuracy among the various technologies in use for providing validation data. The validation measurements were made at globally distributed sites making it difficult to isolate the sources of the apparent measurement differences among the various sensors, which included both Raman lidar and radiosonde. Because of this, the AIRS Water Vapor Experiment-Ground (AWEX-G) was held in October - November, 2003 with the goal of bringing validation technologies to a common site for intercomparison and resolution of the measurement discrepancies. Using the University of Colorado Cryogenic Frostpoint Hygrometer (CFH) as the water vapor reference, the AWEX-G field campaign resulted in new correction techniques for both Raman lidar, Vaisala RS80-H and RS90/92 measurements that significantly improve the absolute accuracy of those measurement systems particularly in the upper troposphere. Mean comparisons of radiosondes and lidar are performed demonstrating agreement between corrected sensors and the CFH to generally within 5% thereby providing data of sufficient accuracy for Aqua validation purposes. Examples of the use of the correction techniques in radiance and retrieval comparisons are provided and discussed.

Whiteman, D. N.; Russo, F.; Demoz, B.; Miloshevich, L. M.; Veselovskii, I.; Hannon, S.; Wang, Z.; Vomel, H.; Schmidlin, F.; Lesht, B.



Analysis of Raman Lidar and Radiosonde Measurements from the AWEX-G Field Campaign and Its Relation to Aqua Validation  

NASA Technical Reports Server (NTRS)

Early work within the Aqua validation activity revealed there to be large differences in water vapor measurement accuracy among the various technologies in use for providing validation data. The validation measurements were made at globally distributed sites making it difficult to isolate the sources of the apparent measurement differences among the various sensors, which included both Raman lidar and radiosonde. Because of this, the AIRS Water Vapor Experiment-Ground (AWEX-G) was held in October-November 2003 with the goal of bringing validation technologies to a common site for intercomparison and resolving the measurement discrepancies. Using the University of Colorado Cryogenic Frostpoint Hygrometer (CFH) as the water vapor reference, the AWEX-G field campaign permitted correction techniques to be validated for Raman lidar, Vaisala RS80-H and RS90/92 that significantly improve the absolute accuracy of water vapor measurements from these systems particularly in the upper troposphere. Mean comparisons of radiosondes and lidar are performed demonstrating agreement between corrected sensors and the CFH to generally within 5% thereby providing data of sufficient accuracy for Aqua validation purposes. Examples of the use of the correction techniques in radiance and retrieval comparisons are provided and discussed.

Whiteman, D. N.; Russo, F.; Demoz, B.; Miloshevich, L. M.; Veselovskii, I.; Hannon, S.; Wang, Z.; Vomel, H.; Schmidlin, F.; Lesht, B.; Moore, P. J.; Beebe, A. S.; Gambacorta, A.; Barnet, C.



Christhin: Quantitative Analysis of Thin Layer Chromatography  

E-print Network

Manual for Christhin 0.1.36 Christhin (Chromatography Riser Thin) is software developed for the quantitative analysis of data obtained from thin-layer chromatographic techniques (TLC). Once installed on your computer, the program is very easy to use, and provides data quickly and accurately. This manual describes the program, and reading should be enough to use it properly.

Barchiesi, Maximiliano; Renaudo, Carlos; Rossi, Pablo; Pramparo, María de Carmen; Nepote, Valeria; Grosso, Nelson Ruben; Gayol, María Fernanda



Method and apparatus for chromatographic quantitative analysis  


An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

Fritz, James S. (Ames, IA); Gjerde, Douglas T. (Ames, IA); Schmuckler, Gabriella (Haifa, IL)



Comprehensive quantitative analysis on privacy leak behavior.  


Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan



Comprehensive Quantitative Analysis on Privacy Leak Behavior  

PubMed Central

Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan



Quantitative analysis of colony morphology in yeast  

PubMed Central

Microorganisms often form multicellular structures such as biofilms and structured colonies that can influence the organism’s virulence, drug resistance, and adherence to medical devices. Phenotypic classification of these structures has traditionally relied on qualitative scoring systems that limit detailed phenotypic comparisons between strains. Automated imaging and quantitative analysis have the potential to improve the speed and accuracy of experiments designed to study the genetic and molecular networks underlying different morphological traits. For this reason, we have developed a platform that uses automated image analysis and pattern recognition to quantify phenotypic signatures of yeast colonies. Our strategy enables quantitative analysis of individual colonies, measured at a single time point or over a series of time-lapse images, as well as the classification of distinct colony shapes based on image-derived features. Phenotypic changes in colony morphology can be expressed as changes in feature space trajectories over time, thereby enabling the visualization and quantitative analysis of morphological development. To facilitate data exploration, results are plotted dynamically through an interactive Yeast Image Analysis web application (YIMAA; that integrates the raw and processed images across all time points, allowing exploration of the image-based features and principal components associated with morphological development. PMID:24447135

Ruusuvuori, Pekka; Lin, Jake; Scott, Adrian C.; Tan, Zhihao; Sorsa, Saija; Kallio, Aleksi; Nykter, Matti; Yli-Harja, Olli; Shmulevich, Ilya; Dudley, Aimee M.



Informatics and Quantitative Analysis in Biological Imaging  

NSDL National Science Digital Library

Biological imaging is now a quantitative technique for probing cellular structure and dynamics and is increasingly used for cell-based screens. However, the bioinformatics tools required for hypothesis-driven analysis of digital images are still immature. We are developing the Open Microscopy Environment (OME) as an informatics solution for the storage and analysis of optical microscope image data. OME aims to automate image analysis, modeling, and mining of large sets of images and specifies a flexible data model, a relational database, and an XML-encoded file standard that is usable by potentially any software tool. With this design, OME provides a first step toward biological image informatics.

Jason Swedlow (University of Dundee;); Ilya Goldberg (National Institutes of Health;Laboratory of Genetics, National Institute on Aging); Erik Brauner (Harvard Medical School;Institute of Chemistry and Cell Biology); Peter K. Sorger (Harvard Medical School/Massachusetts Institute of Technology;)



Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis  

NASA Technical Reports Server (NTRS)

Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

Shortle, J. F.; Allocco, M.



Aqua Satellite Mission Educational Outreach  

NASA Astrophysics Data System (ADS)

An important component of the Aqua mission, launched into space on May 4, 2002 with a suite of six instruments from the U.S., Japan, and Brazil, is the effort to educate the public about the mission and the science topics that it addresses. This educational outreach includes printed products, web casts, other web-based materials, animations, presentations, and a student contest. The printed products include brochures for the mission as a whole and for the instruments, NASA Fact Sheets on the mission, the water cycle, and weather forecasting, an Aqua Science Writers' Guide, an Aqua lithograph, posters, and trading cards. Animations include animations of the launch, the orbit, instrument deployments, instrument sensing, and several of the data products. Each of these materials is available on the Aqua web site at, as are archived versions of the eight Aqua web casts. The web casts were done live on the internet and focused on the spacecraft, the science, the launch, and the validation efforts. All web casts had key Aqua personnel as live guests and had a web-based chat session allowing viewers to ask questions. Other web-based materials include a "Cool Science" section of the website, with videos of Aqua scientists and engineers speaking about Aqua and the science and engineering behind it, arranged in a framework organized for the convenience of teachers dealing with core curriculum requirements. The web casts and "Cool Science" site were produced by the Special Project Initiatives Office at NASA's Goddard Space Flight Center. Outreach presentations about Aqua have been given at schools, universities, and public forums at many locations around the world, especially in the U.S. A competition was held for high school students during the 2002-03 school year, culminating in April 2003, with five finalist teams competing for the top slots, followed by an awards ceremony. The competition had all the student teams analyzing an anomalous situation encountered by Aqua shortly after launch and the five finalist teams determining how best to handle a hypothetical degradation of the solid state recorder.

Parkinson, C. L.; Graham, S. M.



Journal of Financial and Quantitative Analysis (JFQA)  

NSDL National Science Digital Library

The Journal of Financial and Quantitative Analysis is published by the School of Business Administration at the University of Washington, Seattle. It publishes theoretical and empirical research in financial economics. The web site details the instructions on submitting articles for publications; table of contents and abstracts of previous articles are also accessible. Full text of articles that have been accepted for publication are available in Acrobat format.



Influence analysis in quantitative trait loci detection.  


This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. PMID:24740424

Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko



Environmental Projects in the Quantitative Analysis Lab  

NASA Astrophysics Data System (ADS)

This article describes a revised design of the Quantitative Analysis course that includes a laboratory emphasizing teamwork, experimental design and hands-on exposure to instrumentation for environmental analysis. In the laboratory, spreadsheet use is introduced during the first week. Over the next six weeks, students individually conduct one gravimetric and three titrimetric analyses. Five investigative rotations are conducted using instrumental methods such as HPLC, GC, IC, FTIR, AA and UV-VIS over a seven week period, with students working in groups of three. Such an approach allows efficient and intensive use of expensive instruments that have traditionally not been a major component of the Quantitative Analysis laboratory. In the rotation experiments, there is strong emphasis on development of hypotheses, experimental design and environmental sampling. It is required that one experiment utilize solid phase extraction for sample preparation. Within groups, each person has the specific duties of project manager, chemist or instrument specialist. Evaluation emphasizes data analysis and interpretation. These changes in the laboratory have been facilitated by modifications to the traditional lecture sequence. Methods of analysis are now discussed in the first nine weeks of the semester, and chemical equilibria are the focus of the final six weeks.

Weidenhamer, Jeffrey D.



Aqua satellite orbiting the Earth  

NASA Video Gallery

This animation shows the Aqua satellite orbiting the Earth on August 27, 2005 by revealing MODIS true-color imagery for that day. This animation is on a cartesian map projection, so the satellite w...


Quantitative resilience analysis through control design.  

SciTech Connect

Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)



Quantitative surface analysis by total electron yield.  


When the surface of a solid sample is irradiated under vacuum by x-rays an electron emission, owing to photoabsorption, can be measured. As the electrons are detected under neglection of their kinetic energies the total electron yield (TEY) is determined. With a tuneable x-ray monochromator the TEY is measured below and above of one of the absorption edges of a given element. A jumplike increase of the TEY signal, due to the additional photoabsorptions in the corresponding atomic level, can be observed - qualitative analysis. The height of this jump can be correlateted to the concentration - quantitative analysis. It can be shown by a fundamental parameter approach for primary and secondary excitations how to use TEY for a quantitative analysis. The information depth lambda of this new method is approximately 2-400 nm depending on the chemical elements and on the original kinetic energies of Auger and photoelectrons. Thus, TEY is located between photoelectron spectrometry and x-ray fluorescence analysis. PMID:15048496

Ebel, H; Zagler, N; Svagera, R; Ebel, M; Kaitna, R



Quantitative Analysis of Surface Barkhausen Noise Measurements  

NASA Astrophysics Data System (ADS)

Barkhausen noise is the result of abrupt magnetic domain activity around pinning sites in ferromagnetic materials. Our recent work has investigated permeability-independent measurements of Barkhausen noise, made possible by directly controlling the magnetic circuit flux and its derivative. This approach, as opposed to control of the excitation coil current used in many other studies, significantly reduces sensitivity to lift-off, improves on measurement reproducibility and also improves analysis capability. Here, quantitative measurement and analysis techniques that can be applied to the measured waveforms are demonstrated. These start with single Barkhausen events, and expand to all the measured events in a full sweep around the hysteresis loop. A set of non-redundant parameters, useful for the characterization and analysis of Barkhausen noise, is presented.

White, S.; Krause, T.; Clapham, L.



Quantitative textural analysis of phenocryst zoning patterns  

NASA Astrophysics Data System (ADS)

The textural complexity of phenocrysts has made quantitative analysis of large populations of crystals a challenging study. Because each phenocryst expresses a unique localized event in the volcanic interior, no single crystal necessarily records the complete pre-eruptive history of the magmatic system as a whole. Synthesizing the textural and compositional records of many crystals, however, should provide a more complete understanding of conditions prior to eruption. In this research, we present new techniques for quantitative analysis of individual crystals and across populations of crystals. We apply those techniques to back-scattered electron images of complexly zoned plagioclase from El Chichón volcano, Mexico. Analysis begins with Gaussian filtering to remove noise from the images and create more qualitatively distinct zoning patterns. Because pixel intensity is directly correlated with Anorthite content, compositional anisotropy is then calculated throughout each image by determining the distance from a grid point at which variation in pixel intensity exceeds a pre-determined standard deviation; both regular and adaptive grid spacings are used, and length scales are calculated in 8 directions. The resulting textural maps are analogous to a vector field and quantify 2-dimensional variation in texture. With both types of grid spacing, changes in magnitude and orientation of textural anisotropy and length scale indicate different crystal zones. The adaptive grid spacing, however, describes non-uniform textural variation more completely and has a higher measurement density in regions of high-frequency variation. In general, textural regions commonly described as clean or smooth show longer length scales and aligned anisotropies, whereas shorter length scales with variable anisotropies identify areas commonly described as patchy, dusty, or rough. The comparison and correlation of textural and compositional zoning help determine how different crystals record the same magmatic event. This analytical technique presents a systematic method for quantitative description of crystal textures and permits the evaluation of large populations of crystals.

Niespolo, E.; Andrews, B. J.



Quantitative analysis of NMR spectra with chemometrics  

NASA Astrophysics Data System (ADS)

The number of applications of chemometrics to series of NMR spectra is rapidly increasing due to an emerging interest for quantitative NMR spectroscopy e.g. in the pharmaceutical and food industries. This paper gives an analysis of advantages and limitations of applying the two most common chemometric procedures, Principal Component Analysis (PCA) and Multivariate Curve Resolution (MCR), to a designed set of 231 simple alcohol mixture (propanol, butanol and pentanol) 1H 400 MHz spectra. The study clearly demonstrates that the major advantage of chemometrics is the visualisation of larger data structures which adds a new exploratory dimension to NMR research. While robustness and powerful data visualisation and exploration are the main qualities of the PCA method, the study demonstrates that the bilinear MCR method is an even more powerful method for resolving pure component NMR spectra from mixtures when certain conditions are met.

Winning, H.; Larsen, F. H.; Bro, R.; Engelsen, S. B.



Quantitative analysis of spirality in elliptical galaxies  

NASA Astrophysics Data System (ADS)

We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

Dojcsak, Levente; Shamir, Lior



Virtual bronchoscopy for quantitative airway analysis  

NASA Astrophysics Data System (ADS)

We propose a new quantitative method for detailed analysis of the major airways. Using a 3D MDCT chest image as input, the method involves three major steps: (1) segmentation of the airway tree, (2) extraction of the central-axis structure of the major airways, and (3) a novel improvement on the standard full-width half-maximum approach for airway-wall delineation. The method produces measurements for all defined tree branches. These measurements include various airway diameters and cross-sectional area values. To facilitate the examination of these measurements, we also demonstrate an integrated virtual-bronchoscopic analysis system that enables flexible interrogation of the airways. Of particular note are techniques for unraveling and viewing the topography of selected airways. A large series of phantom and human tests confirm the efficacy of our methods.

Kiraly, Atilla P.; Reinhardt, Joseph M.; Hoffman, Eric A.; McLennan, Geoffrey; Higgins, William E.



Aqua 10 Years After Launch  

NASA Technical Reports Server (NTRS)

A little over ten years ago, in the early morning hours of May 4, 2002, crowds of spectators stood anxiously watching as the Delta II rocket carrying NASA's Aqua spacecraft lifted off from its launch pad at Vandenberg Air Force Base in California at 2:55 a.m. The rocket quickly went through a low-lying cloud cover, after which the main portion of the rocket fell to the waters below and the rockets second stage proceeded to carry Aqua south across the Pacific, onward over Antarctica, and north to Africa, where the spacecraft separated from the rocket 59.5 minutes after launch. Then, 12.5 minutes later, the solar array unfurled over Europe, and Aqua was on its way in the first of what by now have become over 50,000 successful orbits of the Earth.

Parkinson, Claire L.



Quantitative local analysis of nonlinear systems  

NASA Astrophysics Data System (ADS)

This thesis investigates quantitative methods for local robustness and performance analysis of nonlinear dynamical systems with polynomial vector fields. We propose measures to quantify systems' robustness against uncertainties in initial conditions (regions-of-attraction) and external disturbances (local reachability/gain analysis). S-procedure and sum-of-squares relaxations are used to translate Lyapunov-type characterizations to sum-of-squares optimization problems. These problems are typically bilinear/nonconvex (due to local analysis rather than global) and their size grows rapidly with state/uncertainty space dimension. Our approach is based on exploiting system theoretic interpretations of these optimization problems to reduce their complexity. We propose a methodology incorporating simulation data in formal proof construction enabling more reliable and efficient search for robustness and performance certificates compared to the direct use of general purpose solvers. This technique is adapted both to region-of-attraction and reachability analysis. We extend the analysis to uncertain systems by taking an intentionally simplistic and potentially conservative route, namely employing parameter-independent rather than parameter-dependent certificates. The conservatism is simply reduced by a branch-and-hound type refinement procedure. The main thrust of these methods is their suitability for parallel computing achieved by decomposing otherwise challenging problems into relatively tractable smaller ones. We demonstrate proposed methods on several small/medium size examples in each chapter and apply each method to a benchmark example with an uncertain short period pitch axis model of an aircraft. Additional practical issues leading to a more rigorous basis for the proposed methodology as well as promising further research topics are also addressed. We show that stability of linearized dynamics is not only necessary but also sufficient for the feasibility of the formulations in region-of-attraction analysis. Furthermore, we generalize an upper bound refinement procedure in local reachability/gain analysis which effectively generates non-polynomial certificates from polynomial ones. Finally, broader applicability of optimization-based tools stringently depends on the availability of scalable/hierarchial algorithms. As an initial step toward this direction, we propose a local small-gain theorem and apply to stability region analysis in the presence of unmodeled dynamics.

Topcu, Ufuk


Challenges in the Modeling and Quantitative Analysis of Safety-Critical Automotive Systems!  

E-print Network

! ,,identify Failures"! - Qualitative FMEA! ! - Qualitative Fault Tree Analysis! ! - Event Tree Analysis! ! - Qualitative Fault Tree Analysis! ! - Event Tree Analysis! Quantitative Methods! ,,predict frequency of failures"! - Quantitative FMEA! ! - Quantitative Fault Tree Analysis! ! - Event Tree Analysis! ! - Markov

Leue, Stefan


Error Propagation Analysis for Quantitative Intracellular Metabolomics  

PubMed Central

Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected. PMID:24957773

Tillack, Jana; Paczia, Nicole; Noh, Katharina; Wiechert, Wolfgang; Noack, Stephan



A Framework for Quantitative Security Analysis of Machine Learning  

E-print Network

for quantitative security analysis of machine learning methods. Key issus of this framework are a formalA Framework for Quantitative Security Analysis of Machine Learning Pavel Laskov Universität--Parameter learning; I.5.2 [Pattern Recognition]: Design Methodology--Classifier design and evaluation General Terms

Freytag, Johann-Christoph


Quantitative Analysis of Hypoperfusion in Acute Stroke  

PubMed Central

Background and Purpose This study compares the concordance between arterial spin labeling (ASL) and dynamic susceptibility contrast (DSC) for the identification of regional hypoperfusion and diffusion-perfusion mismatch tissue classification using a quantitative method. Methods The inclusion criteria for this retrospective study were as follows: patients with acute ischemic syndrome with symptom onset <24 hours and acquisition of both ASL and DSC MR perfusion. The volumes of infarction and hypoperfused lesions were calculated on ASL and DSC multi-parametric maps. Patients were classified into reperfused, matched, or mismatch groups using time to maximum >6 sec as the reference. In a subset of patients who were successfully recanalized, the identical analysis was performed and the infarction and hypoperfused lesion volumes were used for paired pre- and posttreatment comparisons. Results Forty-one patients met our inclusion criteria. Twenty patients underwent successful endovascular revascularization (TICI>2a), resulting in a total of 61 ASL-DSC data pairs for comparison. The hypoperfusion volume on ASL-cerebral blood flow best approximated the DSC-time to peak volume (r=0.83) in pretreatment group and time to maximum (r=0.46) after recanalization. Both ASL-cerebral blood flow and DSC-TTP overestimated the hypoperfusion volume compared with time to maximum volume in pretreatment (F=27.41, P<0.0001) and recanalized patients (F=8.78, P<0.0001). Conclusions ASL-cerebral blood flow overestimates the DSC time to maximum hypoperfusion volume and mismatch classification in patients with acute ischemic syndrome. Continued overestimation of hypoperfused volume after recanalization suggests flow pattern and velocity changes in addition to arterial transit delay can affects the performance of ASL. PMID:23988646

Nael, Kambiz; Meshksar, Arash; Liebeskind, David S.; Coull, Bruce M.; Krupinski, Elizabeth A.; Villablanca, J. Pablo



In aqua vivo EPID dosimetry  

SciTech Connect

Purpose: At the Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital in vivo dosimetry using an electronic portal imaging device (EPID) has been implemented for almost all high-energy photon treatments of cancer with curative intent. Lung cancer treatments were initially excluded, because the original back-projection dose-reconstruction algorithm uses water-based scatter-correction kernels and therefore does not account for tissue inhomogeneities accurately. The aim of this study was to test a new method, in aqua vivo EPID dosimetry, for fast dose verification of lung cancer irradiations during actual patient treatment. Methods: The key feature of our method is the dose reconstruction in the patient from EPID images, obtained during the actual treatment, whereby the images have been converted to a situation as if the patient consisted entirely of water; hence, the method is termed in aqua vivo. This is done by multiplying the measured in vivo EPID image with the ratio of two digitally reconstructed transmission images for the unit-density and inhomogeneous tissue situation. For dose verification, a comparison is made with the calculated dose distribution with the inhomogeneity correction switched off. IMRT treatment verification is performed for each beam in 2D using a 2D {gamma} evaluation, while for the verification of volumetric-modulated arc therapy (VMAT) treatments in 3D a 3D {gamma} evaluation is applied using the same parameters (3%, 3 mm). The method was tested using two inhomogeneous phantoms simulating a tumor in lung and measuring its sensitivity for patient positioning errors. Subsequently five IMRT and five VMAT clinical lung cancer treatments were investigated, using both the conventional back-projection algorithm and the in aqua vivo method. The verification results of the in aqua vivo method were statistically analyzed for 751 lung cancer patients treated with IMRT and 50 lung cancer patients treated with VMAT. Results: The improvements by applying the in aqua vivo approach are considerable. The percentage of {gamma} values {<=}1 increased on average from 66.2% to 93.1% and from 43.6% to 97.5% for the IMRT and VMAT cases, respectively. The corresponding mean {gamma} value decreased from 0.99 to 0.43 for the IMRT cases and from 1.71 to 0.40 for the VMAT cases, which is similar to the accepted clinical values for the verification of IMRT treatments of prostate, rectum, and head-and-neck cancers. The deviation between the reconstructed and planned dose at the isocenter diminished on average from 5.3% to 0.5% for the VMAT patients and was almost the same, within 1%, for the IMRT cases. The in aqua vivo verification results for IMRT and VMAT treatments of a large group of patients had a mean {gamma} of approximately 0.5, a percentage of {gamma} values {<=}1 larger than 89%, and a difference of the isocenter dose value less than 1%. Conclusions: With the in aqua vivo approach for the verification of lung cancer treatments (IMRT and VMAT), we can achieve results with the same accuracy as obtained during in vivo EPID dosimetry of sites without large inhomogeneities.

Wendling, Markus; McDermott, Leah N.; Mans, Anton; Olaciregui-Ruiz, Igor; Pecharroman-Gallego, Raul; Sonke, Jan-Jakob; Stroom, Joep; Herk, Marcel J.; Mijnheer, Ben van [Department of Radiation Oncology, Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Plesmanlaan 121, 1066 CX Amsterdam (Netherlands)




E-print Network

College of Canada, Kingston, ON, K7K 7B4 ABSTRACT. Barkhausen noise is tiie result of abrupt magnetic response to elastic stress [3]. Consistent BN requires that M be reproduced for each measurement. A general in the student poster competition of 2007 CP975, Review of Quantitative Nondestructive Evaluation Vol. 27, ed

Clapham, Lynann


Addressing an inadequacy in quantitative analysis: examining NMP technology selection  

Microsoft Academic Search

In its early years NASA's New Millennium Program (NMP) applied quantitative analysis methods in making complex decision in selecting technology for flight validation. In 1997, the quantitative approach showed serious signs of weakness in the selection of technologies for DS3. In response to this, the NMP began to explore new methods to select its technologies. To understand this inadequacy in

M. Bergmann; Martin G. Buehler



Quantitative analysis of comparative genomic hybridization  

SciTech Connect

Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a program for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.

Manoir, S. du; Bentz, M.; Joos, S. [Abteilung Organisation komplexer Genome, Heidelberg (Germany)]|[Institut fuer Humangenetik, Heidelberg (Germany)] [and others



Quantitative analysis of cerebral white matter anatomy from diffusion MRI  

E-print Network

In this thesis we develop algorithms for quantitative analysis of white matter fiber tracts from diffusion MRI. The presented methods enable us to look at the variation of a diffusion measure along a fiber tract in a single ...

Maddah, Mahnaz



A Full Snow Season in Yellowstone: A Database of Restored Aqua Band 6  

NASA Technical Reports Server (NTRS)

The algorithms for estimating snow extent for the Moderate Resolution Imaging Spectroradiometer (MODIS) optimally use the 1.6- m channel which is unavailable for MODIS on Aqua due to detector damage. As a test bed to demonstrate that Aqua band 6 can be restored, we chose the area surrounding Yellowstone and Grand Teton national parks. In such rugged and difficult-to-access terrain, satellite images are particularly important for providing an estimation of snow-cover extent. For the full 2010-2011 snow season covering the Yellowstone region, we have used quantitative image restoration to create a database of restored Aqua band 6. The database includes restored radiances, normalized vegetation index, normalized snow index, thermal data, and band-6-based snow-map products. The restored Aqua-band-6 data have also been regridded and combined with Terra data to produce a snow-cover map that utilizes both Terra and Aqua snow maps. Using this database, we show that the restored Aqua-band-6-based snow-cover extent has a comparable performance with respect to ground stations to the one based on Terra. The result of a restored band 6 from Aqua is that we have an additional band-6 image of the Yellowstone region each day. This image can be used to mitigate cloud occlusion, using the same algorithms used for band 6 on Terra. We show an application of this database of restored band-6 images to illustrate the value of creating a cloud gap filling using the National Aeronautics and Space Administration s operational cloud masks and data from both Aqua and Terra.

Gladkova, Irina; Grossberg, Michael; Bonev, George; Romanov, Peter; Riggs, George; Hall, Dorothy



Aqua's First 10 Years: An Overview  

NASA Technical Reports Server (NTRS)

NASA's Aqua spacecraft was launched at 2:55 a.m. on May 4, 2002, from Vandenberg Air Force Base in California, into a near-polar, sun-synchronous orbit at an altitude of 705 km. Aqua carries six Earth-observing instruments to collect data on water in all its forms (liquid, vapor, and solid) and on a wide variety of additional Earth system variables (Parkinson 2003). The design lifetime for Aqua's prime mission was 6 years, and Aqua is now well into its extended mission, approaching 10 years of successful operations. The Aqua data have been used for hundreds of scientific studies and continue to be used for scientific discovery and numerous practical applications.

Parkinson, Claire L.



Quantitative data analysis of ESAR data  

NASA Astrophysics Data System (ADS)

A synthetic aperture radar (SAR) data processing uses the backscattered electromagnetic wave to map radar reflectivity of the ground surface. The polarization property in radar remote sensing was used successfully in many applications, especially in target decomposition. This paper presents a case study to the experiments which are performed on ESAR L-Band full polarized data sets from German Aerospace Center (DLR) to demonstrate the potential of coherent target decomposition and the possibility of using the weather radar measurement parameter, such as the differential reflectivity and the linear depolarization ratio to obtain the quantitative information of the ground surface. The raw data of ESAR has been processed by the SAR simulator developed using MATLAB program code with Range-Doppler algorithm.

Phruksahiran, N.; Chandra, M.



Database design and implementation for quantitative image analysis research.  


Quantitative image analysis (QIA) goes beyond subjective visual assessment to provide computer measurements of the image content, typically following image segmentation to identify anatomical regions of interest (ROIs). Commercially available picture archiving and communication systems focus on storage of image data. They are not well suited to efficient storage and mining of new types of quantitative data. In this paper, we present a system that integrates image segmentation, quantitation, and characterization with database and data mining facilities. The paper includes generic process and data models for QIA in medicine and describes their practical use. The data model is based upon the Digital Imaging and Communications in Medicine (DICOM) data hierarchy, which is augmented with tables to store segmentation results (ROIs) and quantitative data from multiple experiments. Data mining for statistical analysis of the quantitative data is described along with example queries. The database is implemented in PostgreSQL on a UNIX server. Database requirements and capabilities are illustrated through two quantitative imaging experiments related to lung cancer screening and assessment of emphysema lung disease. The system can manage the large amounts of quantitative data necessary for research, development, and deployment of computer-aided diagnosis tools. PMID:15787012

Brown, Matthew S; Shah, Sumit K; Pais, Richard C; Lee, Yeng-Zhong; McNitt-Gray, Michael F; Goldin, Jonathan G; Cardenas, Alfonso F; Aberle, Denise R



Quantitative multi-modal NDT data analysis  

NASA Astrophysics Data System (ADS)

A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

Heideklang, René; Shokouhi, Parisa



Multiple quantitative trait analysis using bayesian networks.  


Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian



A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis  

ERIC Educational Resources Information Center

In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

Katharaki, Maria; Katharakis, George



Quantitative control of idealized analysis models of thin , Junzhe Zhenga  

E-print Network

Quantitative control of idealized analysis models of thin designs Ming Li1a , Junzhe Zhenga , Ralph analysis, model idealization is often used, where defeaturing, and/or local dimension reduction of thin. During this process, an initial step of model idealization [1, 2] is often performed to convert the fully

Martin, Ralph R.


Quantitative Evaluation of Nuclear Cataract Using Image Analysis  

Microsoft Academic Search

To quantitatively evaluate nuclear lens opacification, we applied image analysis techniques. Utilizing a newly developed anterior eye segment analysis system, Scheimpflug slit images were taken in 65 eyes with transparent lenses and 31 eyes with nuclear cataract. In transparent lenses, scattering light intensity of the anterior fetal nucleus (AFN) was equal to or less than that of the posterior fetal

Kazuyuki Sasaki; Kuruto Fujisawa; Yasuo Sakamoto



Methods of quantitative fire hazard analysis  

SciTech Connect

Simplified fire hazard analysis methods have been developed as part of the FIVE risk-based fire induced vulnerability evaluation methodology for nuclear power plants. These fire hazard analyses are intended to permit plant fire protection personnel to conservatively evaluate the potential for credible exposure fires to cause critical damage to essential safe-shutdown equipment and thereby screen from further analysis spaces where a significant fire hazard clearly does not exist. This document addresses the technical bases for the fire hazard analysis methods. A separate user's guide addresses the implementation of the fire screening methodology, which has been implemented with three worksheets and a number of look-up tables. The worksheets address different locations of targets relative to exposure fire sources. The look-up tables address fire-induced conditions in enclosures in terms of three stages: a fire plume/ceiling jet period, an unventilated enclosure smoke filling period and a ventilated quasi-steady period.

Mowrer, F.W. (Mowrer (Frederick W.), Adelphi, MD (United States))



Quantitative Immunoelectrophoretic Analysis of Streptococcus pyogenes Membrane  

PubMed Central

The antigenic composition and molecular structure of the plasma membrane of Streptococcus pyogenes (group A; M type 6) were studied by crossed immunoelectrophoresis (XIE) and other related quantitative immunoelectrophoretic techniques. After establishment of a reference pattern of 29 immunoprecipitates, the relative differences in amounts of individual antigens contained in membranes isolated from cells that were harvested during the exponential or stationary phase of growth were examined. Relative increases and decreases in amounts of individual antigens were estimated from the areas subtended by immunoprecipitates after XIE of Triton X-100 extracts. The asymmetric distribution of antigens on the inner and outer surfaces of the membrane was established in absorption experiments with intact, stable protoplasts. Of the 29 immunoprecipitates, 8 appeared to contain antigens exposed on the outer surface of the membrane, whereas 11 appeared to contain antigens either located on the inner surface or unexposed. Six antigens appeared to have limited exposure on the outer surface, and four others remain to be assigned. Certain immunoprecipitates were characterized with respect to enzymatic activity or interaction with the lectin concanavalin A. Reduced nicotinamide adenine dinucleotide dehydrogenase (EC, adenosine triphosphatase (EC, and polynucleotide phosphorylase (EC were demonstrated by zymogram techniques. The latter two activities were present within the same immunoprecipitate, suggesting the occurrence of a multienzyme complex. In addition, the areas under the immunoprecipitates containing the three enzymatic activities were not affected by absorption of antimembrane immunoglobulin with intact protoplasts and thus appeared to be located on the inner surface of the membrane. The results from absorption experiments also suggested that the exposure of outer protoplast surface antigens was greater on protoplasts from exponential-phase cells than on those from stationary-phase cells, even when found in increased amounts in the latter. Images PMID:160891

Kessler, Robert E.; van de Rijn, Ivo



Journal of Quantitative Analysis in Manuscript 1416  

E-print Network

/loss records of each team. KEYWORDS: scoring statistics, hot hand, stochastics, random walk, Poisson process for helpful comments on an earlier version of the manuscript. This work was supported in part by NSF grant DMR is at odds with the data, however. Impartial analysis of individual player data in basketball has discredited

Redner, Sidney


Water property monitoring and assessment for China's inland Lake Taihu from MODIS-Aqua measurements  

Microsoft Academic Search

We provide results of quantitative measurements and characterization for inland freshwater Lake Taihu from the Moderate Resolution Imaging Spectroradiometer (MODIS) on the satellite Aqua. China's Lake Taihu, which is located in the Yangtze River delta in one of the world's most urbanized and heavily populated areas, contains consistently highly turbid waters in addition to frequent large seasonal algae blooms in

Menghua Wang; Wei Shi; Junwu Tang



Quantitative analysis of heart rate variability  

Microsoft Academic Search

In the modern industrialized countries every year several hundred thousands of people diedue to the sudden cardiac death. The individual risk for this sudden cardiac death cannot bedefined precisely by common available, non-invasive diagnostic tools like Holter-monitoring,highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore,we apply some rather unconventional methods of nonlinear dynamics to analyse the

J. Kurths; A. Voss; P. Saparin; A. Witt; H. J. Kleiner; N. Wessel



Synergism of MODIS Aerosol Remote Sensing from Terra and Aqua  

NASA Technical Reports Server (NTRS)

The MODerate-resolution Imaging Spectro-radiometer (MODIS) sensors, aboard the Earth Observing System (EOS) Terra and Aqua satellites, are showing excellent competence at measuring the global distribution and properties of aerosols. Terra and Aqua were launched on December 18, 1999 and May 4, 2002 respectively, with daytime equator crossing times of approximately 10:30 am and 1:30 pm respectively. Several aerosol parameters are retrieved at 10-km spatial resolution from MODIS daytime data over land and ocean surfaces. The parameters retrieved include: aerosol optical thickness (AOT) at 0.47, 0.55 and 0.66 micron wavelengths over land, and at 0.47, 0.55, 0.66, 0.87, 1.2, 1.6, and 2.1 microns over ocean; Angstrom exponent over land and ocean; and effective radii, and the proportion of AOT contributed by the small mode aerosols over ocean. Since the beginning of its operation, the quality of Terra-MODIS aerosol products (especially AOT) have been evaluated periodically by cross-correlation with equivalent data sets acquired by ground-based (and occasionally also airborne) sunphotometers, particularly those coordinated within the framework of the AErosol Robotic NETwork (AERONET). Terra-MODIS AOT data have been found to meet or exceed pre-launch accuracy expectations, and have been applied to various studies dealing with local, regional, and global aerosol monitoring. The results of these Terra-MODIS aerosol data validation efforts and studies have been reported in several scientific papers and conferences. Although Aqua-MODIS is still young, it is already yielding formidable aerosol data products, which are also subjected to careful periodic evaluation similar to that implemented for the Terra-MODIS products. This paper presents results of validation of Aqua-MODIS aerosol products with AERONET, as well as comparative evaluation against corresponding Terra-MODIS data. In addition, we show interesting independent and synergistic applications of MODIS aerosol data from both Terra and Aqua. In certain situations, this combined analysis of Terra- and Aqua-MODIS data offers an insight into the diurnal cycle of aerosol loading.

Ichoku, Charles; Kaufman, Yoram J.; Remer, Lorraine A.



Quantitative analysis of heart rate variability  

NASA Astrophysics Data System (ADS)

In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, noninvasive diagnostic tools like Holter monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyze the HRV. Especially, some complexity measures that are based on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.

Kurths, J.; Voss, A.; Saparin, P.; Witt, A.; Kleiner, H. J.; Wessel, N.



A quantitative analysis of the F18 flight control system  

NASA Technical Reports Server (NTRS)

This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann



Crystal growth, stability and photoluminescence studies of tetra aqua diglycine magnesium (II) hexa aqua magnesium (II) bis sulfate crystal  

NASA Astrophysics Data System (ADS)

Single crystals of tetra aqua diglycine magnesium (II) hexa aqua magnesium (II) bis sulfate have been grown from saturated aqueous solution by slow evaporation solution growth technique. The solubility of the title compound in water at various temperatures has been determined. Single-crystal X-ray diffraction analyses reveal that the title compound crystallizes in triclinic system with space group P1¯. Fourier transform infrared spectral analyses confirm the presence of functional groups in the grown crystal. The thermal stability of the grown crystal has been investigated by thermogravimetric and differential scanning calorimetric analysis. It indicates that the material is stable upto 100 °C. The crystalline perfection of the grown crystal has been evaluated by high-resolution X-ray diffraction technique. Vickers microhardness measurements indicate the mechanical strength of the grown crystal. Photoluminescence of the grown crystal has been investigated and it reveals that the crystal has blue-violet fluorescence emission.

Senthil Murugan, G.; Ramasamy, P.



Quantitative transverse flow measurement using OCT speckle decorrelation analysis  

PubMed Central

We propose an inter-Ascan speckle decorrelation based method that can quantitatively assess blood flow normal to the direction of the OCT imaging beam. To validate this method, we performed a systematic study using both phantom and in vivo animal models. Results show that our speckle analysis method can accurately extract transverse flow speed with high spatial and temporal resolution. PMID:23455305

Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Mathews, Scott A.; Kang, Jin U.



Issues Related to Data Analysis and Quantitative Methods in PER  

NSDL National Science Digital Library

This paper, presented at the 2002 Physics Education Research Conference, offers authors' discussion of some issues that always arise, implicitly or explicitly, when conducting quantitative research and carrying out data analysis in Physics Education Research. (Most are relevant for qualitative research as well.)

Meltzer, David (David Elliott)



Quantitative analysis of thallium-201 myocardial emission computed tomography  

SciTech Connect

The clinical usefulness of quantitative analysis of exercise thallium-201 myocardial emission computed tomography (ECT) was evaluated in twenty coronary artery disease patients and 10 normal controls. Longaxial and shortaxial myocardial images of left ventricle were interpreted quantitatively using circumferential profile analysis and two types of abnormality were studied: (1) diminished initial distribution (stress defect) and (2) slow washout of thallium-201, evidenced by patients' initial thallium-201 uptake and 3-hour washout rate profiles falling below the normal limits, respectively. Two diagnostic criteria, stress defect criterion and combined one of stress defect and slow washout, were used to detect significant coronary artery lesions (greater than or equal to75% luminal narrowing). The ischemic volume was also evaluated with quantitative analysis of thallium-201 ECT. The diagnostic accuracy of stress defect criterion was 95% in left ascending, 90% in right coronary and 78% in left circumflex coronary artery lesions. The combined criterion of stress defect and low washout increased detection sensitivity with moderate loss of specificity for identification of individual coronary artery lesions. The ischemic myocardial volume evaluated with combined criterion was significantly larger in triple vessel than in single vessel disease (p<0.05). It was concluded that quantitative analysis of exercise thallium-201 myocardial ECT images was useful for evaluation of coronary artery lesions.

Okada, M.; Kawai, N.; Yamamoto, S.; Matsushima, H.; Kato, R.; Tanahashi, Y.; Sotobata, I.; Obata, Y.; Sakuma, S.



Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis  

E-print Network

Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis Ildefonso M. De la for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location

Cortes, Jesus


Perceptual Analysis of Talking Avatar Head Movements: A Quantitative Perspective  

E-print Network

Perceptual Analysis of Talking Avatar Head Movements: A Quantitative Perspective Xiaohan Ma ABSTRACT Lifelike interface agents (e.g. talking avatars) have been in- creasingly used in human-head motion characteristics of talk- ing avatars. Specifically, we quantify the correlation be- tween

Azevedo, Ricardo


A quantitative approach for medical device Health Hazard Analysis  

Microsoft Academic Search

Health Hazard Analysis (HHA) is one major type of patient health risk assessment for medical device field performance issue. U.S. Food and Drug Administration (FDA) has an online form, listing the needed information for HHA. In this paper, we will illustrate a quantitative HHA approach, which is structured in a rigorous risk assessment framework, with several critical steps, concepts and

Mingxiao Jiang; Kathy Herzog; Thomas Pepin; Michael D. Baca



Original article Quantitative and qualitative analysis of hydrosoluble  

E-print Network

, organic compounds leached from bitumen may complex heavy metals [12] and increase their solubilityOriginal article Quantitative and qualitative analysis of hydrosoluble organic matter in bitumen. Indeed, the generation of water-soluble organic complexing agents could affect the integrity

Paris-Sud XI, Université de


Quantitative Analysis of White Matter Fiber Properties along Geodesic Paths  

E-print Network

on anatomical and functional criteria. These fiber bundles are assessed in-vivo by techniques commonly calledQuantitative Analysis of White Matter Fiber Properties along Geodesic Paths 1,3 Pierre Fillard, 2 resonance technique to study white matter properties and alter- ations of fiber integrity due to pathology

Gerig, Guido


Quantitative analysis of multispectral fundus I. B. Stylesa  

E-print Network

fundus photography, which provides an RGB image of the fundus. Modern digital fundus cameras can provideQuantitative analysis of multispectral fundus images I. B. Stylesa , A. Calcagnia,b , E. Claridgea for extracting histological parameters from multi- spectral images of the ocular fundus. The new method uses

Claridge, Ela


Issues Related to Data Analysis and Quantitative Methods in PER  

NSDL National Science Digital Library

This paper, presented at the 2002 Physics Education Research Conference, offers authors' discussion of some issues that always arise, implicitly or explicitly, when conducting quantitative research and carrying out data analysis in Physics Education Research. (Most are relevant for qualitative research as well.)

Meltzer, David E.



Quantitative analysis of regional myocardial performance in coronary artery disease  

NASA Technical Reports Server (NTRS)

Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

Stewart, D. K.; Dodge, H. T.; Frimer, M.



An improved quantitative analysis method for plant cortical microtubules.  


The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng



Quantitative EDXS analysis of organic materials using the ?-factor method.  


In this study we successfully applied the ?-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (?-factors) using a single standard specimen and the involved parameter optimization for the estimation of ?-factors for elements not contained in the standard. The ?-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. PMID:24012932

Fladischer, Stefanie; Grogger, Werner



Oscillations in glycolysis: multifactorial quantitative analysis in muscle extract  

Microsoft Academic Search

A multifactorial quantitative analysis of oscillations in glycolysis was conducted in the postmicrosomal supernatant of rat muscle homogenates incubated in the presence of yeast hexokinase. Oscillations in adenine nucleotides, D-fructose 1,6-bisphosphate, triose phosphates, L-glycerol 3-phosphate, 3HOH generation from D-[5-3H]glucose, NADH and L-lactate production were documented. The occurrence of such oscillations were found to depend mainly on the balance between the

Greta Marynissen; Abdullah Sener; Willy J. Malaisse



Application of the quantitative IR analysis of copolymers of trioxane  

Microsoft Academic Search

The copolymers of trioxane and 5 to 50% phenyl- or n-butyl glycidyl ethers were Investigated with elemental anlysis and IR spectroscopy. Using as an internal standard the methylene absorbance at 1470 sm?1 and the absorbance of C?O?C groups at 900, 930, 1100 and 1240 sm?1), it was determined that the absorbance at 1100 sm?1 is available for quantitative IR analysis

G. Sirashki; Iv. Glavchev; R. Mateva



A Quantitative Analysis of Preclusivity vs. Similarity Based Rough Approximations  

Microsoft Academic Search

In the context of generalized rough sets, it is possible to introduce in an Information System two different rough approximations.\\u000a These are induced, respectively, by a Similarity and a Preclusivity relation ([3,4]). It is possible to show that the last one is always better than the first one. Here, we present a quantitative analysis\\u000a of the relative performances of the

Gianpiero Cattaneo; Davide Ciucci



Quantitative analysis of food fatty acids by capillary gas chromatography  

Microsoft Academic Search

The superior efficiency of capillary columns is desirable for the gas chromatographic analysis of complex mixtures of fatty\\u000a acids, but there have been some reservations regarding quantitation and reproducibility. This paper discusses the use of wall-coated\\u000a glass capillary columns in a semiautomated system for the determination of food fatty acids. Glass columns coated with SP2340\\u000a were used for extended periods

H. T. Slover; E. Lanza



Quantitative analysis of lipids by thin-layer chromatography  

Microsoft Academic Search

A procedure is described for the quantitative analysis of neutral and phospholipids by thinlayer chromatography (TLC) employing\\u000a densitometry. The chromatophates are prepared with the usual solvent systems. The spots are charred under standard conditions\\u000a and analyzed with a Photovolt Corp. densitometer equipped with a special stage designed for holding 20×20 cm chromatoplates.\\u000a Each spot on the chromatoplate gives a peak

M. L. Blank; J. A. Schmit; O. S. Privett



Multipoint Quantitative-Trait Linkage Analysis in General Pedigrees  

Microsoft Academic Search

Summary Multipoint linkage analysis of quantitative-trait loci (QTLs) has previously been restricted to sibships and small pedigrees. In this article, we show how variance- component linkage methods can be used in pedigrees of arbitrary size and complexity, and we develop a general framework for multipoint identity-by-descent (IBD) probability calculations. We extend the sib-pair multi- point mapping approach of Fulker et

Laura Almasy; John Blangero



BiDirectional optical communication with AquaOptical II  

E-print Network

This paper describes AquaOptical II, a bidirectional, high data-rate, long-range, underwater optical communication system. The system uses the software radio principle. Each AquaOptical II modem can be programmed to transmit ...

Doniec, Marek Wojciech


Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.  

ERIC Educational Resources Information Center

Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

Anderson, James L.; And Others



Quantitative analysis of patient-specific dosimetric IMRT verification  

NASA Astrophysics Data System (ADS)

Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community.

Budgell, G. J.; Perrin, B. A.; Mott, J. H. L.; Fairfoul, J.; Mackay, R. I.



Bayesian Shrinkage Analysis of Quantitative Trait Loci for Dynamic Traits  

PubMed Central

Many quantitative traits are measured repeatedly during the life of an organism. Such traits are called dynamic traits. The pattern of the changes of a dynamic trait is called the growth trajectory. Studying the growth trajectory may enhance our understanding of the genetic architecture of the growth trajectory. Recently, we developed an interval-mapping procedure to map QTL for dynamic traits under the maximum-likelihood framework. We fit the growth trajectory by Legendre polynomials. The method intended to map one QTL at a time and the entire QTL analysis involved scanning the entire genome by fitting multiple single-QTL models. In this study, we propose a Bayesian shrinkage analysis for estimating and mapping multiple QTL in a single model. The method is a combination between the shrinkage mapping for individual quantitative traits and the Legendre polynomial analysis for dynamic traits. The multiple-QTL model is implemented in two ways: (1) a fixed-interval approach where a QTL is placed in each marker interval and (2) a moving-interval approach where the position of a QTL can be searched in a range that covers many marker intervals. Simulation study shows that the Bayesian shrinkage method generates much better signals for QTL than the interval-mapping approach. We propose several alternative methods to present the results of the Bayesian shrinkage analysis. In particular, we found that the Wald test-statistic profile can serve as a mechanism to test the significance of a putative QTL. PMID:17435239

Yang, Runqing; Xu, Shizhong



Quantitative Remote Laser-Induced Breakdown Spectroscopy by Multivariate Analysis  

NASA Astrophysics Data System (ADS)

The ChemCam instrument selected for the Mars Science Laboratory (MSL) rover includes a remote Laser- Induced Breakdown Spectrometer (LIBS) that will quantitatively probe samples up to 9m from the rover mast. LIBS is fundamentally an elemental analysis technique. LIBS involves focusing a Nd:YAG laser operating at 1064 nm onto the surface of the sample. The laser ablates material from the surface, generating an expanding plasma containing electronically excited ions, atoms, and small molecules. As these electronically excited species relax back to the ground state, they emit light at wavelengths characteristic of the species present in the sample. Some of this emission is directed into one of three dispersive spectrometers. In this paper, we studied a suite of 18 igneous and highly-metamorphosed samples from a wide variety of parageneses for which chemical analyses by XRF were already available. Rocks were chosen to represent a range of chemical composition from basalt to rhyolite, thus providing significant variations in all of the major element contents (Si, Fe, Al, Ca, Na, K, O, Ti, Mg, and Mn). These samples were probed at a 9m standoff distance under experimental conditions that are similar to ChemCam. Extracting quantitative elemental concentrations from LIBS spectra is complicated by the chemical matrix effects. Conventional methods for obtaining quantitative chemical data from LIBS analyses are compared with new multivariate analysis (MVA) techniques that appear to compensate for these chemical matrix effects. The traditional analyses use specific elemental peak heights or areas, which compared with calibration curves for each element at one or more emission lines for a series of standard samples. Because of matrix effects, the calibration standards generally must have similar chemistries to the unknown samples, and thus this conventional approach imposes severe limitations on application of the technique to remote analyses. In this suite of samples, the use of traditional methods results in chemical analyses with significant uncertainties. Alternatively, greatly-improved quantitative elemental analysis was accomplished by using a Partial Least Squares (PLS) calibration model for all of the major elements of interest. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are then employed to predict the rock-type of the sample. These MVA techniques appear to compensate for these matrix effects because the analysis finds correlations between the spectra (independent variables), the individual elements of interest (dependent variables such as Si) as well as the other elements in the matrix.

Clegg, S. M.; Sklute, E. C.; Dyar, M. D.; Barefield, J. E.; Wiens, R. C.



Universal Drag Tag for Direct Quantitative Analysis of Multiple David W. Wegman,  

E-print Network

. There is a need for a method to detect such fingerprints, which requires direct, quantitative analysis of multiple practical. Here, we focus on advancing one such method termed direct, quantitative analysis of multiple miUniversal Drag Tag for Direct Quantitative Analysis of Multiple MicroRNAs David W. Wegman, Leonid T

Krylov, Sergey


A quantitative analysis of IRAS maps of molecular clouds  

NASA Technical Reports Server (NTRS)

We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

Wiseman, Jennifer J.; Adams, Fred C.



A Quantitative Analysis of IRAS Maps of Molecular Clouds  

E-print Network

We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps; this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100$\\mu$m and 60$\\mu$m to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the ``output'' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental ``complexity'' of these star forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more ``complex'' environments.

Jennifer J. Wiseman; Fred C. Adams



Teaching Neuroinformatics with an Emphasis on Quantitative Locus Analysis  

PubMed Central

Although powerful bioinformatics tools are available for free on the web and are used by neuroscience professionals on a daily basis, neuroscience students are largely ignorant of them. This Neuroinformatics module weaves together several bioinformatics tools to make a comprehensive unit. This unit encompasses quantifying a phenotype through a Quantitative Trait Locus (QTL) analysis, which links phenotype to loci on chromosomes that likely had an impact on the phenotype. Students then are able to sift through a list of genes in the region(s) of the chromosome identified by the QTL analysis and find a candidate gene that has relatively high expression in the brain region of interest. Once such a candidate gene is identified, students can find out more information about the gene, including the cells/layers in which it is expressed, the sequence of the gene, and an article about the gene. All of the resources employed are available at no cost via the internet. Didactic elements of this instructional module include genetics, neuroanatomy, Quantitative Trait Locus analysis, molecular techniques in neuroscience, and statistics—including multiple regression, ANOVA, and a bootstrap technique. This module was presented at the Faculty for Undergraduate Neuroscience (FUN) 2011 Workshop at Pomona College and can be accessed at PMID:23493834

Grisham, William; Korey, Christopher A.; Schottler, Natalie A.; McCauley, Lisa Beck; Beatty, Jackson



Quantitative Analysis Of Cristobalite In The Presence Of Quartz  

NASA Astrophysics Data System (ADS)

The detection and quantitation of-cristobalite in quartz is necessary to calculate threshold value limits (TVL) for free crystalline silica (FCS) as proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). The cristobalite standard used in this study was made by heating diatomaceous earth to the transition temperature for cristobalite. The potassium bromide (KBR) pellet method was used for the analysis. Potassium cyanide (KCN) was used as an internal standard. Samples ranged from 5% to 30% cris-tobalite in quartz. Precision for this method is within 2%.

Totten, Gary A.



Lipid biomarker analysis for the quantitative analysis of airborne microorganisms  

SciTech Connect

There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R. [Microbial Insights Inc., Rockford, TN (United States)] [and others




E-print Network

GINI COEFFICIENTS, SOCIAL NETWORK ANALYSIS, AND MARKOV CHAINS: QUANTITATIVE METHODS FOR ANA- LYZING Management Project No.: 538 Title of Project: Gini Coefficients, social network analysis and Markov Chains


Quantitative analysis of volume images: electron microscopic tomography of HIV  

NASA Astrophysics Data System (ADS)

Three-dimensional objects should be represented by 3D images. So far, most of the evaluation of images of 3D objects have been done visually, either by looking at slices through the volumes or by looking at 3D graphic representations of the data. In many applications a more quantitative evaluation would be valuable. Our application is the analysis of volume images of the causative agent of the acquired immune deficiency syndrome (AIDS), namely human immunodeficiency virus (HIV), produced by electron microscopic tomography (EMT). A structural analysis of the virus is of importance. The representation of some of the interesting structural features will depend on the orientation and the position of the object relative to the digitization grid. We describe a method of defining orientation and position of objects based on the moment of inertia of the objects in the volume image. In addition to a direct quantification of the 3D object a quantitative description of the convex deficiency may provide valuable information about the geometrical properties. The convex deficiency is the volume object subtracted from its convex hull. We describe an algorithm for creating an enclosing polyhedron approximating the convex hull of an arbitrarily shaped object.

Nystroem, Ingela; Bengtsson, Ewert W.; Nordin, Bo G.; Borgefors, Gunilla



Fusing Quantitative Requirements Analysis with Model-based Systems Engineering  

NASA Technical Reports Server (NTRS)

A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven



Quantitative Analysis of Genetic and Neuronal Multi-Perturbation Experiments  

PubMed Central

Perturbation studies, in which functional performance is measured after deletion, mutation, or lesion of elements of a biological system, have been traditionally employed in many fields in biology. The vast majority of these studies have been qualitative and have employed single perturbations, often resulting in little phenotypic effect. Recently, newly emerging experimental techniques have allowed researchers to carry out concomitant multi-perturbations and to uncover the causal functional contributions of system elements. This study presents a rigorous and quantitative multi-perturbation analysis of gene knockout and neuronal ablation experiments. In both cases, a quantification of the elements' contributions, and new insights and predictions, are provided. Multi-perturbation analysis has a potentially wide range of applications and is gradually becoming an essential tool in biology. PMID:16322764

Meilijson, Isaac; Kupiec, Martin; Ruppin, Eytan



A method for quantitative wet chemical analysis of urinary calculi.  


We describe a simple method for quantitative chemical analysis of urinary calculi requiring no specialized equipment. Pulverized calculi are dried over silica gel at room temperature and dissolved in nitric acid, which was the only effective agent for complete dissolution. Calcium, magnesium, ammonium, and phosphate are then determined by conventional methods. Oxalate is determined by a method based on the quenching action of oxalate on the fluorescence of a zirconium-flavonol complex. Uric acid, when treated with nitric acid, is stoichiometrically converted to alloxan, which is determined fluorimetrically with 1,2-phenylenediamine. Similarly, cystine is oxidized by nitric acid to sulfate, which is determined turbidimetrically as barium sulfate. Protein is determined spectrophotometrically as xanthoprotein. The total mass recovery of authentic calculi was 92.2 +/- 6.7 (SD) per cent. The method permits analysis of calculi as small as 1.0 mg. Internal quality control is performed with specially designed control samples. PMID:6086179

Larsson, L; Sörbo, B; Tiselius, H G; Ohman, S



[Quantitative analysis of transformer oil dissolved gases using FTIR].  


For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm. PMID:24369641

Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua



Glioblastoma multiforme: exploratory radiogenomic analysis by using quantitative image features.  


Purpose To derive quantitative image features from magnetic resonance (MR) images that characterize the radiographic phenotype of glioblastoma multiforme (GBM) lesions and to create radiogenomic maps associating these features with various molecular data. Materials and Methods Clinical, molecular, and MR imaging data for GBMs in 55 patients were obtained from the Cancer Genome Atlas and the Cancer Imaging Archive after local ethics committee and institutional review board approval. Regions of interest (ROIs) corresponding to enhancing necrotic portions of tumor and peritumoral edema were drawn, and quantitative image features were derived from these ROIs. Robust quantitative image features were defined on the basis of an intraclass correlation coefficient of 0.6 for a digital algorithmic modification and a test-retest analysis. The robust features were visualized by using hierarchic clustering and were correlated with survival by using Cox proportional hazards modeling. Next, these robust image features were correlated with manual radiologist annotations from the Visually Accessible Rembrandt Images (VASARI) feature set and GBM molecular subgroups by using nonparametric statistical tests. A bioinformatic algorithm was used to create gene expression modules, defined as a set of coexpressed genes together with a multivariate model of cancer driver genes predictive of the module's expression pattern. Modules were correlated with robust image features by using the Spearman correlation test to create radiogenomic maps and to link robust image features with molecular pathways. Results Eighteen image features passed the robustness analysis and were further analyzed for the three types of ROIs, for a total of 54 image features. Three enhancement features were significantly correlated with survival, 77 significant correlations were found between robust quantitative features and the VASARI feature set, and seven image features were correlated with molecular subgroups (P < .05 for all). A radiogenomics map was created to link image features with gene expression modules and allowed linkage of 56% (30 of 54) of the image features with biologic processes. Conclusion Radiogenomic approaches in GBM have the potential to predict clinical and molecular characteristics of tumors noninvasively. © RSNA, 2014 Online supplemental material is available for this article. PMID:24827998

Gevaert, Olivier; Mitchell, Lex A; Achrol, Achal S; Xu, Jiajing; Echegaray, Sebastian; Steinberg, Gary K; Cheshier, Samuel H; Napel, Sandy; Zaharchuk, Greg; Plevritis, Sylvia K



Operational Experiences in Planning and Reconstructing Aqua Inclination Maneuvers  

NASA Technical Reports Server (NTRS)

As the lead satellite in NASA's growing Earth Observing System (EOS) PM constellation, it is increasingly critical that Aqua maintain its various orbit requirements. The two of interest for this paper are maintaining an orbit inclination that provides for a consistent mean local time and a semi-major Axis (SMA) that allows for ground track repeatability. Maneuvers to adjust the orbit inclination involve several flight dynamics constraints and complexities which make planning such maneuvers challenging. In particular, coupling between the orbital and attitude degrees of freedom lead to changes in SMA when changes in inclination are effected. A long term mission mean local time trend analysis was performed in order to determine the size and placement of the required inclination maneuvers. Following this analysis, detailed modeling of each burn and its Various segments was performed to determine its effects on the immediate orbit state. Data gathered from an inclination slew test of the spacecraft and first inclination maneuver uncovered discrepancies in the modeling method that were investigated and resolved. The new modeling techniques were applied and validated during the second spacecraft inclination maneuver. These improvements should position Aqua to successfully complete a series of inclination maneuvers in the fall of 2004. The following paper presents the events and results related

Rand, David; Reilly, Jacqueline; Schiff, Conrad



Development of Analytical and Reporting Skills in Quantitative Analysis  

NASA Astrophysics Data System (ADS)

Development of data analysis and reporting skills by Quantitative Analysis students is promoted through a series of activities in lecture and laboratory. Students learn the basics of chemical measurement, data reduction and statistical analysis. They first apply those skills on self-collected laboratory data and report the results in instructor-defined laboratory write-ups. Structured groups are used for some experiments. Following that, students submit experimental write-ups in which they decide what data analysis to do, how to do them and what conclusions to draw. The final step is a self-designed special project in which students propose an analysis project, carry it out, analyze the data and report it in a poster format. Instructor observations and student survey data are presented regarding this set of activities. Students learn to analyze data and draw conclusions and learn to take responsibility for deciding when and how to do those tasks. Students are positive about their learning, but express discomfort at being given the freedom and responsibility to decide what to do.

Eierman, R. J.



Quantitative analysis of cyclic beta-turn models.  

PubMed Central

The beta-turn is a frequently found structural unit in the conformation of globular proteins. Although the circular dichroism (CD) spectra of the alpha-helix and beta-pleated sheet are well defined, there remains some ambiguity concerning the pure component CD spectra of the different types of beta-turns. Recently, it has been reported (Hollósi, M., Kövér, K.E., Holly, S., Radics, L., & Fasman, G.D., 1987, Biopolymers 26, 1527-1572; Perczel, A., Hollósi, M., Foxman, B.M., & Fasman, G.D., 1991a, J. Am. Chem. Soc. 113, 9772-9784) that some pseudohexapeptides (e.g., the cyclo[(delta)Ava-Gly-Pro-Aaa-Gly] where Aaa = Ser, Ser(OtBu), or Gly) in many solvents adopt a conformational mixture of type I and the type II beta-turns, although the X-ray-determined conformation was an ideal type I beta-turn. In addition to these pseudohexapeptides, conformational analysis was also carried out on three pseudotetrapeptides and three pseudooctapeptides. The target of the conformation analysis reported herein was to determine whether the ring stress of the above beta-turn models has an influence on their conformational properties. Quantitative nuclear Overhauser effect (NOE) measurements yielded interproton distances. The conformational average distances so obtained were interpreted utilizing molecular dynamics (MD) simulations to yield the conformational percentages. These conformational ratios were correlated with the conformational weights obtained by quantitative CD analysis of the same compounds. The pure component CD curves of type I and type II beta-turns were also obtained, using a recently developed algorithm (Perczel, A., Tusnády, G., Hollósi, M., & Fasman, G.D., 1991b, Protein Eng. 4(6), 669-679). For the first time the results of a CD deconvolution, based on the CD spectra of 14 beta-turn models, were assigned by quantitative NOE results. The NOE experiments confirmed the ratios of the component curves found for the two major beta-turns by CD analysis. These results can now be used to enhance the conformational determination of globular proteins on the basis of their CD spectra. PMID:1304345

Perczel, A.; Fasman, G. D.



From screening to quantitative sensitivity analysis. A unified approach  

NASA Astrophysics Data System (ADS)

The present work is a sequel to a recent one published on this journal where the superiority of 'radial design' to compute the 'total sensitivity index' was ascertained. Both concepts belong to sensitivity analysis of model output. A radial design is the one whereby starting from a random point in the hyperspace of the input factors one step in turn is taken for each factor. The procedure is iterated a number of times with a different starting random point as to collect a sample of elementary shifts for each factor. The total sensitivity index is a powerful sensitivity measure which can be estimated based on such a sample. Given the similarity between the total sensitivity index and a screening test known as method of the elementary effects (or method of Morris), we test the radial design on this method. Both methods are best practices: the total sensitivity index in the class of the quantitative measures and the elementary effects in that of the screening methods. We find that the radial design is indeed superior even for the computation of the elementary effects method. This opens the door to a sensitivity analysis strategy whereby the analyst can start with a small number of points (screening-wise) and then - depending on the results - possibly increase the numeral of points up to compute a fully quantitative measure. Also of interest to practitioners is that a radial design is nothing else than an iterated 'One factor At a Time' (OAT) approach. OAT is a radial design of size one. While OAT is not a good practice, modelers in all domains keep using it for sensitivity analysis for reasons discussed elsewhere (Saltelli and Annoni, 2010) [23]. With the present approach modelers are offered a straightforward and economic upgrade of their OAT which maintain OAT's appeal of having just one factor moved at each step.

Campolongo, Francesca; Saltelli, Andrea; Cariboni, Jessica



A quantitative analysis of chain-schedule performance  

PubMed Central

Six pigeons were trained with a chain variable-interval variable-interval schedule on the left key and with reinforcers available on the right key on a single variable-interval schedule arranged concurrently with both links of the chain. All three schedules were separately and systematically varied over a wide range of mean intervals. During these manipulations, the obtained reinforcer rates on constant arranged schedules also frequently changed systematically. Increasing reinforcer rates in Link 2 of the chain increased response rates in both links and decreased response rates in the variable-interval schedule concurrently available with Link 2. Increasing Link-1 reinforcer rates increased Link-1 response rates and decreased Link-2 response rates. Increasing reinforcer rates on the right-key schedule decreased response rates in Link 1 of the chain but did not affect the rate in Link 2. The results extend and amplify previous analyses of chain-schedule performance and help define the effects that a quantitative model must describe. However, the complexity of the results, and the fact that constant arranged reinforcer schedules did not necessarily lead to constant obtained reinforcer rates, precluded a quantitative analysis. PMID:16812574

Davison, Michael; McCarthy, Dianne



Quantitative analysis of multiple sclerosis: a feasibility study  

NASA Astrophysics Data System (ADS)

Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong



Quantitative analysis of incipient mineral loss in hard tissues  

NASA Astrophysics Data System (ADS)

A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett



Quantitative analysis of the reconstruction performance of interpolants  

NASA Technical Reports Server (NTRS)

The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

Lansing, Donald L.; Park, Stephen K.



Quantitative genetic analysis of injury liability in infants and toddlers  

SciTech Connect

A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

Phillips, K.; Matheny, A.P. Jr. [Univ. of Louisville Medical School, KY (United States)



Early changes in thallium distribution. Effect on quantitative analysis  

SciTech Connect

Thirty-two patients with coronary artery disease and an abnormality on an initial anterior view thallium scan had repeat images obtained after delays of 30 and 240 minutes. Scans were analyzed by quantitative criteria. Comparison of the initial stress study with the 30-minute redistribution scan showed no significant change in 11 patients, defects becoming smaller in 13 patients, and defects becoming larger in eight patients. When comparing the stress or the early redistribution images with the late redistribution scans, the diagnosis (eg, scar vs. ischemia) would have been affected in 14 cases. Analysis of the sources of variability showed that all the apparent worsening but only part of the defect resolution could be explained by variability inherent to repositioning the patient. Thus, the size of an initial defect is very sensitive to the time between the end of exercise and the onset of data collection and the nature of changes in scan appearance is complex.

Makler, P.T. Jr.; McCarthy, D.M.; Alavi, A.



Quantitative image analysis of WE43-T6 cracking behavior  

NASA Astrophysics Data System (ADS)

Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

Ahmad, A.; Yahya, Z.



Quantitative comparison of analysis methods for spectroscopic optical coherence tomography  

PubMed Central

Spectroscopic optical coherence tomography (sOCT) enables the mapping of chromophore concentrations and image contrast enhancement in tissue. Acquisition of depth resolved spectra by sOCT requires analysis methods with optimal spectral/spatial resolution and spectral recovery. In this article, we quantitatively compare the available methods, i.e. the short time Fourier transform (STFT), wavelet transforms, the Wigner-Ville distribution and the dual window method through simulations in tissue-like media. We conclude that all methods suffer from the trade-off in spectral/spatial resolution, and that the STFT is the optimal method for the specific application of the localized quantification of hemoglobin concentration and oxygen saturation. PMID:24298417

Bosschaart, Nienke; van Leeuwen, Ton G.; Aalders, Maurice C. G.; Faber, Dirk J.



Quantitative analysis of gallstones using laser-induced breakdown spectroscopy  

SciTech Connect

The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P



Quantitative analysis of forest island pattern in selected Ohio landscapes  

SciTech Connect

The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

Bowen, G.W.; Burgess, R.L.



A Novel Quantitative Approach to Concept Analysis: The Internomological Network  

PubMed Central

Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli



Quantitative analysis on sensitivity of shearography in NDT  

NASA Astrophysics Data System (ADS)

Shearography is a very powerful optical technique for both stress analysis and nondestructive testing (NDT) of composite. Sensitivity of the method is somewhat confused. The method has maximum sensitivity as high as holography. The sensitivity is approximately proportional to shear distance when the shear distance is relatively small. How does the sensitivity change from zero to maximum. It is a useful discussion for NDT applications because the size of defects is not very big compared with shear distance. In this paper, the interpretation method for Shearography was studied first. A new method to interpret shearogram, so called 'difference of twin points' displacement', was presented. The method doesn't use the assumption of small shear distance and can be used for sensitivity analysis art any shear distance. A mechanical model of the defect was built to analyze the sensitivity of shearography in NDT. The quantitative relationship between sensitivity of shearography and the shear distance was given after mathematical analysis. Error of classical interpretation was analyzed.

Guo, Guangping; Qin, Yuwen



Application of neural networks to quantitative spectrometry analysis  

NASA Astrophysics Data System (ADS)

Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57Co, 58Co, 85Sr, 88Y, 131I, 139Ce, 141Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction.

Pilato, V.; Tola, F.; Martinez, J. M.; Huver, M.



Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops  

NASA Technical Reports Server (NTRS)

This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

Shortle, John F.; Allocco, Michael



Quantitative analysis of triple-mutant genetic interactions.  


The quantitative analysis of genetic interactions between pairs of gene mutations has proven to be effective for characterizing cellular functions, but it can miss important interactions for functionally redundant genes. To address this limitation, we have developed an approach termed triple-mutant analysis (TMA). The procedure relies on a query strain that contains two deletions in a pair of redundant or otherwise related genes, which is crossed against a panel of candidate deletion strains to isolate triple mutants and measure their growth. A central feature of TMA is to interrogate mutants that are synthetically sick when two other genes are deleted but interact minimally with either single deletion. This approach has been valuable for discovering genes that restore critical functions when the principal actors are deleted. TMA has also uncovered double-mutant combinations that produce severe defects because a third protein becomes deregulated and acts in a deleterious fashion, and it has revealed functional differences between proteins presumed to act together. The protocol is optimized for Singer ROTOR pinning robots, takes 3 weeks to complete and measures interactions for up to 30 double mutants against a library of 1,536 single mutants. PMID:25010907

Braberg, Hannes; Alexander, Richard; Shales, Michael; Xu, Jiewei; Franks-Skiba, Kathleen E; Wu, Qiuqin; Haber, James E; Krogan, Nevan J



Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).  


The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created. PMID:25336172

Prasad, Raghavendra; Bandyopadhyay, Tapas K



Rapid inorganic ion analysis using quantitative microchip capillary electrophoresis.  


Rapid quantitative microchip capillary electrophoresis (CE) for online monitoring of drinking water enabling inorganic ion separation in less than 15 s is presented. Comparing cationic and anionic standards at different concentrations the analysis of cationic species resulted in non-linear calibration curves. We interpret this effect as a variation in the volume of the injected sample plug caused by changes of the electroosmotic flow (EOF) due to the strong interaction of bivalent cations with the glass surface. This explanation is supported by the observation of severe peak tailing. Conducting microchip CE analysis in a glass microchannel, optimized conditions are received for the cationic species K+, Na+, Ca2+, Mg2+ using a background electrolyte consisting of 30 mmol/L histidine and 2-(N-morpholino)ethanesulfonic acid, containing 0.5 mmol/L potassium chloride to reduce surface interaction and 4 mmol/L tartaric acid as a complexing agent resulting in a pH-value of 5.8. Applying reversed EOF co-migration for the anionic species Cl-, SO42- and HCO3- optimized separation occurs in a background electrolyte consisting of 10 mmol/L 4-(2-hydroxyethyl)-1-piperazineethanesulfonic acid (HEPES) and 10 mmol/L HEPES sodium salt, containing 0.05 mmol/L CTAB (cetyltrimethylammonium bromide) resulting in a pH-value of 7.5. The detection limits are 20 micromol/L for the monovalent cationic and anionic species and 10 micromol/L for the divalent species. These values make the method very suitable for many applications including the analysis of abundant ions in tap water as demonstrated in this paper. PMID:16310794

Vrouwe, Elwin X; Luttge, Regina; Olthuis, Wouter; van den Berg, Albert



The Quantitative Analysis of an Analgesic Tablet: An NMR Experiment for the Instrumental Analysis Course  

Microsoft Academic Search

A quantitative analysis experiment is outlined that uses 13C NMR. Initial work utilizes a known compound (acenapthene) to assess the type of NMR experiment necessary to achieve a proportional response from all of the carbons in the compound. Both gated decoupling and inverse gated decoupling routines with a variety of delay times are inspected, in addition to investigation of paramagnetic

Thomas A. Schmedake; Lawrence E. Welch



Quantitative high-throughput analysis of transcription factor binding specificities.  


We present a general high-throughput approach to accurately quantify DNA-protein interactions, which can facilitate the identification of functional genetic polymorphisms. The method tested here on two structurally distinct transcription factors (TFs), NF-kappaB and OCT-1, comprises three steps: (i) optimized selection of DNA variants to be tested experimentally, which we show is superior to selecting variants at random; (ii) a quantitative protein-DNA binding assay using microarray and surface plasmon resonance technologies; (iii) prediction of binding affinity for all DNA variants in the consensus space using a statistical model based on principal coordinates analysis. For the protein-DNA binding assay, we identified a polyacrylamide/ester glass activation chemistry which formed exclusive covalent bonds with 5'-amino-modified DNA duplexes and hindered non-specific electrostatic attachment of DNA. Full accessibility of the DNA duplexes attached to polyacrylamide-modified slides was confirmed by the high degree of data correlation with the electromobility shift assay (correlation coefficient 93%). This approach offers the potential for high-throughput determination of TF binding profiles and predicting the effects of single nucleotide polymorphisms on TF binding affinity. New DNA binding data for OCT-1 are presented. PMID:14990752

Linnell, Jane; Mott, Richard; Field, Simon; Kwiatkowski, Dominic P; Ragoussis, Jiannis; Udalova, Irina A



Quantitative analysis of ceramics by laser-induced breakdown spectroscopy  

NASA Astrophysics Data System (ADS)

A quantitative elemental analysis of ceramics was carried out with laser-induced breakdown spectroscopy. A Q-switched Nd:YAG laser was focused on ceramic targets in an argon atmosphere at reduced pressure, and the emission spectra from laser-induced plasma were measured using time-resolved spectroscopy. The experimental results showed that in argon at approximately 200 Torr, the spectral line intensity and the line-to-background ratio were maximized by observing the laser plasma with a time delay of 0.4 ?s. Also, time-resolved measurement of a spectrum in the initial stage of plasma generation (˜1 ?s) was effective for improving the slope of the calibration curve. Based on the results, standard ceramic samples were analyzed for magnesium, aluminum, calcium, iron and titanium, and linear calibration curves with a slope of unity were obtained by measuring spectra with a gate width of 0.4 ?s at a delay time of 0.4 ?s after the laser pulse in argon at 200 Torr.

Kuzuya, M.; Murakami, M.; Maruyama, N.



SILAC-based quantitative proteomic analysis of gastric cancer secretome  

PubMed Central

Purpose Gastric cancer is a commonly occurring cancer in Asia and one of the leading causes of cancer deaths. However, there is no reliable blood-based screening test for this cancer. Identifying proteins secreted from tumor cells could lead to the discovery of clinically useful biomarkers for early detection of gastric cancer. Experimental design A SILAC-based quantitative proteomic approach was employed to identify secreted proteins that were differentially expressed between neoplastic and non-neoplastic gastric epithelial cells. Proteins from the secretome were subjected to SDS-PAGE and SCX-based fractionation, followed by mass spectrometric analysis on an LTQ-Orbitrap Velos mass spectrometer. Immunohistochemical labeling was employed to validate a subset of candidates using tissue microarrays. Results We identified 2,205 proteins in the gastric cancer secretome of which 263 proteins were overexpressed >4-fold in gastric cancer-derived cell lines as compared to non-neoplastic gastric epithelial cells. Three candidate proteins, proprotein convertase subtilisin/kexin type 9 (PCSK9), lectin mannose binding 2 (LMAN2) and PDGFA associated protein 1 (PDAP1), were validated by immunohistochemical labeling. Conclusions and clinical relevance We report here the largest cancer secretome described to date. The novel biomarkers identified in the current study are excellent candidates for further testing as early detection biomarkers for gastric adenocarcinoma. PMID:23161554

Marimuthu, Arivusudar; Subbannayya, Yashwanth; Sahasrabuddhe, Nandini A.; Balakrishnan, Lavanya; Syed, Nazia; Sekhar, Nirujogi Raja; Katte, Teesta V.; Pinto, Sneha M.; Srikanth, Srinivas M.; Kumar, Praveen; Pawar, Harsh; Kashyap, Manoj K.; Maharudraiah, Jagadeesha; Ashktorab, Hassan; Smoot, Duane T; Ramaswamy, Girija; Kumar, Rekha V.; Cheng, Yulan; Meltzer, Stephen J; Roa, Juan Carlos; Chaerkady, Raghothama; Prasad, T.S. Keshava; Harsha, H. C.; Chatterjee, Aditi; Pandey, Akhilesh



A new method for the quantitative analysis of endodontic microleakage.  


The aim of this in vitro study was to evaluate the apical seal obtained with three commonly used root canal sealing cements: Sealapex, AH Plus or Topseal, and Sealite, using a new method based on the quantitative analysis of 125I-radiolabeled lysozyme penetration. One hundred thirteen teeth with straight single root canals were instrumented to master apical point #25/30. These were divided into three groups: (i) negative control (4 roots) covered with two layers of nail polish, (ii) test group (105 roots) obturated by laterally condensed guttapercha with the three cements; and (iii) positive control (4 roots) obturated without cement. The groups were then immersed in 125I lysozyme solution for a period of 1, 7, 14, or 28 days. After removal, six sections of 0.8 mm length each were made of each root with a fine diamond wire. Each section was analyzed for activity by a gamma counter, corrected for decay, and used to quantify protein penetration. Leakage was high in the positive control and almost negligible in the negative control. AH Plus (Topseal) and Sealapex showed similar leakage behavior over time, with AH Plus (Topseal) performing better. Sealite showed acceptable leakage up until day 14, after which a large increase occurred, presumably due to three-dimensional instability. PMID:10321181

Haïkel, Y; Wittenmeyer, W; Bateman, G; Bentaleb, A; Allemann, C



Comparison of different surface quantitative analysis methods: Application to corium  

NASA Astrophysics Data System (ADS)

In the case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The formed material, named "corium," flows out and spreads at the bottom of the reactor. In order to limit and control the consequences of such an accident, it is necessary to know precisely the specifications of the O-U-Zr basic system. These specifications should lead to the understanding of physico-chemical phenomenon happening at very high temperatures, from the study at room temperature of solidified structures. Toward that goal, a corium mix was processed by a melting by electron bombing at very high temperature (3000 K), followed by a quenching of the ingot in an Isabell[1] evaporator, Metallographical analyses were then necessary in order to validate thermodynamic databases set by the Thermo-Calc software.[2,3] The study consists of setting a global quantitative analysis method of the surface that would be fast and reliable, in order to determine a global composition of corium.

Guilbaud, Nathalie; Blin, Delphine; Pérodeaud, Phllippe; Dugne, Olivier; Guéneau, Christine



Quantitative SERS sensors for environmental analysis of naphthalene.  


In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene. PMID:21165476

Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C



Quantitative analysis of chromosome motion in Drosophila melanogaster  

NASA Astrophysics Data System (ADS)

We present an algorithm for estimating nonrigid motion of chromosomes in 4D microscopic images. Chromosomes are represented by a graph and motion estimation is formulated as a graph matching problem. All chromosomes within the graph are located, and then simulated annealing is used to find the mapping of chromosomes at time t onto chromosomes at time t+1 that minimizes the integrated displacement along each chromosome. Results with actual 4D images indicate that this model-based approach is sufficiently robust to correctly track the motion of chromosomes under conditions of limited spatial and temporal resolution. Using the motion estimate, previously developed methods for the quantitative analysis of 3D structure are extended to four dimensions, allowing chromosome mobility, flexibility, and interactions to be measured. Application of these algorithms to 4D images of Drosophila metaphase chromosomes in vivo allows visualization of clearly defined domains of high chromosomal flexibility, as well as other regions of distinctly lower chromosomal mobility which may coincide with centrometers.

Marshall, Wallace F.; Agard, David A.; Sedat, John W.



Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms  

SciTech Connect

Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne



Communication about vaccinations in Italian websites: a quantitative analysis.  


Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy. PMID:24607988

Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia



PFabrication of gold tips by chemical etching in aqua regia  

NASA Astrophysics Data System (ADS)

We present a method to produce sharp gold tips for applications in apertureless near-field optical microscopy and spectroscopy. Thin gold wires are tapered by chemical etching in aqua regia, covered by an isooctane protective layer. Tips with apical radii of curvature of <50 nm are obtained with a 40% yield. The tip performances have been checked by shear-force imaging of amyloid fibrils samples and compared to optical fiber probes. The analysis of the tip morphology, carried out by scanning electron microscopy, shows the existence of two different etching processes occurring in bulk and at the liquid-liquid interface. A simple analytical model is presented to describe the dynamics of the tip formation at the liquid-liquid meniscus interface that fits remarkably well the experimental results in terms of tip shape and length.

Bonaccorso, F.; Calogero, G.; Di Marco, G.; Maragò, O. M.; Gucciardi, P. G.; Giorgianni, U.; Channon, K.; Sabatino, G.



Understanding Maneuver Uncertainties during Inclination Maneuvers of the Aqua Spacecraft  

NASA Technical Reports Server (NTRS)

During the Fall 2006 inclination campaign for the Aqua spacecraft it was discovered that there was significant uncertainty in the prediction of the semimajor axis change during a maneuver. The low atmospheric drag environment at the time of the maneuvers amplified the effects of this uncertainty leading to a potential violation of the spacecraft ground-track requirements. In order to understand the uncertainty, a Monte Carlo simulation was developed to characterize the expected semi-major axis change uncertainty given the observed behavior of the spacecraft propulsion and attitude control systems during a maneuver. This expected uncertainty was then used to develop new analysis tools to ensure that future inclination maneuver plans will .meet ground-track control requirements in the presence of the error.

McKinley, David P.



Visual Modeling for Aqua Ventus I off Monhegan Island, ME  

SciTech Connect

To assist the University of Maine in demonstrating a clear pathway to project completion, PNNL has developed visualization models of the Aqua Ventus I project that accurately depict the Aqua Ventus I turbines from various points on Monhegain Island, ME and the surrounding area. With a hub height of 100 meters, the Aqua Ventus I turbines are large and may be seen from many areas on Monhegan Island, potentially disrupting important viewsheds. By developing these visualization models, which consist of actual photographs taken from Monhegan Island and the surrounding area with the Aqua Ventus I turbines superimposed within each photograph, PNNL intends to support the project’s siting and permitting process by providing the Monhegan Island community and various other stakeholders with a probable glimpse of how the Aqua Ventus I project will appear.

Hanna, Luke A.; Whiting, Jonathan M.; Copping, Andrea E.



Quantitative strain analysis of single crystals using x-ray topography  

Microsoft Academic Search

The x-ray topography technique images diffraction intensity variations of a crystal. The use of a CCD camera enables the measurement of different spatial resolutions. Currently an x-ray topograph with spatial resolution of 1 micron has been achieved, but the quantitative data analysis has not been explored widely. Quantitative strain analysis on these images extends new capabilities in crystal study. We

Y. Zhong; Y. S. Chu; A. T. Macrander; S. F. Krasnicki



A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test  

ERIC Educational Resources Information Center

The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis



Quantitative x-ray diffraction phase analysis of coarse airborne particulate collected by cascade impactor sampling  

Microsoft Academic Search

Mineralogical composition of Castellon (Spanish Mediterranean coast) atmospheric aerosol was studied by X-ray diffraction by sampling with a cascade impactor without filters. Quantitative phase analysis of natural phases present in the atmospheric coarse aerosol was performed using a modified version of the computer program MENGE, that uses the standardless X-ray method developed by Rius for the quantitative analysis of multiphase

L. E. Ochando; J. M. Amigó



The Trouble With Tailings: How Alteration Mineralogy can Hinder Quantitative Phase Analysis of Mineral Waste  

Microsoft Academic Search

Quantitative phase analysis, using the Rietveld method and X-ray powder-diffraction data, has become a standard technique for analysis of mineral waste from mining operations. This method relies upon the availability of well defined crystal structures for all detectable mineral phases in a sample. An even more basic assumption, central to quantitative mineralogy, is that all significant mineral phases can be

S. A. Wilson; S. J. Mills; G. M. Dipple; M. Raudsepp



Multivariate analysis of allozymic and quantitative trait variation in Alnus rubra  

E-print Network

Multivariate analysis of allozymic and quantitative trait variation in Alnus rubra: geographic (Alnus rubra Bong.). Principal components analysis showed that variation in quantitative traits can ont étudié la différenciation géographique parmi 65 provenances d'aulne rouge (Alnus rubra Bong.) de

Hamann, Andreas


Quantitative Analysis of a Licensing Examination Using Adverse Impact.  

ERIC Educational Resources Information Center

This study of the nursing examination for licensure in Michigan shows how adverse impact may be used as a legally mandated, quantitative tool to assess selection procedures in nursing and other health professions. (Author/CM)

Chesney, James D.; Engel, Rafael J.



Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.  

ERIC Educational Resources Information Center

Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

Hill, Devon W.; And Others



Quantitative analysis of the effective functional structure in yeast glycolysis.  


The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

De la Fuente, Ildefonso M; Cortes, Jesus M



Quantitative Proteomics Analysis of Inborn Errors of Cholesterol Synthesis  

PubMed Central

Smith-Lemli-Opitz syndrome (SLOS) and lathosterolosis are malformation syndromes with cognitive deficits caused by mutations of 7-dehydrocholesterol reductase (DHCR7) and lathosterol 5-desaturase (SC5D), respectively. DHCR7 encodes the last enzyme in the Kandutsch-Russel cholesterol biosynthetic pathway, and impaired DHCR7 activity leads to a deficiency of cholesterol and an accumulation of 7-dehydrocholesterol. SC5D catalyzes the synthesis of 7-dehydrocholesterol from lathosterol. Impaired SC5D activity leads to a similar deficiency of cholesterol but an accumulation of lathosterol. Although the genetic and biochemical causes underlying both syndromes are known, the pathophysiological processes leading to the developmental defects remain unclear. To study the pathophysiological mechanisms underlying SLOS and lathosterolosis neurological symptoms, we performed quantitative proteomics analysis of SLOS and lathosterolosis mouse brain tissue and identified multiple biological pathways affected in Dhcr7?3–5/?3–5 and Sc5d?/? E18.5 embryos. These include alterations in mevalonate metabolism, apoptosis, glycolysis, oxidative stress, protein biosynthesis, intracellular trafficking, and cytoskeleton. Comparison of proteome alterations in both Dhcr7?3–5/?3–5 and Sc5d?/? brain tissues helps elucidate whether perturbed protein expression was due to decreased cholesterol or a toxic effect of sterol precursors. Validation of the proteomics results confirmed increased expression of isoprenoid and cholesterol synthetic enzymes. This alteration of isoprenoid synthesis may underlie the altered posttranslational modification of Rab7, a small GTPase that is functionally dependent on prenylation with geranylgeranyl, that we identified and validated in this study. These data suggested that although cholesterol synthesis is impaired in both Dhcr7?3–5/?3–5 and Sc5d?/? embryonic brain tissues the synthesis of nonsterol isoprenoids may be increased and thus contribute to SLOS and lathosterolosis pathology. This proteomics study has provided insight into the pathophysiological mechanisms of SLOS and lathosterolosis, and understanding these pathophysiological changes will help guide clinical therapy for SLOS and lathosterolosis. PMID:20305089

Jiang, Xiao-Sheng; Backlund, Peter S.; Wassif, Christopher A.; Yergey, Alfred L.; Porter, Forbes D.



Analysis of quantitative phase detection based on optical information processing  

NASA Astrophysics Data System (ADS)

Phase object exists widely in nature, such as biological cells, optical components, atmospheric flow field and so on. The phase detection of objects has great significance in the basic research, nondestructive testing, aerospace, military weapons and other areas. The usual methods of phase object detection include interference method, grating method, schlieren method, and phase-contrast method etc. These methods have their own advantages, but they also have some disadvantages on detecting precision, environmental requirements, cost, detection rate, detection range, detection linearity in various applications, even the most sophisticated method-phase contrast method mainly used in microscopic structure, lacks quantitative analysis of the size of the phase of the object and the relationship between the image contrast and the optical system. In this paper, various phase detection means and the characteristics of different applications are analyzed based on the optical information processing, and a phase detection system based on optical filtering is formed. Firstly the frequency spectrum of the phase object is achieved by Fourier transform lens in the system, then the frequency spectrum is changed reasonably by the filter, at last the image which can represent the phase distribution through light intensity is achieved by the inverse Fourier transform. The advantages and disadvantages of the common used filters such as 1/4 wavelength phase filter, high-pass filter and edge filter are analyzed, and their phase resolution is analyzed in the same optical information processing system, and the factors impacting phase resolution are pointed out. The paper draws a conclusion that there exists an optimal filter which makes the detect accuracy best for any application. At last, we discussed how to design an optimal filter through which the ability of the phase testing of optical information processing system can be improved most.

Tao, Wang; Tu, Jiang-Chen; Chun, Kuang-Tao; Yu, Han-Wang; Xin, Du



Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis  

PubMed Central

The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

De la Fuente, Ildefonso M.; Cortes, Jesus M.



Quantitative analysis of mycoflora on commercial domestic fruits in Japan.  


A comprehensive and quantitative analysis of the mycoflora on the surface of commercial fruit was performed. Nine kinds of fruits grown in Japan were tested. Overall fungal counts on the fruits ranged from 3.1 to 6.5 log CFU/g. The mean percentages of the total yeast counts were higher than those of molds in samples of apples, Japanese pears, and strawberries, ranging from 58.5 to 67.0%, and were lower than those of molds in samples of the other six fruits, ranging from 9.8 to 48.3%. Cladosporium was the most frequent fungus and was found in samples of all nine types of fruits, followed by Penicillium found in eight types of fruits. The fungi with the highest total counts in samples of the various fruits were Acremonium in cantaloupe melons (47.6% of the total fungal count), Aspergillus in grapes (32.2%), Aureobasidium in apples (21.3%), blueberries (63.6%), and peaches (33.6%), Cladosporium in strawberries (38.4%), Cryptococcus in Japanese pears (37.6%), Penicillium in mandarins (22.3%), and Sporobolomyces in lemons (26.9%). These results demonstrated that the mycoflora on the surfaces of these fruits mainly consists of common pre- and postharvest inhabitants of the plants or in the environment; fungi that produce mycotoxins or cause market diseases were not prominent in the mycoflora of healthy fruits. These findings suggest fruits should be handled carefully with consideration given to fungal contaminants, including nonpathogenic fungi, to control the quality of fruits and processed fruit products. PMID:21902918

Watanabe, Maiko; Tsutsumi, Fumiyuki; Konuma, Rumi; Lee, Ken-Ichi; Kawarada, Kensuke; Sugita-Konishi, Yoshiko; Kumagai, Susumu; Takatori, Kosuke; Konuma, Hirotaka; Hara-Kudo, Yukiko



Quantitative picture analysis of freeze-fracture electron-micrographs.  


A method of three-dimensional reconstruction of the surface profile of artificial and natural membranes from freeze quenched electron micrographs is presented. The direct relation between the Pt-layer thickness and the local orientation of the membrane allows a reconstruction of the surface. The efficiency of this method is demonstrated on the quantitative analysis of some fine structures. These essential results are: 1. In the low resolution observation structural elements of a yeast cell were quantitatively described, (i) The diameter of a yeast cell is determined (4.2 microgram). (ii) The cell wall thickness is measured (150 nm). (iii) The dimension of cell wall incapsulated vesicles is determined (60-80 nm). (iv) The damlike protrusion in the plasma membrane has a triangular cross section. The height is 23 nm and the half width 50 nm. The particle assembly in the damlike protrusion is in a crystalline state. The change in surface curvature is probable due to a phase separation of a biaxial cluster in an uniaxial membrane. (v) Membrane bound particles can be distinguished by their surface profiles. 2. The resolution of surface profiles is limited by the size of the platinum grain. An average procedure can lead to a resolution of 0.2 nm. This increase is resolution can be understood with the uncertainty relation: The uncertainty of the profile in one dimension times the uncertainty in the other dimension (averaging length) is the area of the platinum grain. The monolayer thickness of dipalmitoyl phosphatidyl choline and dimyristoyl choline are distinguishable 2.6 +/- 0.2 nm and 2.4 +/- 0.2 nm respectively. The surface profile of a two-dimensional crystal in the membrane of a yeast cell can be determined with high accuracy. The two profiles of the inner and outer monolayer do not fit exactly together. A part of the membrane bound particle is pulled out of the monolayer during the fracturing procedure. 3. The third part investigates special fluctuation of the surface. (i) The mixture of dipalmitoyl phosphatidyl choline and dioleyl phosphatidyl choline shows a periodic structure. The fluctuation besides this periodicity can be explained by a spinodal decomposition during cryofixation. (ii) The fluctuation of a periodic structure can also be induced by thermal motion. The fluctuation of dimyristol phosphatidyl choline quenched from a temperature between the pre- and maintransition determines only one kind of elastic constant. This curvature elastic constant is in the order of 10(-20) Joule. (iii) The fluctuation of the particle density can be related with the particle-particle compressibility. We choose the clusters induced by polylysine in a membrane with charged and uncharged lipids as particles. The compressibility is in the order of 10(-6) Newton/m which is comparable to those of a monolayer in a gaseous state. PMID:6784180

Gruler, H



Aqua splint suture technique in isolated zygomatic arch fractures.  


Various methods have been used to treat zygomatic arch fractures, but no optimal modality exists for reducing these fractures and supporting the depressed bone fragments without causing esthetic problems and discomfort for life. We developed a novel aqua splint and suture technique for stabilizing isolated zygomatic arch fractures. The objective of this study is to evaluate the effect of novel aqua splint and suture technique in isolated zygomatic arch fractures. Patients with isolated zygomatic arch fractures were treated by a single surgeon in a single center from January 2000 through December 2012. Classic Gillies approach without external fixation was performed from January 2000 to December 2003, while the novel technique has been performed since 2004. 67 consecutive patients were included (Classic method, n = 32 and Novel method, n = 35). An informed consent was obtained from all patients. The novel aqua splint and suture technique was performed by the following fashion: first, we evaluated intraoperatively the bony alignment by ultrasonography and then, reduced the depressed fracture surgically using the Gillies approach. Thereafter, to stabilize the fracture and obtain the smooth facial figure, we made an aqua splint that fit the facial contour and placed monofilament nonabsorbable sutures around the fractured zygomatic arch. The novel aqua splint and suture technique showed significantly correlated with better cosmetic and functional results. In conclusion, the aqua splint suture technique is very simple, quick, safe, and effective for stabilizing repositioned zygomatic arch fractures. The aqua splint suture technique can be a good alternative procedure in isolated zygomatic arch fractures. PMID:23793598

Kim, Dong-Kyu; Kim, Seung Kyun; Lee, Jun Ho; Park, Chan Hum



Quantitative analysis of sensor for pressure waveform measurement  

PubMed Central

Background Arterial pressure waveforms contain important diagnostic and physiological information since their contour depends on a healthy cardiovascular system [1]. A sensor was placed at the measured artery and some contact pressure was used to measure the pressure waveform. However, where is the location of the sensor just about enough to detect a complete pressure waveform for the diagnosis? How much contact pressure is needed over the pulse point? These two problems still remain unresolved. Method In this study, we propose a quantitative analysis to evaluate the pressure waveform for locating the position and applying the appropriate force between the sensor and the radial artery. The two-axis mechanism and the modified sensor have been designed to estimate the radial arterial width and detect the contact pressure. The template matching method was used to analyze the pressure waveform. In the X-axis scan, we found that the arterial diameter changed waveform (ADCW) and the pressure waveform would change from small to large and then back to small again when the sensor was moved across the radial artery. In the Z-axis scan, we also found that the ADCW and the pressure waveform would change from small to large and then back to small again when the applied contact pressure continuously increased. Results In the X-axis scan, the template correlation coefficients of the left and right boundaries of the radial arterial width were 0.987 ± 0.016 and 0.978 ± 0.028, respectively. In the Z-axis scan, when the excessive contact pressure was more than 100 mm Hg, the template correlation was below 0.983. In applying force, when using the maximum amplitude as the criteria level, the lower contact pressure (r = 0.988 ± 0.004) was better than the higher contact pressure (r = 0.976 ± 0.012). Conclusions Although, the optimal detective position has to be close to the middle of the radial arterial, the pressure waveform also has a good completeness with a template correlation coefficient of above 0.99 when the position was within ± 1 mm of the middle of the radial arterial range. In applying force, using the maximum amplitude as the criteria level, the lower contact pressure was better than the higher contact pressure. PMID:20092621



Overview of the Aqua\\/AMSR-E 2003 soil moisture experiment in Brazil (SMEX03 Brazil)  

Microsoft Academic Search

This study presents an overview of the field design and satellite data analysis strategy of the Brazilian SMEX03 (Soil Moisture Experiment in 2003) campaign. The goal of the SMEX03 Brazil is to validate existing algorithms to retrieve soil moisture from Aqua\\/AMSR-E data under tropical savanna vegetation cover. The test site corresponded to the Barreiras region, located in the western part

E. E. Sano; E. D. Assad; T. J. Jackson; W. Crow; A. Hsu



Quantitative HPLC analysis of cardiac glycosides in Digitalis purpurea leaves.  


An analytical method for the determination of cardiac glycosides in Digitalis purpurea leaves by hplc was developed. Quantitation was carried out by the incorporation of lanatoside A as an internal standard. The present method is sufficiently precise and relatively simple. PMID:7673934

Ikeda, Y; Fujii, Y; Nakaya, I; Yamazaki, M



Binary imaging analysis for comprehensive quantitative histomorphometry of peripheral nerve  

Microsoft Academic Search

Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated

Daniel A. Hunter; Arash Moradzadeh; Elizabeth L. Whitlock; Michael J. Brenner; Terence M. Myckatyn; Cindy H. Wei; Thomas H. H. Tung; Susan E. Mackinnon



Simulation and Quantitative Interpretation of Electron Spectra for Surface Analysis  

Microsoft Academic Search

Quantitative interpretation of electron spectra requires a thorough understanding of the surface sensitivity of the technique, or, in other words, the transfer of the signal electrons from the source to the detector. The theory of electron transport of relevance for XPS, AES, REELS, EPES and related techniques is meanwhile well established. Within the framework of the partial intensity approach it

W. S. M. Werner


Features of the Quantitative Analysis in Moessbauer Spectroscopy  

SciTech Connect

The results describing the effect of different factors on errors in quantitative determination of the phase composition of studied substances by Moessbauer spectroscopy absorption are presented, and the ways of using them are suggested. The effectiveness of the suggested methods is verified by an example of analyzing standard and unknown compositions.

Semenov, V. G.; Panchuk, V. V. [St. Petersburg State University, St. Petersburg (Russian Federation); Irkaev, S. M. [Institute for Analytical Instrumentation RAS, St. Petersburg (Russian Federation)



Business process quantitative analysis and optimization for service industries  

Microsoft Academic Search

The total time a customer spends in the business process system, called the customer cycle-time, is a major contributor to overall customer satisfaction. Business process analysts and designers are frequently asked to design process solutions with optimal performance. Simulation models have been very popular to quantitatively evaluate the business processes; however, simulation is time-consuming and it also requires extensive modeling

Lixiang Jiang



Issues in qualitative and quantitative risk analysis for developmental toxicology  

Microsoft Academic Search

The qualitative and quantitative evaluation of risk in developmental toxicology has been discussed in several recent publications. A number of issues still are to be resolved in this area. The qualitative evaluation and interpretation of end points in developmental toxicology depends on an understanding of the biological events leading to the end points observed, the relationships among end points, and

Carole A. Kimmel; David W. Gaylor



Ethnicity and body image: Quantitative and qualitative analysis  

Microsoft Academic Search

Objective: Cultural diversity in body image has been studied elsewhere. In this study, we extend previous research by inclusion of (1) multiple ethnic groups for comparison and (2) measures for the assessment of multiple dimensions of body image. Method: Partici- pants were college students who self-identified as African, Asian, Caucasian, or Hispanic- American. Quantitative measures of weight-related body image and

Madeline Altabe



Depression in Parkinson's disease: a quantitative and qualitative analysis  

Microsoft Academic Search

Depression is a common feature of Parkinson's disease, a fact of both clinical and theoretical significance. Assessment of depression in Parkinson's disease is complicated by overlapping symptomatology in the two conditions, making global assessments based on observer or self-ratings of doubtful validity. The present study aimed to provide both a quantitative and qualitative description of the nature of the depressive

A M Gotham; R G Brown; C D Marsden



An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise  

ERIC Educational Resources Information Center

An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

Parker, Richard H.



Quantitative analysis of directional spontaneous emission spectra from light sources in photonic crystals  

E-print Network

Quantitative analysis of directional spontaneous emission spectra from light sources in photonic disorder. Using a model comprising diffuse light transport and photonic band structure, we quantitatively of spontaneously emitted light in real photonic crystals, which is essential in the interpretation of quantum

Vos, Willem L.


Multi observation PET image analysis for patient follow-up quantitation and therapy assessment  

E-print Network

1 Multi observation PET image analysis for patient follow-up quantitation and therapy assessment S Institut Telecom - Telecom Bretagne, Brest, F-29200 France. Abstract. In Positron Emission Tomography (PET-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do

Brest, Université de


Genome-Wide Analysis of Expression Quantitative Trait Loci in Breast Cancer - Nicholas Knoblauch, TCGA Scientitifc Symposium 2012

Home News and Events Multimedia Library Videos Genome-Wide Analysis of Expression Quantitative Trait Loci in Breast Cancer - Nicholas Knoblauch Genome-Wide Analysis of Expression Quantitative Trait Loci in Breast Cancer - Nicholas Knoblauch, TCGA


Quantitative and qualitative HPLC analysis of thermogenic weight loss products.  


An HPLC qualitative and quantitative method of seven analytes (caffeine, ephedrine, forskolin, icariin, pseudoephedrine, synephrine, and yohimbine) in thermogenic weight loss preparations available on the market is described in this paper. After 45 min the seven analytes were separated and detected in the acetonitrile: water (80:20) extract. The method uses a Waters XTerra RP18 (5 microm particle size) column as the stationary phase, a gradient mobile phase of water (5.0 mM SDS) and acetonitrile, and a UV detection of 210 nm. The correlation coefficients for the calibration curves and the recovery rates ranged from 0.994 to 0.999 and from 97.45% to 101.05%, respectively. The qualitative and quantitative results are discussed. PMID:15587578

Schaneberg, B T; Khan, I A



Quantitative Analysis of HSV Gene Expression during Lytic Infection.  


Herpes Simplex Virus (HSV) is a human pathogen that establishes latency and undergoes periodic reactivation, resulting in chronic recurrent lytic infection. HSV lytic infection is characterized by an organized cascade of three gene classes; however, successful transcription and expression of the first, the immediate early class, is critical to the overall success of viral infection. This initial event of lytic infection is also highly dependent on host cell factors. This unit uses RNA interference and small molecule inhibitors to examine the role of host and viral proteins in HSV lytic infection. Methods detailing isolation of viral and host RNA and genomic DNA followed by quantitative real-time PCR allow characterization of impacts on viral transcription and replication, respectively. Western blots can be used to confirm quantitative PCR results. This combination of protocols represents a starting point for researchers interested in virus-host interactions during HSV lytic infection. © 2014 by John Wiley & Sons, Inc. PMID:25367270

Turner, Anne-Marie W; Arbuckle, Jesse H; Kristie, Thomas M



Quantitative analysis of complexes in electron irradiated CZ silicon  

Microsoft Academic Search

Complexes in helium or electron irradiated silicon are quantitatively analyzed by highly sensitive and accurate infrared (IR) absorption spectroscopy. Carbon concentration (1×1015–1×1017cm?3) and helium dose (5×1012–5×1013cm?2) or electron dose (1×1015–1×1017cm?2) are changed by two orders of magnitude in relatively low regime compared to the previous works. It is demonstrated that the carbon-related complex in low carbon concentration silicon of commercial

N. Inoue; H. Ohyama; Y. Goto; T. Sugiyama



Building No. 905, showing typical aqua medias or rain hoods ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

Building No. 905, showing typical aqua medias or rain hoods - Presidio of San Francisco, Enlisted Men's Barracks Type, West end of Crissy Field, between Pearce & Maudlin Streets, San Francisco, San Francisco County, CA


Aqua MODIS 8-Year On-Orbit Operation and Calibration  

NASA Technical Reports Server (NTRS)

Launched in May 2002, the NASA EOS Aqua MODIS has successfully operated for more than 8 years. Observations from Aqua MODIS and its predecessor, Terra MODIS, have generated an unprecedented amount of data products and made significant contributions to studies of changes in the Earth s system of land, oceans, and atmosphere. MODIS collects data in 36 spectral bands: 20 reflective solar bands (RSB) and 16 thermal emissive bands (TEB). It has a set of on-board calibrators (OBC), providing sensor on-orbit radiometric, spectral, and spatial calibration and characterization. This paper briefly summarizes Aqua MODIS on-orbit operation and calibration activities and illustrates instrument on-orbit performance from launch to present. Discussions are focused on OBC functions and changes in detector radiometric gains, spectral responses, and spatial registrations. With ongoing calibration effort, Aqua MODIS will continue serving the science community with high quality data products

Xiong, Xiaoxiong; Angal, Amit; Madhavan, Sriharsha; Choi, Taeyoung; Dodd, Jennifer; Geng, Xu; Wang, Zhipeng; Toller, Gary; Barnes, William



Quantitation of DNA methylation by melt curve analysis  

PubMed Central

Background Methylation of DNA is a common mechanism for silencing genes, and aberrant methylation is increasingly being implicated in many diseases such as cancer. There is a need for robust, inexpensive methods to quantitate methylation across a region containing a number of CpGs. We describe and validate a rapid, in-tube method to quantitate DNA methylation using the melt data obtained following amplification of bisulfite modified DNA in a real-time thermocycler. Methods We first describe a mathematical method to normalise the raw fluorescence data generated by heating the amplified bisulfite modified DNA. From this normalised data the temperatures at which melting begins and finishes can be calculated, which reflect the less and more methylated template molecules present respectively. Also the T50, the temperature at which half the amplicons are melted, which represents the summative methylation of all the CpGs in the template mixture, can be calculated. These parameters describe the methylation characteristics of the region amplified in the original sample. Results For validation we used synthesized oligonucleotides and DNA from fresh cells and formalin fixed paraffin embedded tissue, each with known methylation. Using our quantitation we could distinguish between unmethylated, partially methylated and fully methylated oligonucleotides mixed in varying ratios. There was a linear relationship between T50 and the dilution of methylated into unmethylated DNA. We could quantitate the change in methylation over time in cell lines treated with the demethylating drug 5-aza-2'-deoxycytidine, and the differences in methylation associated with complete, clonal or no loss of MGMT expression in formalin fixed paraffin embedded tissues. Conclusion We have validated a rapid, simple in-tube method to quantify methylation which is robust and reproducible, utilizes easily designed primers and does not need proprietary algorithms or software. The technique does not depend on any operator manipulation or interpretation of the melt curves, and is suitable for use in any laboratory with a real-time thermocycler. The parameters derived provide an objective description and quantitation of the methylation in a specimen, and can be used to for statistical comparisons of methylation between specimens. PMID:19393074




PubMed Central

Background To study the relationship between emphysema, airflow obstruction and lung cancer in a high risk population we performed quantitative analysis of screening computed tomography (CT) scans. Methods Subjects completed questionnaires, spirometry and low-dose helical chest CT. Analyses compared cases and controls according to automated quantitative analysis of lung parenchyma and airways measures. Results Our case-control study of 117 matched pairs of lung cancer cases and controls did not reveal any airway or lung parenchymal findings on quantitative analysis of screening CT scans that were associated with increased lung cancer risk. Airway measures including wall area %, lumen perimeter, lumen area and average wall HU, and parenchymal measures including lung fraction < ?910 Hounsfield Units (HU), were not statistically different between cases and controls. Conclusions The relationship between visual assessment of emphysema and increased lung cancer risk could not be verified by quantitative analysis of low-dose screening CT scans in a high risk tobacco exposed population. PMID:21610523

Wilson, David O; Leader, Joseph K; Fuhrman, Carl R; Reilly, John J; Sciurba, Frank C.; Weissfeld, Joel L



Journal of Quantitative Analysis in Volume 6, Issue 3 2010 Article 8  

E-print Network

Journal of Quantitative Analysis in Sports Volume 6, Issue 3 2010 Article 8 A Point-Mass Kellogg School of Management, Northwestern University, b- University

Jensen, Shane T.


Quantitative analysis of Cenozoic palynofloras from Patagonia, southern South America Mirta E. Quattrocchioa  

E-print Network

Quantitative analysis of Cenozoic palynofloras from Patagonia, southern South America Mirta E. Keywords: Cenozoic; southern South America; Patagonia; palynofloras; statistics 1. Introduction. Here we review the compositional changes of Cenozoic palynomorph assemblages in Patagonia

Bermingham, Eldredge


Automated Quantitative Analysis of Capnogram Shape for COPD–Normal and COPD–CHF Classification  

E-print Network

We develop an approach to quantitative analysis of carbon dioxide concentration in exhaled breath, recorded as a function of time by capnography. The generated waveform – or capnogram – is currently used in clinical practice ...

Mieloszyk, Rebecca J.


Abstract--Combining content analysis of television programs with quantitative audience measurement can  

E-print Network

Abstract-- Combining content analysis of television programs with quantitative audience measurement can provide insights into customer reactions to advertisements and program content. This work and advertisers have a vested interest in gathering accurate information about the viewing preferences

Fisher, Kathleen


Factors Affecting Akinete Differentiation in Anabaena flos-aquae  

Microsoft Academic Search

The effects of pH values, nutrients, and preserving temperature on akinete formation in Anabaena flos-aquae were examined at 30 ± 2 ? exposed to fluorescent light at an intensity of 3000lux. A marked effect of pH 6 and pH 7 on akinete formation was observed. At the third day, A. flos-aquae subjected to pH 6 and pH 7 developed the

Qidong Wan; Xiumin Sun; Rui Chen; Peizhong Zheng; Wenyu Lu; Jianying Shen



Quantitative proteome analysis of proteins of K562 cell line by 18 O- labeling and LCMS\\/MS technology  

Microsoft Academic Search

Aim Quantitative analysis of global protein levels, termed 'quantitative proteomics', is important for the system-based understanding of the molecular function of each protein component and is expected to provide insights into molecular mechanisms of various biological processes and systems. This study is to establish the quantitative proteomics of K562 cell line. Methods The protein quantitation of K562 cell line by

Ying Tan; Zhi-Qiang Ge; Chang-Xiao Liu



Improvements to direct quantitative analysis of multiple microRNAs facilitating faster analysis.  


Studies suggest that patterns of deregulation in sets of microRNA (miRNA) can be used as cancer diagnostic and prognostic biomarkers. Establishing a "miRNA fingerprint"-based diagnostic technique requires a suitable miRNA quantitation method. The appropriate method must be direct, sensitive, capable of simultaneous analysis of multiple miRNAs, rapid, and robust. Direct quantitative analysis of multiple microRNAs (DQAMmiR) is a recently introduced capillary electrophoresis-based hybridization assay that satisfies most of these criteria. Previous implementations of the method suffered, however, from slow analysis time and required lengthy and stringent purification of hybridization probes. Here, we introduce a set of critical improvements to DQAMmiR that address these technical limitations. First, we have devised an efficient purification procedure that achieves the required purity of the hybridization probe in a fast and simple fashion. Second, we have optimized the concentrations of the DNA probe to decrease the hybridization time to 10 min. Lastly, we have demonstrated that the increased probe concentrations and decreased incubation time removed the need for masking DNA, further simplifying the method and increasing its robustness. The presented improvements bring DQAMmiR closer to use in a clinical setting. PMID:24127917

Ghasemi, Farhad; Wegman, David W; Kanoatov, Mirzo; Yang, Burton B; Liu, Stanley K; Yousef, George M; Krylov, Sergey N



Qualitative and quantitative analysis of volatile constituents from latrines.  


More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 ?g/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian



Quantitative Raman spectroscopy for the analysis of carrot bioactives.  


Rapid quantitative near-infrared Fourier transform Raman analyses of the key phytonutrients in carrots, polyacetylenes and carotenoids, are reported here for the first time. Solvent extracts of 31 carrot lines were analyzed for these phytonutrients by conventional methods, polyacetylenes by GC-FID and carotenoids by visible spectrophotometry. Carotenoid concentrations were 0-5586 ?g g(-1) dry weight (DW). Polyacetylene concentrations were 74-4846 ?g g(-1) DW, highest in wild carrots. The polyacetylenes were falcarinol, 6-1237 ?g g(-1) DW; falcarindiol, 42-3475 ?g g(-1) DW; and falcarindiol 3-acetate, 27-649 ?g g(-1) DW. Strong Raman bands for carotenoids gave good correlation to results by visible spectrophotometry. A chemometric model capable of quantitating carotenoids from Raman data was developed. A classification model for rapidly distinguishing carrots with high and low polyacetylene (limit of detection = 1400 ?g g(-1)) concentrations based on Raman spectral intensity in the region of 2250 cm(-1) was produced. PMID:23441972

Killeen, Daniel P; Sansom, Catherine E; Lill, Ross E; Eason, Jocelyn R; Gordon, Keith C; Perry, Nigel B



ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.  


Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. PMID:21705250

Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S



Quantitative cw Overhauser DNP Analysis of Hydration Dynamics  

E-print Network

Liquid state Overhauser Effect Dynamic Nuclear Polarization (ODNP) has experienced a recent resurgence of interest. In particular, a new manifestation of the ODNP measurement measures the translational mobility of water within 5-10 \\AA\\ of an ESR-active spin probe (i.e. the local translational diffusivity D_{local} near an electron spin resonance active molecule). Such spin probes, typically stable nitroxide radicals, have been attached to the surface or interior of macromolecules, including proteins, polymers, and membrane vesicles. Despite the unique specificity of this measurement, it requires only a standard X-band (~10 GHz) continuous wave (cw) electron spin resonance (ESR) spectrometer, coupled with a standard nuclear magnetic resonance (NMR) spectrometer. Here, we present a set of developments and corrections that allow us to improve the accuracy of quantitative ODNP and apply it to samples more than two orders of magnitude lower than were previously feasible.

Franck, John M; Han, Songi



Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC  

PubMed Central

A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8)-triethylamine (50?:?50?:?0.1%) with a flow rate of 1?mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220?nm. The linearity of matrine is in the range of 1.6 to 200.0??g/mL. The regression equation is y = 10706x ? 2959 (R2 = 1.0). The average recovery is 101.7%; RSD = 2.22%??(n = 9). This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle. PMID:24834359

Peng, Xinsheng; Hu, Min; Ling, Yahao; Tian, Yuan; Zhou, Yanxing; Zhou, Yanfang



Quantitative cw Overhauser DNP Analysis of Hydration Dynamics  

E-print Network

Liquid state Overhauser Effect Dynamic Nuclear Polarization (ODNP) has experienced a recent resurgence of interest. In particular, a new manifestation of the ODNP measurement measures the translational mobility of water within 5-10 \\AA\\ of an ESR-active spin probe (i.e. the local translational diffusivity D_{local} near an electron spin resonance active molecule). Such spin probes, typically stable nitroxide radicals, have been attached to the surface or interior of macromolecules, including proteins, polymers, and membrane vesicles. Despite the unique specificity of this measurement, it requires only a standard X-band (~10 GHz) continuous wave (cw) electron spin resonance (ESR) spectrometer, coupled with a standard nuclear magnetic resonance (NMR) spectrometer. Here, we present a set of developments and corrections that allow us to improve the accuracy of quantitative ODNP and apply it to samples more than two orders of magnitude lower than were previously feasible.

John M. Franck; Anna Pavlova; Songi Han



Quantitative analysis of CT scans of ceramic candle filters  

SciTech Connect

Candle filters are being developed to remove coal ash and other fine particles (<15{mu}m) from hot (ca. 1000 K) gas streams. In the present work, a color scanner was used to digitize hard-copy CT X-ray images of cylindrical SiC filters, and linear regressions converted the scanned (color) data to a filter density for each pixel. These data, with the aid of the density of SiC, gave a filter porosity for each pixel. Radial averages, density-density correlation functions, and other statistical analyses were performed on the density data. The CT images also detected the presence and depth of cracks that developed during usage of the filters. The quantitative data promise to be a very useful addition to the color images.

Ferer, M.V. [West Virginia Univ., Morgantown, WV (United States); Smith, D.H. [Morgantown Energy Technology Center, WV (United States)



Value of quantitative analysis of circulating cell free DNA as a screening tool for lung cancer: A meta-analysis  

Microsoft Academic Search

ObjectiveQuantitative analysis of circulating cell free DNA is considered as a possible aid for lung cancer screening. We aimed to comprehensively review the evidence for use of circulating cell free DNA to screen for lung cancer.

Ruifeng Zhang; Fangchun Shao; Xiaohong Wu; Kejing Ying



Introducing quantitative life cycle analysis into the chemical engineering curriculum  

Microsoft Academic Search

This paper briefly describes the various stages used to perform the life cycle analysis (LCA) of a product or process. The analysis is simplified in its outputs in that it focuses only on energy usage and carbon and sulfur dioxide emissions. The main point however, is to provide examples to first year engineering and science students that illustrate the principle

G. M. Evans; K. P. Galvin; E. Doroodchi



Quantitative proteomic analysis of amphotericin B resistance in Leishmania infantum.  


Amphotericin B (AmB) in its liposomal form is now considered as either first- or second-line treatment against Leishmania infections in different part of the world. Few cases of AmB resistance have been reported and resistance mechanisms toward AmB are still poorly understood. This paper reports a large-scale comparative proteomic study in the context of AmB resistance. Quantitative proteomics using stable isotope labeling of amino acids in cell culture (SILAC) was used to better characterize cytoplasmic and membrane-enriched (ME) proteomes of the in vitro generated Leishmania infantum AmB resistant mutant AmB1000.1. In total, 97 individual proteins were found as differentially expressed between the mutant and its parental sensitive strain (WT). More than half of these proteins were either metabolic enzymes or involved in transcription or translation processes. Key energetic pathways such as glycolysis and TCA cycle were up-regulated in the mutant. Interestingly, many proteins involved in reactive oxygen species (ROS) scavenging and heat-shock proteins were also up-regulated in the resistant mutant. This work provides a basis for further investigations to understand the roles of proteins differentially expressed in relation with AmB resistance. PMID:25057462

Brotherton, Marie-Christine; Bourassa, Sylvie; Légaré, Danielle; Poirier, Guy G; Droit, Arnaud; Ouellette, Marc



Quantitative analysis of tumor mitochondrial RNA using microarray  

PubMed Central

AIM: To design a novel method to rapidly detect the quantitative alteration of mtRNA in patients with tumors. METHODS: Oligo 6.22 and Primer Premier 5.0 bio-soft were used to design 15 pairs of primers of mtRNA cDNA probes in light of the functional and structural property of mtDNA, and then RT-PCR amplification was used to produce 15 probes of mtRNA from one normal gastric mucosal tissue. Total RNA extracted from 9 gastric cancers and corresponding normal gastric mucosal tissues was reverse transcribed into cDNA labeled with fluorescein. The spotted mtDNA microarrays were made and hybridized. Finally, the microarrays were scanned with a GeneTACTM laser scanner to get the hybridized results. Northern blot was used to confirm the microarray results. RESULTS: The hybridized spots were distinct with clear and consistent backgrounds. After data was standardized according to the housekeeping genes, the results showed that the expression levels of some mitochondrial genes in gastric carcinoma were different from those in the corresponding non-cancerous regions. CONCLUSION: The mtDNA expression microarray can rapidly, massively and exactly detect the quantity of mtRNA in tissues and cells. In addition, the whole expressive information of mtRNA from a tumor patient on just one slide can be obtained using this method, providing an effective method to investigate the relationship between mtDNA expression and tumorigenesis. PMID:15609393

Han, Cheng-Bo; Mao, Xiao-Yun; Xin, Yan; Wang, Shao-Cheng; Ma, Jia-Ming; Zhao, Yu-Jie



Quantitative analysis of task selection for brain-computer interfaces  

NASA Astrophysics Data System (ADS)

Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.



Optimization of quantitative sonographic diagnostic analysis of breast lesions  

NASA Astrophysics Data System (ADS)

The purpose of this study was to enhance the ability of quantitative sonography to distinguish between B-scan images of malignant and benign lesions of the breast. Several second-order pixel gray level statistics have been used to achieve a good but not acceptable diagnostic accuracy in characterizing breast lesions. Therefore, this study sought to optimize the diagnostic accuracy of second order statistics. The co-occurrence matrix is the most useful second-order statistic so far studied. It is an estimate of the joint probability distribution of gray levels of two pixels separated by a given distance and orientation. Several distances and orientations have been tried previously, but no systematic attempt had been made to find the optimum parameters for diagnosis. In this study, co-occurrence statistics of malignant and benign lesion images were determined as a function of distance and orientation. In particular, the correlation function was modeled as a separable, exponential function, first order for increments in both the x and y directions. Model parameters were used as features for discriminating benign from cancer lesions. An attempt was made to optimize the features by excluding the noisy data from the fit and again using the model parameters.

Krasner, Brian; Garra, Brian S.; Mun, Seong K.



Quantitative analysis of in-air output ratio  

PubMed Central

Output factor (Scp) is one of the important factors required to calculate monitor unit (MU), and is divided into two components: phantom scatter factor (Sp) and in-air output ratio (Sc). Generally, Sc for arbitrary fields are calculated using several methods based on Sc determined by the absorbed dose measurement for several square fields. However, there are calculation errors when the treatment field has a large aspect ratio and the opening of upper and lower collimator are exchanged. To determine Sc accurately, scattered photons from the treatment head and backscattered particles into the monitor chamber must be analyzed individually. In this report, a simulation model that agreed well with measured Sc was constructed and dose variation by scattered photons from the treatment head and by backscattered particles into the monitor chamber was analyzed quantitatively. The results showed that the contribution of scattered photons from the primary collimator was larger than that of the flattening filter, and backscattered particles were affected by not only the upper jaw but also the lower jaw. In future work, a new Sc determination algorism based on the result of this report will be proposed. PMID:23292148

Miyashita, Hisayuki; Hatanaka, Shogo; Fujita, Yukio; Hashimoto, Shimpei; Myojyoyama, Atsushi; Saitoh, Hidetoshi



Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis  

PubMed Central

Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

Stukes, Sabriya A.; Cohen, Hillel W.



Temporal kinetics and quantitative analysis of Cryptococcus neoformans nonlytic exocytosis.  


Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

Stukes, Sabriya A; Cohen, Hillel W; Casadevall, Arturo



Quantitative analysis of chaperone network throughput in budding yeast  

PubMed Central

The network of molecular chaperones mediates the folding and translocation of the many proteins encoded in the genome of eukaryotic organisms, as well as a response to stress. It has been particularly well characterised in the budding yeast, Saccharomyces cerevisiae, where 63 known chaperones have been annotated and recent affinity purification and MS/MS experiments have helped characterise the attendant network of chaperone targets to a high degree. In this study, we apply our QconCAT methodology to directly quantify the set of yeast chaperones in absolute terms (copies per cell) via SRM MS. Firstly, we compare these to existing quantitative estimates of these yeast proteins, highlighting differences between approaches. Secondly, we cast the results into the context of the chaperone target network and show a distinct relationship between abundance of individual chaperones and their targets. This allows us to characterise the ‘throughput’ of protein molecules passing through individual chaperones and their groups on a proteome-wide scale in an unstressed model eukaryote for the first time. The results demonstrate specialisations of the chaperone classes, which display different overall workloads, efficiencies and preference for the sub-cellular localisation of their targets. The novel integration of the interactome data with quantification supports re-estimates of the level of protein throughout going through molecular chaperones. Additionally, although chaperones target fewer than 40% of annotated proteins we show that they mediate the folding of the majority of protein molecules (?62% of the total protein flux in the cell), highlighting their importance. PMID:23420633

Brownridge, Philip; Lawless, Craig; Payapilly, Aishwarya B; Lanthaler, Karin; Holman, Stephen W; Harman, Victoria M; Grant, Christopher M; Beynon, Robert J; Hubbard, Simon J



Analysis of quantitative trait loci for behavioral laterality in mice.  

PubMed Central

Laterality is believed to have genetic components, as has been deduced from family studies in humans and responses to artificial selection in mice, but these genetic components are unknown and the underlying physiological mechanisms are still a subject of dispute. We measured direction of laterality (preferential use of left or right paws) and degree of laterality (absolute difference between the use of left and right paws) in C57BL/6ByJ (B) and NZB/BlNJ (N) mice and in their F(1) and F(2) intercrosses. Measurements were taken of both forepaws and hind paws. Quantitative trait loci (QTL) did not emerge for direction but did for degree of laterality. One QTL for forepaw (LOD score = 5.6) and the second QTL for hind paw (LOD score = 7.2) were both located on chromosome 4 and their peaks were within the same confidence interval. A QTL for plasma luteinizing hormone concentration was also found in the confidence interval of these two QTL. These results suggest that the physiological mechanisms underlying degree of laterality react to gonadal steroids. PMID:12663540

Roubertoux, Pierre L; Le Roy, Isabelle; Tordjman, Sylvie; Cherfou, Ameziane; Migliore-Samour, Daniele



Quantitative analysis of pheromone-binding protein specificity  

PubMed Central

Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using ?-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between ?-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the ?-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ~100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ~200 nM for the silk moth pheromone bombykol and ~90 nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the pheromone receptor model proposed by Laughlin et al. (Cell 133: 1255–65, 2008) are discussed. PMID:23121132

Katti, S.; Lokhande, N.; Gonzalez, D.; Cassill, A.; Renthal, R.



Quantitative proteomic analysis of the Salmonella-lettuce interaction.  


Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24?h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens. PMID:24512637

Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu



A quantitative astronomical analysis of the Orion Correlation Theory  

E-print Network

The link between the three major Giza pyramids and the stars of the Orion Belt has been since long time the subject of various qualitative speculations. In this framework an important role is played by a controversial theory, the so-called "Orion Correlation Theory" (OCT), according to which a perfect coincidence would exist between the mutual positions of the three stars of the Orion Belt and those of the main Giza pyramids. In the present paper the OCT has been subjected to some quantitative astronomical and astrophysical verifications, in order to assess its compatibility with the results of both naked-eye astrometry and photometry. In particular, a linear correlation is found between the height of such monuments and the present brightness of the Orion Belt stars. According to these analyses it is possible to conclude that the OCT is not incompatible with what expected for the stars of the Orion Belt on the basis of naked-eye astrometry and photometry, as well as of the stellar evolution theory.

Orofino, Vincenzo



Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods  

NASA Astrophysics Data System (ADS)

The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

Michalska, J.; Chmiela, B.



Quantitative determination of total lipid hydroperoxides by a flow injection analysis system  

Microsoft Academic Search

A flow injection analysis (FIA) system coupled with a fluorescence detection system using diphenyl-1-pyrenylphosphine (DPPP)\\u000a was developed as a highly sensitive and reproducible quantitative method of total lipid hydroperoxide analysis. Fluorescence\\u000a analysis of DPPP oxide generated by the reaction of lipid hydroperoxides with DPPP enabled a quantitative determination of\\u000a the total amount of lipid hydroperoxides. Use of 1-myristoyl-2-(12-((7-nitro-2-1,3-benzoxadiazol-4-yl)amino) dodecanoyl)-sn-glycero-3-phosphocholine as

Jeong-Ho Sohn; Yusuke Taki; Hideki Ushio; Toshiaki Ohshima



Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size  

SciTech Connect

In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao



A Quantitative Content Analysis of Mercer University MEd, EdS, and Doctoral Theses  

ERIC Educational Resources Information Center

Quantitative content analysis of a body of research not only helps budding researchers understand the culture, language, and expectations of scholarship, it helps identify deficiencies and inform policy and practice. Because of these benefits, an analysis of a census of 980 Mercer University MEd, EdS, and doctoral theses was conducted. Each thesis…

Randolph, Justus J.; Gaiek, Lura S.; White, Torian A.; Slappey, Lisa A.; Chastain, Andrea; Harris, Rose Prejean



A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals  

ERIC Educational Resources Information Center

This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

Takusi, Gabriel Samuto



Quantitative analysis of water-soluble vitamins by ATR-FTIR spectroscopy  

Microsoft Academic Search

HPLC and microbiology are the methods traditionally employed to control the vitamin content in food mixtures. However, considerations of cost, time of analysis per sample and complexities involved in the technique have hampered the acceptance of those methods for raw materials analysis. Fourier Transform Infrared (FTIR) spectroscopy has substantial potential as a quantitative quality control tool for the food industry.

C. Wojciechowski; N. Dupuy; C. D. Ta; J. P. Huvenne; P. Legrand



Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents  

ERIC Educational Resources Information Center

Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

Cochran, Beverly; Lunday, Deborah; Miskevich, Frank



A fully automated method for quantitative cerebral hemodynamic analysis using DSC-MRI  

PubMed Central

Dynamic susceptibility contrast (DSC)-based perfusion analysis from MR images has become an established method for analysis of cerebral blood volume (CBV) in glioma patients. To date, little emphasis has, however, been placed on quantitative perfusion analysis of these patients, mainly due to the associated increased technical complexity and lack of sufficient stability in a clinical setting. The aim of our study was to develop a fully automated analysis framework for quantitative DSC-based perfusion analysis. The method presented here generates quantitative hemodynamic maps without user interaction, combined with automatic segmentation of normal-appearing cerebral tissue. Validation of 101 patients with confirmed glioma after surgery gave mean values for CBF, CBV, and MTT, extracted automatically from normal-appearing whole-brain white and gray matter, in good agreement with literature values. The measured age- and gender-related variations in the same parameters were also in agreement with those in the literature. Several established analysis methods were compared and the resulting perfusion metrics depended significantly on method and parameter choice. In conclusion, we present an accurate, fast, and automatic quantitative perfusion analysis method where all analysis steps are based on raw DSC data only. PMID:20087370

Bj?rnerud, Atle; Emblem, Kyrre E



Dynamic and still microcirculatory image analysis for quantitative microcirculation research  

NASA Astrophysics Data System (ADS)

Based on analyses of various types of digital microcirculatory image (DMCI), we summed up the image features of DMCI, the digitizing demands for digital microcirculatory imaging, and the basic characteristics of the DMCI processing. A dynamic and still imaging separation processing (DSISP) mode was designed for developing a DMCI workstation and the DMCI processing. Original images in this study were clinical microcirculatory images from human finger nail-bed and conjunctiva microvasculature, and intravital microvascular network images from animal tissue or organs. A series of dynamic and still microcirculatory image analysis functions were developed in this study. The experimental results indicate most of the established analog video image analysis methods for microcirculatory measurement could be realized in a more flexible way based on the DMCI. More information can be rapidly extracted from the quality improved DMCI by employing intelligence digital image analysis methods. The DSISP mode is very suitable for building a DMCI workstation.

Ying, Xiaoyou; Xiu, Rui-juan



Quantitative analysis of three-dimensional landmark coordinate data  

NASA Astrophysics Data System (ADS)

The advantages of using three-dimensional (3D) data in the description and analysis of biological forms are obvious: these data provide realistic geometrically integrated models of the forms under study and can be rotated translated and dissected electronically for viewing. 3D coordinate data can be collected from several sources including computed tomographic images stereo photographs specially designed microscopes and digitizers. But once collected how can these data be analyzed to address biologically relevant research questions? This paper demonstrates the capabilities of two analytical techniques finite-element scaling analysis and Euclidean distances matrix analysis in the comparison of 3D biological forms. Examples include studies of growth of the craniofacial complex and analyses of differences in form between members of biologically defined groups (e. g. species sexes diagnostic categories).

Richtsmeier, Joan T.



Quantitative Analysis of Shoot Development and Branching Patterns in Actinidia  

PubMed Central

We developed a framework for the quantitative description of Actinidia vine architecture, classifying shoots into three types (short, medium and long) corresponding to the modes of node number distribution and the presence/absence of neoformed nodes. Short and medium shoots were self?terminated and had only preformed nodes. Based on the cut?off point between their two modes of node number distribution, short shoots were defined as having nine or less nodes, and medium shoots as having more than nine nodes. Long shoots were non?terminated and had a number of neoformed nodes; the total number of nodes per shoot was up to 90. Branching patterns for each parent shoot type were represented by a succession of branching zones. Probabilities of different types of axillary production (latent bud, short, medium or long shoot) and the distributions of length for each branching zone were estimated from experimental data using hidden semi?Markov chain stochastic models. Branching was acrotonic on short and medium parent shoots, with most axillary shoots being located near the shoot tip. For long parent shoots, branching was mesotonic, with most long axillary shoots being located in the transition zone between the preformed and neoformed part of the parent shoot. Although the shoot classification is based on node number distribution there was a marked difference in average (per shoot) internode length between the shoot types, with mean values of 9, 27 and 47 mm for short, medium and long shoots, respectively. Bud and shoot development is discussed in terms of environmental controls. PMID:12096808




Quantitative analysis of monoclonal antibodies by cation-exchange chromatofocusing.  


A robust cation-exchange chromatofocusing method was developed for the routine analysis of a recombinant humanized monoclonal IgG antibody. We compare the chromatofocusing method to the conventional cation-exchange chromatography (CEX) employing a salt gradient and show clear advantages of chromatofocusing over CEX. We demonstrate the suitability of the present chromatofocusing method for its intended purpose by testing the validation characteristics. To our knowledge, this is the first chromatofocusing method developed for the routine analysis of monoclonal antibody charge species. PMID:19560777

Rozhkova, Anna



Procedures for Quantitative Analysis of Change Facilitator Interventions.  

ERIC Educational Resources Information Center

The procedures and coding schema that have been developed by the Research on the Improvement Process (RIP) Program for analyzing the frequency of interventions and for examining their internal characteristics are described. In two in-depth ethnographic studies of implementation efforts, interventions were the focus of data collection and analysis.…

Hord, Shirley M.; Hall, Gene E.


Regression Commonality Analysis: A Technique for Quantitative Theory Building  

ERIC Educational Resources Information Center

When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

Nimon, Kim; Reio, Thomas G., Jr.



Quantitative analysis of marine oils by capillary supercritical fluid chromatography  

Microsoft Academic Search

Supercritical fluid chromatographic analysis methods have been employed in the examination of several marine oils for the group separation of free fatty acids, retinol, ergocalciferol, cholecalciferol, squalene, tocopherols, cholesterol, was esters, diacylglycerols, cholesteryl esters, and triacylglycerols. The oils were derived from characteristic species including shark, seal, edible and trash fish. The supercritical fluid chromatography (SFC) method used for the separation

A. Staby; C. Borch-Jensen; S. Balchen; J. Mollerup



Quantitative Analysis With the Probabilistic Model Checker PRISM  

Microsoft Academic Search

Probabilistic model checking is a formal verification technique for establishing the correctness, performance and reliability of systems which exhibit stochastic be- haviour. As in conventional verification, a precise mathematical model of a real-life system is constructed first, and, given formal specifications of one or more properties of this system, an analysis of these properties is performed. The exploration of the

Marta Z. Kwiatkowska; Gethin Norman; David Parker



Reflectance Spectroscopy: Quantitative Analysis Techniques for Remote Sensing Applications  

Microsoft Academic Search

Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean optical path length and the implications for use in modeling reflectance spectra are presented. It is shown that the mean optical path length in a particulate

Roger N. Clark; Ted L. Roush



Quantitative Analysis of White Matter Fiber Properties along Geodesic Paths  

Microsoft Academic Search

Diusion Tensor Imaging (DTI) is becoming a routine mag- netic resonance technique to study white matter properties and alter- ations of fiber integrity due to pathology. The advanced MRI technique needs postprocessing by adequate image processing and visualization tools. Analysis of DTI in clinical studies so far use manual definition of regions or interest or image matching followed by voxel-based

Pierre Fillard; John H. Gilmore; Joseph Piven; Weili Lin; Guido Gerig



Quantitative assessment of Chicago air pollution through analysis of covariance  

NASA Astrophysics Data System (ADS)

A covariance analysis of source-receptor relationships is reported based on a 6-month air-pollution monitoring study in Chicago. Regular simultaneous measurements were made of total and respirable particulate matter (dp, particle diameter, < 2 ?m); sulfate, nitrate and 18 elemental compositions for both size ranges; electrical aerosol analyzer size distributions; condensation nuclei counts; light scatter; CO, SO 2, NO, NO 2 and O 3; temperature, u.v. radiation, rainfall, humidity and wind speed and direction. The ratio between light scatter and respirable particle concentration was 2.72 m 2g -1 which agrees well with ratios measured at a variety of other locations. Analysis of covariance by wind direction found enrichment, with respect to TSP, for Fe, V, As and NO -3. The patterns were consistent with existing point source locations and strengths. The effect of street salting was also evident as Na was enriched with respect to the winter season for dp > 2 ?m. On days of low humidity (< 8 mm Hg) SO 2-4 concentrations were linearly associated with SO 2 levels ( r2 = 0.72). This relationship probably reflects primary source contributions, defined as sulfate directly emitted or formed relatively soon after discharge. At higher humidities the SO 2-SO 2-4 relationship was not linear; and secondary contributions, as reflected in ozone concentrations from preceding days, appeared to be important. The average primary source contribution to SO 2-4 concentration was estimated at 5.5 ?g m -3 which represented about 60 % of the average level. An analysis of nitrate—NO 2 concentrations suggested a linear relationship ( r2 = 0.58). Analysis of covariance was found to better discriminate between source categories responsible for differences in particulate matter components than the more commonly used analysis of variance and linear regression techniques.

Scheff, Peter A.; Wadden, Richard A.; Allen, Robert J.


Mass spectrometry for real-time quantitative breath analysis.  


Breath analysis research is being successfully pursued using a variety of analytical methods, prominent amongst which are gas chromatography with mass spectrometry, GC-MS, ion mobility spectrometry, IMS, and the fast flow and flow-drift tube techniques called selected ion flow tube mass spectrometry, SIFT-MS, and proton transfer reaction mass spectrometry, PTR-MS. In this paper the case is made for real-time breath analysis by obviating sample collection into bags or onto traps that can suffer from partial degradation of breath metabolites or the introduction of impurities. Real-time analysis of a broad range of volatile chemical compounds can be best achieved using SIFT-MS and PTR-MS, which are sufficiently sensitive and rapid to allow the simultaneous analyses of several trace gas metabolites in single breath exhalations. The basic principles and the ion chemistry that underpin these two analytical techniques are briefly described and the differences between them, including their respective strengths and weaknesses, are revealed, especially with reference to the analysis of the complex matrix that is exhaled breath. A recent innovation is described that combines time-of-flight mass spectrometry with the proton transfer flow-drift tube reactor, PTR-TOFMS, which provides greater resolution in the analytical mass spectrometer and allows separation of protonated isobaric molecules. Examples are presented of some recent data that well illustrate the quality and real-time feature of SIFT-MS and PTR-MS for the analysis of exhaled breath for physiological/biochemical/pharmacokinetics studies and for the identification and quantification of biomarkers relating to specific disease states. PMID:24682047

Smith, David; Špan?l, Patrik; Herbig, Jens; Beauchamp, Jonathan



Quantitative trait loci analysis of swine meat quality traits.  


A QTL study was performed in large half-sib families to characterize the genetic background of variation in pork quality traits as well as to examine the possibilities of including QTL in a marker-assisted selection scheme. The quality traits included ultimate pH in LM and the semimembranosus, drip loss, and the Minolta color measurements L*, a*, and b* representing meat lightness, redness, and yellowness, respectively. The families consist of 3,883 progenies of 12 Duroc boars that were evaluated to identify the QTL. The linkage map consists of 462 SNP markers on 18 porcine autosomes. Quantitative trait loci were mapped using a linear mixed model with fixed factors (sire, sex, herd, month, sow age) and random factors (polygenic effect, QTL effects, and litter). Chromosome-wide and genome-wide significance thresholds were determined by Peipho's approach, and 95% Bayes credibility intervals were estimated from a posterior distribution of the QTL position. In total, 31 QTL for the 6 meat quality traits were found to be significant at the 5% chromosome-wide level, among which 11 QTL were significant at the 5% genome-wide level and 5 of these were significant at the 0.1% genome-wide level. Segregation of the identified QTL in different families was also investigated. Most of the identified QTL segregated in 1 or 2 families. For the QTL affecting ultimate pH in LM and semimembranosus and L* and b* value on SSC6, the positions of the QTL and the shapes of the likelihood curves were almost the same. In addition, a strong correlation of the estimated effects of these QTL was found between the 4 traits, indicating that the same genes control these traits. A similar pattern was seen on SSC15 for the QTL affecting ultimate pH in the 2 muscles and drip loss. The results from this study will be helpful for fine mapping and identifying genes affecting meat quality traits, and tightly linked markers may be incorporated into marker-assisted selection programs. PMID:20495113

Li, H D; Lund, M S; Christensen, O F; Gregersen, V R; Henckel, P; Bendixen, C



Quantitative analysis of night skyglow amplification under cloudy conditions  

NASA Astrophysics Data System (ADS)

The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU-intensive. These new results are of theoretical and experimental significance as they will motivate experimentalists to collect data from various regions to build an overall picture of the AF, and will encourage modellers to test the consistency with theoretical predictions.

Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio



Quantitative analysis of structural neuroimaging of mesial temporal lobe epilepsy  

PubMed Central

Mesial temporal lobe epilepsy (MTLE) is the most common of the surgically remediable drug-resistant epilepsies. MRI is the primary diagnostic tool to detect anatomical abnormalities and, when combined with EEG, can more accurately identify an epileptogenic lesion, which is often hippocampal sclerosis in cases of MTLE. As structural imaging technology has advanced the surgical treatment of MTLE and other lesional epilepsies, so too have the analysis techniques that are used to measure different structural attributes of the brain. These techniques, which are reviewed here and have been used chiefly in basic research of epilepsy and in studies of MTLE, have identified different types and the extent of anatomical abnormalities that can extend beyond the affected hippocampus. These results suggest that structural imaging and sophisticated imaging analysis could provide important information to identify networks capable of generating spontaneous seizures and ultimately help guide surgical therapy that improves postsurgical seizure-freedom outcomes. PMID:24319498

Memarian, Negar; Thompson, Paul M; Engel, Jerome; Staba, Richard J



Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells  

NASA Astrophysics Data System (ADS)

Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.



Quantitative TP73 Transcript Analysis in Hepatocellular Carcinomas  

Microsoft Academic Search

Purpose: The p53 family member p73 displays signifi- cant homology to p53, but data from primary tumors dem- onstrating increased expression levels of p73 in the absence of any gene mutations argue against a classical tumor sup- pressor function. A detailed analysis of the p73 protein in tumor tissues has revealed expression of two classes of p73 isoforms. Whereas the

Thorsten Stiewe; Sebastian Tuve; Martin Peter; Andrea Tannapfel; Ahmet H. Elmaagacli; Brigitte M. Putzer



A quantitative-experiential analysis of human emotions  

Microsoft Academic Search

A study was carried out in which participants provided written first-person present-tense descriptions of experiences of ordinary human emotions and visual analogue scale responses to intensities of desire, expectation, and positive or negative emotional feeling that were present during these experiences. Functional interrelationships found in this analysis could be described by the general equations (1) Feeling intensity=K1Desire+K2Desire×expectations5 for positive approach

Donald D. Price; James E. Barrell; James J. Barrell



Enabling Quantitative Analysis in Ambient Ionization Mass Spectrometry: Internal Standard Coated Capillary Samplers  

PubMed Central

We describe a sampling method using glass capillaries for quantitative analysis of trace analytes in small volumes of complex mixtures (~1 ?L) using ambient ionization mass spectrometry. The internal surface of a sampling glass capillary was coated with internal standard then used to draw liquid sample and so transfer both the analyte and internal standard in a single fixed volume onto a substrate for analysis. The internal standard was automatically mixed into the sample during this process and the volumes of the internal standard solution and sample are both fixed by the capillary volume. Precision in quantitation is insensitive to variations in length of the capillary, making the preparation of the sampling capillary simple and providing a robust sampling protocol. Significant improvements in quantitation accuracy were obtained for analysis of 1 ?L samples using various ambient ionization methods. PMID:23731380

Liu, Jiangjiang; Cooks, R. Graham; Ouyang, Zheng



Quantitative spatiotemporal image analysis of fluorescein angiography in age-related macular degeneration  

NASA Astrophysics Data System (ADS)

Interpretation and analysis of retinal angiographic studies has been largely qualitative. Quantitative analysis of pathologic fundus features will facilitate interpretation and potentiate clinical studies where precise image metrology is vital. Fluorescein angiography studies of patients with age- related macular degeneration were digitized. Sequential temporal images were spatially-registered with polynomial warping algorithms, allowing for the construction of a three- dimensional (two spatial and one temporal) angiogram vector. Temporal profiles through spatially-registered, temporally- sequential pixels were computed. Characteristic temporal profiles for fundus background, retinal vasculature, retinal pigment epithelial atrophy, and choroidal neovascular (CNV) membranes were observed, allowing for pixel assignment and fundus feature quantitation. Segmentation and quantitation of fundus features including geographic atrophy and CNV is facilitated by spatio-temporal image analysis.

Berger, Jeffrey W.



Identifying severity of electroporation through quantitative image analysis  

NASA Astrophysics Data System (ADS)

Electroporation is the formation of reversible hydrophilic pores in the cell membrane under electric fields. Severity of electroporation is challenging to measure and quantify. An image analysis method is developed, and the initial results with a fabricated microfluidic device are reported. The microfluidic device contains integrated microchannels and coplanar interdigitated electrodes allowing low-voltage operation and low-power consumption. Noninvasive human buccal cell samples were specifically stained, and electroporation was induced. Captured image sequences were analyzed for pixel color ranges to quantify the severity of electroporation. The method can detect even a minor occurrence of electroporation and can perform comparative studies.

Morshed, Bashir I.; Shams, Maitham; Mussivand, Tofy



Quantitative assessment of p-glycoprotein expression and function using confocal image analysis.  


P-glycoprotein is implicated in clinical drug resistance; thus, rapid quantitative analysis of its expression and activity is of paramout importance to the design and success of novel therapeutics. The scope for the application of quantitative imaging and image analysis tools in this field is reported here at "proof of concept" level. P-glycoprotein expression was utilized as a model for quantitative immunofluorescence and subsequent spatial intensity distribution analysis (SpIDA). Following expression studies, p-glycoprotein inhibition as a function of verapamil concentration was assessed in two cell lines using live cell imaging of intracellular Calcein retention and a routine monolayer fluorescence assay. Intercellular and sub-cellular distributions in the expression of the p-glycoprotein transporter between parent and MDR1-transfected Madin-Derby Canine Kidney cell lines were examined. We have demonstrated that quantitative imaging can provide dose-response parameters while permitting direct microscopic analysis of intracellular fluorophore distributions in live and fixed samples. Analysis with SpIDA offers the ability to detect heterogeniety in the distribution of labeled species, and in conjunction with live cell imaging and immunofluorescence staining may be applied to the determination of pharmacological parameters or analysis of biopsies providing a rapid prognostic tool. PMID:25158832

Hamrang, Zahra; Arthanari, Yamini; Clarke, David; Pluen, Alain




Microsoft Academic Search

A quantitative analysis of prostate-specific antigen (PSA) in samples of human blood serum by fluorescence immunochromatography using monoclonal antibodies to PSA was developed. The fluorescence immunochromatographic analysis system is composed of anti-PSA-monoclonal antibody (mAb), fluorescence conjugates in detection solution, a immunochromatographic assay strip, and a laser fluorescence scanner. A fluorescence immunochromatographic analysis system was employed to detect PSA on the

Jisun Yoo; Young Mee Jung; Jong Hoon Hahn; Dongjin Pyo



Quantitative Analysis of Guanine Nucleotide Exchange Factors (GEFs) as Enzymes  

PubMed Central

The proteins that possess guanine nucleotide exchange factor (GEF) activity, which include about ~800 G protein coupled receptors (GPCRs),1 15 Arf GEFs,2 81 Rho GEFs,3 8 Ras GEFs,4 and others for other families of GTPases,5 catalyze the exchange of GTP for GDP on all regulatory guanine nucleotide binding proteins. Despite their importance as catalysts, relatively few exchange factors (we are aware of only eight for ras superfamily members) have been rigorously characterized kinetically.5–13 In some cases, kinetic analysis has been simplistic leading to erroneous conclusions about mechanism (as discussed in a recent review14). In this paper, we compare two approaches for determining the kinetic properties of exchange factors: (i) examining individual equilibria, and; (ii) analyzing the exchange factors as enzymes. Each approach, when thoughtfully used,14,15 provides important mechanistic information about the exchange factors. The analysis as enzymes is described in further detail. With the focus on the production of the biologically relevant guanine nucleotide binding protein complexed with GTP (G•GTP), we believe it is conceptually simpler to connect the kinetic properties to cellular effects. Further, the experiments are often more tractable than those used to analyze the equilibrium system and, therefore, more widely accessible to scientists interested in the function of exchange factors. PMID:25332840

Randazzo, Paul A; Jian, Xiaoying; Chen, Pei-Wen; Zhai, Peng; Soubias, Olivier; Northup, John K



The Correlation of Contrast-Enhanced Ultrasound and MRI Perfusion Quantitative Analysis in Rabbit VX2 Liver Cancer.  


Our objective is to explore the value of liver cancer contrast-enhanced ultrasound (CEUS) and MRI perfusion quantitative analysis in liver cancer and the correlation between these two analysis methods. Rabbit VX2 liver cancer model was established in this study. CEUS was applied. Sono Vue was applied in rabbits by ear vein to dynamically observe and record the blood perfusion and changes in the process of VX2 liver cancer and surrounding tissue. MRI perfusion quantitative analysis was used to analyze the mean enhancement time and change law of maximal slope increasing, which were further compared with the pathological examination results. Quantitative indicators of liver cancer CEUS and MRI perfusion quantitative analysis were compared, and the correlation between them was analyzed by correlation analysis. Rabbit VX2 liver cancer model was successfully established. CEUS showed that time-intensity curve of rabbit VX2 liver cancer showed "fast in, fast out" model while MRI perfusion quantitative analysis showed that quantitative parameter MTE of tumor tissue increased and MSI decreased: the difference was statistically significant (P < 0.01). The diagnostic results of CEUS and MRI perfusion quantitative analysis were not significantly different (P > 0.05). However, the quantitative parameter of them were significantly positively correlated (P < 0.05). CEUS and MRI perfusion quantitative analysis can both dynamically monitor the liver cancer lesion and surrounding liver parenchyma, and the quantitative parameters of them are correlated. The combined application of both is of importance in early diagnosis of liver cancer. PMID:25123838

Xiang, Zhiming; Liang, Qianwen; Liang, Changhong; Zhong, Guimian



Quantitative Analysis of Bloggers Collective Behavior Powered by Emotions  

E-print Network

Large-scale data resulting from users online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study emergence of the emotional behavior among Web users. Mapping the high-resolution data from onto bipartite network of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion-classifier developed for this type of texts. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and tempo...

Mitrovi?, Marija; Tadi?, Bosiljka



Quantitative Analysis of Genealogy Using Digitised Family Trees  

E-print Network

Driven by the popularity of television shows such as Who Do You Think You Are? many millions of users have uploaded their family tree to web projects such as WikiTree. Analysis of this corpus enables us to investigate genealogy computationally. The study of heritage in the social sciences has led to an increased understanding of ancestry and descent but such efforts are hampered by difficult to access data. Genealogical research is typically a tedious process involving trawling through sources such as birth and death certificates, wills, letters and land deeds. Decades of research have developed and examined hypotheses on population sex ratios, marriage trends, fertility, lifespan, and the frequency of twins and triplets. These can now be tested on vast datasets containing many billions of entries using machine learning tools. Here we survey the use of genealogy data mining using family trees dating back centuries and featuring profiles on nearly 7 million individuals based in over 160 countries. These data a...

Fire, Micheal; Elovici, Yuval



Quantitative real-time single particle analysis of virions.  


Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed-or adapted from other fields, such as nanotechnology-to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification. PMID:24999044

Heider, Susanne; Metzner, Christoph



Quantitative analysis of caffeine applied to pharmaceutical industry  

NASA Astrophysics Data System (ADS)

The direct determination of some compounds like caffeine in pharmaceutical samples without sample pretreatment and without the separation of these compounds from the matrix (acetyl salicylic acid, paracetamol,…) is very worthwhile. It enables analysis to be performed quickly and without the problems associated with sample manipulation. The samples were diluted directly in KBr powder. We used both diffuse reflectance (DRIFT) and transmission techniques in order to measure the intensity of the peaks of the caffeine in the pharmaceutical matrix. Limits of detection, determination, relative standard deviation and recovery using caffeine in the same matrix as in the pharmaceutical product are related. Two methods for the quantification of caffeine were used: calibration line and standard addition techniques.

Baucells, M.; Ferrer, N.; Gómez, P.; Lacort, G.; Roura, M.



Digital photogrammetry for quantitative wear analysis of retrieved TKA components.  


The use of new materials in knee arthroplasty demands a way in which to accurately quantify wear in retrieved components. Methods such as damage scoring, coordinate measurement, and in vivo wear analysis have been used in the past. The limitations in these methods illustrate a need for a different methodology that can accurately quantify wear, which is relatively easy to perform and uses a minimal amount of expensive equipment. Off-the-shelf digital photogrammetry represents a potentially quick and easy alternative to what is readily available. Eighty tibial inserts were visually examined for front and backside wear and digitally photographed in the presence of two calibrated reference fields. All images were segmented (via manual and automated algorithms) using Adobe Photoshop and National Institute of Health ImageJ. Finally, wear was determined using ImageJ and Rhinoceros software. The absolute accuracy of the method and repeatability/reproducibility by different observers were measured in order to determine the uncertainty of wear measurements. To determine if variation in wear measurements was due to implant design, 35 implants of the three most prevalent designs were subjected to retrieval analysis. The overall accuracy of area measurements was 97.8%. The error in automated segmentation was found to be significantly lower than that of manual segmentation. The photogrammetry method was found to be reasonably accurate and repeatable in measuring 2-D areas and applicable to determining wear. There was no significant variation in uncertainty detected among different implant designs. Photogrammetry has a broad range of applicability since it is size- and design-independent. A minimal amount of off-the-shelf equipment is needed for the procedure and no proprietary knowledge of the implant is needed. PMID:16649169

Grochowsky, J C; Alaways, L W; Siskey, R; Most, E; Kurtz, S M



Quantitative Surface Analysis of NBS Standard Materials and Mt. St. Helens Ash by Electron Spectroscopy for Chemical Analysis  

Microsoft Academic Search

Results are presented which develop a quantitative method of surface analysis by ESCA for complex heterogeneous systems. Calibration and application of the method to determination of surface weight percentages are discussed. Mt. St. Helens Ash is used to authenticate the method; results agree with bulk analysis to ±20%. Results from NBS standard materials are used to establish detection limits of

Joseph A. Gardella Jr; David M. Hercules



77 FR 63801 - Aqua-Leisure Industries, Inc., Provisional Acceptance of a Settlement Agreement and Order  

Federal Register 2010, 2011, 2012, 2013

...CPSC Docket No. 13-C0001] Aqua-Leisure Industries, Inc., Provisional Acceptance...provisionally-accepted Settlement Agreement with Aqua-Leisure Industries, Inc., containing a civil...SAFETY COMMISSION In the Matter of: Aqua-Leisure Industries, Inc. CPSC Docket...



[Clinical application of image processing system and quantitative analysis of left ventriculogram and coronary arteriogram].  


Currently cine left ventriculography and coronary arteriography are still one of the most important methods in the diagnosis of coronary heart disease and other coronary arterial diseases. An image processing system, called "IA-87 Medical Image Processing System", for quantitative analysis of cine-coronary and left ventricular angiograms has been developed using IBM-PC/AT computer. The major functions of this system are: (1) left ventricular volume determination, i.e. left ventricular contour can be drawn automatically or semi-automatically, the systolic and diastolic volume of the left ventricle are calculated by Simpson's, length-area and chord-length methods. (2) left ventricular segmental wall motion analysis, using rectilinear and polar method, the segmental ejection fraction and normalized segmental contraction are determined. (3) dynamic display of the cardiac cycle. (4) coronary arterial lesion, such as stenosis, can be quantitatively analysed. In a series of cases free from cardiac disease the normal coronary artery (40 cases) and left ventricle (30 cases) were quantitatively analysed using "IA-87 Medical Image Processing System", and the normal value of coronary artery diameter and left ventricular function among Chinese obtained. At the same time, in a series of 45 cases with coronary heart disease (including anterior, posterior wall infarction, and left ventricular aneurysm in 15 respectively), quantitative analysis of the left ventricle was made. The results showed that the system is of significant value for quantitative diagnosis of ischemic heart disease as well as evaluation of therapeutic effect and prediction of prognosis. PMID:1879318

Dai, R



Project #05R: Xiaoping Hu and Jue Zhang: Quantitative fMRI and Network Analysis of Acupuncture Induced Brain Activity  

E-print Network

Project #05R: Xiaoping Hu and Jue Zhang: Quantitative fMRI and Network Analysis of Acupuncture collaboration that have performed a quantitative fMRI study of acupuncture. This collaboration proved fruitful as it allowed us to establish quantitative fMRI as a viable approach to study acupuncture and led to very

Weber, Rodney


Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores  

NASA Astrophysics Data System (ADS)

In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis.

Po?ízka, P.; Demidov, A.; Kaiser, J.; Keivanian, J.; Gornushkin, I.; Panne, U.; Riedel, J.



Direct Quantitative Analysis of Arsenic in Coal Fly Ash  

PubMed Central

A rapid, simple method based on graphite furnace atomic absorption spectrometry is described for the direct determination of arsenic in coal fly ash. Solid samples were directly introduced into the atomizer without preliminary treatment. The direct analysis method was not always free of spectral matrix interference, but the stabilization of arsenic by adding palladium nitrate (chemical modifier) and the optimization of the parameters in the furnace program (temperature, rate of temperature increase, hold time, and argon gas flow) gave good results for the total arsenic determination. The optimal furnace program was determined by analyzing different concentrations of a reference material (NIST1633b), which showed the best linearity for calibration. The optimized parameters for the furnace programs for the ashing and atomization steps were as follows: temperatures of 500–1200 and 2150°C, heating rates of 100 and 500°C?s?1, hold times of 90 and 7?s, and medium then maximum and medium argon gas flows, respectively. The calibration plots were linear with a correlation coefficient of 0.9699. This method was validated using arsenic-containing raw coal samples in accordance with the requirements of the mass balance calculation; the distribution rate of As in the fly ashes ranged from 101 to 119%. PMID:23251836

Hartuti, Sri; Kambara, Shinji; Takeyama, Akihiro; Kumabe, Kazuhiro; Moritomi, Hiroshi



Quantitative image analysis of histological sections of coronary arteries  

NASA Astrophysics Data System (ADS)

The study of coronary arteries has evolved from examining gross anatomy and morphology to scrutinizing micro-anatomy and cellular composition. Technological advances such as high- resolution digital microscopes and high precision cutting devices have allowed examination of coronary artery morphology and pathology at micron resolution. We have developed a software toolkit to analyze histological sections. In particular, we are currently engaged in examining normal coronary arteries in order to provide the foundation for study of remodeled tissue. The first of two coronary arteries was stained for elastin and collagen. The second coronary artery was sectioned and stained for cellular nuclei and smooth muscle. High resolution light microscopy was used to image the sections. Segmentation was accomplished initially with slice- to-slice thresholding algorithms. These segmentation techniques choose optimal threshold values by modeling the tissue as one or more distributions. Morphology and image statistics were used to further differentiate the thresholded data into different tissue categories therefore refine the results of the segmentation. Specificity/sensitivity analysis suggests that automatic segmentation can be very effective. For both tissue samples, greater than 90% specificity was achieved. Summed voxel projection and maximum intensity projection appear to be effective 3-D visualization tools. Shading methods also provide useful visualization, however it is important to incorporate combined 2-D and 3-D displays. Surface rendering techniques (e.g. color mapping) can be used for visualizing parametric data. Preliminary results are promising, but continued development of algorithms is needed.

Holmes, David R., III; Robb, Richard A.



Mechanistic insights from a quantitative analysis of pollen tube guidance  

PubMed Central

Background Plant biologists have long speculated about the mechanisms that guide pollen tubes to ovules. Although there is now evidence that ovules emit a diffusible attractant, little is known about how this attractant mediates interactions between the pollen tube and the ovules. Results We employ a semi-in vitro assay, in which ovules dissected from Arabidopsis thaliana are arranged around a cut style on artificial medium, to elucidate how ovules release the attractant and how pollen tubes respond to it. Analysis of microscopy images of the semi-in vitro system shows that pollen tubes are more attracted to ovules that are incubated on the medium for longer times before pollen tubes emerge from the cut style. The responses of tubes are consistent with their sensing a gradient of an attractant at 100-150 ?m, farther than previously reported. Our microscopy images also show that pollen tubes slow their growth near the micropyles of functional ovules with a spatial range that depends on ovule incubation time. Conclusions We propose a stochastic model that captures these dynamics. In the model, a pollen tube senses a difference in the fraction of receptors bound to an attractant and changes its direction of growth in response; the attractant is continuously released from ovules and spreads isotropically on the medium. The model suggests that the observed slowing greatly enhances the ability of pollen tubes to successfully target ovules. The relation of the results to guidance in vivo is discussed. PMID:20170550



Funtools: Fits Users Need Tools for Quick, Quantitative Analysis  

NASA Technical Reports Server (NTRS)

The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

Mandel, Eric; Brederkamp, Joe (Technical Monitor)



Quantitative analysis of cell columns in the cerebral cortex.  


We present a quantified imaging method that describes the cell column in mammalian cortex. The minicolumn is an ideal template with which to examine cortical organization because it is a basic unit of function, complete in itself, which interacts with adjacent and distance columns to form more complex levels of organization. The subtle details of columnar anatomy should reflect physiological changes that have occurred in evolution as well as those that might be caused by pathologies in the brain. In this semiautomatic method, images of Nissl-stained tissue are digitized or scanned into a computer imaging system. The software detects the presence of cell columns and describes details of their morphology and of the surrounding space. Columns are detected automatically on the basis of cell-poor and cell-rich areas using a Gaussian distribution. A line is fit to the cell centers by least squares analysis. The line becomes the center of the column from which the precise location of every cell can be measured. On this basis several algorithms describe the distribution of cells from the center line and in relation to the available surrounding space. Other algorithms use cluster analyses to determine the spatial orientation of every column. PMID:10771070

Buxhoeveden, D P; Switala, A E; Roy, E; Casanova, M F



Machine learning methods for quantitative analysis of Raman spectroscopy data  

NASA Astrophysics Data System (ADS)

The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

Madden, Michael G.; Ryder, Alan G.



Direct quantitative analysis of arsenic in coal fly ash.  


A rapid, simple method based on graphite furnace atomic absorption spectrometry is described for the direct determination of arsenic in coal fly ash. Solid samples were directly introduced into the atomizer without preliminary treatment. The direct analysis method was not always free of spectral matrix interference, but the stabilization of arsenic by adding palladium nitrate (chemical modifier) and the optimization of the parameters in the furnace program (temperature, rate of temperature increase, hold time, and argon gas flow) gave good results for the total arsenic determination. The optimal furnace program was determined by analyzing different concentrations of a reference material (NIST1633b), which showed the best linearity for calibration. The optimized parameters for the furnace programs for the ashing and atomization steps were as follows: temperatures of 500-1200 and 2150°C, heating rates of 100 and 500°C?s(-1), hold times of 90 and 7?s, and medium then maximum and medium argon gas flows, respectively. The calibration plots were linear with a correlation coefficient of 0.9699. This method was validated using arsenic-containing raw coal samples in accordance with the requirements of the mass balance calculation; the distribution rate of As in the fly ashes ranged from 101 to 119%. PMID:23251836

Hartuti, Sri; Kambara, Shinji; Takeyama, Akihiro; Kumabe, Kazuhiro; Moritomi, Hiroshi



Quantitative analysis of fall risk using TUG test.  


We examined falling risk among elderly using a wearable inertial sensor, which combines accelerometer and gyrosensors devices, applied during the Timed Up and Go (TUG) test. Subjects were categorised into two groups as low fall risk and high fall risk with 13.5 s duration taken to complete the TUG test as the threshold between them. One sensor was attached at the subject's waist dorsally, while acceleration and gyrosensor signals in three directions were extracted during the test. The analysis was carried out in phases: sit-bend, bend-stand, walking, turning, stand-bend and bend-sit. Comparisons between the two groups showed that time parameters along with root mean square (RMS) value, amplitude and other parameters could reveal the activities in each phase. Classification using RMS value of angular velocity parameters for sit-stand phase, RMS value of acceleration for walking phase and amplitude of angular velocity signal for turning phase along with time parameters suggests that this is an improved method in evaluating fall risk, which promises benefits in terms of improvement of elderly quality of life. PMID:23964848

Zakaria, Nor Aini; Kuwae, Yutaka; Tamura, Toshiyo; Minato, Kotaro; Kanaya, Shigehiko



Quantitative analysis of bloggers' collective behavior powered by emotions  

NASA Astrophysics Data System (ADS)

Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.

Mitrovi?, Marija; Paltoglou, Georgios; Tadi?, Bosiljka



Biochemical and quantitative analysis of Tamm Horsfall protein in rats.  


The involvement of Tamm Horsfall protein (THP) in nephrolithiasis is currently under investigation in several laboratories. Although rat is a commonly used species as an in vivo model for such studies, there is only limited information available about the biochemical properties and excretion profile of THP in normal rats. In order to characterize rat THP, we purified and analyzed normal male rat THP, and compared it with normal human male urinary THP by gel electrophoresis. Both THPs migrated at approximately 90 KDa, and stained similarly for protein (Coomassie blue) as well as carbohydrates (periodic acid Schiff reagent). Compositional analysis revealed that rat THP was largely similar to human THP in amino acid and carbohydrate contents but showed differences in the individual sugar components from other mammals. There was considerable variation in the day-to-day urinary excretion of THP in normal rats, with values ranging from 552.96 micrograms to 2865.60 micrograms and a mean value of 1679.54 micrograms per 24 h. It was concluded from this study that rat THP did not contain any unusual biochemical components and was primarily similar to human THP in composition and mean urinary concentration. PMID:9373916

Gokhale, J A; Glenton, P A; Khan, S R



Analysis of 129I in Groundwater Samples: Direct and Quantitative Results below the Drinking Water Standard  

SciTech Connect

Due to its long half-life (15.7 million years) and relatively unencumbered migration in subsurface environments, 129I has been recognized as a contaminant of concern at numerous federal, private, and international facilities. In order to understand the long-term risk associated with 129I at these locations, quantitative analysis of groundwater samples must be performed. However, the ability to quantitatively assess the 129I content in groundwater samples requires specialized extraction and sophisticated analytical techniques, which are complicated and not always available to the general scientific community. This paper highlights an analytical method capable of directly quantifying 129I in groundwater samples at concentrations below the MCL without the need for sample pre-concentration. Samples were analyzed on a Perkin Elmer ELAN DRC II ICP-MS after minimal dilution using O2 as the reaction gas. Analysis of continuing calibration verification standards indicated that the DRC mode could be used for quantitative analysis of 129I in samples below the drinking water standard (0.0057 ng/ml or 1 pCi/L). The low analytical detection limit of 129I analysis in the DRC mode coupled with minimal sample dilution (1.02x) resulted in a final sample limit of quantification of 0.0051 ng/ml. Subsequent analysis of three groundwater samples containing 129I resulted in fully quantitative results in the DRC mode, and spike recovery analyses performed on all three samples confirmed that the groundwater matrix did not adversely impact the analysis of 129I in the DRC mode. This analytical approach has been proven to be a cost-effective, high-throughput technique for the direct, quantitative analysis of 129I in groundwater samples at concentrations below the current MCL.

Brown, Christopher F.; Geiszler, Keith N.; Lindberg, Michael J.



Quantitative and chemical fingerprint analysis for quality control of Rhizoma Coptidischinensis based on UPLC-PAD combined with chemometrics methods  

Microsoft Academic Search

To control the quality of Rhizoma Coptidis, a method based on ultra performance liquid chromatography with photodiode array detector (UPLC-PAD) was developed for quantitative analysis of five active alkaloids and chemical fingerprint analysis. In quantitative analysis, the five alkaloids showed good regression (R > 0.999 2) within test ranges and the recovery of the method was in the range of

Wei-Jun Kong; Yan-Ling Zhao; Xiao-He Xiao; Cheng Jin; Zu-Lun Li



Quantitative underwater 3D motion analysis using submerged video cameras: accuracy analysis and trajectory reconstruction.  


In this study we aim at investigating the applicability of underwater 3D motion capture based on submerged video cameras in terms of 3D accuracy analysis and trajectory reconstruction. Static points with classical direct linear transform (DLT) solution, a moving wand with bundle adjustment and a moving 2D plate with Zhang's method were considered for camera calibration. As an example of the final application, we reconstructed the hand motion trajectories in different swimming styles and qualitatively compared this with Maglischo's model. Four highly trained male swimmers performed butterfly, breaststroke and freestyle tasks. The middle fingertip trajectories of both hands in the underwater phase were considered. The accuracy (mean absolute error) of the two calibration approaches (wand: 0.96 mm - 2D plate: 0.73 mm) was comparable to out of water results and highly superior to the classical DLT results (9.74 mm). Among all the swimmers, the hands' trajectories of the expert swimmer in the style were almost symmetric and in good agreement with Maglischo's model. The kinematic results highlight symmetry or asymmetry between the two hand sides, intra- and inter-subject variability in terms of the motion patterns and agreement or disagreement with the model. The two outcomes, calibration results and trajectory reconstruction, both move towards the quantitative 3D underwater motion analysis. PMID:22435960

Silvatti, Amanda P; Cerveri, Pietro; Telles, Thiago; Dias, Fábio A S; Baroni, Guido; Barros, Ricardo M L



Quantitative flux analysis reveals folate-dependent NADPH production.  


ATP is the dominant energy source in animals for mechanical and electrical work (for example, muscle contraction or neuronal firing). For chemical work, there is an equally important role for NADPH, which powers redox defence and reductive biosynthesis. The most direct route to produce NADPH from glucose is the oxidative pentose phosphate pathway, with malic enzyme sometimes also important. Although the relative contribution of glycolysis and oxidative phosphorylation to ATP production has been extensively analysed, similar analysis of NADPH metabolism has been lacking. Here we demonstrate the ability to directly track, by liquid chromatography-mass spectrometry, the passage of deuterium from labelled substrates into NADPH, and combine this approach with carbon labelling and mathematical modelling to measure NADPH fluxes. In proliferating cells, the largest contributor to cytosolic NADPH is the oxidative pentose phosphate pathway. Surprisingly, a nearly comparable contribution comes from serine-driven one-carbon metabolism, in which oxidation of methylene tetrahydrofolate to 10-formyl-tetrahydrofolate is coupled to reduction of NADP(+) to NADPH. Moreover, tracing of mitochondrial one-carbon metabolism revealed complete oxidation of 10-formyl-tetrahydrofolate to make NADPH. As folate metabolism has not previously been considered an NADPH producer, confirmation of its functional significance was undertaken through knockdown of methylenetetrahydrofolate dehydrogenase (MTHFD) genes. Depletion of either the cytosolic or mitochondrial MTHFD isozyme resulted in decreased cellular NADPH/NADP(+) and reduced/oxidized glutathione ratios (GSH/GSSG) and increased cell sensitivity to oxidative stress. Thus, although the importance of folate metabolism for proliferating cells has been long recognized and attributed to its function of producing one-carbon units for nucleic acid synthesis, another crucial function of this pathway is generating reducing power. PMID:24805240

Fan, Jing; Ye, Jiangbin; Kamphorst, Jurre J; Shlomi, Tomer; Thompson, Craig B; Rabinowitz, Joshua D



Clinical applications of a quantitative analysis of regional lift ventricular wall motion  

NASA Technical Reports Server (NTRS)

Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.

Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.



Quantitative Analysis of Circumferential Plaque Distribution in Human Coronary Arteries in  

E-print Network

Quantitative Analysis of Circumferential Plaque Distribution in Human Coronary Arteries in Relation · Symptomatic coronary artery disease and atherosclerosis are among the leading causes of death in many to be understood. · 3-D Fusion of x-ray coronary angiography and intravascular ultrasound (IVUS) data allows

Wahle, Andreas


Quantitative analysis of the eective inter-enzyme connectivity in glycolysis  

E-print Network

Quantitative analysis of the eective inter-enzyme connectivity in glycolysis Jesus M. Cortes1 Granada. Spain. E-mail: 4. Corresponding author Keywords: Glycolysis; Inter-Enzyme-generating mechanism in the yeast glycolysis is based on the self-catalytic regulation of the enzyme

Cortes, Jesus


C emQUANT ® software Mathematical modeling in quantitative phase analysis of Portland cement  

Microsoft Academic Search

It is necessary to determine a complete mineralogy of clinker cement to correctly understand, interpret, and predict the outcome of any plant production process. The cement industry's standard method (ASTM C 150) used in quantitative phase analysis of alite, belite, aluminate, and ferrite has long been known to provide approximate concentrations. The wet chemical and optical microscopy methods are too

B. Feret; C. F. Feret



Thermodynamic Analysis of Bridging Bubbles and a Quantitative Comparison with the Measured Hydrophobic  

E-print Network

Thermodynamic Analysis of Bridging Bubbles and a Quantitative Comparison with the Measured 5095, Australia Received September 2, 1999. In Final Form: February 9, 2000 The shape of a bubble bridging two colloidal spheres is obtained by minimization of the constrained Gibbs free energy. Bubbles

Attard, Phil


Quantitative analysis of urinary stone composition with micro-Raman spectroscopy  

Microsoft Academic Search

Urolithiasis is a common, disturbing disease with high recurrent rate (60% in five years). Accurate identification of urinary stone composition is important for treatment and prevention purpose. Our previous studies have demonstrated that micro-Raman spectroscopy (MRS)-based approach successfully detects the composition of tiny stone powders after minimal invasive urological surgery. But quantitative analysis of urinary stones was not established yet.

Yi-Yu Huang; Yi-Chun Chiu; Huihua Kenny Chiang; Y. H. Jet Chou; Shing-Hwa Lu; Allen W. Chiu



A Quantitative Genetic Analysis of Nuclear-Cytoplasmic Male Sterility in Structured Populations of Silene vulgaris  

Microsoft Academic Search

Gynodioecy, the coexistence of functionally female and hermaphroditic morphs within plant popula- tions, often has a complicated genetic basis involving several cytoplasmic male-sterility factors and nuclear restorers. This complexity has made it difficult to study the genetics and evolution of gynodioecy in natural populations. We use a quantitative genetic analysis of crosses within and among populations of Silene vulgaris to

Douglas R. Taylor; Matthew S. Olson; David E. McCauley


Using Quantitative Analysis to Implement Autonomic IT Systems Radu Calinescu and Marta Kwiatkowska  

E-print Network

Using Quantitative Analysis to Implement Autonomic IT Systems Radu Calinescu and Marta Kwiatkowska- cies (i.e., system objectives), an autonomic manager moni- tors the system components through sensors system development. First, it is generic--our autonomic manager can be reconfigured for use with any

Oxford, University of


Toward a Quantitative Analysis of Online Pornography Antoine Mazires (SenS-INRA)  

E-print Network

1 Deep Tags Toward a Quantitative Analysis of Online Pornography Antoine Mazières (Sen available for further study. Keywords: online pornography; computational social sciences; sexual categories of pornography, revealing a proliferation of `diff'rent strokes for diff'rent folks' (Williams 1992), she shed

Paris-Sud XI, Université de


Deep tags: toward a quantitative analysis of online pornography Antoine Maziresa,d  

E-print Network

Deep tags: toward a quantitative analysis of online pornography Antoine Mazi�resa,d *, Mathieu been made publicly available for further study. Keywords: online pornography; computational social of the world. (Sigel 2000, 12) When Linda Williams compared different kinds of pornography, revealing

Paris-Sud XI, Université de


Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues  

ERIC Educational Resources Information Center

This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

Lieber, Eli



Party Strategy and Media Bias: A Quantitative Analysis of the 2005 UK Election Campaign  

Microsoft Academic Search

This article investigates the current state of press partisanship in the UK. Utilizing content analysis data from the 2005 General Election campaign, recent hypotheses about press dealignment are tested with quantitative methods. Partisan tendencies in reporting are measured in terms of coverage bias, statement bias, and agenda bias. As the governing party, Labour benefits from coverage bias in all papers,

Heinz Brandenburg



Climate change and the socioeconomics of global food production: A quantitative analysis  

E-print Network

1 Climate change and the socioeconomics of global food production: A quantitative analysis of how, Andrew J. Dougill and Piers M. Forster August 2010 Centre for Climate Change Economics and Policy Working Paper No. 29 #12;2 The Centre for Climate Change Economics and Policy (CCCEP) was established

Rambaut, Andrew



E-print Network

SEEDS OF Phaseolus vulgaris l. Erie Christopher Ellis, Ph.D. Cornell University 1990 The pathwayQUANTITATIVE ANALYSIS OF PHOTOSYNTHATE UNLOADING IN DEVELOPING SEEDS OF Phaseolus vulgaris L of the Requirements for the Degree of Doctor of Philosophy by Erie Christopher Ellis January 1990 #12;© Erie

Ellis, Erle C.


Formal Support for Quantitative Analysis of Residual Risks in Safety-Critical Systems  

E-print Network

Formal Support for Quantitative Analysis of Residual Risks in Safety-Critical Systems Jonas in safety-critical systems new challenges to lower the costs and decrease time-to-market, while preserving and particular, the im- pact of probable faults on system level safety. Every poten- tial fault must


Quantitative analysis of sulfur functional groups in natural organic matter by XANES spectroscopy  

E-print Network

Quantitative analysis of sulfur functional groups in natural organic matter by XANES spectroscopy sulfur functionalities in natural organic matter from S K-edge XANES spectroscopy are presented-induced errors, inherent to the choice of a particular curve, are typically lower than 5% of total sulfur


Regulation of Actin Dynamics in Rapidly Moving Cells: A Quantitative Analysis  

E-print Network

Regulation of Actin Dynamics in Rapidly Moving Cells: A Quantitative Analysis Alex Mogilner details of actin dynamics in protrusion associated with cell motility. The model is based on the dendritic differential equations for diffusion and reactions of sequestered actin complexes, nucleation, and growth

Keshet, Leah


The validity of commercial LIBS for quantitative analysis of brass alloy — comparison of WDXRF and AAS  

NASA Astrophysics Data System (ADS)

Commercial low-cost laser induced breakdown spectroscopy (LIBS) has been successfully employed for the quantitative analysis of a Cu-based alloy using a Nd:YAG laser at 1064 nm. The main aim of the present investigation is to explore the benefits of a commercial low-cost LIBS setup. It was recognized that some trace elements such as Al and S could not be detected by LIBS even with a high-resolution spectrometer. The main difficulties in quantifying Cu as a basic component of a brass alloy are related to the self-absorption of Cu spectral lines, with the effect complicated at Cu concentrations higher than 65%. However, few Cu lines such as that at 330.795 nm would be helpful to use due to their lower susceptibility to self-absorption. LIBS, flame atomic absorption spectrometry (FAAS), and wavelength dispersive X-ray fluorescence (WDXRF) were compared for the detection of major and trace metals in the Cu-based alloy. In the case of WDXRF, the brass samples were identified by using a standardless quantitative analysis program depending on a fundamental parameter approach. The quantitative analysis results were acceptable for most of the major and minor elements of the brass sample. Therefore, commercial low cost LIBS would be useful for quantitative analysis of most elements in different types of alloys.

Shaltout, Abdallah A.; Abdel-Aal, M. S.; Mostafa, N. Y.



Integrated Computer-Aided Engineering 10 (2003) 369385 369 A quantitative analysis of evolvability for an  

E-print Network

. While fuzzy logic control has many advantages over traditional methods, it also has some drawbacks at the design stage in that it is difficult to determine optimal parameters. Therefore, many researchers haveIntegrated Computer-Aided Engineering 10 (2003) 369­385 369 IOS Press A quantitative analysis

Cho, Sung-Bae


Quantitative Analysis of Organic Compounds: A Simple and Rapid Method for Use in Schools  

ERIC Educational Resources Information Center

Describes the procedure for making a quantitative analysis of organic compounds suitable for secondary school chemistry classes. Using the Schoniger procedure, the organic compound, such as PVC, is decomposed in a conical flask with oxygen. The products are absorbed in a suitable liquid and analyzed by titration. (JR)

Schmidt, Hans-Jurgen



Selenium and Lung Cancer: A Quantitative Analysis of Heterogeneity in the Current  

E-print Network

Selenium and Lung Cancer: A Quantitative Analysis of Heterogeneity in the Current Epidemiological on sele- nium and lung cancer and identify sources of heterogeneity among studies. When all studies were.30). Overall, these results suggest that selenium may have some protective effect against lung cancer

California at Berkeley, University of


Whose American Government? A Quantitative Analysis of Gender and Authorship in American Politics Texts  

ERIC Educational Resources Information Center

American government textbooks signal to students the kinds of topics that are important and, by omission, the kinds of topics that are not important to the discipline of political science. This article examines portrayals of women in introductory American politics textbooks through a quantitative content analysis of 22 widely used texts. We find…

Cassese, Erin C.; Bos, Angela L.; Schneider, Monica C.



Banking and interest rates in monetary policy analysis: A quantitative exploration  

Microsoft Academic Search

The paper reconsiders the role of money and banking in monetary policy analysis by including a banking sector and money in an optimizing model otherwise of a standard type. The model is implemented quantitatively, with a calibration based on US data. It is reasonably successful in providing an endogenous explanation for substantial steady-state differentials between the interbank policy rate and

Marvin Goodfriend; Bennett T. McCallum



Use of laser ablation in quantitative analysis of the elemental composition of art pigments  

NASA Astrophysics Data System (ADS)

The ability to use laser ablation for preparation of art pigment samples in quantitative analysis of their elemental composition by atomic emission spectroscopy of inductively coupled plasma is shown. The proposed technique enables one to eliminate errors associated with both the influence of strong acids and the stoichiometric disruption in a sample.

Klyachkovskaya, E. V.; Muravitskaya, E. V.; Kozhukh, N. M.; Rozantsev, V. A.; Belkov, M. V.; Ershov-Pavlov, E. A.



A New Optical Design Of A Fundus Imaging Device For Quantitative Retinal Analysis  

NASA Astrophysics Data System (ADS)

A new optical design has been incorporated into an instrument for use in digital imaging and quantitative analysis of fundus images. By using an optical design program, in combination with custom electronics, we have developed a new instrument for fundus imaging.

Schalck, Robert E.



Forty Years of the "Journal of Librarianship and Information Science": A Quantitative Analysis, Part I  

ERIC Educational Resources Information Center

This paper reports on the first part of a two-part quantitative analysis of volume 1-40 (1969-2008) of the "Journal of Librarianship and Information Science" (formerly the "Journal of Librarianship"). It provides an overview of the current state of LIS research journal publishing in the UK; a review of the publication and printing history of…

Furner, Jonathan



hal-00110659,version1-31Oct2006 Quantitative AFM analysis of phase separated  

E-print Network

hal-00110659,version1-31Oct2006 Quantitative AFM analysis of phase separated borosilicate glass borosilicate glass samples were prepared by applying various heat treatments. Using selective chemical etching coarsening of a sodium borosilicate glass[24]. In the present work we focus our study on a sodium

Paris-Sud XI, Université de


Quantitative analysis of dimethyl titanocene by iodometric titration, gas chromatography and NMR  

Microsoft Academic Search

In this study we report the use of an automated iodometric titration method and a novel gas chromatography (GC) method for the quantitative analysis of dimethyl titanocene (DMT), a key raw material in drug synthesis. Both approaches are based on the reaction of DMT in toluene or tetrahydrofuran solutions with iodine. In the case of iodometric titration, excess iodine is

Anant Vailaya; Tao Wang; Yadan Chen; Mark Huffman



Quantitative analysis by X-ray induced total electron yield (TEY) compared to XRFA  

Microsoft Academic Search

The theoretical concepts of the two methods are similar. Consequently, comparable fundamental parameter algorithms can be developed and applied to quantitative analysis of bulk specimens and to investigation of thin layers by TEY and by XRFA. Whereas the sampling depth of XRFA is determined by photoelectric absorption, for TEY the escape probability of electrons reduces this quantity to values of

Horst Ebel



Quantitative Phase Analysis (QPA) Nicola V. Y. Scarlett and Ian C. Madsen  

E-print Network

| Is this still true? "The uncertainty of the quantitative determination of phase composition by X-ray diffraction.E. and H.P. Klug, X-ray diffraction analysis of crystalline dusts. Analytical Chemistry, 1948. 20: p. 886 | Nicola Scarlett7 | Disadvantages of QPA via XRD · Single peak methods o Strongly affected by any

Magee, Joseph W.


Quantitative analysis of dose-effect relationships: the combined effects of multiple drugs or enzyme inhibitors  

Microsoft Academic Search

We demonstrate here the application of a single and generalized method for analyzing dose-effect relationships in enzymatic, cellular and whole animal systems. We also examine the problem of quantitating the effects of multiple inhibitors on such systems and provide definitions of summation of effects, and consequently of synergism and antagonism. Since the proposed method of analysis is derived from generalized




Qualitative and quantitative analysis of mixtures of compounds containing both hydrogen and deuterium  

NASA Technical Reports Server (NTRS)

Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.

Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.



Error-tracking clustering gives quantitative statistics to DNA segmentation analysis  

E-print Network

Error-tracking clustering gives quantitative statistics to DNA segmentation analysis Chih-Hao Chen1, Taiwan 30043 3Department of Surgery, Cathay General Hospital, Taipei, Taiwan 4Cathay Medical Research Institute, Cathay General Hospital, Taipei, Taiwan 5Graduate Institute of Statistics, National Central

Lee, H.C. Paul


A simple LIBS method for fast quantitative analysis of fly ashes  

Microsoft Academic Search

An evaluation of quantitative analysis of major elements (Ca, Al, Mg, Si and Fe) present in fly ashes was made using a simple and cost effective LIBS system. LIBS parameters were optimized to obtain best sensitivity and repeatability. In this purpose different binders were compared, leading to best sensitivity and mechanical stability when a binder containing silver and cellulose was

Alice Stankova; Nicole Gilon; Lionel Dutruch; Viktor Kanicky



Quantitative Analysis of TumorVascularity in Benign and Malignant Solid Thyroid  

E-print Network

Quantitative Analysis of TumorVascularity in Benign and Malignant Solid Thyroid Nodules Andrej in differentiating malignant and benign solid thyroid nodules using tumor histologic evaluation as the reference standard. Methods. Eighty-six solid thyroid tumors (46 malignant and 40 benign) in 56 consecutive patients

Miga, Michael I.


Intraoperative Brain Shift and Deformation: A Quantitative Analysis of Cortical Displacement in 28 Cases  

E-print Network

. Although surgeons have generally considered the magnitude of intraoperative brain shift sufficiently smallIntraoperative Brain Shift and Deformation: A Quantitative Analysis of Cortical Displacement in 28 of surgery, the nature of the cranial opening, the region of the brain involved, the duration of surgery

Frey, Pascal


A Computer Program for Calculation of Calibration Curves for Quantitative X-Ray Diffraction Analysis.  

ERIC Educational Resources Information Center

Describes a FORTRAN IV program written to supplement a laboratory exercise dealing with quantitative x-ray diffraction analysis of mixtures of polycrystalline phases in an introductory course in x-ray diffraction. Gives an example of the use of the program and compares calculated and observed calibration data. (Author/GS)

Blanchard, Frank N.



Quantitative Analysis of the Resolved X-ray Emission Line Profiles of O Stars  

E-print Network

Quantitative Analysis of the Resolved X-ray Emission Line Profiles of O Stars David #12;1. Chandra spectra: emission lines are broad and asymmetric 2. Hot-star X-rays in context 3. Hot-star winds 4. Emission line shapes: constraints on hot plasma distribution and wind mass

Cohen, David


Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development  

NASA Technical Reports Server (NTRS)

BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.



Tow Test Results of an AquaPod Fish Cage  

Microsoft Academic Search

The AquaPodTM system, developed by Ocean Farm Technologies Inc., is a submersible, spherical net pen. This cage was deployed at the University of New Hampshire's (UNH) Open Ocean Aquaculture (OOA) site for seven months for testing purposes. The cage was towed from Portsmouth Harbor to the UNH OOA site, located 15 km offshore in the Gulf of Maine, in September

J. DeCew; S. Page; C. A. Turmelle; J. Irish



AquaNodes: An Underwater Sensor Network Iuliu Vasilescu  

E-print Network

communication and support for sensing and mobil- ity. The nodes in the system are connected acoustically experiments. The sensor nodes are called AquaNodes and are shown in Figure 1. These nodes package instruments to be able to assist in the deployment of measuring systems or to act as part of large-scale data

Farritor, Shane


A Framework for Qualitative and Quantitative Formal Model-Based Safety Analysis  

Microsoft Academic Search

In model-based safety analysis both qualitative aspects i.e. what must go wrong for a system failure) and quantitative aspects (i.e. how probable is a system failure) are very important. For both aspects methods and tools are available. However, until now for each aspect new and independent models must be built for analysis. This paper proposes the SAML framework as a

M. Gudemann; F. Ortmeier



Variable selection method for quantitative trait analysis based on parallel genetic algorithm  

PubMed Central

Summary Selection of important genetic and environmental factors is of strong interest in quantitative trait analyses. In this study, we use parallel genetic algorithm (PGA) to identify genetic and environmental factors in genetic association studies of complex human diseases. Our method can take account of both multiple markers across the genome and environmental factors, and also can be used to do fine mappings based on the results of haplotype analysis to select the markers that are associated with the quantitative traits. Using both simulated and real examples, we show that PGA is able to choose the variables correctly and is also an easy-to-use variable selection tool. PMID:19799600

Mukhopadhyay, Siuli; George, Varghese; Xu, Hongyan



A critical appraisal of techniques, software packages, and standards for quantitative proteomic analysis.  


New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool ( ) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

Gonzalez-Galarza, Faviel F; Lawless, Craig; Hubbard, Simon J; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R



[Quantitative analysis model of multi-component complex oil spill source based on near infrared spectroscopy].  


Near infrared spectroscopy technology was used for quantitative analysis of the simulation of complex oil spill source. Three light petroleum products, i. e. gasoline, diesel fuel and kerosene oil, were selected and configured as simulated mixture of oil spill samples in accordance with different concentrations proportion, and their near infrared spectroscopy in the range of 8 000 -12 000 cm(-1) was collected by Fourier transform near infrared spectrometer. After processing the NIR spectra with different pretreatment methods, partial least squares method was used to establish quantitative analysis model for the mixture of oil spill samples. For gasoline, diesel fuel and kerosene oil, the second derivative method is the optimal pretreatment method, and for these three oil components in the ranges of 8 501.3-7 999.8 and 6 102.1-4 597.8 cm(-1); 6 549.5-4 597.8; 7 999.8-7 498.4 and 102.1-4 597.8 cm(-1), the correlation coefficients R2 of the prediction model are 0.998 2, 0.990 2 and 0.993 6 respectively, while the forecast RMSEP indicators are 0.474 7, 0.936 1 and 1.013 1 respectively; The experimental results show that using near infrared spectroscopy can quantitatively determine the content of each component in the simulated mixed oil spill samples, thus this method can provide effective means for the quantitative detection and analysis of complex marine oil spill source. PMID:23427535

Tan, Ai-Ling; Bi, Wei-Hong



Quantitative analysis of estimated scattering coefficient and phase retardation for ovarian tissue characterization.  


In this report, optical scattering coefficient and phase retardation quantitatively estimated from polarization-sensitive OCT (PSOCT) were used for ovarian tissue characterization. A total of 33 ex vivo ovaries (normal: n = 26, malignant: n = 7) obtained from 18 patients were investigated. A specificity of 100% and a sensitivity of 86% were achieved by using estimated scattering coefficient alone; and a specificity of 100% and a sensitivity of 43% were obtained by using phase retardation alone. However, a superior specificity of 100% and sensitivity of 100% were achieved if these two parameters were used together for classifying normal and malignant ovaries. Quantitative measurement of collagen content obtained from Sirius red histology sections shows that it correlates with estimated scattering coefficient and phase retardation. Our initial results demonstrate that quantitative analysis of PSOCT could be a potentially valuable method for distinguishing normal from malignant ovarian tissues during minimally invasive surgery and help guide surgical intervention. PMID:22808427

Yang, Yi; Wang, Tianheng; Wang, Xiaohong; Sanders, Melinda; Brewer, Molly; Zhu, Quing



[Quantitative analysis of two-component polymer blends (PEG/PE) by infrared spectroscopy].  


The concentration of the camponent of PEG/PE blends was analyzed quantitatively by infrared spectroscopy. The absorption peak area ratio of the selected mixture peaks used as the calibrating basis for the quantitative analysis was more reasonable than the peak area ratio of the pure peaks. The theoretical equation deducted by Beer-Lamber law was used to establish the working curve to calculate the composition of the responding functional groups in the film of the PEG/PE blends. The characteristic peaks of the crystal can not be selected as the calibrating basis for the quantitative measurement, because the crystallization has a great effect on the intensity of the absorption peaks. PMID:16827348

Wu, Hong; Lin, Zhi-yong; Qian, Hao



Direct evidence for the role of light-mediated gas vesicle collapse in the buoyancy regulation of Anabaena flos-aquae (cyanobacteria)  

Microsoft Academic Search

Quantitative measurements were made of the changes in gas vacuole volume and the major components of cell mass (protein and carbohydrate) on cultures of Anabaenaflos-aquae which lost buoyancy as they were shifted from low to high light intensity. Assuming densities of 1,300 kg. rn-' for protein and 1,600 for carbohydrate, we calculated the change in ballast brought about by changes




Quantitative trait locus analysis of susceptibility to diet-induced atherosclerosis in recombinant inbred mice  

SciTech Connect

Quantitative trait locus (QTL) analysis is a statistical method that can be applied to identify loci making a significant impact on a phenotype. For the phenotype of susceptibility to diet-induced atherosclerosis in the mouse, we have studied four quantitative traits: area of aortic fatty streaks and serum concentrations of high-density lipoprotein-bound cholesterol (HDL-cholesterol), apolipoprotein A-I, and apolipoprotein A-II (apo A-II). QTL analysis revealed a significant locus on chromosome 1 distal impacting serum apo A-II concentration on a high-fat diet and serum HDL-cholesterol concentration on a chow diet. This locus is presumably Apoa-2, the structural gene for apo A-II. QTL analysis of aortic fatty streaks failed to reveal a significant locus. 19 refs., 3 tabs.

Hyman, R.W. [Stanford Univ. School of Medicine, CA (United States); Frank, S.; Warden, C.H. [Univ. of California, Los Angeles, CA (United States)] [and others



Aqua-planet simulations of the formation of the South Atlantic convergence zone  

NASA Technical Reports Server (NTRS)

The impact of Amazon Basin convection and cold fronts on the formation and maintenance of the South Atlantic convergence zone (SACZ) is studied using aqua-planet simulations with a general circulation model. In the model, a circular patch of warm sea-surface temperature (SST) is used to mimic the effect of the Amazon Basin on South American monsoon convection. The aqua-planet simulations were designed to study the effect of the strength and latitude of Amazon Basin convection on the formation of the SACZ. The simulations indicate that the strength of the SACZ increases as the Amazon convection intensifies and is moved away from the equator. Of the two controls studied here, the latitude of the Amazon convection exerts the strongest effect on the strength of the SACZ. An analysis of the synoptic-scale variability in the simulations shows the importance of frontal systems in the formation of the aqua-planet SACZ. Composite time series of frontal systems that occurred in the simulations show that a robust SACZ occurs when fronts penetrate into the subtropics and become stationary there as they cross eastward of the longitude of the Amazon Basin. Moisture convergence associated with these frontal systems produces rainfall not along the model SACZ region and along a large portion of the northern model Amazon Basin. Simulations in which the warm SST patch was too weak or too close to the equator did not produce frontal systems that extended into the tropics and became stationary, and did not form a SACZ. In the model, the SACZ forms as Amazon Basin convection strengthens and migrates far enough southward to allow frontal systems to penetrate into the tropics and stall over South America. This result is in agreement with observations that the SACZ tends to form after the onset of the monsoon season in the Amazon Basin.

Nieto Ferreira, Rosana; Chao, Winston C.



Effects of heavy-metal stress on cyanobacterium Anabaena flos-aquae.  


The influence of two metals, copper and cadmium, was studied on the growth and ultrastructures of cyanobacterium Anabaena flos-aquae grown at three different temperatures: 10 degrees C, 20 degrees C, and 30 degrees C. The highest concentration of chlorophyll a was observed at 20 degrees C and the lowest at 10 degrees C. Both toxic metal ions, Cu(2+) and Cd(2+), inhibited growth of the tested cyanobacterium. Chlorophyll a concentration decreased with the increase of metal concentration. A 50% decrease in the growth of A. flos-aquae population, compared with the control, was reached at 0.61 mg l(-1) cadmium and at 0.35 mg l(-1) copper (at 20 degrees C). Copper at all temperatures tested was proven to be more toxic than cadmium. At 3 mg l(-1), the lysis and distortion of cells was observed; however, after incubation at 9 mg l(-1) cadmium, most of the cells were still intact, and only intrathylakoidal spaces started to appear. Copper caused considerably greater changes in the protein system of A. flos-aquae than did cadmium; in this case, not only phycobilins but also total proteins were destructed. The aim of this study was also to identify the place of metal accumulation and sorption in the tested cyanobacterium. Analysis of the energy-dispersion spectra of the characteristic x-ray radiation of trichomes and their sheaths showed that cadmium was completely accumulated in cells but was not found in the sheath. Spectrum of the isolated sheath after treatment with copper exhibited only traces of the metal, but isolated cells without a sheath showed a high peak of copper. PMID:15657804

Surosz, W; Palinska, K A



Errors in Quantitative Image Analysis due to Platform-Dependent Image Scaling1  

PubMed Central

PURPOSE: To evaluate the ability of various software (SW) tools used for quantitative image analysis to properly account for source-specific image scaling employed by magnetic resonance imaging manufacturers. METHODS: A series of gadoteridol-doped distilled water solutions (0%, 0.5%, 1%, and 2% volume concentrations) was prepared for manual substitution into one (of three) phantom compartments to create “variable signal,” whereas the other two compartments (containing mineral oil and 0.25% gadoteriol) were held unchanged. Pseudodynamic images were acquired over multiple series using four scanners such that the histogram of pixel intensities varied enough to provoke variable image scaling from series to series. Additional diffusion-weighted images were acquired of an ice-water phantom to generate scanner-specific apparent diffusion coefficient (ADC) maps. The resulting pseudodynamic images and ADC maps were analyzed by eight centers of the Quantitative Imaging Network using 16 different SW tools to measure compartment-specific region-of-interest intensity. RESULTS: Images generated by one of the scanners appeared to have additional intensity scaling that was not accounted for by the majority of tested quantitative image analysis SW tools. Incorrect image scaling leads to intensity measurement bias near 100%, compared to nonscaled images. CONCLUSION: Corrective actions for image scaling are suggested for manufacturers and quantitative imaging community. PMID:24772209

Chenevert, Thomas L; Malyarenko, Dariya I; Newitt, David; Li, Xin; Jayatilake, Mohan; Tudorica, Alina; Fedorov, Andriy; Kikinis, Ron; Liu, Tiffany Ting; Muzi, Mark; Oborski, Matthew J; Laymon, Charles M; Li, Xia; Thomas, Yankeelov; Jayashree, Kalpathy-Cramer; Mountz, James M; Kinahan, Paul E; Rubin, Daniel L; Fennessy, Fiona; Huang, Wei; Hylton, Nola; Ross, Brian D



Quantitative analysis of dipyridamole-thallium images for the detection of coronary artery disease  

SciTech Connect

To determine if the detection of coronary artery disease by dipyridamole-thallium imaging is improved by quantitative versus qualitative analysis, and combining quantitative variables, 80 patients with chest pain (53 with and 27 without coronary artery disease) who underwent cardiac catheterization were studied. Segmental thallium initial uptake, linear clearance, monoexponential clearance and redistribution were measured from early, intermediate and delayed images acquired in three projections. Normal values were determined from 13 other clinically normal subjects. When five segments per view were used for quantitative analysis, sensitivity and specificity were 87 and 63%, respectively, for uptake, 77 and 67% for linear clearance, 60 and 60% for monoexponential clearance and 62 and 56% for redistribution. Of the four variables, uptake and linear clearance were the most sensitive (p less than 0.01) and specificity did not differ significantly. Using three segments per view, the specificity of uptake increased to 78% without a significant change in sensitivity (85%). With this approach, sensitivity and specificity did not differ from those of qualitative analysis (85 and 78%, respectively). Stepwise logistic regression analysis demonstrated that the best quantitative thallium correlate of the presence of coronary artery disease was a combination variable of ''either abnormal uptake or abnormal linear clearance, or both.'' Using five segments per view, the model's specificity (85%) was greater than that of uptake alone, with similar sensitivity (92%). Using three segments per view, the model's specificity (93%) was greater than that of uptake alone and of qualitative analysis (p less than 0.05), with similar sensitivity (85%).

Ruddy, T.D.; Dighero, H.R.; Newell, J.B.; Pohost, G.M.; Strauss, H.W.; Okada, R.D.; Boucher, C.A.



Quantitative Analysis of the Nanopore Translocation Dynamics of Simple Structured Polynucleotides  

PubMed Central

Nanopore translocation experiments are increasingly applied to probe the secondary structures of RNA and DNA molecules. Here, we report two vital steps toward establishing nanopore translocation as a tool for the systematic and quantitative analysis of polynucleotide folding: 1), Using ?-hemolysin pores and a diverse set of different DNA hairpins, we demonstrate that backward nanopore force spectroscopy is particularly well suited for quantitative analysis. In contrast to forward translocation from the vestibule side of the pore, backward translocation times do not appear to be significantly affected by pore-DNA interactions. 2), We develop and verify experimentally a versatile mesoscopic theoretical framework for the quantitative analysis of translocation experiments with structured polynucleotides. The underlying model is based on sequence-dependent free energy landscapes constructed using the known thermodynamic parameters for polynucleotide basepairing. This approach limits the adjustable parameters to a small set of sequence-independent parameters. After parameter calibration, the theoretical model predicts the translocation dynamics of new sequences. These predictions can be leveraged to generate a baseline expectation even for more complicated structures where the assumptions underlying the one-dimensional free energy landscape may no longer be satisfied. Taken together, backward translocation through ?-hemolysin pores combined with mesoscopic theoretical modeling is a promising approach for label-free single-molecule analysis of DNA and RNA folding. PMID:22225801

Schink, Severin; Renner, Stephan; Alim, Karen; Arnaut, Vera; Simmel, Friedrich C.; Gerland, Ulrich



Biomonitoring and risk assessment on earth and during exploratory missions using AquaHab ®  

NASA Astrophysics Data System (ADS)

Bioregenerative closed ecological life support systems (CELSS) will be necessary in the exploration context revitalizing atmosphere, waste water and producing food for the human CELSS mates. During these long-term space travels and stays far away from Earth in an hostile environment as well as far for example from any hospital and surgery potential, it will be necessary to know much more about chemical and drug contamination in the special sense and by human's themselves in detail. Additionally, there is a strong need on Earth for more relevant standardized test systems including aquatic ones for the prospective risk assessment of chemicals and drugs in general on a laboratory scale. Current standardized test systems are mono species tests, and thus do not represent system aspects and have reduced environmental relevance. The experience gained during the last years in our research group lead to the development of a self-sustaining closed aquatic habitat/facility, called AquaHab ® which can serve regarding space exploration and Earth application. The AquaHab ® module can be the home of several fish species, snails, plants, amphipods and bacteria. The possibility to use different effect endpoints with certain beneficial characteristics is the basis for the application of AquaHab ® in different fields. Influence of drugs and chemicals can be tested on several trophic levels and ecosystem levels; guaranteeing a high relevance for aquatic systems in the real environment. Analyses of effect parameters of different complexity (e.g. general biological and water chemical parameters, activity of biotransforming enzymes) result in broad spectra of sensitivity. Combined with residual analyses (including all metabolites), this leads to an extended prospective risk assessment of a chemical on Earth and in a closed Life Support System. The possibility to measure also sensitive "online" parameters (e.g. behavior, respiration/photosynthetic activity) enables a quick and sensitive effect analysis of water contaminants in respective environments. AquaHab ® is currently under development to an early warning biomonitoring system using genetically modified fish and green algae. The implementation of biosensors/biochip in addition is also discussed.

Slenzka, K.; Dünne, M.; Jastorff, B.



Quantitative analysis of saltwater-freshwater relationships in groundwater systems-A historical perspective  

USGS Publications Warehouse

Although much progress has been made toward the mathematical description of saltwater-freshwater relationships in groundwater systems since the late 19th century, the advective and dispersive mechanisms involved are still incompletely understood. This article documents the major historical advances in this subject and summarizes the major direction of current studies. From the time of Badon Ghyben and Herzberg, it has been recognized that density is important in mathematically describing saltwater-freshwater systems. Other mechanisms, such as hydrodynamic dispersion, were identified later and are still not fully understood. Quantitative analysis of a saltwater-freshwater system attempts to mathematically describe the physical system and the important mechanisms using reasonable simplifications and assumptions. This paper, in developing the history of quantitative analysis discusses many of these simplifications and assumptions and their effect on describing and understanding the phenomenon. ?? 1985.

Reilly, T.E.; Goodman, A.S.



Quantitative analysis of surface characteristics and morphology in Death Valley, California using AIRSAR data  

NASA Technical Reports Server (NTRS)

The Jet Propulsion Laboratory Airborne Synthetic Aperture Radar (JPL-AIRSAR) is used to collect full polarimetric measurements at P-, L-, and C-bands. These data are analyzed using the radar analysis and visualization environment (RAVEN). The AIRSAR data are calibrated using in-scene corner reflectors to allow for quantitative analysis of the radar backscatter. RAVEN is used to extract surface characteristics. Inversion models are used to calculate quantitative surface roughness values and fractal dimensions. These values are used to generate synthetic surface plots that represent the small-scale surface structure of areas in Death Valley. These procedures are applied to a playa, smooth salt-pan, and alluvial fan surfaces in Death Valley. Field measurements of surface roughness are used to verify the accuracy.

Kierein-Young, K. S.; Kruse, F. A.; Lefkoff, A. B.



Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.  


Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. PMID:23892044

Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y



Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3  

NASA Technical Reports Server (NTRS)

In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

Brooks, Howard L.



Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3  

NASA Astrophysics Data System (ADS)

In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

Brooks, Howard L.



DAnTE: a statistical tool for quantitative analysis of -omics data  

Microsoft Academic Search

Summary: DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges associated with quantitative bottom-up, shotgun proteomics data. This tool has also been dem- onstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting

Ashoka D. Polpitiya; Wei-jun Qian; Navdeep Jaitly; Vladislav A. Petyuk; Joshua N. Adkins; David G. Camp II; Gordon A. Anderson; Richard D. Smith



Quantitative 31 P nuclear magnetic resonance analysis of the phospholipids of erythrocyte membranes using detergent  

Microsoft Academic Search

31P Nuclear magnetic resonance (NMR) spectra of human erythrocyte lysates dissolved in sodium cholate were acquired. The narrow\\u000a resonances of phospholipids were mostly well resolved, allowing identification and accurate quantitative analysis of phospholipid\\u000a classes of the erythrocyte membranes. The ether-linked phosphatidylethanolamine components of the erythrocyte membranes were\\u000a identified, based on the removal of plasmalogens by acidolysis and of diacyl phospholipid

M. Hossein Nouri-Sorkhabi; Lesley C. Wright; David R. Sullivan; Philip W. Kuchel



Quantitative Analysis of the Effect of Salt Concentration on Enzymatic Catalysis  

Microsoft Academic Search

Like pH, salt concentration can have a dramatic effect on enzymatic catalysis. Here, a general equation is derived for the quantitative analysis of salt -rate profiles: kcat\\/KM ) (kcat\\/KM)MAX\\/(1 + ((Na+)\\/KNa+)n'), where (kcat\\/KM)MAX is the physical limit of kcat\\/KM, KNa+ is the salt concentration at which kcat\\/KM ) (kcat\\/KM)MAX\\/2, and -nis the slope of the linear region in a plot

Ronald T. Raines



Quantitative analysis of microstructure and its related electrical property of SOFC anode, Ni–YSZ cermet  

Microsoft Academic Search

The microstructural and electrical properties of Ni–YSZ composite anode of solid oxide fuel cells (SOFC) were investigated. We measured the electrical conductivity via 4-probe DC technique as a function of Ni content (10–70 vol.%) in order to examine the correlation with the microstructure of Ni–YSZ cermet. Image analysis based on quantitative microscopic theory was performed to quantify the microstructure of

J.-H Lee; H Moon; H.-W Lee; J Kim; J.-D Kim; K.-H Yoon



A unified framework for clustering and quantitative analysis of white matter fiber tracts  

Microsoft Academic Search

We present a novel approach for joint clustering and point-by-point mapping of white matter fiber pathways. Knowledge of the point correspondence along the fiber pathways is not only necessary for accurate clustering of the trajectories into fiber bundles, but also cru- cial for any tract-oriented quantitative analysis. We employ an expectation-maximization (EM) algorithm to cluster the trajectories in a gamma

Mahnaz Maddah; W. Eric L. Grimson; Simon K. Warfield; William M. Wells III



Quantitative analysis of specific target DNA oligomers using a DNA-immobilized packed-column system  

Microsoft Academic Search

Although a DNA-immobilized packed-column (DNA-packed column), which relies on sequence-dependent interactions of target DNA\\u000a or mRNA (in the mobile phase) with DNA probes (on the silica particle) in a continuous flow process, could be considered as\\u000a an alternative platform for quantitative analysis of specific DNA to DNA chip methodology, the performance in practice has\\u000a not been satisfactory. In this study,

Seung Pil Pack; Tae-Hwe Heo; Kamakshaiah Charyulu Devarayapalli; Keisuke Makino


Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results  

NASA Astrophysics Data System (ADS)

Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.



Powerful regression-based quantitative-trait linkage analysis of general pedigrees  

Microsoft Academic Search

We present a new method of quantitative-trait linkage analysis that combines the simplicity and robustness of regression-based methods and the generality and greater power of variance-components models. The new method is based on a regression of estimated identity-by-descent (IBD) sharing between relative pairs on the squared sums and squared differences of trait values of the relative pairs. The method is

Pak C. Sham; Shaun Purcell; Stacey S. Cherny; Gonçalo R. Abecasis



Quantitative Transcript Analysis in Plants: Improved First-strand cDNA Synthesis  

Microsoft Academic Search

The quantity and quality of first-strand cDNA directly influence the accuracy of transcriptional analysis and quantification. Using a plant-derived ?-tubulin as a model system, the effect of oligo sequence and DTT on the quality and quantity of first-strand cDNA synthesis was assessed via a combination of semi-quantitative PCR and real-time PCR. The results indicated that anchored oligo dT significantly improved

Nai-Zhong XIAO; Lei BA; Preben Bach HOLM; Xing-Zhi WANG; Steve BOWRA



Attenuated Total Internal Reflectance Infrared Spectroscopy (ATR-FTIR): A Quantitative Approach for Kidney Stone Analysis  

PubMed Central

The impact of kidney stone disease is significant worldwide, yet methods for quantifying stone components remain limited. A new approach requiring minimal sample preparation for the quantitative analysis of kidney stone components has been investigated utilizing attenuated total internal reflectance infrared spectroscopy (ATR-FTIR). Calcium oxalate monohydrate (COM) and hydroxylapatite (HAP), two of the most common constituents of urinary stones, were used for quantitative analysis. Calibration curves were constructed using integrated band intensities of four infrared absorptions versus concentration (weight %). The correlation coefficients of the calibration curves range from 0.997 to 0.93. The limits of detection range from 0.07 ± 0.02% COM/HAP where COM is the analyte and HAP the matrix to 0.26 ± 0.07% HAP/COM where HAP is the analyte and COM the matrix. This study shows that linear calibration curves can be generated for the quantitative analysis of stone mixtures provided the system is well understood especially with respect to particle size. PMID:19589213

Gulley-Stahl, Heather J.; Haas, Jennifer A.; Schmidt, Katherine A.; Evan, Andrew P.; Sommer, André J.



Quantitative chemical analysis of ocular melanosomes in stained and non-stained tissues.  


Energy-filtered Analytical Electron Microscopy (AEM) was used to image the ultrastructure and determine quantitatively the chemical composition of rat melanosomes of the choroid and the Retinal Pigment Epithelium (RPE). For the first time, the effect of staining in elemental analysis of melanosomes was investigated. Detection limits and accuracies of the applied methods were determined. Compared to previous work applying only quantitative Energy Dispersive X-ray microanalysis (EDX) in the TEM (Eibl, O., et al., 2006. Micron 37, 262), here we present a combined quantitative EDX and Electron Energy Loss Spectroscopy (EELS) analysis, including N. This yields the fraction of eumelanin and pheomelanin in melanosomes by the S/N mole fraction ratio. Melanosomes of the sepia ink sac, used as eumelanin standard, showed an S/N mole fraction ratio of <0.004. Thus, they consist primarily of eumelanin as reported by degradation analysis. In contrast, melanosomes of the rats contained mixed melanin with significant amounts of pheomelanin (S/N 0.02) in the RPE and the choroid. Consistent with the previous publication, it was shown that oxygen mole fractions are especially large in melanosomes (7-10 at.%) compared to other cell compartments, e.g. 2-4 at.% oxygen in the cytoplasm. In the melanosomes of non-stained tissue, the oxygen mole fraction clearly correlated with the Ca mole fraction. EDX spectra used for quantitative analysis had about 15,000 net counts under the oxygen peak, which is necessary to obtain (i) a small statistical error for oxygen and (ii) optimum minimum detectable mole fractions for S, Ca and transition metals. The precise determination of the oxygen mole fraction in melanosomes is important for understanding metabolism. Therefore, a detailed analysis was carried out on the possible errors affecting quantification. While O, S, and N mole fractions yielded similar results in stained and non-stained ocular melanosomes of rats, transition metals can only be determined reliably in non-stained tissues. High-precision EDX analysis of melanosomes yielded minimum detectable mole fractions of less than 0.04 at.% for Cu and Zn, these elements were present in melanosomes with mole fractions of about 0.3 at.% and 0.1at.%, respectively. Zn is of great importance for metabolism and for age related macular degeneration. Its mole fraction in melanosomes of rats is large enough to be detected and to be quantitatively analyzed by EDX spectroscopy. Ultrastructural information can now be correlated to the elemental composition. This is important to better understand the physical and chemical properties of melanosomal metabolism and turnover. PMID:21330141

Biesemeier, Antje; Schraermeyer, Ulrich; Eibl, Oliver



Low-dose CT for quantitative analysis in acute respiratory distress syndrome  

PubMed Central

Introduction The clinical use of serial quantitative computed tomography (CT) to characterize lung disease and guide the optimization of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS) is limited by the risk of cumulative radiation exposure and by the difficulties and risks related to transferring patients to the CT room. We evaluated the effects of tube current-time product (mAs) variations on quantitative results in healthy lungs and in experimental ARDS in order to support the use of low-dose CT for quantitative analysis. Methods In 14 sheep chest CT was performed at baseline and after the induction of ARDS via intravenous oleic acid injection. For each CT session, two consecutive scans were obtained applying two different mAs: 60 mAs was paired with 140, 15 or 7.5 mAs. All other CT parameters were kept unaltered (tube voltage 120 kVp, collimation 32 × 0.5 mm, pitch 0.85, matrix 512 × 512, pixel size 0.625 × 0.625 mm). Quantitative results obtained at different mAs were compared via Bland-Altman analysis. Results Good agreement was observed between 60 mAs and 140 mAs and between 60 mAs and 15 mAs (all biases less than 1%). A further reduction of mAs to 7.5 mAs caused an increase in the bias of poorly aerated and nonaerated tissue (-2.9% and 2.4%, respectively) and determined a significant widening of the limits of agreement for the same compartments (-10.5% to 4.8% for poorly aerated tissue and -5.9% to 10.8% for nonaerated tissue). Estimated mean effective dose at 140, 60, 15 and 7.5 mAs corresponded to 17.8, 7.4, 2.0 and 0.9 mSv, respectively. Image noise of scans performed at 140, 60, 15 and 7.5 mAs corresponded to 10, 16, 38 and 74 Hounsfield units, respectively. Conclusions A reduction of effective dose up to 70% has been achieved with minimal effects on lung quantitative results. Low-dose computed tomography provides accurate quantitative results and could be used to characterize lung compartment distribution and possibly monitor time-course of ARDS with a lower risk of exposure to ionizing radiation. A further radiation dose reduction is associated with lower accuracy in quantitative results. PMID:24004842



Optimization of homonuclear 2D NMR for fast quantitative analysis: application to tropine-nortropine mixtures.  


Quantitative analysis by (1)H NMR is often hampered by heavily overlapping signals that may occur for complex mixtures, especially those containing similar compounds. Bidimensional homonuclear NMR spectroscopy can overcome this difficulty. A thorough review of acquisition and post-processing parameters was carried out to obtain accurate and precise, quantitative 2D J-resolved and DQF-COSY spectra in a much reduced time, thus limiting the spectrometer instabilities in the course of time. The number of t(1) increments was reduced as much as possible, and standard deviation was improved by optimization of spectral width, number of transients, phase cycling and apodization function. Localized polynomial baseline corrections were applied to the relevant chemical shift areas. Our method was applied to tropine-nortropine mixtures. Quantitative J-resolved spectra were obtained in less than 3 min and quantitative DQF-COSY spectra in 12 min, with an accuracy of 3% for J-spectroscopy and 2% for DQF-COSY, and a standard deviation smaller than 1%. PMID:17118605

Giraudeau, Patrick; Guignard, Nadia; Hillion, Emilie; Baguet, Evelyne; Akoka, Serge



Quantitative analysis of powder mixtures by Raman spectrometry: the influence of particle size and its correction.  


Particle size distribution and compactness have significant confounding effects on Raman signals of powder mixtures, which cannot be effectively modeled or corrected by traditional multivariate linear calibration methods such as partial least-squares (PLS), and therefore greatly deteriorate the predictive abilities of Raman calibration models for powder mixtures. The ability to obtain directly quantitative information from Raman signals of powder mixtures with varying particle size distribution and compactness is, therefore, of considerable interest. In this study, an advanced quantitative Raman calibration model was developed to explicitly account for the confounding effects of particle size distribution and compactness on Raman signals of powder mixtures. Under the theoretical guidance of the proposed Raman calibration model, an advanced dual calibration strategy was adopted to separate the Raman contributions caused by the changes in mass fractions of the constituents in powder mixtures from those induced by the variations in the physical properties of samples, and hence achieve accurate quantitative determination for powder mixture samples. The proposed Raman calibration model was applied to the quantitative analysis of backscatter Raman measurements of a proof-of-concept model system of powder mixtures consisting of barium nitrate and potassium chromate. The average relative prediction error of prediction obtained by the proposed Raman calibration model was less than one-third of the corresponding value of the best performing PLS model for mass fractions of barium nitrate in powder mixtures with variations in particle size distribution, as well as compactness. PMID:22468859

Chen, Zeng-Ping; Li, Li-Mei; Jin, Jing-Wen; Nordon, Alison; Littlejohn, David; Yang, Jing; Zhang, Juan; Yu, Ru-Qin



Summary of Terra and Aqua MODIS Long-Term Performance  

NASA Technical Reports Server (NTRS)

Since launch in December 1999, the MODIS ProtoFlight Model (PFM) onboard the Terra spacecraft has successfully operated for more than 11 years. Its Flight Model (FM) onboard the Aqua spacecraft, launched in May 2002, has also successfully operated for over 9 years. MODIS observations are made in 36 spectral bands at three nadir spatial resolutions and are calibrated and characterized regularly by a set of on-board calibrators (OBC). Nearly 40 science products, supporting a variety of land, ocean, and atmospheric applications, are continuously derived from the calibrated reflectances and radiances of each MODIS instrument and widely distributed to the world-wide user community. Following an overview of MODIS instrument operation and calibration activities, this paper provides a summary of both Terra and Aqua MODIS long-term performance. Special considerations that are critical to maintaining MODIS data quality and beneficial for future missions are also discussed.

Xiong, Xiaoxiong (Jack); Wenny, Brian N.; Angal, Amit; Barnes, William; Salomonson, Vincent



Quantitative maps of genetic interactions in yeast - Comparative evaluation and integrative analysis  

PubMed Central

Background High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Results Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. Conclusions We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches. PMID:21435228



Quantitative X-ray diffraction analysis of oxides formed on superalloys  

NASA Technical Reports Server (NTRS)

Methods were developed for quantitative analysis by X-ray diffraction of the oxides Al2O3, NiO, Cr2O3, CoO, and CoCr2O4 within a standard deviation of about 10 percent of the weight fraction reported or within 1 percent absolute. These error limits assume that the sample oxides are well characterized and that the physiochemical structure of the oxides in the samples are identical with those in the synthesized standards. Results are given for the use of one of the techniques in the analysis of spalls from a series of oxidation tests of the cobalt base alloy WI-52.

Garlick, R. G.



Quantitative analysis of gold nanoparticles in single cells by laser ablation inductively coupled plasma-mass spectrometry.  


Single cell analysis has become an important field of research in recent years reflecting the heterogeneity of cellular responses in biological systems. Here, we demonstrate a new method, based on laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS), which can quantify in situ gold nanoparticles (Au NPs) in single cells. Dried residues of picoliter droplets ejected by a commercial inkjet printer were used to simulate matrix-matched calibration standards. The gold mass in single cells exposed to 100 nM NIST Au NPs (Reference material 8012, 30 nm) for 4 h showed a log-normal distribution, ranging from 1.7 to 72 fg Au per cell, which approximately corresponds to 9 to 370 Au NPs per cell. The average result from 70 single cells (15 ± 13 fg Au per cell) was in good agreement with the result from an aqua regia digest solution of 1.2 × 10(6) cells (18 ± 1 fg Au per cell). The limit of quantification was 1.7 fg Au. This paper demonstrates the great potential of LA-ICPMS for single cell analysis and the beneficial study of biological responses to metal drugs or NPs at the single cell level. PMID:25225851

Wang, Meng; Zheng, Ling-Na; Wang, Bing; Chen, Han-Qing; Zhao, Yu-Liang; Chai, Zhi-Fang; Reid, Helen J; Sharp, Barry L; Feng, Wei-Yue




PubMed Central

This study assesses the utility of compartmental analysis of SPECT data in lateralizing ictal onset in cases of a putative mesial temporal lobe epilepsy (mTLE). An institutional archival review provided 46 patients (18M, 28F) operated for a putative mTLE who achieved an Engel class Ia postoperative outcome. This established the standard to assure a true ictal origin. Ictal and interictal SPECT images were separately coregistered to T1-weighted (T1W) magnetic resonance (MR) image using a rigid transformation and the intensities matched with an l1 norm minimization technique. The T1W MR image was segmented into separate structures using an atlas-based automatic segmentation technique with the hippocampi manually segmented to improve accuracy. Mean ictal-interictal intensity difference values were calculated for select subcortical structures and the accuracy of lateralization evaluated using a linear classifier. Hippocampal SPECT analysis yielded the highest lateralization accuracy (91%) followed by the amygdala (87%), putamen (67%) and thalamus (61%). Comparative FLAIR and volumetric analyses yielded 89% and 78% accuracies, respectively. A multi-modality analysis did not generate a higher accuracy (89%). A quantitative anatomically compartmented approach to SPECT analysis yields a particularly high lateralization accuracy in the case of mTLE comparable to that of quantitative FLAIR MR imaging. Hippocampal segmentation in this regard correlates well with ictal origin and shows good reliability in the preoperative analysis. PMID:21454055

Jafari-Khouzani, Kourosh; Elisevich, Kost; Karvelis, Kastytis C.; Soltanian-Zadeh, Hamid



High Throughput Quantitative Analysis of Serum Proteins using Glycopeptide Capture and Liquid Chromatography Mass Spectrometry  

SciTech Connect

It is expected that the composition of the serum proteome can provide valuable information about the state of the human body in health and disease, and that this information can be extracted via quantitative proteomic measurements. Suitable proteomic techniques need to be sensitive, reproducible and robust to detect potential biomarkers below the level of highly expressed proteins, to generate data sets that are comparable between experiments and laboratories, and have high throughput to support statistical studies. In this paper, we report a method for high throughput quantitative analysis of serum proteins. It consists of the selective isolation of peptides that are N-linked glycosylated in the intact protein, the analysis of these, no de-glycosylated peptides by LC-ESI-MS, and the comparative analysis of the resulting patterns. By focusing selectively on a few formerly N-linked glycopeptides per serum protein, the complexity of the analyte sample is significantly reduced and the sensitivity and throughput of serum proteome analysis are increased compared with the analysis of total tryptic peptides from unfractionated samples. We provide data that document the performance of the method and show that sera from untreated normal mice and genetically identical mice with carcinogen induced skin cancer can be unambiguously discriminated using unsupervised clustering of the resulting peptide patterns. We further identify, by tandem mass spectrometry, some of the peptides that were consistently elevated in cancer mice compared to their control littermates.

Zhang, Hui; Yi, Eugene C.; Li, Xiao-jun; Mallick, Parag; Kelly-Spratt, Karen S.; Masselon, Christophe D.; Camp, David G.; Smith, Richard D.; Kemp, Christopher; Aebersold, Ruedi



High-resolution mass spectrometry for integrated qualitative and quantitative analysis of pharmaceuticals in biological matrices.  


Quantitative and qualitative high-resolution (HR) dependent and independent acquisition schemes on a QqTOF MS (with resolving power 20,000-40,000) were investigated for the analysis of pharmaceutical compounds in biological fluids. High-resolution selected reaction monitoring (HR-SRM) was found to be linear over three orders of magnitude for quantitative analysis of paracetamol in human plasma, offering a real alternative to triple quadrupole LC-SRM/MS. Metabolic stability of talinolol in microsomes was characterized by use of three different acquisition schemes: (i) information-dependent acquisition (IDA) with a TOF MS experiment as survey scan and product-ion scan as dependent scan; (ii) MS(ALL) by collecting TOF mass spectra with and without fragmentation by alternating the collision energy of the collision cell between a low (i.e., 10 eV) and high setting (i.e., 40 eV); and (iii) a novel independent acquisition mode referred to as "sequential window acquisition of all theoretical fragment-ion spectra" (SWATH) or "global precursor ions scan mode" (GPS) in which sequential precursor ions windows (typically 20 u) are used to collect the same spectrum precursor and fragment ions using a collision energy range. SWATH or GPS was found to be superior to IDA or MS(ALL) in combination with UHPLC for qualitative analysis but requires a rapidly acquiring mass spectrometer. Finally, the GPS concept was used for QUAL/QUAN analysis (i.e. integration of qualitative and quantitative analysis) of bosentan and its metabolites in urine over a concentration range from 5 to 2,500 ng mL(-1). PMID:22203371

Hopfgartner, Gérard; Tonoli, David; Varesio, Emmanuel



Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function  

SciTech Connect

A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.



Quantitative analysis of CD34+ stem cells using RT-PCR on whole cells.  


We have employed RT-PCR of whole cells to develop a quantitative method for estimating the number of rare cells expressing a unique mRNA in a large, mixed population of cells. We have demonstrated that RT-PCR can be done on whole cells without the need for extraction of the RNA. This allows for a great saving of time and effort, as well as allowing quantitative analysis to be based on the total number of cells analyzed in a given aliquot and the presence or absence of the specific RT-PCR product. We have employed a limiting dilution series on whole cells, with multiple aliquots at each cell concentration to achieve more statistical power in the analysis of a rare cell type. We have used a nested amplification of the CD34 mRNA to be able to detect a single cell expressing the CD34 mRNA in a larger population of non-CD34-expressing cells. We demonstrate that by using this technique, cells from blood and bone marrow containing the CD34 mRNA can be followed quantitatively during a multistep purification involving immunoadsorption followed by fluorescence-activated cell sorting. We also demonstrate that many cells that express the CD34 protein on their surface no longer contain detectable levels of CD34 mRNA, a phenomenon that appears to be developmentally regulated. PMID:7518719

Molesh, D A; Hall, J M



Quantitative trace element analysis of individual fly ash particles by means of X-ray microfluorescence.  


A new quantification procedure was developed for the evaluation of X-ray microfluorescence (XRF) data sets obtained from individual particles, based on iterative Monte Carlo (MC) simulation. Combined with the high sensitivity of synchrotron radiation-induced XRF spectroscopy, the method was used to obtain quantitative information down to trace-level concentrations from micrometer-sized particulate matter. The detailed XRF simulation model was validated by comparison of calculated and experimental XRF spectra obtained for glass microsphere standards, resulting in uncertainties in the range of 3-10% for the calculated elemental sensitivities. The simulation model was applied for the quantitative analysis of X-ray tube and synchrotron radiation-induced scanning micro-XRF spectra of individual coal and wood fly ash particles originating from different Hungarian power plants. By measuring the same particles by both methods the major, minor, and trace element compositions of the particles were determined. The uncertainty of the MC based quantitative analysis scheme is estimated to be in the range of 5-30%. PMID:11924974

Vincze, L; Somogyi, A; Osan, J; Vekemans, B; Torok, S; Janssens, K; Adams, F



Quantitative topographic anatomy of the femoral ACL footprint: a micro-CT analysis.  


The femoral footprint of the anterior cruciate ligament (ACL) is a much-studied anatomic structure, predominantly due to its importance during ACL reconstruction surgery. A new technique utilising high-resolution micro-computed tomography (micro-CT) is described, allowing detailed three-dimensional (3D) quantitative analysis of this structure. Seven cadaveric knees were scanned using micro-CT, yielding 3D data with a reconstructed voxel size of 60 ?m. A novel method of 3D surface extraction was developed and validated, facilitating both qualitative observation of surface details and quantitative topographic assessment using colour-coded relief maps. Images were displayed on an immersive 3D visualisation wall, and ten experienced ACL clinicians were surveyed as to the presence and morphology of osseous landmarks, providing qualitative assessment of whether such features can be reliably identified for navigation during surgery. Both quantitative analysis and qualitative assessment of the footprints in this study showed significant variability in the presence and morphology of osseous landmarks, with the lateral intercondylar ridge being objectively present in four out of seven relief maps, although reportedly seen in six out of seven cases in the qualitative study, suggesting an element of subjectivity and interpretation. This is the first study to utilise micro-CT in the study of ACL anatomy. PMID:25256807

Norman, Daniel G; Getgood, Alan; Thornby, John; Bird, Jonathan; Turley, Glen A; Spalding, Tim; Williams, Mark A



Optimisation of Lime-Soda process parameters for reduction of hardness in aqua-hatchery practices using Taguchi methods.  


This paper presents the optimisation of Lime-Soda process parameters for the reduction of hardness in aqua-hatchery practices in the context of M. rosenbergii. The fresh water in the development of fisheries needs to be of suitable quality. Lack of desirable quality in available fresh water is generally the confronting restraint. On the Indian subcontinent, groundwater is the only source of raw water, having varying degree of hardness and thus is unsuitable for the fresh water prawn hatchery practices (M. rosenbergii). In order to make use of hard water in the context of aqua-hatchery, Lime-Soda process has been recommended. The efficacy of the various process parameters like lime, soda ash and detention time, on the reduction of hardness needs to be examined. This paper proposes to determine the parameter settings for the CIFE well water, which is pretty hard by using Taguchi experimental design method. Orthogonal Arrays of Taguchi, Signal-to-Noise Ratio, the analysis of variance (ANOVA) have been applied to determine their dosage and analysed for their effect on hardness reduction. The tests carried out with optimal levels of Lime-Soda process parameters confirmed the efficacy of the Taguchi optimisation method. Emphasis has been placed on optimisation of chemical doses required to reduce the total hardness using Taguchi method and ANOVA, to suit the available raw water quality for aqua-hatchery practices, especially for fresh water prawn M. rosenbergii. PMID:24749379

Yavalkar, S P; Bhole, A G; Babu, P V Vijay; Prakash, Chandra



Growth mixture modeling as an exploratory analysis tool in longitudinal quantitative trait loci analysis  

Microsoft Academic Search

We examined the properties of growth mixture modeling in finding longitudinal quantitative trait loci in a genome-wide association study. Two software packages are commonly used in these analyses: Mplus and the SAS TRAJ procedure. We analyzed the 200 replicates of the simulated data with these programs using three tests: the likelihood-ratio test statistic, a direct test of genetic model coefficients,

Su-Wei Chang; Hoan Choi; Ke Li; R ose Saint Fleur; Chengrui Huang; Tong Shen; Kwangmi Ahn; Derek Gordon; Wonkuk Kim; Rongling Wu; Stephen J Finch



Review of Department of Defense Education Activity (DoDEA) Schools. Volume II: Quantitative Analysis of Educational Quality. IDA Paper.  

ERIC Educational Resources Information Center

This volume compiles, and presents in integrated form, the Institute for Defense Analyses' (IDA) quantitative analysis of educational quality provided by the Department of Defense's dependent schools. It covers the quantitative aspects of volume 1 in greater detail and presents some analyses deemed too technical for that volume. The first task in…

Anderson, Lowell Bruce; Bracken, Jerome; Bracken, Marilyn C.


Analysis on the Go: Quantitation of Drugs of Abuse in Dried Urine with Digital Microfluidics and Miniature Mass Spectrometry  

E-print Network

Analysis on the Go: Quantitation of Drugs of Abuse in Dried Urine with Digital Microfluidics the development of a method coupling microfluidics and a miniature mass spectrometer, applied to quantitation of drugs of abuse in urine. A custom digital microfluidic system was designed to deliver droplets

Zandstra, Peter W.


1997 Oxford University Press850860 Nucleic Acids Research, 1997, Vol. 25, No. 4 Quantitative analysis of electrophoresis data: novel  

E-print Network

© 1997 Oxford University Press850­860 Nucleic Acids Research, 1997, Vol. 25, No. 4 Quantitative interface, and includes a number of features which offer significant advantages over existing methods for quantitative gel analysis. The method uses curve fitting with a non- linear least-squares optimization

Tullius, Thomas D.


Plasticity of the corticospinal tract in early blindness revealed by quantitative analysis of fractional anisotropy based on diffusion  

E-print Network

and early childhood. The structural and functional adaptations in cortical areas of the early blind havePlasticity of the corticospinal tract in early blindness revealed by quantitative analysis is to investigate the plasticity of the corticospinal tract (CST) in early blindness by tract-based quantitative



Quantitative Analysis of Snake Venoms Using Soluble Polymer-based Isotope Labeling*S?  

PubMed Central

We present the design and synthesis of a new quantitative strategy termed soluble polymer-based isotope labeling (SoPIL) and its application as a novel and inclusive method for the identification and relative quantification of individual proteins in complex snake venoms. The SoPIL reagent selectively captures and isolates cysteine-containing peptides, and the subsequent tagged peptides are released and analyzed using nanoflow liquid chromatography-tandem mass spectrometry. The SoPIL strategy was used to quantify venom proteins from two pairs of venomous snakes: Crotalus scutulatus scutulatus type A, C. scutulatus scutulatus type B, Crotalus oreganus helleri, and Bothrops colombiensis. The hemorrhagic, hemolytic, clotting ability, and fibrinogenolytic activities of crude venoms were measured and correlated with difference in protein abundance determined by the SoPIL analysis. The SoPIL approach could provide an efficient and widely applicable tool for quantitative proteomics. PMID:18089550

Galan, Jacob A.; Guo, Minjie; Sanchez, Elda E.; Cantu, Esteban; Rodriguez-Acosta, Alexis; Perez, John C.; Tao, W. Andy



Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging  

NASA Astrophysics Data System (ADS)

Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji



A Voxel-Map Quantitative Analysis Approach for Atherosclerotic Noncalcified Plaques of the Coronary Artery Tree  

PubMed Central

Noncalcified plaques (NCPs) are associated with the presence of lipid-core plaques that are prone to rupture. Thus, it is important to detect and monitor the development of NCPs. Contrast-enhanced coronary Computed Tomography Angiography (CTA) is a potential imaging technique to identify atherosclerotic plaques in the whole coronary tree, but it fails to provide information about vessel walls. In order to overcome the limitations of coronary CTA and provide more meaningful quantitative information for percutaneous coronary intervention (PCI), we proposed a Voxel-Map based on mathematical morphology to quantitatively analyze the noncalcified plaques on a three-dimensional coronary artery wall model (3D-CAWM). This approach is a combination of Voxel-Map analysis techniques, plaque locating, and anatomical location related labeling, which show more detailed and comprehensive coronary tree wall visualization. PMID:24348749

Li, Ying; Chen, Wei; Chen, Yonglin; Chu, Chun; Fang, Bingji; Tan, Liwen



TOPICAL REVIEW Quantitative strain analysis of surfaces and interfaces using extremely asymmetric x-ray diffraction  

NASA Astrophysics Data System (ADS)

Strain can reduce carrier mobility and the reliability of electronic devices and affect the growth mode of thin films and the stability of nanometer-scale crystals. To control lattice strain, a technique for measuring the minute lattice strain at surfaces and interfaces is needed. Recently, an extremely asymmetric x-ray diffraction method has been developed for this purpose. By employing Darwin's dynamical x-ray diffraction theory, quantitative evaluation of strain at surfaces and interfaces becomes possible. In this paper, we review our quantitative strain analysis studies on native SiO2/Si interfaces, reconstructed Si surfaces, Ni/Si(111)-H interfaces, sputtered III-V compound semiconductor surfaces, high-k/Si interfaces, and Au ion-implanted Si.

Akimoto, Koichi; Emoto, Takashi



Qualitative and quantitative analysis of the constituents in Danmu preparations by UPLC-PDA-TOF-MS.  


An ultra performance liquid chromatography coupled with photodiode array detector and time-of-flight tandem mass spectrometry (UPLC-PDA-TOF-MS) method was developed for the quality assessment of Danmu preparations, a commonly used traditional Chinese medicine. Thirty-three compounds from Danmu preparations were simultaneously detected; among them, 14 compounds were unequivocally identified based on their retention behaviors, UV spectrum, MS and MS(n) data by comparing with reference substances, and the others were tentatively characterized by literatures. Twelve of 33 compounds were simultaneously determined by UPLC-PDA, and the validation of the quantitative method, including recoveries, linearity, sensitivity, precision and repeatability was carried out and the results demonstrated to be satisfied the requirements of quantitative analysis. The results suggested that the established method would be a powerful and reliable analytical tool for quality control of Danmu preparations and the characterization of multi-constituent in complex chemical system. PMID:23988988

Zhu, Fenxia; Chen, Jiaquan; Wang, Jingjing; Yin, Rong; Li, Xiufeng; Jia, Xiaobin



The Aqua-Planet Experiment (APE): CONTROL SST Simulation  

NASA Technical Reports Server (NTRS)

Climate simulations by 16 atmospheric general circulation models (AGCMs) are compared on an aqua-planet, a water-covered Earth with prescribed sea surface temperature varying only in latitude. The idealised configuration is designed to expose differences in the circulation simulated by different models. Basic features of the aqua-planet climate are characterised by comparison with Earth. The models display a wide range of behaviour. The balanced component of the tropospheric mean flow, and mid-latitude eddy covariances subject to budget constraints, vary relatively little among the models. In contrast, differences in damping in the dynamical core strongly influence transient eddy amplitudes. Historical uncertainty in modelled lower stratospheric temperatures persists in APE.Aspects of the circulation generated more directly by interactions between the resolved fluid dynamics and parameterized moist processes vary greatly. The tropical Hadley circulation forms either a single or double inter-tropical convergence zone (ITCZ) at the equator, with large variations in mean precipitation. The equatorial wave spectrum shows a wide range of precipitation intensity and propagation characteristics. Kelvin mode-like eastward propagation with remarkably constant phase speed dominates in most models. Westward propagation, less dispersive than the equatorial Rossby modes, dominates in a few models or occurs within an eastward propagating envelope in others. The mean structure of the ITCZ is related to precipitation variability, consistent with previous studies.The aqua-planet global energy balance is unknown but the models produce a surprisingly large range of top of atmosphere global net flux, dominated by differences in shortwave reflection by clouds. A number of newly developed models, not optimised for Earth climate, contribute to this. Possible reasons for differences in the optimised models are discussed.The aqua-planet configuration is intended as one component of an experimental hierarchy used to evaluate AGCMs. This comparison does suggest that the range of model behaviour could be better understood and reduced in conjunction with Earth climate simulations. Controlled experimentation is required to explore individual model behavior and investigate convergence of the aqua-planet climate with increasing resolution.

Blackburn, Michael; Williamson, David L.; Nakajima, Kensuke; Ohfuchi, Wataru; Takahashi, Yoshiyuki O.; Hayashi, Yoshi-Yuki; Nakamura, Hisashi; Ishiwatari, Masaki; Mcgregor, John L.; Borth, Hartmut; Wirth, Volkmar; Frank, Helmut; Bechtold, Peter; Wedi, Nils P.; Tomita, Hirofumi; Satoh, Masaki; Zhao, Ming; Held, Isaac M.; Suarez, Max J.; Lee, Myong-In; Watanabe, Masahiro; Kimoto, Masahide; Liu, Yimin; Wang, Zaizhi; Molod, Andrea M.; Rajendran, Kavirajan; Kotoh, Akio; Stratton, Rachel



Quantitative Analysis of Lesion Morphology and Texture Features for Diagnostic Prediction in Breast MRI  

PubMed Central

Rationale and Objectives To investigate the feasibility using quantitative morphology/texture features of breast lesions for diagnostic prediction; and to explore the association of computerized features with lesion phenotype appearance on MRI. Materials and Methods 43 malignant/28 benign lesions were used in this study. A systematic approach from automated lesion segmentation, quantitative feature extraction, diagnostic feature selection using artificial neural network (ANN), and lesion classification was carried out. Eight morphological parameters and 10 GLCM (gray level co-occurrence matrices) texture features were obtained from each lesion. The diagnostic performance of selected features to differentiate between malignant and benign lesions was analyzed using the ROC analysis. Results Six features were selected by ANN using leave-one-out cross validation, including Compactness, NRL Entropy, Volume, Gray Level Entropy, Gray Level Sum Average, and Homogeneity. The area under the ROC curve was 0.86. When dividing the database into half training and half validation set, a classifier of 5 features selected in the half training set achieved AUC of 0.82 in the other half validation set. The selected morphology feature “Compactness” was associated with shape and margin in BI-RADS lexicon, round shape and smooth margin for the benign lesions and more irregular shape for the malignant lesions. The selected texture features were associated with homogeneous/heterogeneous patterns and the enhancement intensity. The malignant lesions had higher intensity and broader distribution in the enhancement histogram (more heterogeneous) compared to the benign ones. Conclusion Quantitative analysis of morphology/texture features of breast lesions was feasible, and these features could be selected by ANN to form a classifier for differential diagnosis. Establishing the link between computer-based features and visual descriptors defined in BI-RADS lexicon will provide the foundation for the acceptance of quantitative diagnostic features in the development of computer-aided diagnosis (CAD). PMID:19000868

Nie, Ke; Chen, Jeon-Hor; Yu, Hon J.; Chu, Yong; Nalcioglu, Orhan; Su, Min-Ying



Quantitative multivariate analytical strategy for paleoenvironmental analysis of mixed benthic foraminiferal assemblages  

SciTech Connect

Fossil assemblages of benthic foraminifera commonly contain taxa that were not associated together during life. A variety of processes act to modify living assemblages during the transition to fossil assemblages-transport of tests by traction and gravity currents, taphonomic filtering, and rapid shifting of environments in response to sea level fluctuations, to name a few. Unraveling the nature of faunal mixing can provide insights into depositional processes and paleoenvironmental history of particular lithofacies. A quantitative multivariate analytical strategy is presented to address these problems, using the late Cenozoic Yakataga Formation, Gulf of Alaska as a specific example. A variety of lithofacies are present within the Yakataga Formation including normal marine mudstones, sandstones, coquinas and conglomerates and glaciomarine diamictites. Comparison of fossil assemblages with modern foraminiferal distributions indicates significant faunal mixing in most lithofacies, particularly the diamictites. Quantitative analysis includes cluster analysis to define broad patterns in faunal similarity, R-mode factor analysis to define species interrelationships, and Q-mode polytopic vector analysis to 'unmix' the assemblages into their component biofacies. Two broad patterns of faunal mixing are identified: (1) comprehensive mixing of all possible biofacies within a particular bathymetric range and (2) mixing of very shallow (innermost neritic) with deeper (upper bathyal) assemblages, bypassing environments from outer neritic areas. Diamictites are shown to form in a variety of water depths from inner neritic to upper bathyal.

Lagoe, M.B. (Univ. of Texas, Austin (United States))



Quantitative analysis of dimethyl titanocene by iodometric titration, gas chromatography and NMR.  


In this study we report the use of an automated iodometric titration method and a novel gas chromatography (GC) method for the quantitative analysis of dimethyl titanocene (DMT), a key raw material in drug synthesis. Both approaches are based on the reaction of DMT in toluene or tetrahydrofuran solutions with iodine. In the case of iodometric titration, excess iodine is titrated with a standardized aqueous sodium thiosulfate solution to a potentiometric end-point for the determination of DMT concentration. Alternatively, GC is employed to measure the concentration of iodomethane, a product of the reaction between DMT and iodine, in order to determine the concentration of DMT in the solution. Excellent agreement between iodometric titration, GC and NMR results using several DMT samples confirms the accuracy of the two methods and strongly supports the use of either method as a replacement to the expensive NMR for quantitative DMT analysis. The relatively few sources of error associated with the two methods, their ubiquitous nature and ease of application in routine analysis make them the analytical methods of choice, among all. Both methods have been validated according to ICH requirements. The use of iodometric titration method for DMT analysis is demonstrated with a couple of applications. PMID:11377038

Vailaya, A; Wang, T; Chen, Y; Huffman, M



Quantitative Analysis of STD-NMR Spectra of Reversibly Forming Ligand-Receptor Complexes  

NASA Astrophysics Data System (ADS)

We describe our work on the quantitative analysis of STD-NMR spectra of reversibly forming ligand-receptor complexes. This analysis is based on the theory of complete relaxation and conformational exchange matrix analysis of saturation transfer (CORCEMA-ST) effects. As part of this work, we have developed two separate versions of the CORCEMA-ST program. The first version predicts the expected STD intensities for a given model of a ligand-protein complex, and compares them quantitatively with the experimental data. This version is very useful for rapidly determining if a model for a given ligand-protein complex is compatible with the STD-NMR data obtained in solution. It is also useful in determining the optimal experimental conditions for undertaking the STD-NMR measurements on a given complex by computer simulations. In the second version of the CORCEMA-ST program, we have implemented a torsion angle refinement feature for the bound ligand within the protein binding pocket. In this approach, the global minimum for the bound ligand conformation is obtained by a hybrid structure refinement protocol involving CORCEMA-ST calculation of intensities and simulated annealing refinement of torsion angles of the bound ligand using STD-NMR intensities as experimental constraints to minimize a pseudo-energy function. This procedure is useful in refining and improving the initial models based on crystallography, computer docking, or other procedures to generate models for the bound ligand within the protein binding pocket compatible with solution STD-NMR data. In this chapter we describe the properties of the STD-NMR spectra, including the dependence of the intensities on various parameters. We also describe the results of the CORCEMA-ST analyses of experimental STD-NMR data on some ligand-protein complexes to illustrate the quantitative analysis of the data using this method. This CORCEMA-ST program is likely to be useful in structure-based drug design efforts.

Krishna, N. Rama; Jayalakshmi, V.


Automated Quantitative Analysis of Capnogram Shape for COPD-Normal and COPD-CHF Classification.  


We develop an approach to quantitative analysis of carbon dioxide concentration in exhaled breath, recorded as a function of time by capnography. The generated waveform-or capnogram-is currently used in clinical practice to establish the presence of respiration as well as determine respiratory rate and end-tidal CO 2 concentration. The capnogram shape also has diagnostic value, but is presently assessed qualitatively, by visual inspection. Prior approaches to quantitatively characterizing the capnogram shape have explored the correlation of various geometric parameters with pulmonary function tests. These studies attempted to characterize the capnogram in normal subjects and patients with cardiopulmonary disease, but no consistent progress was made, and no translation into clinical practice was achieved. We apply automated quantitative analysis to discriminate between chronic obstructive pulmonary disease (COPD) and congestive heart failure (CHF), and between COPD and normal. Capnograms were collected from 30 normal subjects, 56 COPD patients, and 53 CHF patients. We computationally extract four physiologically based capnogram features. Classification on a hold-out test set was performed by an ensemble of classifiers employing quadratic discriminant analysis, designed through cross validation on a labeled training set. Using 80 exhalations of each capnogram record in the test set, performance analysis with bootstrapping yields areas under the receiver operating characteristic (ROC) curve of 0.89 (95% CI: 0.72-0.96) for COPD/CHF classification, and 0.98 (95% CI: 0.82-1.0) for COPD/normal classification. This classification performance is obtained with a run time sufficiently fast for real-time monitoring. PMID:24967981

Mieloszyk, Rebecca J; Verghese, George C; Deitch, Kenneth; Cooney, Brendan; Khalid, Abdullah; Mirre-Gonzalez, Milciades A; Heldt, Thomas; Krauss, Baruch S



Quantitative analysis of multi-component gas mixture based on AOTF-NIR spectroscopy  

NASA Astrophysics Data System (ADS)

Near Infrared (NIR) spectroscopy analysis technology has attracted many eyes and has wide application in many domains in recent years because of its remarkable advantages. But the NIR spectrometer can only be used for liquid and solid analysis by now. In this paper, a new quantitative analysis method of gas mixture by using new generation NIR spectrometer is explored. To collect the NIR spectra of gas mixtures, a vacuumable gas cell was designed and assembled to Luminar 5030-731 Acousto-Optic Tunable Filter (AOTF)-NIR spectrometer. Standard gas samples of methane (CH 4), ethane (C IIH 6) and propane (C 3H 8) are diluted with super pure nitrogen via precision volumetric gas flow controllers to obtain gas mixture samples of different concentrations dynamically. The gas mixtures were injected into the gas cell and the spectra of wavelength between 1100nm-2300nm were collected. The feature components extracted from gas mixture spectra by using Partial Least Squares (PLS) were used as the inputs of the Support Vector Regress Machine (SVR) to establish the quantitative analysis model. The effectiveness of the model is tested by the samples of predicting set. The prediction Root Mean Square Error (RMSE) of CH 4, C IIH 6 and C 3H 8 is respectively 1.27%, 0.89%, and 1.20% when the concentrations of component gas are over 0.5%. It shows that the AOTF-NIR spectrometer with gas cell can be used for gas mixture analysis. PLS combining with SVR has a good performance in NIR spectroscopy analysis. This paper provides the bases for extending the application of NIR spectroscopy analysis to gas detection.

Hao, Huimin; Zhang, Yong; Liu, Junhua




EPA Science Inventory

A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...


Structured Qualitative Research: Organizing "Mountains of Words" for Data Analysis, both Qualitative and Quantitative  

PubMed Central

Qualitative research creates mountains of words. U.S. federal funding supports mostly structured qualitative research, which is designed to test hypotheses using semi-quantitative coding and analysis. The authors have 30 years of experience in designing and completing major qualitative research projects, mainly funded by the US National Institute on Drug Abuse [NIDA]. This article reports on strategies for planning, organizing, collecting, managing, storing, retrieving, analyzing, and writing about qualitative data so as to most efficiently manage the mountains of words collected in large-scale ethnographic projects. Multiple benefits accrue from this approach. Several different staff members can contribute to the data collection, even when working from remote locations. Field expenditures are linked to units of work so productivity is measured, many staff in various locations have access to use and analyze the data, quantitative data can be derived from data that is primarily qualitative, and improved efficiencies of resources are developed. The major difficulties involve a need for staff who can program and manage large databases, and who can be skillful analysts of both qualitative and quantitative data. PMID:20222777

Johnson, Bruce D.; Dunlap, Eloise; Benoit, Ellen



Quantitative analysis of a prostate-specific antigen in serum using fluorescence immunochromatography.  


A quantitative analysis of prostate-specific antigen (PSA) in samples of human blood serum by fluorescence immunochromatography using monoclonal antibodies to PSA was developed. The fluorescence immunochromatographic analysis system is composed of anti-PSA-monoclonal antibody (mAb), fluorescence conjugates in detection solution, a immunochromatographic assay strip, and a laser fluorescence scanner. A fluorescence immunochromatographic analysis system was employed to detect PSA on the basis of the area ratio between the control line and the test line of the strip. Under optimal conditions, the area ratio was proportional to PSA concentration ranging from 0.72 to 46.0 ng/mL with a detection limit of 0.72 ng/mL. PMID:21113839

Yoo, Jisun; Jung, Young Mee; Hahn, Jong Hoon; Pyo, Dongjin



Quantitation of X-Ray Radiographic Elemental Maps Using Factorial Analysis of Correspondence: Methods and Programs  

NASA Astrophysics Data System (ADS)

We examine the problem of building quantitative elemental maps from X-ray absorption images (radiography). As we suggested in a previous publication in Microbeam Analysis, factorial analysis of correspondence is shown to be an optimal method, in the least squares sense, for solving the multilinear equation system given by Beer's law: it relates to an efficient description of the problem in the concentration phase space. We explain how factorial analysis is related to singular value decomposition and we give a complete description of the algorithm. The method can be applied to any multilinear analytical technique as well. Programs are written in C and Mathematica® languages. Academic users may obtain the relevant software (source and code) as freeware directly from the authors.

Trebbia, P.; Ferrar, G.



Quantitative Analysis of Food and Feed Samples with Droplet Digital PCR  

PubMed Central

In this study, the applicability of droplet digital PCR (ddPCR) for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies) of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR) approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed. PMID:23658750

Morisset, Dany; Stebih, Dejan; Milavec, Mojca; Gruden, Kristina; Zel, Jana



Quantitative analysis of liver function in percutaneous transhepatic biliary drainage patients  

SciTech Connect

The diagnostic usefulness of Tc-99m DISIDA cholescintigraphy as a predictor of eventual catheter and hepatic function in patients who have undergone percutaneous transhepatic biliary drainage (PTBD) for extrahepatic biliary obstruction was evaluated. Twenty-nine cholescintigrams were performed in 14 patients. The examinations were divided into two groups: Group A (N = 17), in which the patient's clinical status deteriorated within two to three days post-PTBD, and Group B (N = 12), in which the patients did well clinically post-PTBD. No significant difference between the two groups was demonstrated by visual analysis of the analog images or by analysis of serum bilirubin levels. A computer program, developed by the authors, quantitates several parameters of DISIDA kinetics, reflecting hepatic function based upon compartmental analysis. A significant difference (P less than .001) was demonstrated between the mean transport constants (blood clearance constant = k1; hepatic clearance constant = k2) for the two groups. It is concluded that serum bilirubin levels and visual inspection of analog images are inadequate independent predictors of hepatic function in patients post PTBD. The transport constants k1 and k2 are quantitative parameters of hepatic function that may be of prognostic value in patients post PTBD.

Velchik, M.G.; Schwartz, W.; London, J.W.; Makler, P.T. Jr.; Alavi, A.



Adaptive wavelet transform suppresses background and noise for quantitative analysis by Raman spectrometry.  


Discrete wavelet transform (DWT) provides a well-established means for spectral denoising and baseline elimination to enhance resolution and improve the performance of calibration and classification models. However, the limitation of a fixed filter bank can prevent the optimal application of conventional DWT for the multiresolution analysis of spectra of arbitrarily varying noise and background. This paper presents a novel methodology based on an improved, second-generation adaptive wavelet transform (AWT) algorithm. This AWT methodology uses a spectrally adapted lifting scheme to generate an infinite basis of wavelet filters from a single conventional wavelet, and then finds the optimal one. Such pretreatment combined with a multivariate calibration approach such as partial least squares can greatly enhance the utility of Raman spectroscopy for quantitative analysis. The present work demonstrates this methodology using two dispersive Raman spectral data sets, incorporating lactic acid and melamine in pure water and in milk solutions. The results indicate that AWT can separate spectral background and noise from signals of interest more efficiently than conventional DWT, thus improving the effectiveness of Raman spectroscopy for quantitative analysis and classification. PMID:21331486

Chen, Da; Chen, Zhiwen; Grant, Edward



Qualitative and Quantitative Analysis for Facial Complexion in Traditional Chinese Medicine  

PubMed Central

Facial diagnosis is an important and very intuitive diagnostic method in Traditional Chinese Medicine (TCM). However, due to its qualitative and experience-based subjective property, traditional facial diagnosis has a certain limitation in clinical medicine. The computerized inspection method provides classification models to recognize facial complexion (including color and gloss). However, the previous works only study the classification problems of facial complexion, which is considered as qualitative analysis in our perspective. For quantitative analysis expectation, the severity or degree of facial complexion has not been reported yet. This paper aims to make both qualitative and quantitative analysis for facial complexion. We propose a novel feature representation of facial complexion from the whole face of patients. The features are established with four chromaticity bases splitting up by luminance distribution on CIELAB color space. Chromaticity bases are constructed from facial dominant color using two-level clustering; the optimal luminance distribution is simply implemented with experimental comparisons. The features are proved to be more distinctive than the previous facial complexion feature representation. Complexion recognition proceeds by training an SVM classifier with the optimal model parameters. In addition, further improved features are more developed by the weighted fusion of five local regions. Extensive experimental results show that the proposed features achieve highest facial color recognition performance with a total accuracy of 86.89%. And, furthermore, the proposed recognition framework could analyze both color and gloss degrees of facial complexion by learning a ranking function. PMID:24967342

Zhao, Changbo; Li, Guo-zheng; Li, Fufeng; Wang, Zhi; Liu, Chang



Application of relativistic electrons for the quantitative analysis of trace elements  

NASA Astrophysics Data System (ADS)

Particle induced X-ray emission methods (PIXE) have been extended to relativistic electrons to induce X-ray emission (REIXE) for quantitative trace-element analysis. The electron beam (20 ? E0? 70 MeV) was supplied by the Darmstadt electron linear accelerator DALINAC. Systematic measurements of absolute K-, L- and M-shell ionization cross sections revealed a scaling behaviour of inner-shell ionization cross sections from which X-ray production cross sections can be deduced for any element of interest for a quantitative sample investigation. Using a multielemental mineral monazite sample from Malaysia the sensitivity of REIXE is compared to well established methods of trace-element analysis like proton- and X-ray-induced X-ray fluorescence analysis. The achievable detection limit for very heavy elements amounts to about 100 ppm for the REIXE method. As an example of an application the investigation of a sample prepared from manganese nodules — picked up from the Pacific deep sea — is discussed, which showed the expected high mineral content of Fe, Ni, Cu and Ti, although the search for aliquots of Pt did not show any measurable content within an upper limit of 250 ppm.

Hoffmann, D. H. H.; Brendel, C.; Genz, H.; Löw, W.; Richter, A.



Quantitative analysis of nanoparticle internalization in mammalian cells by high resolution X-ray microscopy  

PubMed Central

Background Quantitative analysis of nanoparticle uptake at the cellular level is critical to nanomedicine procedures. In particular, it is required for a realistic evaluation of their effects. Unfortunately, quantitative measurements of nanoparticle uptake still pose a formidable technical challenge. We present here a method to tackle this problem and analyze the number of metal nanoparticles present in different types of cells. The method relies on high-lateral-resolution (better than 30 nm) transmission x-ray microimages with both absorption contrast and phase contrast -- including two-dimensional (2D) projection images and three-dimensional (3D) tomographic reconstructions that directly show the nanoparticles. Results Practical tests were successfully conducted on bare and polyethylene glycol (PEG) coated gold nanoparticles obtained by x-ray irradiation. Using two different cell lines, EMT and HeLa, we obtained the number of nanoparticle clusters uptaken by each cell and the cluster size. Furthermore, the analysis revealed interesting differences between 2D and 3D cultured cells as well as between 2D and 3D data for the same 3D specimen. Conclusions We demonstrated the feasibility and effectiveness of our method, proving that it is accurate enough to measure the nanoparticle uptake differences between cells as well as the sizes of the formed nanoparticle clusters. The differences between 2D and 3D cultures and 2D and 3D images stress the importance of the 3D analysis which is made possible by our approach. PMID:21477355



DAnTE: a statistical tool for quantitative analysis of –omics data  

SciTech Connect

DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep; Petyuk, Vladislav A.; Adkins, Joshua N.; Camp, David G.; Anderson, Gordon A.; Smith, Richard D.



Uranium Isotopic and Quantitative Analysis Using a Mechanically-Cooled HPGe Detector  

SciTech Connect

A new, portable high-resolution spectroscopy system based on a high-purity germanium detector cooled with a miniature Stirling-cycle cooler, ORTEC trans-SPEC, has recently become commercially available. The use of a long-life mechanical cooling system eliminates the need for liquid nitrogen. The purpose of this study was to determine the applicability of this new instrument for isotopic and quantitative analyses of uranium samples. The results of the performance of the trans-SPEC with the combination of PC-FRAM and ISOTOPIC software packages are described in this paper. An optimal set of analysis parameters for uranium measurements is proposed.

Solodov, Alexander A [ORNL



[Application of MALT-CLS method to FTIR quantitative analysis of atmospheric trace gas].  


The MALT-CLS method for quantitative analysis of atmosphere trace gas by FTIR spectrometry was studied. Some experiments are described, such as long-path White cell, and passive remote sensing of aircraft. The characteristic of this method is that the calibration spectra are calculated from a database of absorption line parameters HITRAN using MALT program, including environmental and instrumental effects in the calculation. It is particularly useful in long open-path and solar FTIR spectroscopy, or passive remote sensing by FTIR. PMID:17020024

Gao, Min-Guang; Liu, Wen-Qing; Zhang, Tian-Shu; Liu, Jian-Guo; Lu, Yi-Huai; Xu, Liang; Zhu, Jun



Modular isotopomer synthesis of ?-hydroxybutyric acid for a quantitative analysis of metabolic fates.  


Herein we report a study combining metabolomics and mass isotopomer analysis used for investigation of the biochemical fate of ?-hydroxybutyric acid (GHB). Using various (13)C incorporation labeling patterns into GHB, we have discovered that GHB is catabolized by previously unknown processes that include (i) direct ?-oxidation to acetyl-CoA and glycolate, (ii) ?-oxidation to 3-hydroxypropionyl-CoA and formate, and (iii) cleavage of C-4 to yield 3-hydroxypropionate and CO2. We further utilized the unique attributes of our labeling patterns and the resultant isotopomers to quantitate relative flux down the identified pathways. PMID:24933109

Sadhukhan, Sushabhan; Zhang, Guo-Fang; Tochtrop, Gregory P



Quantitative Evaluation of Paracetamol and Caffeine from Pharmaceutical Preparations Using Image Analysis and RP-TLC  

Microsoft Academic Search

A reversed-phase high-performance thin-layer chromatographic method combined with image analysis was developed and validated\\u000a for simultaneous quantitative evaluation of paracetamol and caffeine in pharmaceutical preparations. RP-HPTLC-W18 chromatographic\\u000a plates were used as the stationary phase and methanol:glacial acetic acid:water (25:4.3:70.7; v:v:v) as the mobile phase. The detection of the spots and the image documentation were carried out under 254 nm UV radiation.

Florin Soponar; Augustin C?t?lin Mo?; Costel Sârbu



Quantitative determination of electric field strengths within dynamically operated devices using EBIC analysis in the SEM.  


Although electron beam-induced current (EBIC) technique was invented in the seventies, it is still a powerful technique for failure analysis and reliability investigations of modern materials and devices. Time-resolved and stroboscopic microanalyses using sampling Fourier components decomposed by modulated charge carrier excitation are introduced. Quantitative determination of electric field strengths within dynamically operated devices in the scanning electron microscope (SEM) will be demonstrated. This technique allows investigations of diffusion and drift processes and of variations of electric field distributions inside active devices. PMID:18523959

Pugatschow, Anton; Heiderhoff, Ralf; Balk, Ludwig J



Quantitative analysis of brain pathology based on MRI and brain atlases--applications for cerebral palsy.  


We have developed a new method to provide a comprehensive quantitative analysis of brain anatomy in cerebral palsy patients, which makes use of two techniques: diffusion tensor imaging and automated 3D whole brain segmentation based on our brain atlas and a nonlinear normalization technique (large-deformation diffeomorphic metric mapping). This method was applied to 13 patients and normal controls. The reliability of the automated segmentation revealed close agreement with the manual segmentation. We illustrate some potential applications for individual characterization and group comparison. This technique also provides a framework for determining the impact of various neuroanatomic features on brain functions. PMID:20920589

Faria, Andreia V; Hoon, Alexander; Stashinko, Elaine; Li, Xin; Jiang, Hangyi; Mashayekh, Ameneh; Akhter, Kazi; Hsu, John; Oishi, Kenichi; Zhang, Jiangyang; Miller, Michael I; van Zijl, Peter C M; Mori, Susumu



A quantitative analysis of the mechanism that controls body size in Manduca sexta  

PubMed Central

Background Body size is controlled by mechanisms that terminate growth when the individual reaches a species-specific size. In insects, it is a pulse of ecdysone at the end of larval life that causes the larva to stop feeding and growing and initiate metamorphosis. Body size is a quantitative trait, so it is important that the problem of control of body size be analyzed quantitatively. The processes that control the timing of ecdysone secretion in larvae of the moth Manduca sexta are sufficiently well understood that they can be described in a rigorous manner. Results We develop a quantitative description of the empirical data on body size determination that accurately predicts body size for diverse genetic strains. We show that body size is fully determined by three fundamental parameters: the growth rate, the critical weight (which signals the initiation of juvenile hormone breakdown), and the interval between the critical weight and the secretion of ecdysone. All three parameters are easily measured and differ between genetic strains and environmental conditions. The mathematical description we develop can be used to explain how variables such as growth rate, nutrition, and temperature affect body size. Conclusion Our analysis shows that there is no single locus of control of body size, but that body size is a system property that depends on interactions among the underlying determinants of the three fundamental parameters. A deeper mechanistic understanding of body size will be obtained by research aimed at uncovering the molecular mechanisms that give these three parameters their particular quantitative values. PMID:16879739

Nijhout, HF; Davidowitz, G; Roff, DA




PubMed Central

Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method frequently correspond to subregions of visible spots that may represent post-translational modifications or co-migrating proteins that cannot be visually resolved from adjacent, more abundant proteins on the gel image. Thus, it is possible that this image-based approach may actually improve the realized resolution of the gel, revealing differentially expressed proteins that would not have even been detected as spots by modern spot-based analyses. PMID:22408711

Morris, Jeffrey S.; Baladandayuthapani, Veerabhadran; Herrick, Richard C.; Sanna, Pietro; Gutstein, Howard



76 FR 41854 - Aqua Society, Inc., Centurion Gold Holdings, Inc., and PowerRaise, Inc.; Order of Suspension of...  

Federal Register 2010, 2011, 2012, 2013

...File No. 500-1] Aqua Society, Inc., Centurion Gold...of current and accurate information concerning the securities of Aqua Society, Inc. because it has not...of current and accurate information concerning the...



A New 3-Dimensional Dynamic Quantitative Analysis System of Facial Motion: An Establishment and Reliability Test  

PubMed Central

This study aimed to establish a 3-dimensional dynamic quantitative facial motion analysis system, and then determine its accuracy and test-retest reliability. The system could automatically reconstruct the motion of the observational points. Standardized T-shaped rod and L-shaped rods were used to evaluate the static and dynamic accuracy of the system. Nineteen healthy volunteers were recruited to test the reliability of the system. The average static distance error measurement was 0.19 mm, and the average angular error was 0.29°. The measuring results decreased with the increase of distance between the cameras and objects, 80 cm of which was considered to be optimal. It took only 58 seconds to perform the full facial measurement process. The average intra-class correlation coefficient for distance measurement and angular measurement was 0.973 and 0.794 respectively. The results demonstrated that we successfully established a practical 3-dimensional dynamic quantitative analysis system that is accurate and reliable enough to meet both clinical and research needs. PMID:25390881

Feng, Guodong; Zhao, Yang; Tian, Xu; Gao, Zhiqiang



A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework  

NASA Astrophysics Data System (ADS)

An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo


Quantitative Analysis of Signaling Networks across Differentially Embedded Tumors Highlights Interpatient Heterogeneity in Human Glioblastoma.  


Glioblastoma multiforme (GBM) is the most aggressive malignant primary brain tumor, with a dismal mean survival even with the current standard of care. Although in vitro cell systems can provide mechanistic insight into the regulatory networks governing GBM cell proliferation and migration, clinical samples provide a more physiologically relevant view of oncogenic signaling networks. However, clinical samples are not widely available and may be embedded for histopathologic analysis. With the goal of accurately identifying activated signaling networks in GBM tumor samples, we investigated the impact of embedding in optimal cutting temperature (OCT) compound followed by flash freezing in LN2 vs immediate flash freezing (iFF) in LN2 on protein expression and phosphorylation-mediated signaling networks. Quantitative proteomic and phosphoproteomic analysis of 8 pairs of tumor specimens revealed minimal impact of the different sample processing strategies and highlighted the large interpatient heterogeneity present in these tumors. Correlation analyses of the differentially processed tumor sections identified activated signaling networks present in selected tumors and revealed the differential expression of transcription, translation, and degradation associated proteins. This study demonstrates the capability of quantitative mass spectrometry for identification of in vivo oncogenic signaling networks from human tumor specimens that were either OCT-embedded or immediately flash-frozen. PMID:24927040

Johnson, Hannah; White, Forest M



Histogram Analysis of Hepatobiliary Phase MR Imaging as a Quantitative Value for Liver Cirrhosis: Preliminary Observations  

PubMed Central

Purpose To investigate whether histogram analysis of the hepatobiliary phase on gadoxetate enhanced-MRI could be used as a quantitative index for determination of liver cirrhosis. Materials and Methods A total of 63 patients [26 in a normal liver function (NLF) group and 37 in a cirrhotic group] underwent gadoxetate-enhanced MRI, and hepatobiliary phase images were obtained at 20 minutes after contrast injection. The signal intensity of the hepatic parenchyma was measured at four different regions of interest (ROI) of the liver, avoiding vessels and bile ducts. Standard deviation (SD), coefficient of variation (CV), and corrected CV were calculated on the histograms at the ROIs. The distributions of CVs calculated from the ROI histogram were examined and statistical analysis was carried out. Results The CV value was 0.041±0.009 (mean CV±SD) in the NLF group, while that of cirrhotic group was 0.071±0.020. There were statistically significant differences in the CVs and corrected CV values between the NLF and cirrhotic groups (p<0.001). The most accurate cut-off value among CVs for distinguishing normal from cirrhotic group was 0.052 (sensitivity 83.8% and specificity 88.5%). There was no statistically significant differences in SD between NLF and cirrhotic groups (p=0.307). Conclusion The CV of histograms of the hepatobiliary phase on gadoxetate-enhanced MRI may be useful as a quantitative value for determining the presence of liver cirrhosis. PMID:24719131

Kim, Honsoul; Sun, Mark; Sirlin, Claude B.



WormFarm: a quantitative control and measurement device toward automated Caenorhabditis elegans aging analysis.  


Caenorhabditis elegans is a leading model organism for studying the basic mechanisms of aging. Progress has been limited, however, by the lack of an automated system for quantitative analysis of longevity and mean lifespan. To address this barrier, we developed 'WormFarm', an integrated microfluidic device for culturing nematodes. Cohorts of 30-50 animals are maintained throughout their lifespan in each of eight separate chambers on a single WormFarm polydimethylsiloxane chip. Design features allow for automated removal of progeny and efficient control of environmental conditions. In addition, we have developed computational algorithms for automated analysis of video footage to quantitate survival and other phenotypes, such as body size and motility. As proof-of-principle, we show here that WormFarm successfully recapitulates survival data obtained from a standard plate-based assay for both RNAi-mediated and dietary-induced changes in lifespan. Further, using a fluorescent reporter in conjunction with WormFarm, we report an age-associated decrease in fluorescent intensity of GFP in transgenic worms expressing GFP tagged with a mitochondrial import signal under the control of the myo-3 promoter. This marker may therefore serve as a useful biomarker of biological age and aging rate. PMID:23442149

Xian, Bo; Shen, Jie; Chen, Weiyang; Sun, Na; Qiao, Nan; Jiang, Dongqing; Yu, Tao; Men, Yongfan; Han, Zhijun; Pang, Yuhong; Kaeberlein, Matt; Huang, Yanyi; Han, Jing-Dong J



Quantitative deuterium analysis of titanium samples in ultraviolet laser-induced low-pressure helium plasma.  


An experimental study of ultraviolet (UV) laser-induced plasma spectroscopy (LIPS) on Ti samples with low-pressure surrounding He gas has been carried out to demonstrate its applicability to quantitative micro-analysis of deuterium impurities in titanium without the spectral interference from the ubiquitous surface water. This was achieved by adopting the optimal experimental condition ascertained in this study, which is specified by 5 mJ laser energy, 10 Torr helium pressure, and 1-50 mus measurement window, which resulted in consistent D emission enhancement and effective elimination of spectral interference from surface water. As a result, a linear calibration line exhibiting a zero intercept was obtained from Ti samples doped with various D impurity concentrations. An additional measurement also yielded a detection limit of about 40 ppm for D impurity, well below the acceptable threshold of damaging H concentration in Ti and its alloys. Each of these measurements was found to produce a crater size of only 25 mum in diameter, and they may therefore qualify as nondestructive measurements. The result of this study has therefore paved the way for conducting further experiments with hydrogen-doped Ti samples and the technical implementation of quantitative micro-analysis of detrimental hydrogen impurity in Ti metal and its alloys, which is the ultimate goal of this study. PMID:20412619

Abdulmadjid, Syahrun Nur; Lie, Zener Sukra; Niki, Hideaki; Pardede, Marincan; Hedwig, Rinda; Lie, Tjung Jie; Jobiliong, Eric; Kurniawan, Koo Hendrik; Fukumoto, Ken-Ichi; Kagawa, Kiichiro; Tjia, May On



Quantitative Analysis of Flow Processes in a Sand Using Synchrotron-Based X-ray Microtomography  

SciTech Connect

Pore-scale multiphase flow experiments were developed to nondestructively visualize water flow in a sample of porous material using X-ray microtomography. The samples were exposed to similar boundary conditions as in a previous investigation, which examined the effect of initial flow rate on observed dynamic effects in the measured capillary pressure-saturation curves; a significantly higher residual saturation and higher capillary pressures were found when the sample was drained fast using a high air-phase pressure. Prior work applying the X-ray microtomography technique to pore-scale multiphase flow problems has been of a mostly qualitative nature and no experiments have been presented in the existing literature where a truly quantitative approach to investigating the multiphase flow process has been taken, including a thorough image-processing scheme. The tomographic images presented here show, both by qualitative comparison and quantitative analysis in the form of a nearest neighbor analysis, that the dynamic effects seen in previous experiments are likely due to the fast and preferential drainage of large pores in the sample. Once a continuous drained path has been established through the sample, further drainage of the remaining pores, which have been disconnected from the main flowing water continuum, is prevented.

Wildenschild, D.; Hopmans, J.W.; Rivers, M.L.; Kent, A.J.R. (OSU); (UCD); (UC)



Quantitative analysis of 3D mitral complex geometry using support vector machines.  


Quantitative analysis of 3D mitral complex geometry is crucial for a better understanding of its dysfunction. This work aims to characterize the geometry of the mitral complex and utilize a support-vector-machine-based classifier from geometric parameters to support the diagnosis of congenital mitral regurgitation (MR). The method has the following steps: (1) description of the 3D geometry of the mitral complex and establishment of its local reference coordinate system, (2) calculation of geometric parameters and (3) analysis and classification of these parameters. With a control group of 20 normal young children (11 boys, 9 girls, mean age 5.96 ± 3.12 years) and with the normal structure of mitral apparatus, 20 patients (9 boys, 11 girls, mean age 5.59 ± 3.30 years) suffering from severe congenital MR are studied in this study. The average classification accuracy is up to 90.0% of the present population, with the possibility of exploring quantitative association between the mitral complex geometry and the mechanism of congenital MR. PMID:22735308

Song, Wei; Yang, Xin; Sun, Kun



Image registration and analysis for quantitative myocardial perfusion: application to dynamic circular cardiac CT  

NASA Astrophysics Data System (ADS)

Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.

Isola, A. A.; Schmitt, H.; van Stevendaal, U.; Begemann, P. G.; Coulon, P.; Boussel, L.; Grass, M.



Quantitative determination of total lipid hydroperoxides by a flow injection analysis system.  


A flow injection analysis (FIA) system coupled with a fluorescence detection system using diphenyl-1-pyrenylphosphine (DPPP) was developed as a highly sensitive and reproducible quantitative method of total lipid hydroperoxide analysis. Fluorescence analysis of DPPP oxide generated by the reaction of lipid hydroperoxides with DPPP enabled a quantitative determination of the total amount of lipid hydroperoxides. Use of 1-myristoyl-2-(12-((7-nitro-2-1,3-benzoxadiazol-4-yl)amino) dodecanoyl)-sn-glycero-3-phosphocholine as the internal standard improved the sensitivity and reproducibility of the analysis. Several commercially available edible oils, including soybean oil, rape-seed oil, olive oil, corn oil, canola oil, safflower oil, mixed vegetable oils, cod liver oil, and sardine oil were analyzed by the FIA system for the quantitative determination of total lipid hydroperoxides. The minimal amounts of sample oils required were 50 microg of soybean oil (PV = 2.71 meq/kg) and 3 mg of sardine oil (PV = 0.38 meq/kg) for a single injection. Thus, sensitivity was sufficient for the detection of a small amount and/or low concentration of hydroperoxides in common edible oils. The recovery of sample oils for the FIA system ranged between 87.2+/-2.6% and 102+/-5.1% when PV ranged between 0.38 and 58.8 meq/kg. The CV in the analyses of soybean oil (PV = 3.25 meq/kg), cod liver oil (PV = 6.71 meq/kg), rapeseed oil (PV = 12.3 meq/kg), and sardine oil (PV = 63.8 meq/kg) were 4.31, 5.66, 8.27, and 11.2%, respectively, demonstrating sufficient reproducibility of the FIA system for the determination of lipid hydroperoxides. The squared correlation (r2) between the FIA system and the official AOCS iodometric titration method in a linear regression analysis was estimated at 0.9976 within the range of 0.35-77.8 meq/kg of PV (n = 42). Thus, the FIA system provided satisfactory detection limits, recovery, and reproducibility. The FIA system was further applied to evaluate changes in the total amounts of lipid hydroperoxides in fish muscle stored on ice. PMID:15884769

Sohn, Jeong-Ho; Taki, Yusuke; Ushio, Hideki; Ohshima, Toshiaki



Efficiency calibration of an HPGe X-ray detector for quantitative PIXE analysis  

NASA Astrophysics Data System (ADS)

Particle Induced X-ray Emission (PIXE) is an analytical technique, which provides reliably and accurately quantitative results without the need of standards when the efficiency of the X-ray detection system is calibrated. The ion beam microprobe of the Ion Beam Modification and Analysis Laboratory at the University of North Texas is equipped with a 100 mm2 high purity germanium X-ray detector (Canberra GUL0110 Ultra-LEGe). In order to calibrate the efficiency of the detector for standard less PIXE analysis we have measured the X-ray yield of a set of commercially available X-ray fluorescence standards. The set contained elements from low atomic number Z = 11 (sodium) to higher atomic numbers to cover the X-ray energy region from 1.25 keV to about 20 keV where the detector is most efficient. The effective charge was obtained from the proton backscattering yield of a calibrated particle detector.

Mulware, Stephen J.; Baxley, Jacob D.; Rout, Bibhudutta; Reinert, Tilo



Quantitative RT-PCR gene expression analysis of laser microdissected tissue samples  

PubMed Central

Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is a valuable tool for measuring gene expression in biological samples. However, unique challenges are encountered when studies are performed on cells microdissected from tissues derived from animal models or the clinic, including specimen related issues, variability of RNA template quality and quantity, and normalization. qRT-PCR using small amounts of mRNA derived from dissected cell populations requires adaptation of standard methods to allow meaningful comparisons across sample sets. The protocol described here presents the rationale, technical steps, normalization strategy, and data analysis necessary to generate reliable gene expression measurements of transcripts from dissected samples. The entire protocol from tissue microdissection through qRT-PCR analysis requires approximately 16 hours. PMID:19478806

Erickson, Heidi S.; Albert, Paul S.; Gillespie, John W.; Rodriguez-Canales, Jaime; Linehan, W. Marston; Pinto, Peter A.; Chuaqui, Rodrigo F.; Emmert-Buck, Michael R.



Quantitative PCR Analysis of Molds in the Dust from Homes of Asthmatic Children in North Carolina  

SciTech Connect

The vacuum cleaner bag (VCB) dust from the homes of 19 asthmatic children in North Carolina (NC) was analyzed by mold specific quantitative PCR. These results were compared to the analysis of the VCB dust from 157 homes in the HUD “American Healthy Home Survey” of homes in the US. The American Relative Moldiness Index (ARMI) was calculated for each of the homes. The mean and standard deviation (SD) of the ARMI values in the homes of the NC asthmatic children was 11.0 (5.3), compared to the HUD survey VCB ARMI value mean and SD of 6.6 (4.4). The median ARMI value was significantly higher(p < 0.001) in the asthmatic childrens’s homes. The molds Chaetomium globosum and Eurotium amsterdameli were the primary species in the NC homes making the ARMI values higher. Vacuum cleaner bag dust samples may be a less expensive but still useful method of home mold analysis.

Vesper, Stephen J.; McKinstry, Craig A.; Ashley, Peter; Haugland, Richard A.; Yeatts, Karin; Bradham, Karen; Svendsen, Eric



Estimation of crack and damage progression in concrete by quantitative acoustic emission analysis  

SciTech Connect

The kinematics of cracking can be represented by the moment tensor. To distinguish moment tensor components from acoustic emission waveforms, the SiGMA (simplified Green`s functions for moment tensor analysis) procedure was developed. By applying the procedure to bending tests of notched beams, cracks in the fracture process zone of cementitious materials can be identified by kinematic means. In addition to cracks, estimation of the damage level in structural concrete is also conducted, based on acoustic emission activity of a concrete sample under compression. Depending on the damage resulting from existing microcracks, acoustic emission generated behavior is quantitatively estimated by the rate process analysis. The damage mechanics are introduced to quantify the degree of damage. Determining the current damage level using acoustic emission without information on undamaged concrete is attempted by correlating the damage value with the rate process.

Ohtsu, Masayasu [Kumamoto Univ. (Japan). Dept. of Civil Engineering and Architecture



Quantitative analysis of fitness and genetic interactions in yeast on a genome scale.  


Global quantitative analysis of genetic interactions is a powerful approach for deciphering the roles of genes and mapping functional relationships among pathways. Using colony size as a proxy for fitness, we developed a method for measuring fitness-based genetic interactions from high-density arrays of yeast double mutants generated by synthetic genetic array (SGA) analysis. We identified several experimental sources of systematic variation and developed normalization strategies to obtain accurate single- and double-mutant fitness measurements, which rival the accuracy of other high-resolution studies. We applied the SGA score to examine the relationship between physical and genetic interaction networks, and we found that positive genetic interactions connect across functionally distinct protein complexes revealing a network of genetic suppression among loss-of-function alleles. PMID:21076421

Baryshnikova, Anastasia; Costanzo, Michael; Kim, Yungil; Ding, Huiming; Koh, Judice; Toufighi, Kiana; Youn, Ji-Young; Ou, Jiongwen; San Luis, Bryan-Joseph; Bandyopadhyay, Sunayan; Hibbs, Matthew; Hess, David; Gingras, Anne-Claude; Bader, Gary D; Troyanskaya, Olga G; Brown, Grant W; Andrews, Brenda; Boone, Charles; Myers, Chad L



[Study on temperature correctional models of quantitative analysis with near infrared spectroscopy].  


Effect of enviroment temperature on near infrared spectroscopic quantitative analysis was studied. The temperature correction model was calibrated with 45 wheat samples at different environment temperaturs and with the temperature as an external variable. The constant temperature model was calibated with 45 wheat samples at the same temperature. The predicted results of two models for the protein contents of wheat samples at different temperatures were compared. The results showed that the mean standard error of prediction (SEP) of the temperature correction model was 0.333, but the SEP of constant temperature (22 degrees C) model increased as the temperature difference enlarged, and the SEP is up to 0.602 when using this model at 4 degrees C. It was suggested that the temperature correctional model improves the analysis precision. PMID:16201365

Zhang, Jun; Chen, Hua-cai; Chen, Xing-dan



Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis  

PubMed Central

Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens’ theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells. PMID:24761307

Razi Naqvi, K.



Quantitative Fourier transform infrared analysis of gas phase cigarette smoke and other gas mixtures  

SciTech Connect

A new method for the analysis of selected components in complex gas mixtures has been developed utilizing a relatively inexpensive Fourier transform infrared spectrometer and a continuous flow gas cell. The method was used to monitor nitric oxide and nitrogen dioxide concentrations in cigarette smoke with time. Using multivariate least-square regression analysis, it is possible to simultaneously quantitate both NO and NO{sub 2}, even in the presence of overlapping peaks. Using this method, the oxidation of nitric oxide in the presence of isoprene in cigarette smoke and in a model system was followed with time. The method also can be applied to other compounds in smoke or to any other gaseous mixture.

Cueto, R.; Church, D.F.; Pryor, W.A. (Louisiana State Univ., Baton Rouge (USA))



The Recent Progress in Quantitative Medical Image Analysis for Computer Aided Diagnosis Systems  

PubMed Central

Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different CAD schemes are being developed for use in the detection and/or characterization of various lesions found through various types of medical imaging. These imaging technologies employ conventional projection radiography, computed tomography, magnetic resonance imaging, ultrasonography, etc. In order to achieve a high performance level for a computerized diagnosis, it is important to employ effective image analysis techniques in the major steps of a CAD scheme. The main objective of this review is to attempt to introduce the diverse methods used for quantitative image analysis, and to provide a guide for clinicians. PMID:22084808

Kim, Tae-Yun; Son, Jaebum



Clean Lakes Program Phase 2 Project. Report for Lake Le-Aqua-Na Stephenson County, Illinois.  

National Technical Information Service (NTIS)

Lake Le-Aqua-Na is a 39.5 acre (16.0 ha) recreational impoundment located in Le-Aqua-Na State Park, Stephenson County, Illinois. The lake is owned and managed by the Illinois Department of Conservation (IDOC). A Phase I diagnostic/feasibility study, condu...




SciTech Connect

Mobile modular installation ''Aqua-Express'' is a liquid low level and intermediate level radioactive waste (LL&ILRW) treatment facility, intended for not large research centers and other organizations, which activity causes the formation of a few quantity (up to 500 m3/year) of low and intermediate level radioactive waste water. Mobile modular installation ''Aqua-Express'' has the following features: (1) filtration, sorption and ultrafiltration units are used for LL&ILRW purification; (2) installation ''Aqua-Express'' consists of a cascade of three autonomous aqueous liquid waste-purifying installations; (3) installation ''Aqua-Express'' is a mobile installation; the installation can be transported by car, train, ship, or plane, as well as placed in a standard transport (sea or railway) container; (4) installation ''Aqua-Express'' does not includes any technological equipment for conditioning the secondary radioactive waste. Productivity of the installation ''Aqua-Express'' by purified water depends on composition of the initial liquid waste and makes up to 300 l/h. In present report is described the design of installation ''Aqua-Express'', theory of LRW purification in the installation ''Aqua-Express'' and some results of its use at cleaning real radioactive waters at State unitary enterprise - MosNPO ''Radon''.

Karlin, Yurii; Dmitriev, Sergey; Iljin, Vadim; Ojovan, Mihail; Burcl, Rudolf



OP-0001-W3-F2 GTEx Discrepancy Checklist for the Aqua Kit - Postmortem

GTEx Discrepancy Checklist for Aqua Kit – Postmortem OP-0001-W3-F2 VER. 1.2.0 Effective Date: 04/19/2012 Page 1 of 1 Instruction: Use one form for each aqua box received Condition of kit received ?Acceptable ?Damaged, usable ?Damaged, NOT usable:


OP-0001-W3-F1 GTEx Discrepancy Checklist for the Aqua Kit - Surgical

GTEx Discrepancy Checklist for Aqua Kit – Surgical OP-0001-W3-F1 VER. 1.2.0 Effective Date: 04/19/2012 Page 1 of 1 Instruction: Use one form for each aqua kit received Condition of kit received ?Acceptable


Network component analysis provides quantitative insights on an Arabidopsis transcription factor-gene regulatory network  

PubMed Central

Background Gene regulatory networks (GRNs) are models of molecule-gene interactions instrumental in the coordination of gene expression. Transcription factor (TF)-GRNs are an important subset of GRNs that characterize gene expression as the effect of TFs acting on their target genes. Although such networks can qualitatively summarize TF-gene interactions, it is highly desirable to quantitatively determine the strengths of the interactions in a TF-GRN as well as the magnitudes of TF activities. To our knowledge, such analysis is rare in plant biology. A computational methodology developed for this purpose is network component analysis (NCA), which has been used for studying large-scale microbial TF-GRNs to obtain nontrivial, mechanistic insights. In this work, we employed NCA to quantitatively analyze a plant TF-GRN important in floral development using available regulatory information from AGRIS, by processing previously reported gene expression data from four shoot apical meristem cell types. Results The NCA model satisfactorily accounted for gene expression measurements in a TF-GRN of seven TFs (LFY, AG, SEPALLATA3 [SEP3], AP2, AGL15, HY5 and AP3/PI) and 55 genes. NCA found strong interactions between certain TF-gene pairs including LFY???MYB17, AG???CRC, AP2???RD20, AGL15???RAV2 and HY5???HLH1, and the direction of the interaction (activation or repression) for some AGL15 targets for which this information was not previously available. The activity trends of four TFs?-?LFY, AG, HY5 and AP3/PI as deduced by NCA correlated well with the changes in expression levels of the genes encoding these TFs across all four cell types; such a correlation was not observed for SEP3, AP2 and AGL15. Conclusions For the first time, we have reported the use of NCA to quantitatively analyze a plant TF-GRN important in floral development for obtaining nontrivial information about connectivity strengths between TFs and their target genes as well as TF activity. However, since NCA relies on documented connectivity information about the underlying TF-GRN, it is currently limited in its application to larger plant networks because of the lack of documented connectivities. In the future, the identification of interactions between plant TFs and their target genes on a genome scale would allow the use of NCA to provide quantitative regulatory information about plant TF-GRNs, leading to improved insights on cellular regulatory programs. PMID:24228871



Twenty-five years of quantitative PCR for gene expression analysis.  


Following its invention 25 years ago, PCR has been adapted for numerous molecular biology applications. Gene expression analysis by reverse-transcription quantitative PCR (RT-qPCR) has been a key enabling technology of the post-genome era. Since the founding of BioTechniques, this journal has been a resource for the improvements in qPCR technology, experimental design, and data analysis. qPCR and, more specifically, real-time qPCR has become a routine and robust approach for measuring the expression of genes of interest, validating microarray experiments, and monitoring biomarkers. The use of real-time qPCR has nearly supplanted other approaches (e.g., Northern blotting, RNase protection assays). This review examines the current state of qPCR for gene expression analysis now that the method has reached a mature stage of development and implementation. Specifically, the different fluorescent reporter technologies of real-time qPCR are discussed as well as the selection of endogenous controls. The conceptual framework for data analysis methods is also presented to demystify these analysis techniques. The future of qPCR remains bright as the technology becomes more rapid, cost-effective, easier to use, and capable of higher throughput. PMID:18474036

VanGuilder, Heather D; Vrana, Kent E; Freeman, Willard M



Targeted three-dimensional liquid chromatography: a versatile tool for quantitative trace analysis in complex matrices.  


Targeted multidimensional liquid chromatography (MDLC), commonly referred to as 'coupled-column' or 'heartcutting', has been used extensively since the 1970s for analysis of low concentration constituents in complex biological and environmental samples. A primary benefit of adding additional dimensions of separation to conventional HPLC separations is that the additional resolving power provided by the added dimensions can greatly simplify method development for complex samples. Despite the long history of targeted MDLC, nearly all published reports involve two-dimensional methods, and very few have explored the benefits of adding a third dimension of separation. In this work we capitalize on recent advances in reversed-phase HPLC to construct a three-dimensional HPLC system for targeted analysis built on three very different reversed-phase columns. Using statistical peak overlap theory and one of the most recent models of reversed-phase selectivity we use simulations to show the potential benefit of adding a third dimension to a MDLC system. We then demonstrate this advantage experimentally by developing targeted methods for the analysis of a variety of broadly relevant molecules in different sample matrices including urban wastewater treatment effluent, human urine, and river water. We find in each case that excellent separations of the target compounds from the sample matrix are obtained using one set of very similar separation conditions for all of the target compound/sample matrix combinations, thereby significantly reducing the normally tedious method development process. A rigorous quantitative comparison of this approach to conventional 1DLC-MS/MS also shows that targeted 3DLC with UV detection is quantitatively accurate for the target compounds studied, with method detection limits in the low parts-per-trillion range of concentrations. We believe this work represents a first step toward the development of a targeted 3D analysis system that will be more effective than previous 2D separations as a tool for the rapid development of robust methods for quantitation of low concentration constituents in complex mixtures. PMID:21047638

Simpkins, Scott W; Bedard, Jeremy W; Groskreutz, Stephen R; Swenson, Michael M; Liskutin, Tomas E; Stoll, Dwight R



Automated quantitative spectroscopic analysis combining background subtraction, cosmic ray removal, and peak fitting.  


An integrated concept for post-acquisition spectrum analysis was developed for in-line (real-time) and off-line applications that preserves absolute spectral quantification; after the initializing parameter setup, only minimal user intervention is required. This spectral evaluation suite is composed of a sequence of tasks specifically addressing cosmic ray removal, background subtraction, and peak analysis and fitting, together with the treatment of two-dimensional charge-coupled device array data. One may use any of the individual steps on their own, or may exclude steps from the chain if so desired. For the background treatment, the canonical rolling-circle filter (RCF) algorithm was adopted, but it was coupled with a Savitzky-Golay filtering step on the locus-array generated from a single RCF pass. This novel only-two-parameter procedure vastly improves on the RCF's deficiency to overestimate the baseline level in spectra with broad peak features. The peak analysis routine developed here is an only-two-parameter (amplitude and position) fitting algorithm that relies on numerical line shape profiles rather than on analytical functions. The overall analysis chain was programmed in National Instrument's LabVIEW; this software allows for easy incorporation of this spectrum analysis suite into any LabVIEW-managed instrument control, data-acquisition environment, or both. The strength of the individual tasks and the integrated program sequence are demonstrated for the analysis of a wide range of (although not necessarily limited to) Raman spectra of varying complexity and exhibiting nonanalytical line profiles. In comparison to other analysis algorithms and functions, our new approach for background subtraction, peak analysis, and fitting returned vastly improved quantitative results, even for "hidden" details in the spectra, in particular, for nonanalytical line profiles. All software is available for download. PMID:23876734

James, Timothy M; Schlösser, Magnus; Lewis, Richard J; Fischer, Sebastian; Bornschein, Beate; Telle, Helmut H



A Novel Image-Analysis Toolbox Enabling Quantitative Analysis of Root System Architecture1[W][OA  

PubMed Central

We present in this paper a novel, semiautomated image-analysis software to streamline the quantitative analysis of root growth and architecture of complex root systems. The software combines a vectorial representation of root objects with a powerful tracing algorithm that accommodates a wide range of image sources and quality. The root system is treated as a collection of roots (possibly connected) that are individually represented as parsimonious sets of connected segments. Pixel coordinates and gray level are therefore turned into intuitive biological attributes such as segment diameter and orientation as well as distance to any other segment or topological position. As a consequence, user interaction and data analysis directly operate on biological entities (roots) and are not hampered by the spatially discrete, pixel-based nature of the original image. The software supports a sampling-based analysis of root system images, in which detailed information is collected on a limited number of roots selected by the user according to specific research requirements. The use of the software is illustrated with a time-lapse analysis of cluster root formation in lupin (Lupinus albus) and an architectural analysis of the maize (Zea mays) root system. The software, SmartRoot, is an operating system-independent freeware based on ImageJ and relies on cross-platform standards for communication with data-analysis software. PMID:21771915

Lobet, Guillaume; Pages, Loic; Draye, Xavier



Assessment of ERCC1 and XPF Protein Expression Using Quantitative Immunohistochemistry in Nasopharyngeal Carcinoma Patients Undergoing Curative Intent Treatment  

SciTech Connect

Purpose: We sought to evaluate the prognostic/predictive value of ERCC1 and XPF in patients with nonmetastatic nasopharyngeal carcinoma (NPC) treated with curative intent. Methods and Materials: ERCC1 and XPF protein expression was evaluated by immunofluorescence combined with automated quantitative analysis (AQUA) using the FL297 and 3F2 antibodies, respectively. ERCC1 and XPF protein expression levels were correlated with clinical outcomes. Results: Patient characteristics were as follows: mean age 52 years (range, 18-85 years), 67% male, 72% Karnofsky performance status (KPS) ?90%, World Health Organization (WHO) type 1/2/3 = 12%/28%/60%, stage III/IV 65%. With a median follow-up time of 50 months (range, 2.9 to 120 months), the 5-year overall survival (OS) was 70.8%. Median standardized nuclear AQUA scores were used as cutpoints for ERCC1 (n=138) and XPF (n=130) protein expression. Agreement between dichotomized ERCC1 and XPF scores was high at 79.4% (kappa = 0.587, P<.001). Neither biomarker predicted locoregional recurrence, DFS, or OS after adjustment for age and KPS, irrespective of stratification by stage, WHO type, or treatment. Conclusions: Neither ERCC1 nor XPF, analyzed by quantitative immunohistochemistry using the FL297 and 3F2 antibodies, was prognostic or predictive in this cohort of NPC patients.

Jagdis, Amanda [Department of Internal Medicine, Faculty of Medicine, University of British Columbia, Vancouver, British Columbia (Canada)] [Department of Internal Medicine, Faculty of Medicine, University of British Columbia, Vancouver, British Columbia (Canada); Phan, Tien [Department of Radiation Oncology, Tom Baker Cancer Centre, Calgary, Alberta (Canada) [Department of Radiation Oncology, Tom Baker Cancer Centre, Calgary, Alberta (Canada); Faculty of Medicine, University of Calgary, Calgary, Alberta (Canada); Klimowicz, Alexander C. [Department of Oncology, Tom Baker Cancer Centre, Calgary, Alberta (Canada) [Department of Oncology, Tom Baker Cancer Centre, Calgary, Alberta (Canada); Faculty of Medicine, University of Calgary, Calgary, Alberta (Canada); Laskin, Janessa J. [Department of Medical Oncology, British Columbia Cancer Agency–Vancouver, Vancouver, British Columbia (Canada) [Department of Medical Oncology, British Columbia Cancer Agency–Vancouver, Vancouver, British Columbia (Canada); Faculty of Medicine, University of British Columbia, Vancouver, British Columbia (Canada); Lau, Harold Y. [Department of Radiation Oncology, Tom Baker Cancer Centre, Calgary, Alberta (Canada) [Department of Radiation Oncology, Tom Baker Cancer Centre, Calgary, Alberta (Canada); Faculty of Medicine, University of Calgary, Calgary, Alberta (Canada); Petrillo, Stephanie K. [Functional Tissue Imaging Unit, Translational Research Laboratory, Tom Baker Cancer Centre, Calgary, Alberta (Canada)] [Functional Tissue Imaging Unit, Translational Research Laboratory, Tom Baker Cancer Centre, Calgary, Alberta (Canada); Siever, Jodi E. [Department of Biostatistics, Public Health Innovation and Decision Support Population and Public Health, Alberta Health Services, Calgary, Alberta (Canada)] [Department of Biostatistics, Public Health Innovation and Decision Support Population and Public Health, Alberta Health Services, Calgary, Alberta (Canada); Thomson, Thomas A. [Department of Pathology, British Columbia Cancer Agency–Vancouver, Vancouver, British Columbia (Canada) [Department of Pathology, British Columbia Cancer Agency–Vancouver, Vancouver, British Columbia (Canada); Faculty of Medicine, University of British Columbia, Vancouver, British Columbia (Canada); Magliocco, Anthony M. [Department of Pathology, Tom Baker Cancer Centre, Calgary, Alberta (Canada) [Department of Pathology, Tom Baker Cancer Centre, Calgary, Alberta (Canada); Faculty of Medicine, University of Calgary, Calgary, Alberta (Canada); Hao, Desirée, E-mail: [Department of Medical Oncology, Tom Baker Cancer Centre, Calgary, Alberta (Canada) [Department of Medical Oncology, Tom Baker Cancer Centre, Calgary, Alberta (Canada); Faculty of Medicine, University of Calgary, Calgary, Alberta (Canada)



Aqua de Ney, California, a spring of unique chemical character  

USGS Publications Warehouse

The chemistry of water of Aqua de Ney, a cold spring of unusual character located in Siskiyou County, Calif., has been re-examined as part of a study of the relation of water chemistry to rock environment. The water has a pH of 11??6 and a silica content of 4000 parts per million (p.p.m.), the highest values known to occur in natural ground waters. The rocks exposed nearby consist of two volcanic sequences, one predominantly basaltic in composition, the other highly siliceous. Neither these rocks nor the sedimentary and igneous rocks presumed to underlie the area at depth seem to offer explanation of the unusual mineralization which includes 240 p.p.m. of boron, 1000 p.p.m. of sulphide (as H2S), and 148 p.p.m. of ammonia nitrogen (as NH4) in a water that is predominantly sodium chloride and sodium carbonate in character. By analogy, it is assumed that water from Aqua de Ney is the product of an initial mixture of connate sea water with a calcium magnesium sulphate water. It is postulated that ion exchange has increased the content of sodium and reduced that of calcium and magnesium, and that sulphate reduction has brought about the high alkalinity, high pH, and high content of sulphide. The large silica value is explained as the result of solution of silica by water having the high pH observed. ?? 1961.

Feth, J.H.; Rogers, S.M.; Roberson, C.E.



Quantitative proteomic analysis reveals potential diagnostic markers and pathways involved in pathogenesis of renal cell carcinoma  

PubMed Central

There are no serum biomarkers for the accurate diagnosis of clear cell renal cell carcinoma (ccRCC). Diagnosis and decision of nephrectomy rely on imaging which is not always accurate. Non-invasive diagnostic biomarkers are urgently required. In this study, we preformed quantitative proteomics analysis on a total of 199 patients including 30 matched pairs of normal kidney and ccRCC using isobaric tags for relative and absolute quantitation (iTRAQ) labeling and LC-MS/MS analysis to identify differentially expressed proteins. We found 55 proteins significantly dysregulated in ccRCC compared to normal kidney tissue. 54 were previously reported to play a role in carcinogenesis, and 39 are secreted proteins. Dysregulation of alpha-enolase (ENO1), L-lactate dehydrogenase A chain (LDHA), heat shock protein beta-1 (HSPB1/Hsp27), and 10 kDa heat shock protein, mitochondrial (HSPE1) was confirmed in two independent sets of patients by western blot and immunohistochemistry. Pathway analysis, validated by PCR, showed glucose metabolism is altered in ccRCC compared to normal kidney tissue. In addition, we examined the utility of Hsp27 as biomarker in serum and urine. In ccRCC patients, Hsp27 was elevated in the urine and serum and high serum Hsp27 was associated with high grade (Grade 3–4) tumors. These data together identify potential diagnostic biomarkers for ccRCC and shed new light on the molecular mechanisms that are dysregulated and contribute to the pathogenesis of ccRCC. Hsp27 is a promising diagnostic marker for ccRCC although further large-scale studies are required. Also, molecular profiling may help pave the road to the discovery of new therapies. PMID:24504108

DeSouza, Leroi V.; Krakovska-Yutz, Olga; Metias, Shereen; Romaschin, Alexander D.; Honey, R. John; Stewart, Robert; Pace, Kenneth; Lee, Jason; Jewett, Michael AS; Bjarnason, Georg A.; Siu, K.W. Michael; Yousef, George M.



Quantitative analysis of chromosome in situ hybridization signal in paraffin-embedded tissue sections  

SciTech Connect

Interphase cytogenetic analysis using chromosome-specific probes is increasingly being used to detect chromosomal aberrations on paraffin-embedded tissue sections. However, quantitative analysis of the hybridization signal is confounded by the nuclear slicing that occurs during sectioning. To determine the sensitivity and accuracy of chromosome in situ hybridization for detecting numerical chromosomal aberrations on paraffin-embedded sections, in situ hybridization was performed on sections derived from mixtures of cell populations with know frequencies of numerical chromosomal aberrations and the Chromosome Index (CI) was calculated (i.e., total number of signal spots/number of nuclei counted) as a quantitative measure of chromosome copy number. The presence of 25% or more monosomic or tetrasomic cells in a given population was easily detected as a change in CI (P < 0.05). Lower degrees of polysomy could be detected as a small percentage of nuclear fragments with >2 signal spots. The CI was not significantly influenced by a change in section thickness from 4 to 8 {mu}M, by an increase in cell size from 478 to 986 {mu}M{sup 3}, or by the choice of detection method (fluorescence vs. conventional bright-field microscopy). Comparative analysis of touch preparations and tissue sections from the corresponding breast tumors showed that CI accurately reflects the average copy number of chromosomes in intact nuclei and may actually be superior to in situ hybridization on whole nuclei for the detection of numerical chromosomal changes in defined histologic areas. This method is thus a sensitive and accurate means of studying genetic changes in premalignant and malignant tissue, and of assessing the genetic changes associated with specific phenotypes. 27 refs., 8 figs., 3 tabs.

Dhingra, K.; Emami, K.; Hortobagyi, G.N. [Univ. of Texas M.D. Anderson Cancer Center, Houston, TX (United States)] [and others



Quantitative meta-analysis on the effects of defaunation of the rumen on growth, intake and digestion in ruminants  

Microsoft Academic Search

A quantitative meta-analysis was applied on 90 publications and 169 comparisons dealing with defaunation of the rumen (removal of protozoa from the rumen) in order to point out the major quantitative effects of defaunation and identify interfering factors. Generally speaking defaunation significantly (P<0.01) increased average daily gain (11% on average, 64 trials) but did not affect dry matter intake. As

M Eugène; H Archimède; D Sauvant



Quantitative human chorionic gonadotropin analysis. I. Comparison of an immunoradiometric assay and a radioimmunoassay  

SciTech Connect

An immunoradiometric assay (IRMA) for the quantitative analysis of human chorionic gonadotropin (hCG) was evaluated for specificity, sensitivity, accuracy and precision. The results were compared with those of the conventional radioimmunoassay (RIA) used in our laboratory. The IRMA is a solid-phase, double-antibody immunoassay that sandwiches the intact hCG molecule between the two antibodies. It has specificity, accuracy, and precision which are similar to those of the RIA. The RIA is based upon the assumptions that the antigenicity of the tracer is not altered by the iodination process and that the antibody reacts equally with all of the antigens, including the radiolabeled antigen. The IRMA does not use radiolabeled antigens and thus is free of the assumptions made in the conventional RIA. The IRMA may be more accurate at the lower limits of the assay because it does not require logarithmic transformations. Since the IRMA does not measure the free beta-subunit of hCG, it cannot be endorsed as the sole technique to quantitate hCG in patients with gestational trophoblastic neoplasia until the significance of the free beta-subunit in these patients is determined.

Shapiro, A.I.; Wu, T.F.; Ballon, S.C.; Lamb, E.J.



Quantitative analysis on PUVA-induced skin photodamages using optical coherence tomography  

NASA Astrophysics Data System (ADS)

Psoralen plus ultraviolet A radiation (PUVA) therapy is a very important clinical treatment of skin diseases such as vitiligo and psoriasis, but associated with an increased risk of skin photodamages especially photoaging. Since skin biopsy alters the original skin morphology and always requires an iatrogenic trauma, optical coherence tomography (OCT) appears to be a promising technique to study skin damage in vivo. In this study, the Balb/c mice had 8-methoxypsralen (8-MOP) treatment prior to UVA radiation was used as PUVA-induced photo-damaged modal. The OCT imaging of photo-damaged group (modal) and normal group (control) in vivo was obtained of mice dorsal skin at 0, 24, 48, 72 hours after irradiation respectively. And then the results were quantitatively analyzed combined with histological information. The experimental results showed that, PUVA-induced photo-damaged skin had an increase in epidermal thickness (ET), a reduction of attenuation coefficient in OCT images signal, and an increase in brightness of the epidermis layer compared with the control group. In conclusion, noninvasive high-resolution imaging techniques such as OCT may be a promising tool for photobiological studies aimed at assessing photo-damage and repair processes in vivo. It can be used to quantitative analysis of changes in photo-damaged skin, such as the ET and collagen in dermis, provides a theoretical basis for treatment and prevention of skin photodamages.

Zhai, Juan; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Zeng, Changchun; Jin, Ying



Ultrasensitive, self-calibrated cavity ring-down spectrometer for quantitative trace gas analysis.  


A cavity ring-down spectrometer is built for trace gas detection using telecom distributed feedback (DFB) diode lasers. The longitudinal modes of the ring-down cavity are used as frequency markers without active-locking either the laser or the high-finesse cavity. A control scheme is applied to scan the DFB laser frequency, matching the cavity modes one by one in sequence and resulting in a correct index at each recorded spectral data point, which allows us to calibrate the spectrum with a relative frequency precision of 0.06 MHz. Besides the frequency precision of the spectrometer, a sensitivity (noise-equivalent absorption) of 4×10-11??cm-1??Hz-1/2 has also been demonstrated. A minimum detectable absorption coefficient of 5×10-12??cm-1 has been obtained by averaging about 100 spectra recorded in 2  h. The quantitative accuracy is tested by measuring the CO2 concentrations in N2 samples prepared by the gravimetric method, and the relative deviation is less than 0.3%. The trace detection capability is demonstrated by detecting CO2 of ppbv-level concentrations in a high-purity nitrogen gas sample. Simple structure, high sensitivity, and good accuracy make the instrument very suitable for quantitative trace gas analysis. PMID:25402995

Chen, Bing; Sun, Yu R; Zhou, Ze-Yi; Chen, Jian; Liu, An-Wen; Hu, Shui-Ming



Quantitative analysis of intrinsic skin aging in dermal papillae by in vivo harmonic generation microscopy  

PubMed Central

Chronological skin aging is associated with flattening of the dermal-epidermal junction (DEJ), but to date no quantitative analysis focusing on the aging changes in the dermal papillae (DP) has been performed. The aim of the study is to determine the architectural changes and the collagen density related to chronological aging in the dermal papilla zone (DPZ) by in vivo harmonic generation microscopy (HGM) with a sub-femtoliter spatial resolution. We recruited 48 Asian subjects and obtained in vivo images on the sun-protected volar forearm. Six parameters were defined to quantify 3D morphological changes of the DPZ, which we analyzed both manually and computationally to study their correlation with age. The depth of DPZ, the average height of isolated DP, and the 3D interdigitation index decreased with age, while DP number density, DP volume, and the collagen density in DP remained constant over time. In vivo high-resolution HGM technology has uncovered chronological aging-related variations in DP, and sheds light on real-time quantitative skin fragility assessment and disease diagnostics based on collagen density and morphology. PMID:25401037

Liao, Yi-Hua; Kuo, Wei-Cheng; Chou, Sin-Yo; Tsai, Cheng-Shiun; Lin, Guan-Liang; Tsai, Ming-Rung; Shih, Yuan-Ta; Lee, Gwo-Giun; Sun, Chi-Kuang



RGB color calibration for quantitative image analysis: the "3D thin-plate spline" warping approach.  


In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data. PMID:22969337

Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado



Quantitative nanostructural and single-molecule force spectroscopy biomolecular analysis of human-saliva-derived exosomes.  


Exosomes are naturally occurring nanoparticles with unique structure, surface biochemistry, and mechanical characteristics. These distinct nanometer-sized bioparticles are secreted from the surfaces of oral epithelial cells into saliva and are of interest as oral-cancer biomarkers. We use high- resolution AFM to show single-vesicle quantitative differences between exosomes derived from normal and oral cancer patient's saliva. Compared to normal exosomes (circular, 67.4 ± 2.9 nm), our findings indicate that cancer exosome populations are significantly increased in saliva and display irregular morphologies, increased vesicle size (98.3 ± 4.6 nm), and higher intervesicular aggregation. At the single-vesicle level, cancer exosomes exhibit significantly (P < 0.05) increased CD63 surface densities. To our knowledge, it represents the first report detecting single-exosome surface protein variations. Additionally, high-resolution AFM imaging of cancer saliva samples revealed discrete multivesicular bodies with intraluminal exosomes enclosed. We discuss the use of quantitative, nanoscale ultrastructural and surface biomolecular analysis of saliva exosomes at single-vesicle- and single-protein-level sensitivities as a potentially new oral cancer diagnostic. PMID:22017459

Sharma, Shivani; Gillespie, Boyd M; Palanisamy, Viswanathan; Gimzewski, James K



RGB Color Calibration for Quantitative Image Analysis: The "3D Thin-Plate Spline" Warping Approach  

PubMed Central

In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data. PMID:22969337

Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado



Molecular analysis of mitotic chromosome condensation using a quantitative time-resolved fluorescence microscopy assay.  


Chromosomes condense during mitotic entry to facilitate their segregation. Condensation is typically assayed in fixed preparations, limiting analysis of contributing factors. Here, we describe a quantitative method to monitor condensation kinetics in living cells expressing GFP fused to a core histone. We demonstrate the utility of this method by using it to analyze the molecular requirements for the condensation of holocentric chromosomes during the first division of the Caenorhabditis elegans embryo. In control embryos, the fluorescence intensity distribution for nuclear GFP:histone changes during two distinct time intervals separated by a plateau phase. During the first interval, primary condensation converts diffuse chromatin into discrete linear chromosomes. After the plateau, secondary condensation compacts the curvilinear chromosomes to form shorter bar-shaped structures. We quantitatively compared the consequences on this characteristic profile of depleting the condensin complex, the mitosis-specific histone H3 kinase Aurora B, the centromeric histone CENP-A, and CENP-C, a conserved protein required for kinetochore assembly. Both condensin and CENP-A play critical but distinct roles in primary condensation. In contrast, depletion of CENP-C slows but does not prevent primary condensation. Finally, Aurora B inhibition has no effect on primary condensation, but slightly delays secondary condensation. These results provide insights into the process of condensation, help resolve apparent contradictions from prior studies, and indicate that CENP-A chromatin has an intrinsic role in the condensation of holocentric chromosomes that is independent of its requirement for kinetochore assembly. PMID:17005720

Maddox, Paul S; Portier, Nathan; Desai, Arshad; Oegema, Karen



Quantitative analysis of zinc in rat hippocampal mossy fibers by nuclear microscopy.  


Zinc (Zn) is involved in regulating mental and motor functions of the brain. Previous approaches have determined Zn content in the brain using semi-quantitative histological methods. We present here an alternative approach to map and quantify Zn levels in the synapses from mossy fibers to CA3 region of the hippocampus. Based on the use of nuclear microscopy, which is a combination of imaging and analysis techniques encompassing scanning transmission ion microscopy (STIM), Rutherford backscattering spectrometry (RBS), and particle induced X-ray emission (PIXE), it enables quantitative elemental mapping down to the parts per million (?g/g dry weight) levels of zinc in rat hippocampal mossy fibers. Our results indicate a laminar-specific Zn concentration of 240±9?M in wet weight level (135±5?g/g dry weight) in the stratum lucidum (SL) compared to 144±6?M in wet weight level (81±3?g/g dry weight) in the stratum pyramidale (SP) and 78±10?M in wet weight level (44±5?g/g dry weight) in the stratum oriens (SO) of the hippocampus. The mossy fibers terminals in CA3 are mainly located in the SL. Hence the Zn concentration is suggested to be within this axonal presynaptic terminal system. PMID:22766378

Zhang, Binbin; Ren, Minqin; Sheu, Fwu-Shan; Watt, Frank; Routtenberg, Aryeh



Quantitative evaluation of DNA methylation by optimization of a differential-high resolution melt analysis protocol  

PubMed Central

DNA methylation is a key regulator of gene transcription. Alterations in DNA methylation patterns are common in most cancers, occur early in carcinogenesis and can be detected in body fluids. Reliable and sensitive quantitative assays are required to improve the diagnostic role of methylation in the management of cancer patients. Here we present an optimized procedure, based on differential-high resolution melting analysis (D-HRMA), for the rapid and accurate quantification of methylated DNA. Two sets of primers are used in a single tube for the simultaneous amplification of the methylated (M) and unmethylated (Um) DNA sequences in D-HRMA. After HRM, differential fluorescence was calculated at the specific melting temperature after automatic subtraction of UM-DNA fluorescence. Quantification was calculated by interpolation on an external standard curve generated by serial dilutions of M-DNA. To optimize the protocol, nine primer sets were accurately selected on the basis of the number of CpG on promoters of hTERT and Bcl2 genes. The use of optimized D-HRMA allowed us to detect up to 0.025% M-DNA. D-HRMA results of DNA from 85 bladder cancers were comparable to those obtained with real time quantitative methylation specific PCR. In addition, D-HRMA appears suitable for rapid and efficient measurements in ‘in vitro’ experiments on methylation patterns after treatment with demethylating drugs. PMID:19454604

Malentacchi, Francesca; Forni, Giulia; Vinci, Serena; Orlando, Claudio



A semi-quantitative approach to GMO risk-benefit analysis.  


In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process. PMID:21197601

Morris, E Jane



Oqtans: the RNA-seq workbench in the cloud for complete and reproducible quantitative transcriptome analysis.  


We present Oqtans, an open-source workbench for quantitative transcriptome analysis, that is integrated in Galaxy. Its distinguishing features include customizable computational workflows and a modular pipeline architecture that facilitates comparative assessment of tool and data quality. Oqtans integrates an assortment of machine learning-powered tools into Galaxy, which show superior or equal performance to state-of-the-art tools. Implemented tools comprise a complete transcriptome analysis workflow: short-read alignment, transcript identification/quantification and differential expression analysis. Oqtans and Galaxy facilitate persistent storage, data exchange and documentation of intermediate results and analysis workflows. We illustrate how Oqtans aids the interpretation of data from different experiments in easy to understand use cases. Users can easily create their own workflows and extend Oqtans by integrating specific tools. Oqtans is available as (i) a cloud machine image with a demo instance at, (ii) a public Galaxy instance at, (iii) a git repository containing all installed software (; most of which is also available from (iv) the Galaxy Toolshed and (v) a share string to use along with Galaxy CloudMan. PMID:24413671

Sreedharan, Vipin T; Schultheiss, Sebastian J; Jean, Géraldine; Kahles, André; Bohnert, Regina; Drewe, Philipp; Mudrakarta, Pramod; Görnitz, Nico; Zeller, Georg; Rätsch, Gunnar



Quantitative Analysis of Various Metalloprotein Compositional Stoichiometries with Simultaneous PIXE and NRA  

NASA Astrophysics Data System (ADS)

Stoichiometric characterization has been carried out on multiple metalloproteins using a combination of Ion Beam Analysis methods and a newly modified preparation technique. Particle Induced X-ray emission (PIXE) spectroscopy is a non-destructive ion beam analysis technique well suited to determine the concentrations of heavy elements. Nuclear Reaction Analysis (NRA) is a technique which measures the areal density of a thin target from scattering cross sections of 3.4 MeV protons. A combination of NRA and PIXE has been developed to provide a quantitative technique for the determination of stoichiometric metal ion ratios in metalloproteins. About one third of all proteins are metalloproteins, and most do not have well determined stoichiometric compositions for the metals they contain. Current work focuses on establishing a standard method in which to prepare protein samples. The method involves placing drops of protein solutions on aluminized polyethylene terephthalate (Mylar) and allowing them to dry. This technique has been tested for several proteins of known stoichiometry to determine cofactor content and has proven to be a reliable analysis method, accurately determining metal stoichiometry in cytochrome c, superoxide dismutase, concanavalin A, vitamin B12, and hemoglobin.

McCubbin, Andrew; Deyoung, Paul; Peaslee, Graham; Sibley, Megan; Warner, Joshua



Quantitative analysis of metformin in antidiabetic tablets by laser-induced breakdown spectroscopy  

NASA Astrophysics Data System (ADS)

Nowadays the production of counterfeit and low quality drugs affects human health and generates losses to pharmaceutical industries and tax revenue losses to government. Currently there are several methods for pharmaceutical product analysis; nevertheless, most of them depend on complex and time consuming steps such as sample preparation. In contrast to conventional methods, Laser-induced breakdown spectroscopy (LIBS) is evaluated as a potential analytical technique for the rapid screening and quality control of anti-diabetic solid formulations. In this paper authors propose a simple method to analyze qualitatively and quantitatively Active Pharmaceutical Ingredients (APIs) such as Metformin hydrochloride. The authors used ten nanosecond duration pulses (FWHM) from a Nd:YAG laser produces the induced breakdown for the analysis. Light is collected and focused into a Cerny-Turner spectrograph and dispersed into an ICCD camera for its detection. We used atomic emissions from Chlorine atoms present only in APIs as analyte signal. The analysis was improved using Bromine as internal standard. Linear calibration curves from synthetic samples were prepared achieving linearity higher than 99%. Our results were compared with HPLC results and validation was performed by statistical methods. The validation analysis suggests that both methods have no significant differences i.e., the proposed method can be implemented for monitoring the pharmaceutical production process in-situ in real time or for inspection and recognition of authenticity.

Contreras, U.; Ornelas-Soto, N.; Meneses-Nava, M. A.; Barbosa-García, O.; López-de-Alba, P. L.; López-Martínez, L.



The Lottia gigantea shell matrix proteome: re-analysis including MaxQuant iBAQ quantitation and phosphoproteome analysis  

PubMed Central

Background Although the importance of proteins of the biomineral organic matrix and their posttranslational modifications for biomineralization is generally recognized, the number of published matrix proteomes is still small. This is mostly due to the lack of comprehensive sequence databases, usually derived from genomic sequencing projects. However, in-depth mass spectrometry-based proteomic analysis, which critically depends on high-quality sequence databases, is a very fast tool to identify candidates for functional biomineral matrix proteins and their posttranslational modifications. Identification of such candidate proteins is facilitated by at least approximate quantitation of the identified proteins, because the most abundant ones may also be the most interesting candidates for further functional analysis. Results Re-quantification of previously identified Lottia shell matrix proteins using the intensity-based absolute quantification (iBAQ) method as implemented in the MaxQuant identification and quantitation software showed that only 57 of the 382 accepted identifications constituted 98% of the total identified matrix proteome. This group of proteins did not contain obvious intracellular proteins, such as cytoskeletal components or ribosomal proteins, invariably identified as minor components of high-throughput biomineral matrix proteomes. Fourteen of these major proteins were phosphorylated to a variable extent. All together we identified 52 phospho sites in 20 of the 382 accepted proteins with high confidence. Conclusions We show that iBAQ quantitation may be a useful tool to narrow down the group of functional biomineral matrix protein candidates for further research in cell biology, genetics or materials research. Knowledge of posttranslational modifications in these major proteins could be a valuable addition to previously published proteomes. This is true especially for phosphorylation, because this modification was already shown to modify mineralization processes in some instances. PMID:25018669



Normalization and statistical analysis of quantitative proteomics data generated by metabolic labeling.  


Comparative proteomics is a powerful analytical method for learning about the responses of biological systems to changes in growth parameters. To make confident inferences about biological responses, proteomics approaches must incorporate appropriate statistical measures of quantitative data. In the present work we applied microarray-based normalization and statistical analysis (significance testing) methods to analyze quantitative proteomics data generated from the metabolic labeling of a marine bacterium (Sphingopyxis alaskensis). Quantitative data were generated for 1,172 proteins, representing 1,736 high confidence protein identifications (54% genome coverage). To test approaches for normalization, cells were grown at a single temperature, metabolically labeled with (14)N or (15)N, and combined in different ratios to give an artificially skewed data set. Inspection of ratio versus average (MA) plots determined that a fixed value median normalization was most suitable for the data. To determine an appropriate statistical method for assessing differential abundance, a -fold change approach, Student's t test, unmoderated t test, and empirical Bayes moderated t test were applied to proteomics data from cells grown at two temperatures. Inverse metabolic labeling was used with multiple technical and biological replicates, and proteomics was performed on cells that were combined based on equal optical density of cultures (providing skewed data) or on cell extracts that were combined to give equal amounts of protein (no skew). To account for arbitrarily complex experiment-specific parameters, a linear modeling approach was used to analyze the data using the limma package in R/Bioconductor. A high quality list of statistically significant differentially abundant proteins was obtained by using lowess normalization (after inspection of MA plots) and applying the empirical Bayes moderated t test. The approach also effectively controlled for the number of false discoveries and corrected for the multiple testing problem using the Storey-Tibshirani false discovery rate (Storey, J. D., and Tibshirani, R. (2003) Statistical significance for genomewide studies. Proc. Natl. Acad. Sci. U.S.A. 100, 9440-9445). The approach we have developed is generally applicable to quantitative proteomics analyses of diverse biological systems. PMID:19605365

Ting, Lily; Cowley, Mark J; Hoon, Seah Lay; Guilhaus, Michael; Raftery, Mark J; Cavicchioli, Ricardo



The AquaDEB project (phase I): Analysing the physiological flexibility of aquatic species and connecting physiological diversity to ecological and evolutionary processes by using Dynamic Energy Budgets  

NASA Astrophysics Data System (ADS)

The European Research Project AquaDEB (2007-2011, is joining skills and expertise of some French and Dutch research institutes and universities to analyse the physiological flexibility of aquatic organisms and to link it to ecological and evolutionary processes within a common theoretical framework for quantitative bioenergetics [Kooijman, S.A.L.M., 2000. Dynamic energy and mass budgets in biological systems. Cambridge University Press, Cambridge]. The main scientific objectives in AquaDEB are i) to study and compare the sensitivity of aquatic species (mainly molluscs and fish) to environmental variability of natural or human origin, and ii) to evaluate the related consequences at different biological levels (individual, population, ecosystem) and temporal scales (life cycle, population dynamics, evolution). At mid-term life, the AquaDEB collaboration has already yielded interesting results by quantifying bio-energetic processes of various aquatic species (e.g. molluscs, fish, crustaceans, algae) with a single mathematical framework. It has also allowed to federate scientists with different backgrounds, e.g. mathematics, microbiology, ecology, chemistry, and working in different fields, e.g. aquaculture, fisheries, ecology, agronomy, ecotoxicology, climate change. For the two coming years, the focus of the AquaDEB collaboration will be in priority: (i) to compare energetic and physiological strategies among species through the DEB parameter values and to identify the factors responsible for any differences in bioenergetics and physiology; and to compare dynamic (DEB) versus static (SEB) energy models to study the physiological performance of aquatic species; (ii) to consider different scenarios of environmental disruption (excess of nutrients, diffuse or massive pollution, exploitation by man, climate change) to forecast effects on growth, reproduction and survival of key species; (iii) to scale up the models for a few species from the individual level up to the level of evolutionary processes.

Alunno-Bruscia, Marianne; van der Veer, Henk W.; Kooijman, Sebastiaan A. L. M.



Over-seasons analysis of quantitative trait loci affecting phenolic content and antioxidant capacity in raspberry.  


This study examined the total phenol content (TPC) and total anthocyanin content (TAC) in ripe fruit of progeny of a mapping population generated from a cross between the European red raspberry cv. Glen Moy ( Rubus ideaus var. idaeus) and the North American red raspberry cv. Latham ( Rubus ideaus var. strigosus) over five seasons in two different growing environments. Measurements of antioxidant capacity (FRAP and TEAC) were also carried out. TPC was highly correlated with TEAC and FRAP across the entire data set. The subset of anthocyanin content was genotype-dependent but also correlated with TPC, although the proportion of anthocyanin compounds varied between progeny. Quantitative trait locus (QTL) analysis was carried out, and key markers were tested for consistency of effects over sites and years. Four regions, on linkage groups 2, 3, 5, and 6, were identified. These agree with QTLs from a previous study over a single season and indicate that QTL effects were robust over seasons. PMID:22583495

Dobson, Patricia; Graham, Julie; Stewart, D; Brennan, Rex; Hackett, Christine A; McDougall, Gordon J



Quantitative analysis of ginger components in commercial products using liquid chromatography with electrochemical array detection  

PubMed Central

For the first time, a sensitive reversed-phase HPLC electrochemical array method has been developed for the quantitative analysis of eight major ginger components ([6]-, [8]-, and [10]-gingerol, [6]-, [8]-, and [10]-shogaol, [6]-paradol, and [1]-dehydrogingerdione) in eleven ginger-containing commercial products. This method was valid with unrivaled sensitivity as low as 7.3 – 20.2 pg of limit of detection and a range of 14.5 to 40.4 pg of limit of quantification. Using this method, we quantified the levels of eight ginger components in eleven different commercial products. Our results found that both levels and ratios among the eight compounds vary greatly in commercial products. PMID:21090746

Shao, Xi; Lv, Lishuang; Parks, Tiffany; Wu, Hou; Ho, Chi-Tang; Sang, Shengmin



Quantitative analysis of the flexibility effect of cisplatin on circular DNA  

NASA Astrophysics Data System (ADS)

We study the effects of cisplatin on the circular configuration of DNA using atomic force microscopy (AFM) and observe that the DNA gradually transforms to a complex configuration with an intersection and interwound structures from a circlelike structure. An algorithm is developed to extract the configuration profiles of circular DNA from AFM images and the radius of gyration is used to describe the flexibility of circular DNA. The quantitative analysis of the circular DNA demonstrates that the radius of gyration gradually decreases and two processes on the change of flexibility of circular DNA are found as the cisplatin concentration increases. Furthermore, a model is proposed and discussed to explain the mechanism for understanding the complicated interaction between DNA and cisplatin.

Ji, Chao; Zhang, Lingyun; Wang, Peng-Ye



Quantitative measurement of phase variation amplitude of ultrasonic diffraction grating based on diffraction spectral analysis  

NASA Astrophysics Data System (ADS)

A new method based on diffraction spectral analysis is proposed for the quantitative measurement of the phase variation amplitude of an ultrasonic diffraction grating. For a traveling wave, the phase variation amplitude of the grating depends on the intensity of the zeroth- and first-order diffraction waves. By contrast, for a standing wave, this amplitude depends on the intensity of the zeroth-, first-, and second-order diffraction waves. The proposed method is verified experimentally. The measured phase variation amplitude ranges from 0 to 2?, with a relative error of approximately 5%. A nearly linear relation exists between the phase variation amplitude and driving voltage. Our proposed method can also be applied to ordinary sinusoidal phase grating.

Pan, Meiyan; Zeng, Yingzhi; Huang, Zuohua



Quantitative Analysis of Chiari-Like Malformation and Syringomyelia in the Griffon Bruxellois Dog  

PubMed Central

This study aimed to develop a system of quantitative analysis of canine Chiari-like malformation and syringomyelia on variable quality MRI. We made a series of measurements from magnetic resonance DICOM images from Griffon Bruxellois dogs with and without Chiari-like malformation and syringomyelia and identified several significant variables. We found that in the Griffon Bruxellois dog, Chiari-like malformation is characterized by an apparent shortening of the entire cranial base and possibly by increased proximity of the atlas to the occiput. As a compensatory change, there appears to be an increased height of the rostral cranial cavity with lengthening of the dorsal cranial vault and considerable reorganization of the brain parenchyma including ventral deviation of the olfactory bulbs and rostral invagination of the cerebellum under the occipital lobes. PMID:24533070

Knowler, Susan P.; McFadyen, Angus K.; Freeman, Courtenay; Kent, Marc; Platt, Simon R.; Kibar, Zoha; Rusbridge, Clare



Quantitative analysis of generation and branch defects in G5 poly(amidoamine) dendrimer  

PubMed Central

Although methods have been developed to synthesize and isolate generation 5 (G5) PAMAM dendrimers containing precise numbers of ligands per polymer particle, the presence of skeletal and generational defects in this material can substantially hamper the process. Here we provide a quantitative analysis of G5 PAMAM dendrimer defects via high performance liquid chromatography, potentiometric titration, mass spectrometry, size exclusion chromatography, and nuclear magnetic resonance. We identified, isolated, and characterized the major structural defects of G5 dendrimer, trailing generations, and dimer, trimer, and tetramer species. We determine that the G5 material present in the as-received mixture contains 93 arms on average. We have developed two model systems capable of generating the experimentally observed mass range and polydispersity at defect rates of 8–15%. PMID:24058210

van Dongen, Mallory A.; Desai, Ankur; Orr, Bradford G.; Baker, James R.; Holl, Mark M. Banaszak



Quantitative analysis of Scanning Tunneling Microscopy images for surface structure determination: Sulfur on Re(0001)  

SciTech Connect

Scanning Tunneling Microscopy (STM) images of adsorbed atoms and molecules on single crystal substrates provide important information on surface structure and order. In many cases images are interpreted qualitatively based on other information on the system. To obtain quantitative information, a theoretical analysis of the STM image is required. A new method of calculating STM images is presented that includes a full description of the STM tip and surface structure. This method is applied to experimental STM images of sulfur adsorbed on Re(0001). Effects of adsorption site, adsorbate geometry, tip composition and tunnel gap resistance on STM image contrast are analyzed. The chemical identity of tip apex atom and substrate subsurface structure are both shown to significantly affect STM image contrast.

Ogletree, D.F.; Dunphy, J.C.; Salmeron, M.B. [Lawrence Berkeley Lab., CA (United States); Sautet, P. [ENS, Lyon (France). Lab. de Chemie Theoretique]|[Centre National de la Recherche Scientifique (CNRS), 69 - Villeurbanne (France). Inst. de Recherches sur la Catalyse



Quantitative analysis of polynuclear aromatic hydrocarbons in liquid fuels. Final report Oct 76-Oct 78  

SciTech Connect

Polynuclear aromatic hydrocarbons (PNAs), formed in combustion processes with liquid hydrocarbon fuels, contribute to mobile source exhaust emissions. Because correlation between PNA levels in automobile exhaust and pre-existent PNAs in fuel has been demonstrated in previous work, a quantitative analysis of 12 individual polynuclear aromatic hydrocarbons present in various aircraft turbine, diesel, and gasoline test fuels was determined in this project. The PNAs included phenanthrene, anthracene, fluoranthene, pyrene, benzo(a)anthracene, chrysene, triphenylene, benzo(a)pyrene, benzo(e)pyrene, benzo(g,h,i)perylene, coronene and anthanthrene. The fuel samples were analyzed by combined gas chromatography/mass spectrometry (GC-MS) after a preliminary isolation/concentration scheme. Liquid crystal chromatographic columns were employed to resolve isomeric PNAs. The results indicated that anthanthrene and coronene were not detected in any of the samples analyzed.

Parr, J.L.



Quantitative analysis of sesquiterpene lactones in extract of Arnica montana L. by 1H NMR spectroscopy.  


(1)H NMR spectroscopy was used as a method for quantitative analysis of sesquiterpene lactones present in a crude lactone fraction isolated from Arnica montana. Eight main components - tigloyl-, methacryloyl-, isobutyryl- and 2-methylbutyryl-esters of helenalin (H) and 11?,13-dihydrohelenalin (DH) were identified in the studied sample. The method allows the determination of the total amount of sesquiterpene lactones and the quantity of both type helenalin and 11?,13-dihydrohelenalin esters separately. Furthermore, 6-O-tigloylhelenalin (HT, 1), 6-O-methacryloylhelenalin (HM, 2), 6-O-tigloyl-11?,13-dihydrohelenalin (DHT, 5), and 6-O-methacryloyl-11?,13-dihydrohelenalin (DHM, 6) were quantified as individual components. PMID:20837387

Staneva, Jordanka; Denkova, Pavletta; Todorova, Milka; Evstatieva, Ljuba