Science.gov

Sample records for quantitative hydrogen analysis

  1. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  2. Scattering influences in quantitative fission neutron radiography for the in situ analysis of hydrogen distribution in metal hydrides

    NASA Astrophysics Data System (ADS)

    Börries, S.; Metz, O.; Pranzas, P. K.; Bücherl, T.; Söllradl, S.; Dornheim, M.; Klassen, T.; Schreyer, A.

    2015-10-01

    In situ neutron radiography allows for the time-resolved study of hydrogen distribution in metal hydrides. However, for a precise quantitative investigation of a time-dependent hydrogen content within a host material, an exact knowledge of the corresponding attenuation coefficient is necessary. Additionally, the effect of scattering has to be considered as it is known to violate Beer's law, which is used to determine the amount of hydrogen from a measured intensity distribution. Within this study, we used a metal hydride inside two different hydrogen storage tanks as host systems, consisting of steel and aluminum. The neutron beam attenuation by hydrogen was investigated in these two different setups during the hydrogen absorption process. A linear correlation to the amount of absorbed hydrogen was found, allowing for a readily quantitative investigation. Further, an analysis of scattering contributions on the measured intensity distributions was performed and is described in detail.

  3. Qualitative and quantitative analysis of mixtures of compounds containing both hydrogen and deuterium

    NASA Technical Reports Server (NTRS)

    Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.

    1969-01-01

    Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.

  4. Quantitative hydrogen analysis in minerals based on a semi-empirical approach

    NASA Astrophysics Data System (ADS)

    Kristiansson, P.; Borysiuk, M.; Ros, L.; Skogby, H.; Abdel, N.; Elfman, M.; Nilsson, E. J. C.; Pallon, J.

    2013-07-01

    Hydrogen normally occurs as hydroxyl ions related to defects at specific crystallographic sites in the structures, and is normally characterized by infrared spectroscopy (FTIR). For quantification purposes the FTIR technique has proven to be less precise since calibrations against independent methods are needed. Hydrogen analysis by the NMP technique can solve many of the problems, due to the low detection limit, high lateral resolution, insignificant matrix effects and possibility to discriminate surface-adsorbed water. The technique has been shown to work both on thin samples and on thicker geological samples. To avoid disturbance from surface contamination the hydrogen is analyzed inside semi-thick geological samples. The technique used is an elastic recoil technique where both the incident projectile (proton) and the recoiled hydrogen are detected in coincidence in a segmented detector. Both the traditional annular system with the detector divided in two halves and the new double-sided silicon strip detector (DSSSD) has been used. In this work we present an upgraded version of the technique, studying two sets of mineral standards combined with pre-sample charge normalization. To improve the processing time of data we suggest a very simple semi-empirical approach to be used for data evaluation. The advantages and drawbacks with the approach are discussed and a possible extension of the model is suggested.

  5. Quantitative analysis of the hydrogen peroxide formed in aqueous cigarette tar extracts

    SciTech Connect

    Nakayama, T.; Church, D.F.; Pryor, W.A. )

    1989-01-01

    We have established, for the first time, a reliable method to quantitate hydrogen peroxide (H{sub 2}O{sub 2}) generated in aqueous extracts of cigarette smoke tar. The aqueous tar extract was passed through a short reverse-phase column and its H{sub 2}O{sub 2} concentration determined by differential pulse polarography using an automatic reference subtraction system. The H{sub 2}O{sub 2} concentration increased with aging, pH and temperature; the presence of superoxide dismutase lead to lower H{sub 2}O{sub 2} concentrations. This method was applied to many kinds of research and commercial cigarettes. With a few exceptions, the amount of H{sub 2}O{sub 2} formed after a fixed time from each cigarette smoke was proportional to its tar yield.

  6. Purity analysis of hydrogen cyanide, cyanogen chloride and phosgene by quantitative (13)C NMR spectroscopy.

    PubMed

    Henderson, Terry J; Cullinan, David B

    2007-11-01

    Hydrogen cyanide, cyanogen chloride and phosgene are produced in tremendously large quantities today by the chemical industry. The compounds are also particularly attractive to foreign states and terrorists seeking an inexpensive mass-destruction capability. Along with contemporary warfare agents, therefore, the US Army evaluates protective equipment used by warfighters and domestic emergency responders against the compounds, and requires their certification at > or = 95 carbon atom % before use. We have investigated the (13)C spin-lattice relaxation behavior of the compounds to develop a quantitative NMR method for characterizing chemical lots supplied to the Army. Behavior was assessed at 75 and 126 MHz for temperatures between 5 and 15 degrees C to hold the compounds in their liquid states, dramatically improving detection sensitivity. T(1) values for cyanogen chloride and phosgene were somewhat comparable, ranging between 20 and 31 s. Hydrogen cyanide values were significantly shorter at 10-18 s, most likely because of a (1)H--(13)C dipolar contribution to relaxation not possible for the other compounds. The T(1) measurements were used to derive relaxation delays for collecting the quantitative (13)C data sets. At 126 MHz, only a single data acquisition with a cryogenic probehead gave a signal-to-noise ratio exceeding that necessary for certifying the compounds at > or = 95 carbon atom % and 99% confidence. Data acquired at 75 MHz with a conventional probehead, however, required > or = 5 acquisitions to reach this certifying signal-to-noise ratio for phosgene, and >/= 12 acquisitions were required for the other compounds under these same conditions. In terms of accuracy and execution time, the NMR method rivals typical chromatographic methods. PMID:17924355

  7. Quantitative analysis of hydrogen in SiO2/SiN/SiO2 stacks using atom probe tomography

    NASA Astrophysics Data System (ADS)

    Kunimune, Yorinobu; Shimada, Yasuhiro; Sakurai, Yusuke; Inoue, Masao; Nishida, Akio; Han, Bin; Tu, Yuan; Takamizawa, Hisashi; Shimizu, Yasuo; Inoue, Koji; Yano, Fumiko; Nagai, Yasuyoshi; Katayama, Toshiharu; Ide, Takashi

    2016-04-01

    We have demonstrated that it is possible to reproducibly quantify hydrogen concentration in the SiN layer of a SiO2/SiN/SiO2 (ONO) stack structure using ultraviolet laser-assisted atom probe tomography (APT). The concentration of hydrogen atoms detected using APT increased gradually during the analysis, which could be explained by the effect of hydrogen adsorption from residual gas in the vacuum chamber onto the specimen surface. The amount of adsorbed hydrogen in the SiN layer was estimated by analyzing another SiN layer with an extremely low hydrogen concentration (<0.2 at. %). Thus, by subtracting the concentration of adsorbed hydrogen, the actual hydrogen concentration in the SiN layer was quantified as approximately 1.0 at. %. This result was consistent with that obtained by elastic recoil detection analysis (ERDA), which confirmed the accuracy of the APT quantification. The present results indicate that APT enables the imaging of the three-dimensional distribution of hydrogen atoms in actual devices at a sub-nanometer scale.

  8. Mass spectrometry-based quantitative proteomic analysis of Salmonella enterica serovar Enteritidis protein expression upon exposure to hydrogen peroxide

    PubMed Central

    2010-01-01

    Background Salmonella enterica, a common food-borne bacterial pathogen, is believed to change its protein expression profile in the presence of different environmental stress such as that caused by the exposure to hydrogen peroxide (H2O2), which can be generated by phagocytes during infection and represents an important antibacterial mechanism of host cells. Among Salmonella proteins, the effectors of Salmonella pathogenicity island 1 and 2 (SPI-1 and SPI-2) are of particular interest since they are expressed during host infection in vivo and are important for invasion of epithelial cells and for replication in organs during systemic infection, respectively. However, the expression profiles of these proteins upon exposure to H2O2 or to host cells in vivo during the established phase of systemic infection have not been extensively studied. Results Using stable isotope labeling coupled with mass spectrometry, we performed quantitative proteomic analysis of Salmonella enterica serovar Enteritidis and identified 76 proteins whose expression is modulated upon exposure to H2O2. SPI-1 effector SipC was expressed about 3-fold higher and SopB was expressed approximately 2-fold lower in the presence of H2O2, while no significant change in the expression of another SPI-1 protein SipA was observed. The relative abundance of SipA, SipC, and SopB was confirmed by Western analyses, validating the accuracy and reproducibility of our approach for quantitative analysis of protein expression. Furthermore, immuno-detection showed substantial expression of SipA and SipC but not SopB in the late phase of infection in macrophages and in the spleen of infected mice. Conclusions We have identified Salmonella proteins whose expression is modulated in the presence of H2O2. Our results also provide the first direct evidence that SipC is highly expressed in the spleen at late stage of salmonellosis in vivo. These results suggest a possible role of SipC and other regulated proteins in

  9. Quantitative determination of hydrogen in solids by gas chromatography.

    PubMed

    Addach, H; Berçot, P; Wery, M; Rezrazi, M

    2004-11-19

    Processes such as electroplating or acid cleaning are notorious causes of post-processing failure through hydrogen embrittlement. So, the determination of amounts of hydrogen in metals is of great importance. An analysis method for investigation of H content in solids has been established based on hot extraction and gas chromatography system. Hot extraction in inert gas enables complete and/or partial removal of the hydrogen from the samples. A gas chromatography system is used to determine quantitatively the amount of thermally desorbed hydrogen. An investigation of the baking operating conditions is made of the hydrogen desorption rate of zinc-plated steel parts. Then, an analysis of the polarisation conditions upon chromium electroplating is given. PMID:15584242

  10. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  11. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  12. Quantitative analysis of desorption and decomposition kinetics of formic acid on Cu(111): The importance of hydrogen bonding between adsorbed species

    SciTech Connect

    Shiozawa, Yuichiro; Koitaya, Takanori; Mukai, Kozo; Yoshimoto, Shinya; Yoshinobu, Jun

    2015-12-21

    Quantitative analysis of desorption and decomposition kinetics of formic acid (HCOOH) on Cu(111) was performed by temperature programmed desorption (TPD), X-ray photoelectron spectroscopy, and time-resolved infrared reflection absorption spectroscopy. The activation energy for desorption is estimated to be 53–75 kJ/mol by the threshold TPD method as a function of coverage. Vibrational spectra of the first layer HCOOH at 155.3 K show that adsorbed molecules form a polymeric structure via the hydrogen bonding network. Adsorbed HCOOH molecules are dissociated gradually into monodentate formate species. The activation energy for the dissociation into monodentate formate species is estimated to be 65.0 kJ/mol at a submonolayer coverage (0.26 molecules/surface Cu atom). The hydrogen bonding between adsorbed HCOOH species plays an important role in the stabilization of HCOOH on Cu(111). The monodentate formate species are stabilized at higher coverages, because of the lack of vacant sites for the bidentate formation.

  13. Quantitative analysis of desorption and decomposition kinetics of formic acid on Cu(111): The importance of hydrogen bonding between adsorbed species

    NASA Astrophysics Data System (ADS)

    Shiozawa, Yuichiro; Koitaya, Takanori; Mukai, Kozo; Yoshimoto, Shinya; Yoshinobu, Jun

    2015-12-01

    Quantitative analysis of desorption and decomposition kinetics of formic acid (HCOOH) on Cu(111) was performed by temperature programmed desorption (TPD), X-ray photoelectron spectroscopy, and time-resolved infrared reflection absorption spectroscopy. The activation energy for desorption is estimated to be 53-75 kJ/mol by the threshold TPD method as a function of coverage. Vibrational spectra of the first layer HCOOH at 155.3 K show that adsorbed molecules form a polymeric structure via the hydrogen bonding network. Adsorbed HCOOH molecules are dissociated gradually into monodentate formate species. The activation energy for the dissociation into monodentate formate species is estimated to be 65.0 kJ/mol at a submonolayer coverage (0.26 molecules/surface Cu atom). The hydrogen bonding between adsorbed HCOOH species plays an important role in the stabilization of HCOOH on Cu(111). The monodentate formate species are stabilized at higher coverages, because of the lack of vacant sites for the bidentate formation.

  14. Analysis of hydrogen isotope mixtures

    DOEpatents

    Villa-Aleman, Eliel

    1994-01-01

    An apparatus and method for determining the concentrations of hydrogen isotopes in a sample. Hydrogen in the sample is separated from other elements using a filter selectively permeable to hydrogen. Then the hydrogen is condensed onto a cold finger or cryopump. The cold finger is rotated as pulsed laser energy vaporizes a portion of the condensed hydrogen, forming a packet of molecular hydrogen. The desorbed hydrogen is ionized and admitted into a mass spectrometer for analysis.

  15. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  16. Quantitative Microscale Hydrogenation of Vegetable Oils

    NASA Astrophysics Data System (ADS)

    Blanchard, Daniel E.

    2003-05-01

    A new hydrogenation experiment for the introductory organic chemistry laboratory is described. It is a classic synthetic reaction applied to a consumer product. It provides an opportunity to discuss fats and fatty acids, their physical behavior, and their nutritional value. The experiment is relatively easy to set up and requires no special equipment. Calculations utilizing the ideal gas law allow the determination of the efficiency of the hydrogenation. With this experiment a link is made between the organic chemistry laboratory and the everyday life of the student in an attempt to show the relevance between organic chemistry and modern life.

  17. Study of Surface Damage caused by Laser Irradiation for Quantitative Hydrogen Analysis in Zircaloy using Laser-induced Plasma Breakdown Spectrometry

    SciTech Connect

    Fukumoto, K.; Yamada, N.; Niki, H.; Maruyama, T.; Kagawa, K.

    2009-03-17

    The surface damage caused by laser irradiation is studied to investigate the possibility of performing a depth-profile analysis of the hydrogen concentration in zircaloy-4 alloys using laser-induced plasma breakdown spectrometry. After laser irradiation, a heat-affected zone extending about 3 {mu}m down from the top surface can be seen. The depth of this heat-affected zone is independent of the laser power density in the range 10{sup 8} to 10{sup 9} W/cm{sup 2}. In order to obtain the depth profile of the hydrogen concentration in zircaloy-4 alloys, the power density of laser shots must be greater than 1.3x10{sup 9} W/cm{sup 2}.

  18. Hydrogen Data Book from the Hydrogen Analysis Resource Center

    DOE Data Explorer

    The Hydrogen Data Book contains a wide range of factual information on hydrogen and fuel cells (e.g., hydrogen properties, hydrogen production and delivery data, and information on fuel cells and fuel cell vehicles), and it also provides other data that might be useful in analyses of hydrogen infrastructure in the United States (e.g., demographic data and data on energy supply and/or infrastructure). ItÆs made available from the Hydrogen Analysis Resource Center along with a wealth of related information. The related information includes guidelines for DOE Hydrogen Program Analysis, various calculator tools, a hydrogen glossary, related websites, and analysis tools relevant to hydrogen and fuel cells. [From http://hydrogen.pnl.gov/cocoon/morf/hydrogen

  19. Quantitative intracerebral brain hemorrhage analysis

    NASA Astrophysics Data System (ADS)

    Loncaric, Sven; Dhawan, Atam P.; Cosic, Dubravko; Kovacevic, Domagoj; Broderick, Joseph; Brott, Thomas

    1999-05-01

    In this paper a system for 3-D quantitative analysis of human spontaneous intracerebral brain hemorrhage (ICH) is described. The purpose of the developed system is to perform quantitative 3-D measurements of the parameters of ICH region and from computed tomography (CT) images. The measured parameter in this phase of the system development is volume of the hemorrhage region. The goal of the project is to measure parameters for a large number of patients having ICH and to correlate measured parameters to patient morbidity and mortality.

  20. Software for quantitative trait analysis

    PubMed Central

    2005-01-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed. PMID:16197737

  1. Image analysis and quantitative morphology.

    PubMed

    Mandarim-de-Lacerda, Carlos Alberto; Fernandes-Santos, Caroline; Aguila, Marcia Barbosa

    2010-01-01

    Quantitative studies are increasingly found in the literature, particularly in the fields of development/evolution, pathology, and neurosciences. Image digitalization converts tissue images into a numeric form by dividing them into very small regions termed picture elements or pixels. Image analysis allows automatic morphometry of digitalized images, and stereology aims to understand the structural inner three-dimensional arrangement based on the analysis of slices showing two-dimensional information. To quantify morphological structures in an unbiased and reproducible manner, appropriate isotropic and uniform random sampling of sections, and updated stereological tools are needed. Through the correct use of stereology, a quantitative study can be performed with little effort; efficiency in stereology means as little counting as possible (little work), low cost (section preparation), but still good accuracy. This short text provides a background guide for non-expert morphologists. PMID:19960334

  2. Technical Analysis of Hydrogen Production

    SciTech Connect

    Ali T-Raissi

    2005-01-14

    The aim of this work was to assess issues of cost, and performance associated with the production and storage of hydrogen via following three feedstocks: sub-quality natural gas (SQNG), ammonia (NH{sub 3}), and water. Three technology areas were considered: (1) Hydrogen production utilizing SQNG resources, (2) Hydrogen storage in ammonia and amine-borane complexes for fuel cell applications, and (3) Hydrogen from solar thermochemical cycles for splitting water. This report summarizes our findings with the following objectives: Technoeconomic analysis of the feasibility of the technology areas 1-3; Evaluation of the hydrogen production cost by technology areas 1; and Feasibility of ammonia and/or amine-borane complexes (technology areas 2) as a means of hydrogen storage on-board fuel cell powered vehicles. For each technology area, we reviewed the open literature with respect to the following criteria: process efficiency, cost, safety, and ease of implementation and impact of the latest materials innovations, if any. We employed various process analysis platforms including FactSage chemical equilibrium software and Aspen Technologies AspenPlus and HYSYS chemical process simulation programs for determining the performance of the prospective hydrogen production processes.

  3. Bioimaging for quantitative phenotype analysis.

    PubMed

    Chen, Weiyang; Xia, Xian; Huang, Yi; Chen, Xingwei; Han, Jing-Dong J

    2016-06-01

    With the development of bio-imaging techniques, an increasing number of studies apply these techniques to generate a myriad of image data. Its applications range from quantification of cellular, tissue, organismal and behavioral phenotypes of model organisms, to human facial phenotypes. The bio-imaging approaches to automatically detect, quantify, and profile phenotypic changes related to specific biological questions open new doors to studying phenotype-genotype associations and to precisely evaluating molecular changes associated with quantitative phenotypes. Here, we review major applications of bioimage-based quantitative phenotype analysis. Specifically, we describe the biological questions and experimental needs addressable by these analyses, computational techniques and tools that are available in these contexts, and the new perspectives on phenotype-genotype association uncovered by such analyses. PMID:26850283

  4. Optimization of quantitative infrared analysis

    NASA Astrophysics Data System (ADS)

    Duerst, Richard W.; Breneman, W. E.; Dittmar, Rebecca M.; Drugge, Richard E.; Gagnon, Jim E.; Pranis, Robert A.; Spicer, Colleen K.; Stebbings, William L.; Westberg, J. W.; Duerst, Marilyn D.

    1994-01-01

    A number of industrial processes, especially quality assurance procedures, accept information on relative quantities of components in mixtures, whenever absolute values for the quantitative analysis are unavailable. These relative quantities may be determined from infrared intensity ratios even though known standards are unavailable. Repeatability [vs precisionhl in quantitative analysis is a critical parameter for meaningful results. In any given analysis, multiple runs provide "answers" with a certain standard deviation. Obviously, the lower the standard deviation, the better the precision. In attempting to minimize the standard deviation and thus improve precision, we need to delineate which contributing factors we have control over (such as sample preparation techniques, data analysis methodology) and which factors we have little control over (environmental and instrument noise, for example). For a given set of conditions, the best instrumental precision achievable on an IR instrument should be determinable. Traditionally, the term "signal-to-noise" (S/N) has been used for a single spectrum, realizing that S/N improves with an increase in number of scans coadded for generation of that single spectrum. However, the S/N ratio does not directly reflect the precision achievable for an absorbing band. We prefer to use the phrase "maximum achievable instrument precision" (MAIP), which is equivalent to the minimum relative standard deviation for a given peak (either height or area) in spectra. For a specific analysis, the analyst should have in mind the desired precision. Only if the desired precision is less than the MA1P will the analysis be feasible. Once the MAIP is established, other experimental procedures may be modified to improve the analytical precision, if it is below that which is expected (the MAIP).

  5. Automated quantitative analysis for pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  6. A Quantitative Fitness Analysis Workflow

    PubMed Central

    Lydall, D.A.

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel1,2,3,4. QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods5,6. However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases3. For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously1. Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and imaging

  7. A quantitative fitness analysis workflow.

    PubMed

    Banks, A P; Lawless, C; Lydall, D A

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel(1,2,3,4). QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods(5,6). However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases(3). For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously(1). Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and

  8. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-01

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses. PMID

  9. Quantitative analysis of sandstone porosity

    SciTech Connect

    Ferrell, R.E. Jr.; Carpenter, P.K.

    1988-01-01

    A quantitative analysis of changes in porosity associated with sandstone diagenesis was accomplished with digital back-scattered electron image analysis techniques. The volume percent (vol. %) of macroporosity, quartz, clay minerals, feldspar, and other constituents combined with stereological parameters, such as the size and shape of the analyzed features, permitted the determination of cement volumes, the ratio of primary to secondary porosity, and the relative abundance of detrital and authigenic clay minerals. The analyses were produced with a JEOL 733 Superprobe and a TRACOR/NORTHERN 5700 Image Analyzer System. The results provided a numerical evaluation of sedimentological facies controls and diagenetic effects on the permeabilities of potential reservoirs. In a typical application, subtle differences in the diagnetic development of porosity were detected in Wilcox sandstones from central Louisiana. Mechanical compaction of these shoreface sandstones has reduced the porosity to approximately 20%. In most samples with permeabilities greater than 10 md, the measured ratio of macroporosity to microporosity associated with pore-filling kaolinite was 3:1. In other sandstones with lower permeabilities, the measured ratio was higher, but the volume of pore-filling clay was essentially the same. An analysis of the frequency distribution of pore diameters and shapes revealed that the latter samples contained 2-3 vol% of grain-dissolution or moldic porosity. Fluid entry to these large pores was restricted and the clays produced from the grain dissolution products reduced the observed permeability. The image analysis technique provided valuable data for the distinction of productive and nonproductive intervals in this reservoir.

  10. Quantitative Analysis of Glaciated Landscapes

    NASA Astrophysics Data System (ADS)

    Huerta, A. D.

    2005-12-01

    The evolution of glaciated mountains is at the heart of the debate over Late Cenozoic linkages between climate and tectonics. Traditionally, the development of high summit elevations is attributed to tectonic processes. However, much of the high elevation of the Transantarctic Mountains can be attributed solely to uplift in response to glacial erosion (Stern et al., 2005). The Transantarctic Mountains (TAM) provide an unparalleled opportunity to study glacial erosion. The mountain range has experienced glacial conditions since Oligocene time. In the higher and dryer regions of the TAM there is only a thin veneer of ice and snow draping the topography. In these regions landforms that were shaped during earlier climatic conditions are preserved. In fact, both glacial and fluvial landforms dating as far back as 18 Ma are preserved locally. In addition, the TAM are ideal for studying glacial erosion since the range has experienced minimal tectonic uplift since late Oligocene time, thus isolating the erosion signal from any tectonic signal. With the advent of digital data sets and GIS methodologies, quantitative analysis can identify key aspects of glaciated landscape morphology, and thus develop powerful analytical techniques for objective study of glaciation. Inspection of USGS topographic maps of the TAM reveals that mountain tops display an extreme range of glacial modification. For example, in the Mt. Rabot region (83°-84° S), mountain peaks are strongly affected by glaciation; cirque development is advanced with cirque diameters on the range of several kilometers, and cirque confluence has resulted in the formation of ``knife-edge'' arêtes up to 10 km long. In contrast, in the Mt. Murchison area (73°-74° S) cirque development is youthful, and there is minimal development of arêtes. Preliminary work indicates that analysis of DEM's and contour lines can be used to distinguish degree of glaciation. In particular, slope, curvature, and power spectrum analysis

  11. Quantitative analysis of retinal OCT.

    PubMed

    Sonka, Milan; Abràmoff, Michael D

    2016-10-01

    Clinical acceptance of 3-D OCT retinal imaging brought rapid development of quantitative 3-D analysis of retinal layers, vasculature, retinal lesions as well as facilitated new research in retinal diseases. One of the cornerstones of many such analyses is segmentation and thickness quantification of retinal layers and the choroid, with an inherently 3-D simultaneous multi-layer LOGISMOS (Layered Optimal Graph Image Segmentation for Multiple Objects and Surfaces) segmentation approach being extremely well suited for the task. Once retinal layers are segmented, regional thickness, brightness, or texture-based indices of individual layers can be easily determined and thus contribute to our understanding of retinal or optic nerve head (ONH) disease processes and can be employed for determination of disease status, treatment responses, visual function, etc. Out of many applications, examples provided in this paper focus on image-guided therapy and outcome prediction in age-related macular degeneration and on assessing visual function from retinal layer structure in glaucoma. PMID:27503080

  12. Task D: Hydrogen safety analysis

    SciTech Connect

    Swain, M.R.; Sievert, B.G.; Swain, M.N.

    1996-10-01

    This report covers two topics. The first is a review of codes, standards, regulations, recommendations, certifications, and pamphlets which address safety of gaseous fuels. The second is an experimental investigation of hydrogen flame impingement. Four areas of concern in the conversion of natural gas safety publications to hydrogen safety publications are delineated. Two suggested design criteria for hydrogen vehicle fuel systems are proposed. It is concluded from the experimental work that light weight, low cost, firewalls to resist hydrogen flame impingement are feasible.

  13. Quantitative Analysis of Face Symmetry.

    PubMed

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait. PMID:26080172

  14. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  15. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  16. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described. PMID:8905629

  17. Qualitative and Quantitative Analysis: Interpretation of Electropherograms

    NASA Astrophysics Data System (ADS)

    Szumski, Michał; Buszewski, Bogusław

    In this chapter the basic information on qualitative and quantitative analysis in CE is provided. Migration time and spectral data are described as the most important parameters used for identification of compounds. The parameters that negatively influence qualitative analysis are briefly mentioned. In the quantitative analysis section the external standard and internal standard calibration methods are described. Variables influencing peak height and peak area in capillary electrophoresis are briefly summarized. Also, a discussion on electrodisperssion and its influence on a observed peak shape is provided.

  18. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  19. Mobile app-based quantitative scanometric analysis.

    PubMed

    Wong, Jessica X H; Liu, Frank S F; Yu, Hua-Zhong

    2014-12-16

    The feasibility of using smartphones and other mobile devices as the detection platform for quantitative scanometric assays is demonstrated. The different scanning modes (color, grayscale, black/white) and grayscale converting protocols (average, weighted average/luminosity, and software specific) have been compared in determining the optical darkness ratio (ODR) values, a conventional quantitation measure for scanometric assays. A mobile app was developed to image and analyze scanometric assays, as demonstrated by paper-printed tests and a biotin-streptavidin assay on a plastic substrate. Primarily for ODR analysis, the app has been shown to perform as well as a traditional desktop scanner, augmenting that smartphones (and other mobile devices) promise to be a practical platform for accurate, quantitative chemical analysis and medical diagnostics. PMID:25420202

  20. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  1. STS-35 scrub 3 hydrogen leak analysis

    NASA Technical Reports Server (NTRS)

    Seymour, Dave

    1991-01-01

    During the summer of 1990, space shuttle Columbia experienced both an external tank/orbiter disconnect hydrogen leak and multiple internal aft compartment hydrogen leaks. After the third scrub of STS-35, a leak investigation team was organized. In support of this team, an analysis of the data obtained during scrub 3 was performed. Based on this analysis, the engine 2 prevalve was concluded to be the most likely leak location and to account for most of the observed leakage.

  2. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  3. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  4. Quantitative Proteomics Analysis of Leukemia Cells.

    PubMed

    Halbach, Sebastian; Dengjel, Jörn; Brummer, Tilman

    2016-01-01

    Chronic myeloid leukemia (CML) is driven by the oncogenic fusion kinase Bcr-Abl, which organizes its own signaling network with various proteins. These proteins, their interactions, and their role in relevant signaling pathways can be analyzed by quantitative mass spectrometry (MS) approaches in various models systems, e.g., in cell culture models. In this chapter, we describe in detail immunoprecipitations and quantitative proteomics analysis using stable isotope labeling by amino acids in cell culture (SILAC) of components of the Bcr-Abl signaling pathway in the human CML cell line K562. PMID:27581145

  5. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  6. Quantitative Tools for Dissection of Hydrogen-Producing Metabolic Networks-Final Report

    SciTech Connect

    Rabinowitz, Joshua D.; Dismukes, G.Charles.; Rabitz, Herschel A.; Amador-Noguez, Daniel

    2012-10-19

    During this project we have pioneered the development of integrated experimental-computational technologies for the quantitative dissection of metabolism in hydrogen and biofuel producing microorganisms (i.e. C. acetobutylicum and various cyanobacteria species). The application of these new methodologies resulted in many significant advances in the understanding of the metabolic networks and metabolism of these organisms, and has provided new strategies to enhance their hydrogen or biofuel producing capabilities. As an example, using mass spectrometry, isotope tracers, and quantitative flux-modeling we mapped the metabolic network structure in C. acetobutylicum. This resulted in a comprehensive and quantitative understanding of central carbon metabolism that could not have been obtained using genomic data alone. We discovered that biofuel production in this bacterium, which only occurs during stationary phase, requires a global remodeling of central metabolism (involving large changes in metabolite concentrations and fluxes) that has the effect of redirecting resources (carbon and reducing power) from biomass production into solvent production. This new holistic, quantitative understanding of metabolism is now being used as the basis for metabolic engineering strategies to improve solvent production in this bacterium. In another example, making use of newly developed technologies for monitoring hydrogen and NAD(P)H levels in vivo, we dissected the metabolic pathways for photobiological hydrogen production by cyanobacteria Cyanothece sp. This investigation led to the identification of multiple targets for improving hydrogen production. Importantly, the quantitative tools and approaches that we have developed are broadly applicable and we are now using them to investigate other important biofuel producers, such as cellulolytic bacteria.

  7. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  8. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  9. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  10. Quantitative observations of hydrogen-induced, slow crack growth in a low alloy steel

    NASA Technical Reports Server (NTRS)

    Nelson, H. G.; Williams, D. P.

    1973-01-01

    Hydrogen-induced slow crack growth, da/dt, was studied in AISI-SAE 4130 low alloy steel in gaseous hydrogen and distilled water environments as a function of applied stress intensity, K, at various temperatures, hydrogen pressures, and alloy strength levels. At low values of K, da/dt was found to exhibit a strong exponential K dependence (Stage 1 growth) in both hydrogen and water. At intermediate values of K, da/dt exhibited a small but finite K dependence (Stage 2), with the Stage 2 slope being greater in hydrogen than in water. In hydrogen, at a constant K, (da/dt) sub 2 varied inversely with alloy strength level and varied essentially in the same complex manner with temperature and hydrogen pressure as noted previously. The results of this study provide support for most of the qualitative predictions of the lattice decohesion theory as recently modified by Oriani. The lack of quantitative agreement between data and theory and the inability of theory to explain the observed pressure dependence of slow crack growth are mentioned and possible rationalizations to account for these differences are presented.

  11. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  12. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  13. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  14. Quantitative Bias Analysis in Regulatory Settings.

    PubMed

    Lash, Timothy L; Fox, Matthew P; Cooney, Darryl; Lu, Yun; Forshee, Richard A

    2016-07-01

    Nonrandomized studies are essential in the postmarket activities of the US Food and Drug Administration, which, however, must often act on the basis of imperfect data. Systematic errors can lead to inaccurate inferences, so it is critical to develop analytic methods that quantify uncertainty and bias and ensure that these methods are implemented when needed. "Quantitative bias analysis" is an overarching term for methods that estimate quantitatively the direction, magnitude, and uncertainty associated with systematic errors influencing measures of associations. The Food and Drug Administration sponsored a collaborative project to develop tools to better quantify the uncertainties associated with postmarket surveillance studies used in regulatory decision making. We have described the rationale, progress, and future directions of this project. PMID:27196652

  15. Accuracy in Quantitative 3D Image Analysis

    PubMed Central

    Bassel, George W.

    2015-01-01

    Quantitative 3D imaging is becoming an increasingly popular and powerful approach to investigate plant growth and development. With the increased use of 3D image analysis, standards to ensure the accuracy and reproducibility of these data are required. This commentary highlights how image acquisition and postprocessing can introduce artifacts into 3D image data and proposes steps to increase both the accuracy and reproducibility of these analyses. It is intended to aid researchers entering the field of 3D image processing of plant cells and tissues and to help general readers in understanding and evaluating such data. PMID:25804539

  16. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  17. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  18. Screening analysis of solar thermochemical hydrogen concepts.

    SciTech Connect

    Diver, Richard B., Jr.; Kolb, Gregory J.

    2008-03-01

    A screening analysis was performed to identify concentrating solar power (CSP) concepts that produce hydrogen with the highest efficiency. Several CSP concepts were identified that have the potential to be much more efficient than today's low-temperature electrolysis technology. They combine a central receiver or dish with either a thermochemical cycle or high-temperature electrolyzer that operate at temperatures >600 C. The solar-to-hydrogen efficiencies of the best central receiver concepts exceed 20%, significantly better than the 14% value predicted for low-temperature electrolysis.

  19. Quantitative architectural analysis of bronchial intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Guillaud, Martial; MacAulay, Calum E.; Le Riche, Jean C.; Dawe, Chris; Korbelik, Jagoda; Lam, Stephen

    2000-04-01

    Considerable variation exists among pathologist in the interpretation of intraepithelial neoplasia making it difficult to determine the natural history of these lesion and to establish management guidelines for chemoprevention. The aim of the study is to evaluate architectural features of pre-neoplastic progression in lung cancer, and to search for a correlation between architectural index and conventional pathology. Quantitative architectural analysis was performed on a series of normal lung biopsies and Carcinoma In Situ (CIS). Centers of gravity of the nuclei within a pre-defined region of interest were used as seeds to generate a Voronoi Diagram. About 30 features derived from the Voronoi diagram, its dual the Delaunay tessellation, and the Minimum Spanning Tree were extracted. A discriminant analysis was performed to separate between the two groups. The architectural Index was calculated for each of the bronchial biopsies that were interpreted as hyperplasia, metaplasia, mild, moderate or severe dysplasia by conventional histopathology criteria. As a group, lesions classified as CIS by conventional histopathology criteria could be distinguished from dysplasia using the architectural Index. Metaplasia was distinct from hyperplasia and hyperplasia from normal. There was overlap between severe and moderate dysplasia but mild dysplasia could be distinguished form moderate dysplasia. Bronchial intraepithelial neoplastic lesions can be degraded objectively by architectural features. Combination of architectural features and nuclear morphometric features may improve the quantitation of the changes occurring during the intra-epithelial neoplastic process.

  20. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  1. Quantitative interactome analysis reveals a chemoresistant edgotype.

    PubMed

    Chavez, Juan D; Schweppe, Devin K; Eng, Jimmy K; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for 'edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  2. Quantitative analysis of NMR spectra with chemometrics

    NASA Astrophysics Data System (ADS)

    Winning, H.; Larsen, F. H.; Bro, R.; Engelsen, S. B.

    2008-01-01

    The number of applications of chemometrics to series of NMR spectra is rapidly increasing due to an emerging interest for quantitative NMR spectroscopy e.g. in the pharmaceutical and food industries. This paper gives an analysis of advantages and limitations of applying the two most common chemometric procedures, Principal Component Analysis (PCA) and Multivariate Curve Resolution (MCR), to a designed set of 231 simple alcohol mixture (propanol, butanol and pentanol) 1H 400 MHz spectra. The study clearly demonstrates that the major advantage of chemometrics is the visualisation of larger data structures which adds a new exploratory dimension to NMR research. While robustness and powerful data visualisation and exploration are the main qualities of the PCA method, the study demonstrates that the bilinear MCR method is an even more powerful method for resolving pure component NMR spectra from mixtures when certain conditions are met.

  3. Quantitative dissection of hydrogen bond-mediated proton transfer in the ketosteroid isomerase active site

    PubMed Central

    Sigala, Paul A.; Fafarman, Aaron T.; Schwans, Jason P.; Fried, Stephen D.; Fenn, Timothy D.; Caaveiro, Jose M. M.; Pybus, Brandon; Ringe, Dagmar; Petsko, Gregory A.; Boxer, Steven G.; Herschlag, Daniel

    2013-01-01

    Hydrogen bond networks are key elements of protein structure and function but have been challenging to study within the complex protein environment. We have carried out in-depth interrogations of the proton transfer equilibrium within a hydrogen bond network formed to bound phenols in the active site of ketosteroid isomerase. We systematically varied the proton affinity of the phenol using differing electron-withdrawing substituents and incorporated site-specific NMR and IR probes to quantitatively map the proton and charge rearrangements within the network that accompany incremental increases in phenol proton affinity. The observed ionization changes were accurately described by a simple equilibrium proton transfer model that strongly suggests the intrinsic proton affinity of one of the Tyr residues in the network, Tyr16, does not remain constant but rather systematically increases due to weakening of the phenol–Tyr16 anion hydrogen bond with increasing phenol proton affinity. Using vibrational Stark spectroscopy, we quantified the electrostatic field changes within the surrounding active site that accompany these rearrangements within the network. We were able to model these changes accurately using continuum electrostatic calculations, suggesting a high degree of conformational restriction within the protein matrix. Our study affords direct insight into the physical and energetic properties of a hydrogen bond network within a protein interior and provides an example of a highly controlled system with minimal conformational rearrangements in which the observed physical changes can be accurately modeled by theoretical calculations. PMID:23798390

  4. Materials characterization through quantitative digital image analysis

    SciTech Connect

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  5. Thermal Analysis of Cryogenic Hydrogen Liquid Separator

    NASA Technical Reports Server (NTRS)

    Congiardo, Jared F.; Fortier, Craig R. (Editor)

    2014-01-01

    During launch for the new Space Launch System (SLS) liquid hydrogen is bleed through the engines during replenish, pre-press, and extended pre-press to condition the engines prior to launch. The predicted bleed flow rates are larger than for the shuttle program. A consequence of the increased flow rates is having liquif hydrogen in the vent system, which the facilities was never designed to handle. To remedy the problem a liquid separator is being designed in the system to accumulated the liquid propellant and protect the facility flare stack (which can only handle gas). The attached document is a presentation of the current thermalfluid analysis performed for the separator and will be presented at the Thermal and Fluid Analysis Workshop (NASA workshop) next week in Cleveland, Ohio.

  6. Quantitative proteomic analysis of single pancreatic islets

    PubMed Central

    Waanders, Leonie F.; Chwalek, Karolina; Monetti, Mara; Kumar, Chanchal; Lammert, Eckhard; Mann, Matthias

    2009-01-01

    Technological developments make mass spectrometry (MS)-based proteomics a central pillar of biochemical research. MS has been very successful in cell culture systems, where sample amounts are not limiting. To extend its capabilities to extremely small, physiologically distinct cell types isolated from tissue, we developed a high sensitivity chromatographic system that measures nanogram protein mixtures for 8 h with very high resolution. This technology is based on splitting gradient effluents into a capture capillary and provides an inherent technical replicate. In a single analysis, this allowed us to characterize kidney glomeruli isolated by laser capture microdissection to a depth of more than 2,400 proteins. From pooled pancreatic islets of Langerhans, another type of “miniorgan,” we obtained an in-depth proteome of 6,873 proteins, many of them involved in diabetes. We quantitatively compared the proteome of single islets, containing 2,000–4,000 cells, treated with high or low glucose levels, and covered most of the characteristic functions of beta cells. Our ultrasensitive analysis recapitulated known hyperglycemic changes but we also find components up-regulated such as the mitochondrial stress regulator Park7. Direct proteomic analysis of functionally distinct cellular structures opens up perspectives in physiology and pathology. PMID:19846766

  7. Analysis of Hydrogen Production from Renewable Electricity Sources: Preprint

    SciTech Connect

    Levene, J. I.; Mann, M. K.; Margolis, R.; Milbrandt, A.

    2005-09-01

    To determine the potential for hydrogen production via renewable electricity sources, three aspects of the system are analyzed: a renewable hydrogen resource assessment, a cost analysis of hydrogen production via electrolysis, and the annual energy requirements of producing hydrogen for refueling. The results indicate that ample resources exist to produce transportation fuel from wind and solar power. However, hydrogen prices are highly dependent on electricity prices.

  8. Geographically Based Hydrogen Demand & Infrastructure Analysis (Presentation)

    SciTech Connect

    Melendez, M.

    2006-05-18

    Presentation given at the 2006 DOE Hydrogen, Fuel Cells & Infrastructure Technologies Program Annual Merit Review in Washington, D.C., May 16-19, 2006, discusses potential future hydrogen demand and the infrastructure needed to support hydrogen vehicles.

  9. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  10. Error Propagation Analysis for Quantitative Intracellular Metabolomics

    PubMed Central

    Tillack, Jana; Paczia, Nicole; Nöh, Katharina; Wiechert, Wolfgang; Noack, Stephan

    2012-01-01

    Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected. PMID:24957773

  11. Quantitative methods for ecological network analysis.

    PubMed

    Ulanowicz, Robert E

    2004-12-01

    The analysis of networks of ecological trophic transfers is a useful complement to simulation modeling in the quest for understanding whole-ecosystem dynamics. Trophic networks can be studied in quantitative and systematic fashion at several levels. Indirect relationships between any two individual taxa in an ecosystem, which often differ in either nature or magnitude from their direct influences, can be assayed using techniques from linear algebra. The same mathematics can also be employed to ascertain where along the trophic continuum any individual taxon is operating, or to map the web of connections into a virtual linear chain that summarizes trophodynamic performance by the system. Backtracking algorithms with pruning have been written which identify pathways for the recycle of materials and energy within the system. The pattern of such cycling often reveals modes of control or types of functions exhibited by various groups of taxa. The performance of the system as a whole at processing material and energy can be quantified using information theory. In particular, the complexity of process interactions can be parsed into separate terms that distinguish organized, efficient performance from the capacity for further development and recovery from disturbance. Finally, the sensitivities of the information-theoretic system indices appear to identify the dynamical bottlenecks in ecosystem functioning. PMID:15556474

  12. Diffusion Analysis Of Hydrogen-Desorption Measurements

    NASA Technical Reports Server (NTRS)

    Danford, Merlin D.

    1988-01-01

    Distribution of hydrogen in metal explains observed desorption rate. Report describes application of diffusion theory to anaylsis of experimental data on uptake and elimination of hydrogen in high-strength alloys of 25 degree C. Study part of program aimed at understanding embrittlement of metals by hydrogen. Two nickel-base alloys, Rene 41 and Waspaloy, and one ferrous alloy, 4340 steel, studied. Desorption of hydrogen explained by distribution of hydrogen in metal. "Fast" hydrogen apparently not due to formation of hydrides on and below surface as proposed.

  13. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that question the…

  14. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  15. A Qualitative-Quantitative H-NMR Experiment for the Instrumental Analysis Laboratory.

    ERIC Educational Resources Information Center

    Phillips, John S.; Leary, James J.

    1986-01-01

    Describes an experiment combining qualitative and quantitative information from hydrogen nuclear magnetic resonance spectra. Reviews theory, discusses the experimental approach, and provides sample results. (JM)

  16. Analysis of NTSC's Timekeeping Hydrogen Masers

    NASA Astrophysics Data System (ADS)

    Song, H. J.; Dong, S. W.; Wang, Z. M.; Qu, L. L.; Jing, Y. J.; Li, W.

    2015-11-01

    In this article, the hydrogen masers were tested in NTSC (National Time Service Center) keeping time laboratory. In order to avoid the impact of larger noise of caesium atomic clocks, TA(k) or UTC(k) was not used as reference, and four hydrogen masers were mutually referred and tested. The frequency stabilities of hydrogen masers were analyzed by using four-cornered hat method, and the Allan standard deviation of single hydrogen maser was estimated in different sampling time. Then according to the characteristics of hydrogen masers, by removing the trend term, excluding outliers, and smoothing data with mathematical methods to separate the Gaussian noise of hydrogen masers, and finally through the normal Kolmogorov-Smirnov test, a single hydrogen maser's Gaussian noise has been estimated.

  17. Quantitative analysis of comparative genomic hybridization

    SciTech Connect

    Manoir, S. du; Bentz, M.; Joos, S. |

    1995-01-01

    Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a program for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.

  18. The hydrogen anomaly in neutron Compton scattering: new experiments and a quantitative theoretical explanation

    NASA Astrophysics Data System (ADS)

    Karlsson, E. B.; Hartmann, O.; Chatzidimitriou-Dreismann, C. A.; Abdul-Redah, T.

    2016-08-01

    No consensus has been reached so far about the hydrogen anomaly problem in Compton scattering of neutrons, although strongly reduced H cross-sections were first reported almost 20 years ago. Over the years, this phenomenon has been observed in many different hydrogen-containing materials. Here, we use yttrium hydrides as test objects, YH2, YH3, YD2 and YD3, Y(H x D1‑x )2 and Y(H x D1‑x )3, for which we observe H anomalies increasing with transferred momentum q. We also observe reduced deuteron cross-sections in YD2 and YD3 and have followed those up to scattering angles of 140° corresponding to high momentum transfers. In addition to data taken using the standard Au-197 foils for neutron energy selection, the present work includes experiments with Rh-103 foils and comparisons were also made with data from different detector setups. The H and D anomalies are discussed in terms of the different models proposed for their interpretation. The ‘electron loss model’ (which assumes energy transfer to excited electrons) is contradicted by the present data, but it is shown here that exchange effects in scattering from two or more protons (or deuterons) in the presence of large zero-point vibrations, can explain quantitatively the reduction of the cross-sections as well as their q-dependence. Decoherence processes also play an essential role. In a scattering time representation, shake-up processes can be followed on the attosecond scale. The theory also shows that large anomalies can appear only when the neutron coherence lengths (determined by energy selection and detector geometry) are about the same size as the distance between the scatterers.

  19. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  20. Analysis of Hybrid Hydrogen Systems: Final Report

    SciTech Connect

    Dean, J.; Braun, R.; Munoz, D.; Penev, M.; Kinchin, C.

    2010-01-01

    Report on biomass pathways for hydrogen production and how they can be hybridized to support renewable electricity generation. Two hybrid systems were studied in detail for process feasibility and economic performance. The best-performing system was estimated to produce hydrogen at costs ($1.67/kg) within Department of Energy targets ($2.10/kg) for central biomass-derived hydrogen production while also providing value-added energy services to the electric grid.

  1. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved. PMID:25823584

  2. The solar-hydrogen economy: an analysis

    NASA Astrophysics Data System (ADS)

    Reynolds, Warren D.

    2007-09-01

    The 20th Century was the age of the Petroleum Economy while the 21st Century is certainly the age of the Solar-Hydrogen Economy. The global Solar-Hydrogen Economy that is now emerging follows a different logic. Under this new economic paradigm, new machines and methods are once again being developed while companies are restructuring. The Petroleum Economy will be briefly explored in relation to oil consumption, Hubbert's curve, and oil reserves with emphasis on the "oil crash". Concerns and criticisms about the Hydrogen Economy will be addressed by debunking some of the "hydrogen myths". There are three major driving factors for the establishment of the Solar-Hydrogen Economy, i.e. the environment, the economy with the coming "oil crash", and national security. The New Energy decentralization pathway has developed many progressive features, e.g., reducing the dependence on oil, reducing the air pollution and CO II. The technical and economic aspects of the various Solar-Hydrogen energy options and combinations will be analyzed. A proposed 24-hour/day 200 MWe solar-hydrogen power plant for the U.S. with selected energy options will be discussed. There are fast emerging Solar Hydrogen energy infrastructures in the U.S., Europe, Japan and China. Some of the major infrastructure projects in the transportation and energy sectors will be discussed. The current and projected growth in the Solar-Hydrogen Economy through 2045 will be given.

  3. Fast, quantitative, and nondestructive evaluation of hydrided LWR fuel cladding by small angle incoherent neutron scattering of hydrogen

    DOE PAGESBeta

    Yan, Y.; Qian, S.; Littrell, K.; Parish, C. M.; Plummer, L. K.

    2015-02-13

    A non-destructive neutron scattering method to precisely measure the uptake of hydrogen and the distribution of hydride precipitates in light water reactor (LWR) fuel cladding was developed. Zircaloy-4 cladding used in commercial LWRs was used to produce hydrided specimens. The hydriding apparatus consists of a closed stainless steel vessel that contains Zr alloy specimens and hydrogen gas. Following hydrogen charging, the hydrogen content of the hydrided specimens was measured using the vacuum hot extraction method, by which the samples with desired hydrogen concentration were selected for the neutron study. Optical microscopy shows that our hydriding procedure results in uniform distributionmore » of circumferential hydrides across the wall. Small angle neutron incoherent scattering was performed in the High Flux Isotope Reactor at Oak Ridge National Laboratory. This study demonstrates that the hydrogen in commercial Zircaloy-4 cladding can be measured very accurately in minutes by this nondestructive method over a wide range of hydrogen concentrations from a very small amount ( 20 ppm) to over 1000 ppm. The hydrogen distribution in a tube sample was obtained by scaling the neutron scattering rate with a factor determined by a calibration process using standard, destructive direct chemical analysis methods on the specimens. This scale factor will be used in future tests with unknown hydrogen concentrations, thus providing a nondestructive method for absolute hydrogen concentration determination.« less

  4. Fast, quantitative, and nondestructive evaluation of hydrided LWR fuel cladding by small angle incoherent neutron scattering of hydrogen

    SciTech Connect

    Yan, Y.; Qian, S.; Littrell, K.; Parish, C. M.; Plummer, L. K.

    2015-02-13

    A non-destructive neutron scattering method to precisely measure the uptake of hydrogen and the distribution of hydride precipitates in light water reactor (LWR) fuel cladding was developed. Zircaloy-4 cladding used in commercial LWRs was used to produce hydrided specimens. The hydriding apparatus consists of a closed stainless steel vessel that contains Zr alloy specimens and hydrogen gas. Following hydrogen charging, the hydrogen content of the hydrided specimens was measured using the vacuum hot extraction method, by which the samples with desired hydrogen concentration were selected for the neutron study. Optical microscopy shows that our hydriding procedure results in uniform distribution of circumferential hydrides across the wall. Small angle neutron incoherent scattering was performed in the High Flux Isotope Reactor at Oak Ridge National Laboratory. This study demonstrates that the hydrogen in commercial Zircaloy-4 cladding can be measured very accurately in minutes by this nondestructive method over a wide range of hydrogen concentrations from a very small amount ( 20 ppm) to over 1000 ppm. The hydrogen distribution in a tube sample was obtained by scaling the neutron scattering rate with a factor determined by a calibration process using standard, destructive direct chemical analysis methods on the specimens. This scale factor will be used in future tests with unknown hydrogen concentrations, thus providing a nondestructive method for absolute hydrogen concentration determination.

  5. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-01-01

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF. PMID:22373162

  6. Quantitative data analysis of ESAR data

    NASA Astrophysics Data System (ADS)

    Phruksahiran, N.; Chandra, M.

    2013-07-01

    A synthetic aperture radar (SAR) data processing uses the backscattered electromagnetic wave to map radar reflectivity of the ground surface. The polarization property in radar remote sensing was used successfully in many applications, especially in target decomposition. This paper presents a case study to the experiments which are performed on ESAR L-Band full polarized data sets from German Aerospace Center (DLR) to demonstrate the potential of coherent target decomposition and the possibility of using the weather radar measurement parameter, such as the differential reflectivity and the linear depolarization ratio to obtain the quantitative information of the ground surface. The raw data of ESAR has been processed by the SAR simulator developed using MATLAB program code with Range-Doppler algorithm.

  7. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system. PMID:26360033

  8. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method.

    PubMed

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is (252)Cf or (241)Am-Be. In this study, (252)Cf with a neutron flux of 6.3x10(6)n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with (3)He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of approximately 0.947g/cc and area of 40cmx25cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei. PMID:19285419

  9. Towards a Quantitative OCT Image Analysis

    PubMed Central

    Garcia Garrido, Marina; Beck, Susanne C.; Mühlfriedel, Regine; Julien, Sylvie; Schraermeyer, Ulrich; Seeliger, Mathias W.

    2014-01-01

    Background Optical coherence tomography (OCT) is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study. Methods Spectral-Domain Optical Coherence Tomography (OCT), confocal Scanning-La­ser Ophthalmoscopy (SLO), and Fluorescein Angiography (FA) were performed in mice (Mus musculus), gerbils (Gerbillus perpadillus), and cynomolgus monkeys (Macaca fascicularis) using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer). Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/). Results Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates) with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP), the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated. Conclusions OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions

  10. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  11. Multiple quantitative trait analysis using bayesian networks.

    PubMed

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

  12. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    PubMed Central

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework. PMID:27141321

  13. Molecular orbital analysis of the hydrogen bonded water dimer

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Jiang, Wanrun; Dai, Xin; Gao, Yang; Wang, Zhigang; Zhang, Rui-Qin

    2016-02-01

    As an essential interaction in nature, hydrogen bonding plays a crucial role in many material formations and biological processes, requiring deeper understanding. Here, using density functional theory and post-Hartree-Fock methods, we reveal two hydrogen bonding molecular orbitals crossing the hydrogen-bond’s O and H atoms in the water dimer. Energy decomposition analysis also shows a non-negligible contribution of the induction term. Our finding sheds light on the essential understanding of hydrogen bonding in ice, liquid water, functional materials and biological systems.

  14. Molecular orbital analysis of the hydrogen bonded water dimer

    PubMed Central

    Wang, Bo; Jiang, Wanrun; Dai, Xin; Gao, Yang; Wang, Zhigang; Zhang, Rui-Qin

    2016-01-01

    As an essential interaction in nature, hydrogen bonding plays a crucial role in many material formations and biological processes, requiring deeper understanding. Here, using density functional theory and post-Hartree-Fock methods, we reveal two hydrogen bonding molecular orbitals crossing the hydrogen-bond’s O and H atoms in the water dimer. Energy decomposition analysis also shows a non-negligible contribution of the induction term. Our finding sheds light on the essential understanding of hydrogen bonding in ice, liquid water, functional materials and biological systems. PMID:26905305

  15. The stable hydrogen isotopic composition of sedimentary plant waxes as quantitative proxy for rainfall in the West African Sahel

    NASA Astrophysics Data System (ADS)

    Niedermeyer, Eva M.; Forrest, Matthew; Beckmann, Britta; Sessions, Alex L.; Mulch, Andreas; Schefuß, Enno

    2016-07-01

    Various studies have demonstrated that the stable hydrogen isotopic composition (δD) of terrestrial leaf waxes tracks that of precipitation (δDprecip) both spatially across climate gradients and over a range of different timescales. Yet, reconstructed estimates of δDprecip and corresponding rainfall typically remain largely qualitative, due mainly to uncertainties in plant ecosystem net fractionation, relative humidity, and the stability of the amount effect through time. Here we present δD values of the C31n-alkane (δDwax) from a marine sediment core offshore the Northwest (NW) African Sahel covering the past 100 years and overlapping with the instrumental record of rainfall. We use this record to investigate whether accurate, quantitative estimates of past rainfall can be derived from our δDwax time series. We infer the composition of vegetation (C3/C4) within the continental catchment area by analysis of the stable carbon isotopic composition of the same compounds (δ13Cwax), calculated a net ecosystem fractionation factor, and corrected the δDwax time series accordingly to derive δDprecip. Using the present-day relationship between δDprecip and the amount of precipitation in the tropics, we derive quantitative estimates of past precipitation amounts. Our data show that (a) vegetation composition can be inferred from δ13Cwax, (b) the calculated net ecosystem fractionation represents a reasonable estimate, and (c) estimated total amounts of rainfall based on δDwax correspond to instrumental records of rainfall. Our study has important implications for future studies aiming to reconstruct rainfall based on δDwax; the combined data presented here demonstrate that it is feasible to infer absolute rainfall amounts from sedimentary δDwax in tandem with δ13Cwax in specific depositional settings.

  16. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  17. A Quantitative Analysis of Countries' Research Strengths

    ERIC Educational Resources Information Center

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  18. Hydrogen release from irradiated elastomers measured by Nuclear Reaction Analysis

    NASA Astrophysics Data System (ADS)

    Jagielski, J.; Ostaszewska, U.; Bielinski, D. M.; Grambole, D.; Romaniec, M.; Jozwik, I.; Kozinski, R.; Kosinska, A.

    2016-03-01

    Ion irradiation appears as an interesting method of modification of elastomers, especially friction and wear properties. Main structural effect caused by heavy ions is a massive loss of hydrogen from the surface layer leading to its smoothening and shrinking. The paper presents the results of hydrogen release from various elastomers upon irradiation with H+, He+ and Ar+ studied by using Nuclear Reaction Analysis (NRA) method. The analysis of the experimental data indicates that the hydrogen release is controlled by inelastic collisions between ions and target electrons. The last part of the study was focused on preliminary analysis of mechanical properties of irradiated rubbers.

  19. Hydrogen storage and delivery system development: Analysis

    SciTech Connect

    Handrock, J.L.

    1996-10-01

    Hydrogen storage and delivery is an important element in effective hydrogen utilization for energy applications and is an important part of the FY1994-1998 Hydrogen Program Implementation Plan. This project is part of the Field Work Proposal entitled Hydrogen Utilization in Internal Combustion Engines (ICE). The goal of the Hydrogen Storage and Delivery System Development Project is to expand the state-of-the-art of hydrogen storage and delivery system design and development. At the foundation of this activity is the development of both analytical and experimental evaluation platforms. These tools provide the basis for an integrated approach for coupling hydrogen storage and delivery technology to the operating characteristics of potential hydrogen energy use applications. Results of the analytical model development portion of this project will be discussed. Analytical models have been developed for internal combustion engine (ICE) hybrid and fuel cell driven vehicles. The dependence of hydride storage system weight and energy use efficiency on engine brake efficiency and exhaust temperature for ICE hybrid vehicle applications is examined. Results show that while storage system weight decreases with increasing engine brake efficiency energy use efficiency remains relatively unchanged. The development, capability, and use of a recently developed fuel cell vehicle storage system model will also be discussed. As an example of model use, power distribution and control for a simulated driving cycle is presented. Model calibration results of fuel cell fluid inlet and exit temperatures at various fuel cell idle speeds, assumed fuel cell heat capacities, and ambient temperatures are presented. The model predicts general increases in temperature with fuel cell power and differences between inlet and exit temperatures, but under predicts absolute temperature values, especially at higher power levels.

  20. Structural and atoms-in-molecules analysis of hydrogen-bond network around nitroxides in liquid water.

    PubMed

    Houriez, Céline; Masella, Michel; Ferré, Nicolas

    2010-09-28

    In this study, we investigated the hydrogen-bond network patterns involving the NO moieties of five small nitroxides in liquid water by analyzing nanosecond scale molecular dynamics trajectories. To this end, we implemented two types of hydrogen-bond definitions, based on electronic structure, using Bader's atoms-in-molecules analysis and based on geometric criteria. In each definition framework, the nitroxide/water hydrogen-bond networks appear very variable from a nitroxide to another. Moreover, each definition clearly leads to a different picture of nitroxide hydration. For instance, the electronic structure-based definition predicts a number of hydrogen bonds around the nitroxide NO moiety usually larger than geometric structure-based ones. One particularly interesting result is that the strength of a nitroxide/water hydrogen bond does not depend on its linearity, leading us to question the relevance of geometric definition based on angular cutoffs to study this type of hydrogen bond. Moreover, none of the hydrogen-bond definitions we consider in the present study is able to quantitatively correlate the strength of nitroxide/water hydrogen-bond networks with the aqueous nitroxide spin properties. This clearly exhibits that the hydrogen-bonding concept is not reliable enough to draw quantitative conclusions concerning such properties. PMID:20886951

  1. Structural and atoms-in-molecules analysis of hydrogen-bond network around nitroxides in liquid water

    NASA Astrophysics Data System (ADS)

    Houriez, Céline; Masella, Michel; Ferré, Nicolas

    2010-09-01

    In this study, we investigated the hydrogen-bond network patterns involving the NO moieties of five small nitroxides in liquid water by analyzing nanosecond scale molecular dynamics trajectories. To this end, we implemented two types of hydrogen-bond definitions, based on electronic structure, using Bader's atoms-in-molecules analysis and based on geometric criteria. In each definition framework, the nitroxide/water hydrogen-bond networks appear very variable from a nitroxide to another. Moreover, each definition clearly leads to a different picture of nitroxide hydration. For instance, the electronic structure-based definition predicts a number of hydrogen bonds around the nitroxide NO moiety usually larger than geometric structure-based ones. One particularly interesting result is that the strength of a nitroxide/water hydrogen bond does not depend on its linearity, leading us to question the relevance of geometric definition based on angular cutoffs to study this type of hydrogen bond. Moreover, none of the hydrogen-bond definitions we consider in the present study is able to quantitatively correlate the strength of nitroxide/water hydrogen-bond networks with the aqueous nitroxide spin properties. This clearly exhibits that the hydrogen-bonding concept is not reliable enough to draw quantitative conclusions concerning such properties.

  2. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  3. Influence of corrosion layers on quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Röhrich, J.; Strub, E.

    2005-09-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed.

  4. Quantitative transcriptome analysis using RNA-seq.

    PubMed

    Külahoglu, Canan; Bräutigam, Andrea

    2014-01-01

    RNA-seq has emerged as the technology of choice to quantify gene expression. This technology is a convenient accurate tool to quantify diurnal changes in gene expression, gene discovery, differential use of promoters, and splice variants for all genes expressed in a single tissue. Thus, RNA-seq experiments provide sequence information and absolute expression values about transcripts in addition to relative quantification available with microarrays or qRT-PCR. The depth of information by sequencing requires careful assessment of RNA intactness and DNA contamination. Although the RNA-seq is comparatively recent, a standard analysis framework has emerged with the packages of Bowtie2, TopHat, and Cufflinks. With rising popularity of RNA-seq tools have become manageable for researchers without much bioinformatical knowledge or programming skills. Here, we present a workflow for a RNA-seq experiment from experimental planning to biological data extraction. PMID:24792045

  5. Quantitative surface spectroscopic analysis of multicomponent polymers

    NASA Astrophysics Data System (ADS)

    Zhuang, Hengzhong

    Angle-dependent electron spectroscopy for chemical analysis (ESCA) has been successfully used to examine the surface compositional gradient of a multicomponent polymer. However, photoelectron intensities detected at each take-off angle of ESCA measurements are convoluted signals. The convoluted nature of the signal distorts depth profiles for samples having compositional gradients. To recover the true concentration profiles for the samples, a deconvolution program has been described in Chapter 2. The compositional profiles of two classes of important multicomponent polymers, i.e., poly(dimethysiloxane urethane) (PU-DMS) segmented copolymers and fluorinated poly(amide urethane) block copolymers, are achieved using this program. The effects of the polymer molecular structure and the processing variation on its surface compositional profile have been studied. Besides surface composition, it is desirable to know whether the distribution of segment or block lengths at the surface is different than in the bulk, because this aspect of surface structure may lead to properties different than that predicted simply by knowledge of the surface composition and the bulk structure. In Chapter 3, we pioneered the direct determination of the distribution of polydimethylsiloxane (PDMS) segment lengths at the surface of PU-DMS using time-of-flight secondary ion mass spectrometry (SUMS). Exciting preliminary results are provided: for the thick film of PU-DMS with nominal MW of PDMS = 1000, the distribution of the PDMS segment lengths at the surface is nearly identical to that in the bulk, whereas in the case of the thick films of PU-DMS with nominal MW of PDMS = 2400, only those PDMS segments with MW of ca. 1000 preferentially segregated at the surface. As a potential minimal fouling coating or biocompatible cardio-vascular materials, PU-DMS copolymers eventually come into contact with water once in use. Could such an environmental change (from air to aqueous) induce any undesirable

  6. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  7. [Qualitative and quantitative gamma-hydroxybutyrate analysis].

    PubMed

    Petek, Maja Jelena; Vrdoljak, Ana Lucić

    2006-12-01

    Gamma-hydroxybutyrate (GHB) is a naturally occurring compound present in the brain and peripheral tissues of mammals. It is a minor metabolite and precursor of gamma-aminobutyric acid (GABA). Just as GABA, GHB is believed to play a role in neurotransmission. GHB was first synthesized in vitro in 1960, when it revealed depressive and hypnotic effects on the central nervous system. In 1960s it was used as an anaesthetic and later as an alternative to anabolic steroids, in order to enhance muscle growth. However, after it was shown that it caused strong physical dependence and severe side effects, GHB was banned. For the last fifteen years, GHB has been abused for its intoxicating effects such as euphoria, reduced inhibitions and sedation. Illicitly it is available as white powder or as clear liquid. Paradoxically GHB can easily be manufactured from its precursor gamma-butyrolactone (GBL), which has not yet been banned. Because of many car accidents and criminal acts in which it is involved, GHB has become an important object of forensic laboratory analysis. This paper describes gas and liquid chromatography, infrared spectroscopy, microscopy, colourimetry and nuclear magnetic resonance as methods for detection and quantification of GHB in urine and illicit products. PMID:17265679

  8. Quantitative analysis of in vivo cell proliferation.

    PubMed

    Cameron, Heather A

    2006-11-01

    Injection and immunohistochemical detection of 5-bromo-2'-deoxyuridine (BrdU) has become the standard method for studying the birth and survival of neurons, glia, and other cell types in the nervous system. BrdU, a thymidine analog, becomes stably incorporated into DNA during the S-phase of mitosis. Because DNA containing BrdU can be specifically recognized by antibodies, this method allows dividing cells to be marked at any given time and then identified at time points from a few minutes to several years later. BrdU immunohistochemistry is suitable for cell counting to examine the regulation of cell proliferation and cell fate. It can be combined with labeling by other antibodies, allowing confocal analysis of cell phenotype or expression of other proteins. The potential for nonspecific labeling and toxicity are discussed. Although BrdU immunohistochemistry has almost completely replaced tritiated thymidine autoradiography for labeling dividing cells, this method and situations in which it is still useful are also described. PMID:18428635

  9. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  10. Configuration analysis of nickel hydrogen cell

    NASA Technical Reports Server (NTRS)

    Holleck, G.

    1978-01-01

    The significance of various stack configurations and components on the cycle life for nickel hydrogen cells for synchronous orbit used was evaluated. Failure modes of electrolyte management and 02 management were solved by modifications in the reservoir, the wick, and/or the stack configuration.

  11. Hydrogen Technical Analysis -- Dissemination of Information

    SciTech Connect

    George Kervitsky, Jr.

    2006-03-20

    SENTECH is a small energy and environmental consulting firm providing technical, analytical, and communications solutions to technology management issues. The activities proposed by SENTECH focused on gathering and developing communications materials and information, and various dissemination activities to present the benefits of hydrogen energy to a broad audience while at the same time establishing permanent communications channels to enable continued two-way dialog with these audiences in future years. Effective communications and information dissemination is critical to the acceptance of new technology. Hydrogen technologies face the additional challenge of safety preconceptions formed primarily as a result of the crash of the Hindenburg. Effective communications play a key role in all aspects of human interaction, and will help to overcome the perceptual barriers, whether of safety, economics, or benefits. As originally proposed SENTECH identified three distinct information dissemination activities to address three distinct but important audiences; these formed the basis for the task structure used in phases 1 and 2. The tasks were: (1) Print information--Brochures that target the certain segment of the population and will be distributed via relevant technical conferences and traditional distribution channels. (2) Face-to-face meetings--With industries identified to have a stake in hydrogen energy. The three industry audiences are architect/engineering firms, renewable energy firms, and energy companies that have not made a commitment to hydrogen (3) Educational Forums--The final audience is students--the future engineers, technicians, and energy consumers. SENTECH will expand on its previous educational work in this area. The communications activities proposed by SENTECH and completed as a result of this cooperative agreement was designed to compliment the research and development work funded by the DOE by presenting the technical achievements and validations

  12. Ab initio charge analysis of pure and hydrogenated perovskites

    NASA Astrophysics Data System (ADS)

    Bork, N.; Bonanos, N.; Rossmeisl, J.; Vegge, T.

    2011-02-01

    We present a density functional theory based Bader analysis of the charge distribution in pure and hydrogenated SrTiO3. We find that the hydrogen defect carries a +0.56e charge and the OH defect carries a +0.50e charge compared to the host oxygen. Calculations on BaNbO3, CaTiO3, and SrZrO3 support these findings. The distribution of the remaining electronic density decays exponentially with distance to the hydrogen defect. Diffusional paths are calculated wherein the hydrogenic species retain a charge between +0.57 and +0.54e showing that hydrogen permeation should not be viewed as consisting of virtually independent protonic and electronic transport processes.

  13. Hydrogen detection near surfaces and shallow interfaces with resonant nuclear reaction analysis

    NASA Astrophysics Data System (ADS)

    Wilde, Markus; Fukutani, Katsuyuki

    2014-12-01

    This review introduces hydrogen depth profiling by nuclear reaction analysis (NRA) via the resonant 1H(15N,αγ)12C reaction as a versatile method for the highly depth-resolved observation of hydrogen (H) at solid surfaces and interfaces. The technique is quantitative, non-destructive, and readily applied to a large variety of materials. Its fundamentals, instrumental requirements, advantages and limitations are described in detail, and its main performance benchmarks in terms of depth resolution and sensitivity are compared to those of elastic recoil detection (ERD) as a competing method. The wide range of 1H(15N,αγ)12C NRA applications in research of hydrogen-related phenomena at surfaces and interfaces is reviewed. Special emphasis is placed on the powerful combination of 1H(15N,αγ)12C NRA with surface science techniques of in-situ target preparation and characterization, as the NRA technique is ideally suited to investigate hydrogen interactions with atomically controlled surfaces and intact interfaces. In conjunction with thermal desorption spectroscopy, 15N NRA can assess the thermal stability of absorbed hydrogen species in different depth locations against diffusion and desorption. Hydrogen diffusion dynamics in the near-surface region, including transitions of hydrogen between the surface and the bulk, and between shallow interfaces of nanostructured thin layer stacks can directly be visualized. As a unique feature of 15N NRA, the analysis of Doppler-broadened resonance excitation curves allows for the direct measurement of the zero-point vibrational energy of hydrogen atoms adsorbed on single crystal surfaces.

  14. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  15. A hydrogen energy carrier. Volume 2: Systems analysis

    NASA Technical Reports Server (NTRS)

    Savage, R. L. (Editor); Blank, L. (Editor); Cady, T. (Editor); Cox, K. (Editor); Murray, R. (Editor); Williams, R. D. (Editor)

    1973-01-01

    A systems analysis of hydrogen as an energy carrier in the United States indicated that it is feasible to use hydrogen in all energy use areas, except some types of transportation. These use areas are industrial, residential and commercial, and electric power generation. Saturation concept and conservation concept forecasts of future total energy demands were made. Projected costs of producing hydrogen from coal or from nuclear heat combined with thermochemical decomposition of water are in the range $1.00 to $1.50 per million Btu of hydrogen produced. Other methods are estimated to be more costly. The use of hydrogen as a fuel will require the development of large-scale transmission and storage systems. A pipeline system similar to the existing natural gas pipeline system appears practical, if design factors are included to avoid hydrogen environment embrittlement of pipeline metals. Conclusions from the examination of the safety, legal, environmental, economic, political and societal aspects of hydrogen fuel are that a hydrogen energy carrier system would be compatible with American values and the existing energy system.

  16. Quantitative transverse flow measurement using OCT speckle decorrelation analysis

    PubMed Central

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Mathews, Scott A.; Kang, Jin U.

    2014-01-01

    We propose an inter-Ascan speckle decorrelation based method that can quantitatively assess blood flow normal to the direction of the OCT imaging beam. To validate this method, we performed a systematic study using both phantom and in vivo animal models. Results show that our speckle analysis method can accurately extract transverse flow speed with high spatial and temporal resolution. PMID:23455305

  17. Early Child Grammars: Qualitative and Quantitative Analysis of Morphosyntactic Production

    ERIC Educational Resources Information Center

    Legendre, Geraldine

    2006-01-01

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is…

  18. Tracer-based laser-induced fluorescence measurement technique for quantitative fuel/air-ratio measurements in a hydrogen internal combustion engine.

    PubMed

    Blotevogel, Thomas; Hartmann, Matthias; Rottengruber, Hermann; Leipertz, Alfred

    2008-12-10

    A measurement technique for the quantitative investigation of mixture formation processes in hydrogen internal combustion engines (ICEs) has been developed using tracer-based laser-induced fluorescence (TLIF). This technique can be employed to fired and motored engine operation. The quantitative TLIF fuel/air-ratio results have been verified by means of linear Raman scattering measurements. Exemplary results of the simultaneous investigation of mixture formation and combustion obtained at an optical accessible hydrogen ICE are shown. PMID:19079454

  19. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  20. Development of a quantitative autoradiography image analysis system

    SciTech Connect

    Hoffman, T.J.; Volkert, W.A.; Holmes R.A.

    1986-03-01

    A low cost image analysis system suitable for quantitative autoradiography (QAR) analysis has been developed. Autoradiographs can be digitized using a conventional Newvicon television camera interfaced to an IBM-XT microcomputer. Software routines for image digitization and capture permit the acquisition of thresholded or windowed images with graphic overlays that can be stored on storage devices. Image analysis software performs all background and non-linearity corrections prior to display as black/white or pseudocolor images. The relationship of pixel intensity to a standard radionuclide concentration allows the production of quantitative maps of tissue radiotracer concentrations. An easily modified subroutine is provided for adaptation to use appropriate operational equations when parameters such as regional cerebral blood flow or regional cerebral glucose metabolism are under investigation. This system could provide smaller research laboratories with the capability of QAR analysis at relatively low cost.

  1. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  2. Quantitative numerical analysis of transient IR-experiments on buildings

    NASA Astrophysics Data System (ADS)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  3. Scanning tunneling microscopy on rough surfaces-quantitative image analysis

    NASA Astrophysics Data System (ADS)

    Reiss, G.; Brückl, H.; Vancea, J.; Lecheler, R.; Hastreiter, E.

    1991-07-01

    In this communication, the application of scanning tunneling microscopy (STM) for a quantitative evaluation of roughnesses and mean island sizes of polycrystalline thin films is discussed. Provided strong conditions concerning the resolution are satisfied, the results are in good agreement with standard techniques as, for example, transmission electron microscopy. Owing to its high resolution, STM can supply a better characterization of surfaces than established methods, especially concerning the roughness. Microscopic interpretations of surface dependent physical properties thus can be considerably improved by a quantitative analysis of STM images.

  4. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  5. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  6. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  7. Quantitative deuterium analysis of titanium samples in ultraviolet laser-induced low-pressure helium plasma.

    PubMed

    Abdulmadjid, Syahrun Nur; Lie, Zener Sukra; Niki, Hideaki; Pardede, Marincan; Hedwig, Rinda; Lie, Tjung Jie; Jobiliong, Eric; Kurniawan, Koo Hendrik; Fukumoto, Ken-Ichi; Kagawa, Kiichiro; Tjia, May On

    2010-04-01

    An experimental study of ultraviolet (UV) laser-induced plasma spectroscopy (LIPS) on Ti samples with low-pressure surrounding He gas has been carried out to demonstrate its applicability to quantitative micro-analysis of deuterium impurities in titanium without the spectral interference from the ubiquitous surface water. This was achieved by adopting the optimal experimental condition ascertained in this study, which is specified by 5 mJ laser energy, 10 Torr helium pressure, and 1-50 mus measurement window, which resulted in consistent D emission enhancement and effective elimination of spectral interference from surface water. As a result, a linear calibration line exhibiting a zero intercept was obtained from Ti samples doped with various D impurity concentrations. An additional measurement also yielded a detection limit of about 40 ppm for D impurity, well below the acceptable threshold of damaging H concentration in Ti and its alloys. Each of these measurements was found to produce a crater size of only 25 mum in diameter, and they may therefore qualify as nondestructive measurements. The result of this study has therefore paved the way for conducting further experiments with hydrogen-doped Ti samples and the technical implementation of quantitative micro-analysis of detrimental hydrogen impurity in Ti metal and its alloys, which is the ultimate goal of this study. PMID:20412619

  8. Analysis of Hydrogen and Competing Technologies for Utility-Scale Energy Storage (Presentation)

    SciTech Connect

    Steward, D.

    2010-02-11

    Presentation about the National Renewable Energy Laboratory's analysis of hydrogen energy storage scenarios, including analysis framework, levelized cost comparison of hydrogen and competing technologies, analysis results, and conclusions drawn from the analysis.

  9. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  10. Analysis of experimental hydrogen engine data and hydrogen vehicle performance and emissions simulation

    SciTech Connect

    Aceves, S.A.

    1996-10-01

    This paper reports the engine and vehicle simulation and analysis done at Lawrence Livermore (LLNL) as a part of a joint optimized hydrogen engine development effort. Project participants are: Sandia National Laboratory; Los Alamos National Laboratory; and the University of Miami. Fuel cells are considered as the ideal power source for future vehicles, due to their high efficiency and low emissions. However, extensive use of fuel cells in light-duty vehicles is likely to be years away, due to their high manufacturing cost. Hydrogen-fueled, spark-ignited, homogeneous-charge engines offer a near-term alternative to fuel cells. Hydrogen in a spark-ignited engine can be burned at very low equivalence ratios. NO{sub x} emissions can be reduced to less than 10 ppm without catalyst. HC and CO emissions may result from oxidation of engine oil, but by proper design are negligible (a few ppm). Lean operation also results in increased indicated efficiency due to the thermodynamic properties of the gaseous mixture contained in the cylinder. The high effective octane number of hydrogen allows the use of a high compression ratio, further increasing engine efficiency. In this paper, a simplified engine model is used for predicting hydrogen engine efficiency and emissions. The model uses basic thermodynamic equations for the compression and expansion processes, along with an empirical correlation for heat transfer, to predict engine indicated efficiency. A friction correlation and a supercharger/turbocharger model are then used to calculate brake thermal efficiency. The model is validated with many experimental points obtained in a recent evaluation of a hydrogen research engine. The experimental data are used to adjust the empirical constants in the heat release rate and heat transfer correlation. The results indicate that hydrogen lean-burn spark-ignite engines can provide Equivalent Zero Emission Vehicle (EZEV) levels in either a series hybrid or a conventional automobile.

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  12. An Analysis of Critical Factors for Quantitative Immunoblotting

    PubMed Central

    Janes, Kevin A.

    2015-01-01

    Immunoblotting (also known as Western blotting) combined with digital image analysis can be a reliable method for analyzing the abundance of proteins and protein modifications, but not every immunoblot-analysis combination produces an accurate result. Here, I illustrate how sample preparation, protocol implementation, detection scheme, and normalization approach profoundly affect the quantitative performance of immunoblotting. This study implemented diagnostic experiments that assess an immunoblot-analysis workflow for accuracy and precision. The results showed that ignoring such diagnostics can lead to pseudoquantitative immunoblot data that dramatically overestimate or underestimate true differences in protein abundance. PMID:25852189

  13. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  14. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. PMID:27358910

  15. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  16. Hydrogen and deuterium loss from the terrestrial atmosphere - A quantitative assessment of nonthermal escape fluxes

    NASA Technical Reports Server (NTRS)

    Yung, Yuk L.; Wen, Jun-Shan; Moses, Julianne I.; Landry, Bridget M.; Allen, Mark; Hsu, Kuang-Jung

    1989-01-01

    A comprehensive one-dimensional photochemical model extending from the middle atmosphere (50 km) to the exobase (432 km) has been used to study the escape of hydrogen and deuterium from the earth's atmosphere. The model incorporates recent advances in chemical kinetics as well as atmospheric observations by satellites, especially the Atmosphere Explorer C satellite. The results suggest that the escape fluxes of both H and D are limited by the upward transport of total hydrogen and total deuterium at the homopause. About one fourth of total hydrogen escape is thermal, the rest being nonthermal. It is shown that escape of D is nonthermal and that charge exchange and polar wind are important mechanisms for the nonthermal escape of H and D.

  17. Water-hydrogen isotope exchange process analysis

    SciTech Connect

    Fedorchenko, O.; Alekseev, I.; Uborsky, V.

    2008-07-15

    The use of a numerical method is needed to find a solution to the equation system describing a general case of heterogeneous isotope exchange between gaseous hydrogen and liquid water in a column. A computer model of the column merely outputting the isotope compositions in the flows leaving the column, like the experimental column itself, is a 'black box' to a certain extent: the solution is not transparent and occasionally not fully comprehended. The approximate analytical solution was derived from the ZXY-diagram (McCabe-Thiele diagram), which illustrates the solution of the renewed computer model called 'EVIO-4.2' Several 'unusual' results and dependences have been analyzed and explained. (authors)

  18. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored. PMID:27444661

  19. Real-Time Quantitative Analysis of H2, He, O2, and Ar by Quadrupole Ion Trap Mass Spectrometry

    NASA Technical Reports Server (NTRS)

    Ottens, Andrew K.; Harrison, W. W.; Griffin, Timothy P.; Helms, William R.; Voska, N. (Technical Monitor)

    2002-01-01

    The use of a quadrupole ion trap mass spectrometer for quantitative analysis of hydrogen and helium as well as other permanent gases is demonstrated. The customized instrument utilizes the mass selective instability mode of mass analysis as with commercial instruments; however, this instrument operates at a greater RF trapping frequency and without a buffer gas. With these differences, a useable mass range from 2 to over 50 Da is achieved, as required by NASA for monitoring the Space Shuttle during a launch countdown. The performance of the ion trap is evaluated using part-per-million concentrations of hydrogen, helium, oxygen and argon mixed into a nitrogen gas stream. Relative accuracy and precision when quantitating the four analytes were better than the NASA-required minimum of 10% error and 5% deviation, respectively. Limits of detection were below the NASA requirement of 25-ppm hydrogen and 100-ppm helium; those for oxygen and argon were slightly higher than the requirement. The instrument provided adequate performance at fast data recording rates, demonstrating the utility of an ion trap mass spectrometer as a real-time quantitative monitoring device for permanent gas analysis.

  20. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  1. Quantitative analysis of chaotic synchronization by means of coherence

    NASA Astrophysics Data System (ADS)

    Shabunin, A.; Astakhov, V.; Kurths, J.

    2005-07-01

    We use an index of chaotic synchronization based on the averaged coherence function for the quantitative analysis of the process of the complete synchronization loss in unidirectionally coupled oscillators and maps. We demonstrate that this value manifests different stages of the synchronization breaking. It is invariant to time delay and insensitive to small noise and distortions, which can influence the accessible signals at measurements. Peculiarities of the synchronization destruction in maps and oscillators are investigated.

  2. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    NASA Astrophysics Data System (ADS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  3. Geospatial analysis and seasonal changes in water-equivalent hydrogen in eastern equatorial Mars

    NASA Astrophysics Data System (ADS)

    Clevy, June Renee

    2014-10-01

    This dissertation describes the relationship between hydrogen abundance, as measured through epithermal neutron counts, and the topographic, geologic, and surficial features in the equatorial region of eastern Mars. In Chapter 1, I present an alternative method for resampling the epithermal neutron count data collected by the neutron spectrometer from Mars Odyssey's Gamma Ray Spectrometer suite. Chapter 2 provides a seasonal break down of mean and median epithermal neutron count rates and examines areas of static, seasonal, and episodic hydrogen enrichment. Armed with new maps of mean epithermal neutron count rates and derivative maps of weight percent water equivalent hydrogen, I examine the spatial relationships between equatorial hydrogen concentrations and satellite-measured surface properties such as elevation, its derivatives slope and aspect, albedo, dust cover, geologic units, and valley networks in Chapter 3. The chapters in this dissertation represent a workflow from the development of the Water Equivalent Hydrogen dataset used in this research (Chapter 1), to an analysis of seasonal changes in the hydrogen signal (Chapter 2), and the relationships between this data and measurements of elevation, crustal thickness, surface composition, and geomorphology (Chapter 3). These investigations were made possible by the application of terrestrial geographic information science to planetary geology through Geographic Information Systems (GIS). Neighborhood processing allowed me to refine the spatial resolution of the epithermal neutron count in the first chapter. Class frequency tables permitted the identification of changes over time in chapter two and facilitated the identification of high and low variability areas. Finally, a quantitative process known as the Location Quotient, which builds upon frequency tables, was applied to identify more frequent than expected combinations of hydrogen abundance and other martian data (e.g., elevation) for the purpose of

  4. General quantitative model for coal liquefaction kinetics: the thermal cleavage/hydrogen donor capping mechanism. [59 references

    SciTech Connect

    Gangwer, T

    1980-01-01

    A mechanism for coal liquefaction, based on the concept of thermal cleavage-hydrogen capping donor complexes, is proposed and the quantitative agreement between the derived rate laws and the kinetic data obtained from fifteen publications is presented. The mechanism provides rate laws which describe the preasphaltene, asphaltene, oil and gas time/yield curves for the coal liquefaction process. A simplistic dissolution model is presented and used to relate the proposed mechanism to the experimentally observed products. Based on the quality of the mechanistic fit to the reported coal liquefaction systems, which cover a diverse range of reaction conditions, coal types and donor solvent compositions, it is proposed that the donor solvent/thermal bond cleavage/hydrogen capping mechanism provides a good, quantitative description of the rate limiting process. Interpretation of the rate constant/temperature dependencies in terms of transition state theory indicates formation of the activated complex can involve either physically or chemically controlled steps. A uniform free energy of activation of 52 kcal was found for the diverse liquefaction systems indicating a common transition state describes the reactions. Thus the proposed mechanism unifies the diverse liquefaction kinetic data by using a set of uniform reaction sequences, which have a common transition state, to describe the conversion chemistry. The mechanism thereby creates a common base for intercomparison, interpretation and evaluation of coal conversion for the broad range of processes currently being investigated in the liquefaction field.

  5. Quantitative Motion Analysis in Two and Three Dimensions.

    PubMed

    Wessels, Deborah J; Lusche, Daniel F; Kuhl, Spencer; Scherer, Amanda; Voss, Edward; Soll, David R

    2016-01-01

    This chapter describes 2D quantitative methods for motion analysis as well as 3D motion analysis and reconstruction methods. Emphasis is placed on the analysis of dynamic cell shape changes that occur through extension and retraction of force generating structures such as pseudopodia and lamellipodia. Quantitative analysis of these structures is an underutilized tool in the field of cell migration. Our intent, therefore, is to present methods that we developed in an effort to elucidate mechanisms of basic cell motility, directed cell motion during chemotaxis, and metastasis. We hope to demonstrate how application of these methods can more clearly define alterations in motility that arise due to specific mutations or disease and hence, suggest mechanisms or pathways involved in normal cell crawling and treatment strategies in the case of disease. In addition, we present a 4D tumorigenesis model for high-resolution analysis of cancer cells from cell lines and human cancer tissue in a 3D matrix. Use of this model led to the discovery of the coalescence of cancer cell aggregates and unique cell behaviors not seen in normal cells or normal tissue. Graphic illustrations to visually display and quantify cell shape are presented along with algorithms and formulae for calculating select 2D and 3D motion analysis parameters. PMID:26498790

  6. Energy utilization and efficiency analysis for hydrogen fuel cell vehicles

    NASA Astrophysics Data System (ADS)

    Moore, R. M.; Hauer, K. H.; Ramaswamy, S.; Cunningham, J. M.

    This paper presents the results of an energy analysis for load-following versus battery-hybrid direct-hydrogen fuel cell vehicles. The analysis utilizes dynamic fuel cell vehicle simulation tools previously presented [R.M. Moore, K.H. Hauer, J. Cunningham, S. Ramaswamy, A dynamic simulation tool for the battery-hybrid hydrogen fuel cell vehicle, Fuel Cells, submitted for publication; R.M. Moore, K.H. Hauer, D.J. Friedman, J.M. Cunningham, P. Badrinarayanan, S.X. Ramaswamy, A. Eggert, A dynamic simulation tool for hydrogen fuel cell vehicles, J. Power Sources, 141 (2005) 272-285], and evaluates energy utilization and efficiency for standardized drive cycles used in the US, Europe and Japan.

  7. Quantitative evaluation on activated property-tunable bulk liquid water with reduced hydrogen bonds using deconvoluted Raman spectroscopy.

    PubMed

    Chen, Hsiao-Chien; Mai, Fu-Der; Yang, Kuang-Hsuan; Chen, Liang-Yih; Yang, Chih-Ping; Liu, Yu-Chuan

    2015-01-01

    Interesting properties of water with distinguishable hydrogen-bonding structure on interfacial phase or in confined environment have drawn wide attentions. However, these unique properties of water are only found within the interfacial phase and confined environment, thus, their applications are limited. In addition, quantitative evaluation on these unique properties associating with the enhancement of water's physical and chemical activities represents a notable challenge. Here we report a practicable production of free-standing liquid water at room temperature with weak hydrogen-bonded structure naming Au nanoparticles (NPs)-treated (AuNT) water via treating by plasmon-induced hot electron transfer occurred on resonantly illuminated gold NPs (AuNPs). Compared to well-known untreated bulk water (deionized water), the prepared AuNT water exhibits many distinct activities in generally physical and chemical reactions, such as high solubilities to NaCl and O2. Also, reducing interaction energy within water molecules provides lower overpotential and higher efficiency in electrolytic hydrogen production. In addition, these enhanced catalytic activities of AuNT water are tunable by mixing with deionized water. Also, most of these tunable activities are linearly proportional to its degree of nonhydrogen-bonded structure (DNHBS), which is derived from the O-H stretching in deconvoluted Raman spectrum. PMID:25471522

  8. ANALYSIS OF AVAILABLE HYDROGEN DATA & ACCUMULATION OF HYDROGEN IN UNVENTED TRANSURANIC (TRU) DRUMS

    SciTech Connect

    DAYLEY, L

    2004-06-24

    This document provides a response to the second action required in the approval for the Justification for Continued Operations (JCO) Assay and Shipment of Transuranic (TRU) Waste Containers in 218-W-4C. The Waste Management Project continues to make progress toward shipping certified TRU waste to the Waste Isolation Pilot Plant (WIPP). As the existing inventory of TRU waste in the Central Waste Complex (CWC) storage buildings is shipped, and the uncovered inventory is removed from the trenches and prepared for shipment from the Hanford Site, the covered inventory of suspect TRU wastes must be retrieved and prepared for processing for shipment to WIPP. Accumulation of hydrogen in unvented TRU waste containers is a concern due to the possibility of explosive mixtures of hydrogen and oxygen. The frequency and consequence of these gas mixtures resulting in an explosion must be addressed. The purpose of this study is to recommend an approach and schedule for venting TRU waste containers in the low-level burial ground (LLBG) trenches in conjunction with TRU Retrieval Project activities. This study provides a detailed analysis of the expected probability of hydrogen gas accumulation in significant quantities in unvented drums. Hydrogen gas accumulation in TRU drums is presented and evaluated in the following three categories: Hydrogen concentrations less than 5 vol%; Hydrogen between 5-15 vol%; and Hydrogen concentrations above 15 vol%. This analysis is based on complex-wide experience with TRU waste drums, available experimental data, and evaluations of storage conditions. Data reviewed in this report includes experience from the Idaho National Environmental Engineering Laboratories (INEEL), Savannah River Site (SRS), Los Alamos National Laboratories (LANL), Oak Ridge National Laboratories, (ORNL), Rocky Flats sites, Matrix Depletion Program and the National Transportation and Packaging Program. Based on this analysis, as well as an assessment of the probability and

  9. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    PubMed

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes. PMID:26095427

  10. Quantitative analysis of surface electromyography: Biomarkers for convulsive seizures.

    PubMed

    Beniczky, Sándor; Conradsen, Isa; Pressler, Ronit; Wolf, Peter

    2016-08-01

    Muscle activity during seizures is in electroencephalographical (EEG) praxis often considered an irritating artefact. This article discusses ways by surface electromyography (EMG) to turn it into a valuable tool of epileptology. Muscles are in direct synaptic contact with motor neurons. Therefore, EMG signals provide direct information about the electric activity in the motor cortex. Qualitative analysis of EMG has traditionally been a part of the long-term video-EEG recordings. Recent development in quantitative analysis of EMG signals yielded valuable information on the pathomechanisms of convulsive seizures, demonstrating that it was different from maximal voluntary contraction, and different from convulsive psychogenic non-epileptic seizures. Furthermore, the tonic phase of the generalised tonic-clonic seizures (GTCS) proved to have different quantitative features than tonic seizures. The high temporal resolution of EMG allowed detailed characterisation of temporal dynamics of the GTCS, suggesting that the same inhibitory mechanisms that try to prevent the build-up of the seizure activity, contribute to ending the seizure. These findings have clinical implications: the quantitative EMG features provided the pathophysiologic substrate for developing neurophysiologic biomarkers that accurately identify GTCS. This proved to be efficient both for seizure detection and for objective, automated distinction between convulsive and non-convulsive epileptic seizures. PMID:27212115

  11. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  12. Label-Free Technologies for Quantitative Multiparameter Biological Analysis

    PubMed Central

    Qavi, Abraham J.; Washburn, Adam L.; Byeon, Ji-Yeon; Bailey, Ryan C.

    2009-01-01

    In the post-genomic era, information is king and information-rich technologies are critically important drivers in both fundamental biology and medicine. It is now known that single-parameter measurements provide only limited detail and that quantitation of multiple biomolecular signatures can more fully illuminate complex biological function. Label-free technologies have recently attracted significant interest for sensitive and quantitative multiparameter analysis of biological systems. There are several different classes of label-free sensors that are currently being developed both in academia and in industry. In this critical review, we highlight, compare, and contrast some of the more promising approaches. We will describe the fundamental principles of these different methodologies and discuss advantages and disadvantages that might potentially help one in selecting the appropriate technology for a given bioanalytical application. PMID:19221722

  13. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  14. Microcomputer-based digital image analysis system for quantitative autoradiography

    SciTech Connect

    Hoffman, T.J.; Volkert, W.A.; Holmes, R.A.

    1988-01-01

    A computerized image processing system utilizing an IBM-XT personal microcomputer with the capability of performing quantitative cerebral autoradiography is described. All of the system components are standard computer and optical hardware that can be easily assembled. The system has 512 horizontal by 512 vertical axis resolution with 8 bits per pixel (256 gray levels). Unlike other dedicated image processing systems, the IBM-XT permits the assembly of an efficient, low-cost image analysis system without sacrificing other capabilities of the IBM personal computer. The application of this system in both qualitative and quantitative autoradiography has been the principal factor in developing a new radiopharmaceutical to measure regional cerebral blood flow.

  15. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  16. Isotopic disproportionation during hydrogen isotopic analysis of nitrogen-bearing organic compounds

    USGS Publications Warehouse

    Nair, Sreejesh; Geilmann, Heike; Coplen, Tyler B.; Qi, Haiping; Gehre, Matthias; Schimmelmann, Arndt; Brand, Willi A.

    2015-01-01

    Rationale High-precision hydrogen isotope ratio analysis of nitrogen-bearing organic materials using high-temperature conversion (HTC) techniques has proven troublesome in the past. Formation of reaction products other than molecular hydrogen (H2) has been suspected as a possible cause of incomplete H2 yield and hydrogen isotopic fractionation. Methods The classical HTC reactor setup and a modified version including elemental chromium, both operated at temperatures in excess of 1400 °C, have been compared using a selection of nitrogen-bearing organic compounds, including caffeine. A focus of the experiments was to avoid or suppress hydrogen cyanide (HCN) formation and to reach quantitative H2 yields. The technique also was optimized to provide acceptable sample throughput. Results The classical HTC reaction of a number of selected compounds exhibited H2 yields from 60 to 90 %. Yields close to 100 % were measured for the experiments with the chromium-enhanced reactor. The δ2H values also were substantially different between the two types of experiments. For the majority of the compounds studied, a highly significant relationship was observed between the amount of missing H2and the number of nitrogen atoms in the molecules, suggesting the pyrolytic formation of HCN as a byproduct. A similar linear relationship was found between the amount of missing H2 and the observed hydrogen isotopic result, reflecting isotopic fractionation. Conclusions The classical HTC technique to produce H2 from organic materials using high temperatures in the presence of glassy carbon is not suitable for nitrogen-bearing compounds. Adding chromium to the reaction zone improves the yield to 100 % in most cases. The initial formation of HCN is accompanied by a strong hydrogen isotope effect, with the observed hydrogen isotope results on H2 being substantially shifted to more negative δ2H values. The reaction can be understood as an initial disproportionation leading to H2 and HCN

  17. Hydrogen measurement during steam oxidation using coupled thermogravimetric analysis and quadrupole mass spectrometry

    DOE PAGESBeta

    Parkison, Adam J.; Nelson, Andrew Thomas

    2016-01-11

    An analytical technique is presented with the goal of measuring reaction kinetics during steam oxidation reactions for three cases in which obtaining kinetics information often requires a prohibitive amount of time and cost. The technique presented relies on coupling thermogravimetric analysis (TGA) with a quantitative hydrogen measurement technique using quadrupole mass spectrometry (QMS). The first case considered is in differentiating between the kinetics of steam oxidation reactions and those for simultaneously reacting gaseous impurities such as nitrogen or oxygen. The second case allows one to independently measure the kinetics of oxide and hydride formation for systems in which both ofmore » these reactions are known to take place during steam oxidation. The third case deals with measuring the kinetics of formation for competing volatile and non-volatile oxides during certain steam oxidation reactions. In order to meet the requirements of the coupled technique, a methodology is presented which attempts to provide quantitative measurement of hydrogen generation using QMS in the presence of an interfering fragmentation species, namely water vapor. This is achieved such that all calibrations and corrections are performed during the TGA baseline and steam oxidation programs, making system operation virtually identical to standard TGA. Benchmarking results showed a relative error in hydrogen measurement of 5.7–8.4% following the application of a correction factor. Lastly, suggestions are made for possible improvements to the presented technique so that it may be better applied to the three cases presented.« less

  18. Quantitative analysis of astrogliosis in drug-dependent humans.

    PubMed

    Weber, Marco; Scherf, Nico; Kahl, Thomas; Braumann, Ulf-Dietrich; Scheibe, Patrick; Kuska, Jens-Peer; Bayer, Ronny; Büttner, Andreas; Franke, Heike

    2013-03-15

    Drug addiction is a chronic, relapsing disease caused by neurochemical and molecular changes in the brain. In this human autopsy study qualitative and quantitative changes of glial fibrillary acidic protein (GFAP)-positive astrocytes in the hippocampus of 26 lethally intoxicated drug addicts and 35 matched controls are described. The morphological characterization of these cells reflected alterations representative for astrogliosis. But, neither quantification of GFAP-positive cells nor the Western blot analysis indicated statistical significant differences between drug fatalities versus controls. However, by semi-quantitative scoring a significant shift towards higher numbers of activated astrocytes in the drug group was detected. To assess morphological changes quantitatively, graph-based representations of astrocyte morphology were obtained from single cell images captured by confocal laser scanning microscopy. Their underlying structures were used to quantify changes in astroglial fibers in an automated fashion. This morphometric analysis yielded significant differences between the investigated groups for four different measures of fiber characteristics (Euclidean distance, graph distance, number of graph elements, fiber skeleton distance), indicating that, e.g., astrocytes in drug addicts on average exhibit significant elongation of fiber structures as well as two-fold increase in GFAP-positive fibers as compared with those in controls. In conclusion, the present data show characteristic differences in morphology of hippocampal astrocytes in drug addicts versus controls and further supports the involvement of astrocytes in human pathophysiology of drug addiction. The automated quantification of astrocyte morphologies provides a novel, testable way to assess the fiber structures in a quantitative manner as opposed to standard, qualitative descriptions. PMID:23337617

  19. Bayesian Shrinkage Analysis of Quantitative Trait Loci for Dynamic Traits

    PubMed Central

    Yang, Runqing; Xu, Shizhong

    2007-01-01

    Many quantitative traits are measured repeatedly during the life of an organism. Such traits are called dynamic traits. The pattern of the changes of a dynamic trait is called the growth trajectory. Studying the growth trajectory may enhance our understanding of the genetic architecture of the growth trajectory. Recently, we developed an interval-mapping procedure to map QTL for dynamic traits under the maximum-likelihood framework. We fit the growth trajectory by Legendre polynomials. The method intended to map one QTL at a time and the entire QTL analysis involved scanning the entire genome by fitting multiple single-QTL models. In this study, we propose a Bayesian shrinkage analysis for estimating and mapping multiple QTL in a single model. The method is a combination between the shrinkage mapping for individual quantitative traits and the Legendre polynomial analysis for dynamic traits. The multiple-QTL model is implemented in two ways: (1) a fixed-interval approach where a QTL is placed in each marker interval and (2) a moving-interval approach where the position of a QTL can be searched in a range that covers many marker intervals. Simulation study shows that the Bayesian shrinkage method generates much better signals for QTL than the interval-mapping approach. We propose several alternative methods to present the results of the Bayesian shrinkage analysis. In particular, we found that the Wald test-statistic profile can serve as a mechanism to test the significance of a putative QTL. PMID:17435239

  20. U.S. Department of Energy Hydrogen Storage Cost Analysis

    SciTech Connect

    Law, Karen; Rosenfeld, Jeffrey; Han, Vickie; Chan, Michael; Chiang, Helena; Leonard, Jon

    2013-03-11

    The overall objective of this project is to conduct cost analyses and estimate costs for on- and off-board hydrogen storage technologies under development by the U.S. Department of Energy (DOE) on a consistent, independent basis. This can help guide DOE and stakeholders toward the most-promising research, development and commercialization pathways for hydrogen-fueled vehicles. A specific focus of the project is to estimate hydrogen storage system cost in high-volume production scenarios relative to the DOE target that was in place when this cost analysis was initiated. This report and its results reflect work conducted by TIAX between 2004 and 2012, including recent refinements and updates. The report provides a system-level evaluation of costs and performance for four broad categories of on-board hydrogen storage: (1) reversible on-board metal hydrides (e.g., magnesium hydride, sodium alanate); (2) regenerable off-board chemical hydrogen storage materials(e.g., hydrolysis of sodium borohydride, ammonia borane); (3) high surface area sorbents (e.g., carbon-based materials); and 4) advanced physical storage (e.g., 700-bar compressed, cryo-compressed and liquid hydrogen). Additionally, the off-board efficiency and processing costs of several hydrogen storage systems were evaluated and reported, including: (1) liquid carrier, (2) sodium borohydride, (3) ammonia borane, and (4) magnesium hydride. TIAX applied a bottom-up costing methodology customized to analyze and quantify the processes used in the manufacture of hydrogen storage systems. This methodology, used in conjunction with ® software and other tools, developed costs for all major tank components, balance-of-tank, tank assembly, and system assembly. Based on this methodology, the figure below shows the projected on-board high-volume factory costs of the various analyzed hydrogen storage systems, as designed. Reductions in the key cost drivers may bring hydrogen storage system costs closer to this DOE target

  1. Analysis of hydrogen plasma in MPCVD reactor

    NASA Astrophysics Data System (ADS)

    Shivkumar, Gayathri

    The aim of this work is to build a numerical model that can predict the plasma properties of hydrogen plasmas inside a Seki Technotron Corp. AX5200S MPCVD system so that it may be used to understand and optimize the conditions for the growth of carbon nanostructures. A 2D model of the system is used in the finite element high frequency Maxwell solver and heat trasfer solver in COMSOL Multiphysics, where the solvers are coupled with user defined functions to analyze the plasma. A simplified chemistry model is formulated in order to determine the electron temperature in the plasma. This is used in the UDFs which calculate the electron number density as well as electron temperature. A Boltzmann equation solver for electrons in weakly ionized gases under uniform electric fields, called BOLSIG+, is used to obtain certain input parameters required for these UDFs. The system is modeled for several reactor geometries at pressures of 10 Torr and 30 Torr and powers ranging from 300 W to 700 W. The variation of plasma characteristics with changes in input conditions is studied and the electric field, electron number density, electron temperature and gas temperature are seen to increase with increasing power. Electric field, electron number density and electron temperature decrease and gas temperature increases with increasing pressure. The modeling results are compared with experimental measurements and a good agreement is found after calibrating the parameter gamma in Funer's model to match experimental electron number densities. The gas temperature is seen to have a weak dependence on power and a strong dependence on gas pressure. On an average, the gas temperature at a point 5 mm above the center of the puck increases from about 1000 K at a pressure of 10 Torr to about 1500 K at 30 Torr. The inclusion of the pillar produces an increase in the maximum electron number density of approximately 50%; it is higher under some conditions. It increases the maximum electron

  2. Hydrogen sensor

    DOEpatents

    Duan, Yixiang; Jia, Quanxi; Cao, Wenqing

    2010-11-23

    A hydrogen sensor for detecting/quantitating hydrogen and hydrogen isotopes includes a sampling line and a microplasma generator that excites hydrogen from a gas sample and produces light emission from excited hydrogen. A power supply provides power to the microplasma generator, and a spectrometer generates an emission spectrum from the light emission. A programmable computer is adapted for determining whether or not the gas sample includes hydrogen, and for quantitating the amount of hydrogen and/or hydrogen isotopes are present in the gas sample.

  3. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  4. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  5. A quantitative analysis of IRAS maps of molecular clouds

    NASA Astrophysics Data System (ADS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-11-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  6. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. PMID:27354014

  7. Facegram - Objective quantitative analysis in facial reconstructive surgery.

    PubMed

    Gerós, Ana; Horta, Ricardo; Aguiar, Paulo

    2016-06-01

    Evaluation of effectiveness in reconstructive plastic surgery has become an increasingly important asset in comparing and choosing the most suitable medical procedure to handle facial disfigurement. Unfortunately, traditional methods to assess the results of surgical interventions are mostly qualitative and lack information about movement dynamics. Along with this, the few existing methodologies tailored to objectively quantify surgery results are not practical in the medical field due to constraints in terms of cost, complexity and poor suitability to clinical environment. These limitations enforce an urgent need for the creation of a new system to quantify facial movement and allow for an easy interpretation by medical experts. With this in mind, we present here a novel method capable of quantitatively and objectively assess complex facial movements, using a set of morphological, static and dynamic measurements. For this purpose, RGB-D cameras are used to acquire both color and depth images, and a modified block matching algorithm, combining depth and color information, was developed to track the position of anatomical landmarks of interest. The algorithms are integrated into a user-friendly graphical interface and the analysis outcomes are organized into an innovative medical tool, named facegram. This system was developed in close collaboration with plastic surgeons and the methods were validated using control subjects and patients with facial paralysis. The system was shown to provide useful and detailed quantitative information (static and dynamic) making it an appropriate solution for objective quantitative characterization of facial movement in a clinical environment. PMID:26994664

  8. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  9. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  10. Quantitative Northern Blot Analysis of Mammalian rRNA Processing.

    PubMed

    Wang, Minshi; Pestov, Dimitri G

    2016-01-01

    Assembly of eukaryotic ribosomes is an elaborate biosynthetic process that begins in the nucleolus and requires hundreds of cellular factors. Analysis of rRNA processing has been instrumental for studying the mechanisms of ribosome biogenesis and effects of stress conditions on the molecular milieu of the nucleolus. Here, we describe the quantitative analysis of the steady-state levels of rRNA precursors, applicable to studies in mammalian cells and other organisms. We include protocols for gel electrophoresis and northern blotting of rRNA precursors using procedures optimized for the large size of these RNAs. We also describe the ratio analysis of multiple precursors, a technique that facilitates the accurate assessment of changes in the efficiency of individual pre-rRNA processing steps. PMID:27576717

  11. Hydrogen Scenario Analysis Summary Report: Analysis of the Transition to Hydrogen Fuel Cell Vehicles and the Potential Hydrogen Energy Infrastructure Requirements

    SciTech Connect

    Greene, David L; Leiby, Paul Newsome; James, Brian; Perez, Julie; Melendez, Margo; Milbrandt, Anelia; Unnasch, Stefan; Rutherford, Daniel; Hooks, Matthew

    2008-03-01

    Infrastructure Technologies Program (HFCIT) has supported a series of analyses to evaluate alternative scenarios for deployment of millions of hydrogen fueled vehicles and supporting infrastructure. To ensure that these alternative market penetration scenarios took into consideration the thinking of the automobile manufacturers, energy companies, industrial hydrogen suppliers, and others from the private sector, DOE held several stakeholder meetings to explain the analyses, describe the models, and solicit comments about the methods, assumptions, and preliminary results (U.S. DOE, 2006a). The first stakeholder meeting was held on January 26, 2006, to solicit guidance during the initial phases of the analysis; this was followed by a second meeting on August 9-10, 2006, to review the preliminary results. A third and final meeting was held on January 31, 2007, to discuss the final analysis results. More than 60 hydrogen energy experts from industry, government, national laboratories, and universities attended these meetings and provided their comments to help guide DOE's analysis. The final scenarios attempt to reflect the collective judgment of the participants in these meetings. However, they should not be interpreted as having been explicitly endorsed by DOE or any of the stakeholders participating. The DOE analysis examined three vehicle penetration scenarios: Scenario 1--Production of thousands of vehicles per year by 2015 and hundreds of thousands per year by 2019. This option is expected to lead to a market penetration of 2.0 million fuel cell vehicles (FCV) by 2025. Scenario 2--Production of thousands of FCVs by 2013 and hundreds of thousands by 2018. This option is expected to lead to a market penetration of 5.0 million FCVs by 2025. Scenario 3--Production of thousands of FCVs by 2013, hundreds of thousands by 2018, and millions by 2021 such that market penetration is 10 million by 2025. Scenario 3 was formulated to comply with the NAS recommendation: 'DOE should map out

  12. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  13. Quantitative Analysis Of Cristobalite In The Presence Of Quartz

    NASA Astrophysics Data System (ADS)

    Totten, Gary A.

    1985-12-01

    The detection and quantitation of-cristobalite in quartz is necessary to calculate threshold value limits (TVL) for free crystalline silica (FCS) as proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). The cristobalite standard used in this study was made by heating diatomaceous earth to the transition temperature for cristobalite. The potassium bromide (KBR) pellet method was used for the analysis. Potassium cyanide (KCN) was used as an internal standard. Samples ranged from 5% to 30% cris-tobalite in quartz. Precision for this method is within 2%.

  14. Quantitative proteomic analysis of drug-induced changes in mycobacteria.

    PubMed

    Hughes, Minerva A; Silva, Jeffrey C; Geromanos, Scott J; Townsend, Craig A

    2006-01-01

    A new approach for qualitative and quantitative proteomic analysis using capillary liquid chromatography and mass spectrometry to study the protein expression response in mycobacteria following isoniazid treatment is discussed. In keeping with known effects on the fatty acid synthase II pathway, proteins encoded by the kas operon (AcpM, KasA, KasB, Accd6) were significantly overexpressed, as were those involved in iron metabolism and cell division suggesting a complex interplay of metabolic events leading to cell death. PMID:16396495

  15. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits

    PubMed Central

    Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-01-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI’s Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  16. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  17. Systems Analysis of the Hydrogen Transition with HyTrans

    SciTech Connect

    Leiby, Paul Newsome; Greene, David L; Bowman, David Charles; Tworek, Elzbieta

    2007-01-01

    The U.S. Federal government is carefully considering the merits and long-term prospects of hydrogen-fueled vehicles. NAS (1) has called for the careful application of systems analysis tools to structure the complex assessment required. Others, raising cautionary notes, question whether a consistent and plausible transition to hydrogen light-duty vehicles can identified (2) and whether that transition would, on balance, be environmentally preferred. Modeling the market transition to hydrogen-powered vehicles is an inherently complex process, encompassing hydrogen production, delivery and retailing, vehicle manufacturing, and vehicle choice and use. We describe the integration of key technological and market factors in a dynamic transition model, HyTrans. The usefulness of HyTrans and its predictions depends on three key factors: (1) the validity of the economic theories that underpin the model, (2) the authenticity with which the key processes are represented, and (3) the accuracy of specific parameter values used in the process representations. This paper summarizes the theoretical basis of HyTrans, and highlights the implications of key parameter specifications with sensitivity analysis.

  18. Multivariate calibration applied to the quantitative analysis of infrared spectra

    NASA Astrophysics Data System (ADS)

    Haaland, David M.

    1992-03-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in- situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mid- or near-infrared spectra of the blood. Progress toward the noninvasive determination of glucose levels in diabetics is an ultimate goal of this research.

  19. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  20. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGESBeta

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  1. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  2. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  3. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  4. Segmentation and quantitative analysis of individual cells in developmental tissues.

    PubMed

    Nandy, Kaustav; Kim, Jusub; McCullough, Dean P; McAuliffe, Matthew; Meaburn, Karen J; Yamaguchi, Terry P; Gudla, Prabhakar R; Lockett, Stephen J

    2014-01-01

    Image analysis is vital for extracting quantitative information from biological images and is used extensively, including investigations in developmental biology. The technique commences with the segmentation (delineation) of objects of interest from 2D images or 3D image stacks and is usually followed by the measurement and classification of the segmented objects. This chapter focuses on the segmentation task and here we explain the use of ImageJ, MIPAV (Medical Image Processing, Analysis, and Visualization), and VisSeg, three freely available software packages for this purpose. ImageJ and MIPAV are extremely versatile and can be used in diverse applications. VisSeg is a specialized tool for performing highly accurate and reliable 2D and 3D segmentation of objects such as cells and cell nuclei in images and stacks. PMID:24318825

  5. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  6. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  7. A method for quantitative wet chemical analysis of urinary calculi.

    PubMed

    Larsson, L; Sörbo, B; Tiselius, H G; Ohman, S

    1984-06-27

    We describe a simple method for quantitative chemical analysis of urinary calculi requiring no specialized equipment. Pulverized calculi are dried over silica gel at room temperature and dissolved in nitric acid, which was the only effective agent for complete dissolution. Calcium, magnesium, ammonium, and phosphate are then determined by conventional methods. Oxalate is determined by a method based on the quenching action of oxalate on the fluorescence of a zirconium-flavonol complex. Uric acid, when treated with nitric acid, is stoichiometrically converted to alloxan, which is determined fluorimetrically with 1,2-phenylenediamine. Similarly, cystine is oxidized by nitric acid to sulfate, which is determined turbidimetrically as barium sulfate. Protein is determined spectrophotometrically as xanthoprotein. The total mass recovery of authentic calculi was 92.2 +/- 6.7 (SD) per cent. The method permits analysis of calculi as small as 1.0 mg. Internal quality control is performed with specially designed control samples. PMID:6086179

  8. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    PubMed

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm. PMID:24369641

  9. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  10. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  11. Quantum dots assisted laser desorption/ionization mass spectrometric detection of carbohydrates: qualitative and quantitative analysis.

    PubMed

    Bibi, Aisha; Ju, Huangxian

    2016-04-01

    A quantum dots (QDs) assisted laser desorption/ionization mass spectrometric (QDA-LDI-MS) strategy was proposed for qualitative and quantitative analysis of a series of carbohydrates. The adsorption of carbohydrates on the modified surface of different QDs as the matrices depended mainly on the formation of hydrogen bonding, which led to higher MS intensity than those with conventional organic matrix. The effects of QDs concentration and sample preparation method were explored for improving the selective ionization process and the detection sensitivity. The proposed approach offered a new dimension to the application of QDs as matrices for MALDI-MS research of carbohydrates. It could be used for quantitative measurement of glucose concentration in human serum with good performance. The QDs served as a matrix showed the advantages of low background, higher sensitivity, convenient sample preparation and excellent stability under vacuum. The QDs assisted LDI-MS approach has promising application to the analysis of carbohydrates in complex biological samples. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27041659

  12. Stable Hydrogen Isotopic Composition of Sedimentary Plant Waxes as Quantitative Proxy for Rainfall in the West African Sahel

    NASA Astrophysics Data System (ADS)

    Niedermeyer, E. M.; Forrest, M.; Beckmann, B.; Sessions, A. L.; Mulch, A.; Schefuß, E.

    2015-12-01

    Multiple studies have demonstrated that the stable hydrogen isotopic composition (δD) of terrestrial leaf waxes (δDwax) tracks that of precipitation (δDprecip) both spatially across climate gradients and on a range of different timescales. Yet, reconstructed estimates of δDprecip and corresponding rainfall typically remain largely relative, due mainly to uncertainties in plant ecosystem net fractionation, relative humidity, and the stability of the amount effect through time. We present δDwax together with corresponding stable carbon isotopic compositions (δ13Cwax) from a marine sediment core offshore from the North West (NW) African Sahel covering the past 100 years and overlapping with the instrumental record of rainfall. We developed a framework within which we produced a quantitative reconstruction of rainfall based on a δDwax time series, and compared it to records of rainfall in the terrestrial catchment area. The combined datasets demonstrate the feasibility to derive an accurate quantitative estimate of precipitation based on δDwax in specific depositional settings.

  13. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  14. Hydrogen and Water: An Engineering, Economic and Environmental Analysis

    SciTech Connect

    Simon, A J; Daily, W; White, R G

    2010-01-06

    The multi-year program plan for the Department of Energy's Hydrogen and Fuel Cells Technology Program (USDOE, 2007a) calls for the development of system models to determine economic, environmental and cross-cutting impacts of the transition to a hydrogen economy. One component of the hydrogen production and delivery chain is water; water's use and disposal can incur costs and environmental consequences for almost any industrial product. It has become increasingly clear that due to factors such as competing water demands and climate change, the potential for a water-constrained world is real. Thus, any future hydrogen economy will need to be constructed so that any associated water impacts are minimized. This, in turn, requires the analysis and comparison of specific hydrogen production schemes in terms of their water use. Broadly speaking, two types of water are used in hydrogen production: process water and cooling water. In the production plant, process water is used as a direct input for the conversion processes (e.g. steam for Steam Methane Reforming {l_brace}SMR{r_brace}, water for electrolysis). Cooling water, by distinction, is used indirectly to cool related fluids or equipment, and is an important factor in making plant processes efficient and reliable. Hydrogen production further relies on water used indirectly to generate other feedstocks required by a hydrogen plant. This second order indirect water is referred to here as 'embedded' water. For example, electricity production uses significant quantities of water; this 'thermoelectric cooling' contributes significantly to the total water footprint of the hydrogen production chain. A comprehensive systems analysis of the hydrogen economy includes the aggregate of the water intensities from every step in the production chain including direct, indirect, and embedded water. Process and cooling waters have distinct technical quality requirements. Process water, which is typically high purity (limited dissolved

  15. The destruction chemistry of organophosphorus compounds in flames -- I: Quantitative determination of final phosphorus-containing species in hydrogen-oxygen flames

    SciTech Connect

    Korobeinichev, O.P.; Ilyin, S.B.; Shvartsberg, V.M.; Chernov, A.A.

    1999-09-01

    The combustion of organophosphorus compounds (OPC) is of considerable interest in connection with the disposal of toxic and hazardous chemical wastes and other undesirable substances containing phosphorus, including chemical warfare agents (CWA) such as the nerve agents sarin and VX. This paper presents the results of a quantitative determination of the composition of final phosphorus-containing products (PO, PO{sub 2}, HOPO, and HOPO{sub 2}) from the destruction of the organophosphorus compounds trimethyl phosphate (TMP) and dimethyl methylphosphonate (DMMP) in premixed hydrogen-oxygen flames. The flames were stabilized on a flat burner at 47 Torr and probed using molecular beam mass spectrometric techniques. Quantitative analysis of these species is difficult, due to problems with mass spectrometric calibrations. Also these compounds are unstable under normal conditions and are not readily available To solve this problem a material balance equation for the element phosphorus has been used to analyze the results is stoichiometric, rich, and lean flames, doped with different amounts of TMP and DMMP. A system of linear nondegenerate materials balance equations was solved using the Singular Value Decomposition (SVD) algorithm. The calculated calibration coefficients for the phosphorus species have allowed their mole fractions to be derived. How the concentrations of PO, PO{sub 2}, HOPO, and HOPI{sub 2} depend on the initial concentrations of DMMP or TMP and on the mixture's composition has been studied. The measurements are compared to the Results of thermochemical equilibrium calculations.

  16. Hydrogen-fueled scramjets: Potential for detailed combustor analysis

    NASA Technical Reports Server (NTRS)

    Beach, H. L., Jr.

    1976-01-01

    Combustion research related to hypersonic scramjet (supersonic combustion ramjet) propulsion is discussed from the analytical point of view. Because the fuel is gaseous hydrogen, mixing is single phase and the chemical kinetics are well known; therefore, the potential for analysis is good relative to hydro-carbon fueled engines. Recent progress in applying two and three dimensional analytical techniques to mixing and reacting flows indicates cause for optimism, and identifies several areas for continuing effort.

  17. Thermodynamic analysis of acetic acid steam reforming for hydrogen production

    NASA Astrophysics Data System (ADS)

    Goicoechea, Saioa; Ehrich, Heike; Arias, Pedro L.; Kockmann, Norbert

    2015-04-01

    A thermodynamic analysis of hydrogen generation by acetic acid steam reforming has been carried out with respect to applications in solid oxide fuel cells. The effect of operating parameters on equilibrium composition has been examined focusing especially on hydrogen and carbon monoxide production, which are the fuels in this type of fuel cell. The temperature, steam to acetic acid ratio, and to a lesser extent pressure affect significantly the equilibrium product distribution due to their influence on steam reforming, thermal decomposition and water-gas shift reaction. The study shows that steam reforming of acetic acid with a steam to acetic acid ratio of 2 to 1 is thermodynamically feasible with hydrogen, carbon monoxide and water as the main products at the equilibrium at temperatures higher than 700 °C, and achieving CO/CO2 ratios higher than 1. Thus, it can be concluded that within the operation temperature range of solid oxide fuel cells - between 700 °C and 1000 °C - the production of a gas rich in hydrogen and carbon monoxide is promoted.

  18. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). PMID:25913743

  19. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  20. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  1. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-05-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  2. Quantitative analysis of the polarization characteristics of atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Gubarkova, Ekaterina V.; Kirillin, Michail Y.; Dudenkova, Varvara V.; Kiseleva, Elena B.; Moiseev, Alexander A.; Gelikonov, Grigory V.; Timofeeva, Lidia B.; Fiks, Ilya I.; Feldchtein, Felix I.; Gladkova, Natalia D.

    2016-04-01

    In this study we demonstrate the capability of cross-polarization optical coherence tomography (CP OCT) to assess collagen and elastin fibers condition in atherosclerotic plaques basing on ratio of the OCT signal levels in cross- and co- polarizations. We consider the depolarization factor (DF) and the effective birefringence (Δn) as quantitative characteristics of CP OCT images. We revealed that calculation of both DF and Δn in the region of interest (fibrous cap) yields a statistically significant difference between stable and unstable plaques (0.46+/-0.21 vs 0.09+/-0.04 for IDF; (4.7+/-1.0)•10-4 vs (2.5+/-0.7)•10-4 for Δn p<0.05). In parallel with CP OCT we used the nonlinear microscopy for analysis of thin cross-section of atherosclerotic plaque, revealing the different average isotropy index of collagen and elastin fibers for stable and unstable plaques (0.30 +/- 0.10 vs 0.70 +/- 0.08; p<0.001). The proposed approach for quantitative assessment of CP OCT images allows cross-scattering and birefringence characterization of stable and unstable atherosclerotic plaques.

  3. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  4. Quantitative analysis of gene function in the Drosophila embryo.

    PubMed Central

    Tracey, W D; Ning, X; Klingler, M; Kramer, S G; Gergen, J P

    2000-01-01

    The specific functions of gene products frequently depend on the developmental context in which they are expressed. Thus, studies on gene function will benefit from systems that allow for manipulation of gene expression within model systems where the developmental context is well defined. Here we describe a system that allows for genetically controlled overexpression of any gene of interest under normal physiological conditions in the early Drosophila embryo. This regulated expression is achieved through the use of Drosophila lines that express a maternal mRNA for the yeast transcription factor GAL4. Embryos derived from females that express GAL4 maternally activate GAL4-dependent UAS transgenes at uniform levels throughout the embryo during the blastoderm stage of embryogenesis. The expression levels can be quantitatively manipulated through the use of lines that have different levels of maternal GAL4 activity. Specific phenotypes are produced by expression of a number of different developmental regulators with this system, including genes that normally do not function during Drosophila embryogenesis. Analysis of the response to overexpression of runt provides evidence that this pair-rule segmentation gene has a direct role in repressing transcription of the segment-polarity gene engrailed. The maternal GAL4 system will have applications both for the measurement of gene activity in reverse genetic experiments as well as for the identification of genetic factors that have quantitative effects on gene function in vivo. PMID:10628987

  5. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  6. Quantitative PCR analysis of laryngeal muscle fiber types

    PubMed Central

    Van Daele, Douglas J.

    2013-01-01

    Voice and swallowing dysfunction as a result of recurrent laryngeal nerve paralysis can be improved with vocal fold injections or laryngeal framework surgery. However, denervation atrophy can cause late-term clinical failure. A major determinant of skeletal muscle physiology is myosin heavy chain (MyHC) expression, and previous protein analyses have shown changes in laryngeal muscle fiber MyHC isoform with denervation. RNA analyses in this setting have not been performed, and understanding RNA levels will allow interventions better designed to reverse processes such as denervation in the future. Total RNA was extracted from bilateral rat thyroarytenoid (TA), posterior cricoarytenoid (PCA), and cricothyroid (CT) muscles in rats. Primers were designed using published MyHC isoform sequences. SYBR Green real time reverse transcription-polymerase chain reaction (SYBR-RT-PCR) was used for quantification. The electropherogram showed a clear separation of total RNA to 28S and 18S subunits. Melting curves illustrated single peaks for all type MyHC primers. All MyHC isoforms were identified in all muscles with various degrees of expression. Quantitative PCR is a sensitive method to detect MyHC isoforms in laryngeal muscle. Isoform expression using mRNA analysis was similar to previous analyses but showed some important differences. This technique can be used to quantitatively assess response to interventions targeted to maintain muscle bulk after denervation. PMID:20430402

  7. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  8. EDXRF quantitative analysis of chromophore chemical elements in corundum samples.

    PubMed

    Bonizzoni, L; Galli, A; Spinolo, G; Palanza, V

    2009-12-01

    Corundum is a crystalline form of aluminum oxide (Al(2)O(3)) and is one of the rock-forming minerals. When aluminum oxide is pure, the mineral is colorless, but the presence of trace amounts of other elements such as iron, titanium, and chromium in the crystal lattice gives the typical colors (including blue, red, violet, pink, green, yellow, orange, gray, white, colorless, and black) of gemstone varieties. The starting point for our work is the quantitative evaluation of the concentration of chromophore chemical elements with a precision as good as possible to match the data obtained by different techniques as such as optical absorption photoluminescence. The aim is to give an interpretation of the absorption bands present in the NIR and visible ranges which do not involve intervalence charge transfer transitions (Fe(2+) --> Fe(3+) and Fe(2+) --> Ti(4+)), commonly considered responsible of the important features of the blue sapphire absorption spectra. So, we developed a method to evaluate as accurately as possible the autoabsorption effects and the secondary excitation effects which frequently are sources of relevant errors in the quantitative EDXRF analysis. PMID:19821113

  9. Analysis of generalized interictal discharges using quantitative EEG.

    PubMed

    da Silva Braga, Aline Marques; Fujisao, Elaine Keiko; Betting, Luiz Eduardo

    2014-12-01

    Experimental evidence from animal models of the absence seizures suggests a focal source for the initiation of generalized spike-and-wave (GSW) discharges. Furthermore, clinical studies indicate that patients diagnosed with idiopathic generalized epilepsy (IGE) exhibit focal electroencephalographic abnormalities, which involve the thalamo-cortical circuitry. This circuitry is a key network that has been implicated in the initiation of generalized discharges, and may contribute to the pathophysiology of GSW discharges. Quantitative electroencephalogram (qEEG) analysis may be able to detect abnormalities associated with the initiation of GSW discharges. The objective of this study was to determine whether interictal GSW discharges exhibit focal characteristics using qEEG analysis. In this study, 75 EEG recordings from 64 patients were analyzed. All EEG recordings analyzed contained at least one GSW discharge. EEG recordings were obtained by a 22-channel recorder with electrodes positioned according to the international 10-20 system of electrode placement. EEG activity was recorded for 20 min including photic stimulation and hyperventilation. The EEG recordings were visually inspected, and the first unequivocally confirmed generalized spike was marked for each discharge. Three methods of source imaging analysis were applied: dipole source imaging (DSI), classical LORETA analysis recursively applied (CLARA), and equivalent dipole of independent components with cluster analysis. A total of 753 GSW discharges were identified and spatiotemporally analyzed. Source evaluation analysis using all three techniques revealed that the frontal lobe was the principal source of GSW discharges (70%), followed by the parietal and occipital lobes (14%), and the basal ganglia (12%). The main anatomical sources of GSW discharges were the anterior cingulate cortex (36%) and the medial frontal gyrus (23%). Source analysis did not reveal a common focal source of GSW discharges. However

  10. Quantitative multi-image analysis for biomedical Raman spectroscopic imaging.

    PubMed

    Hedegaard, Martin A B; Bergholt, Mads S; Stevens, Molly M

    2016-05-01

    Imaging by Raman spectroscopy enables unparalleled label-free insights into cell and tissue composition at the molecular level. With established approaches limited to single image analysis, there are currently no general guidelines or consensus on how to quantify biochemical components across multiple Raman images. Here, we describe a broadly applicable methodology for the combination of multiple Raman images into a single image for analysis. This is achieved by removing image specific background interference, unfolding the series of Raman images into a single dataset, and normalisation of each Raman spectrum to render comparable Raman images. Multivariate image analysis is finally applied to derive the contributing 'pure' biochemical spectra for relative quantification. We present our methodology using four independently measured Raman images of control cells and four images of cells treated with strontium ions from substituted bioactive glass. We show that the relative biochemical distribution per area of the cells can be quantified. In addition, using k-means clustering, we are able to discriminate between the two cell types over multiple Raman images. This study shows a streamlined quantitative multi-image analysis tool for improving cell/tissue characterisation and opens new avenues in biomedical Raman spectroscopic imaging. PMID:26833935

  11. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  12. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  13. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  14. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  15. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  16. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  17. [Accounting for Expected Linkage in Biometric Analysis of Quantitative Traits].

    PubMed

    Mikhailov, M E

    2015-08-01

    The problem of accounting for a genetic estimation of expected linkage in the disposition of random loci was solved for the additive-dominant model. The Comstock-Robinson estimations for the sum of squares of dominant effects, the sum of squares of additive effects, and the average degree of dominance were modified. Also, the Wright's estimation for the number of loci controlling the variation of a quantitative trait was modified and its application sphere was extended. Formulas that should eliminate linkage, on average, were derived for these estimations. Nonbiased estimations were applied to the analysis of maize data. Our result showed that the most likely cause of heterosis is dominance rather than overdominance and that the main part of the heterotic effect is provided by dozens of genes. PMID:26601496

  18. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. PMID:25990413

  19. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  20. Sensitive LC MS quantitative analysis of carbohydrates by Cs+ attachment.

    PubMed

    Rogatsky, Eduard; Jayatillake, Harsha; Goswami, Gayotri; Tomuta, Vlad; Stein, Daniel

    2005-11-01

    The development of a sensitive assay for the quantitative analysis of carbohydrates from human plasma using LC/MS/MS is described in this paper. After sample preparation, carbohydrates were cationized by Cs(+) after their separation by normal phase liquid chromatography on an amino based column. Cesium is capable of forming a quasi-molecular ion [M + Cs](+) with neutral carbohydrate molecules in the positive ion mode of electrospray ionization mass spectrometry. The mass spectrometer was operated in multiple reaction monitoring mode, and transitions [M + 133] --> 133 were monitored (M, carbohydrate molecular weight). The new method is robust, highly sensitive, rapid, and does not require postcolumn addition or derivatization. It is useful in clinical research for measurement of carbohydrate molecules by isotope dilution assay. PMID:16182559

  1. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  2. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  3. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  4. A quantitative histological analysis of the dilated ureter of childhood.

    PubMed

    Lee, B R; Partin, A W; Epstein, J I; Quinlan, D M; Gosling, J A; Gearhart, J P

    1992-11-01

    A quantitative histological study of the dilated ureter of childhood was performed on 26 ureters. The specimens were from 15 male and 11 female patients 10 days to 12 years old (mean age 2.0 years). A color image analysis system was used to examine and compare collagen and smooth muscle components of the muscularis layers to normal control ureters of similar age. In comparing primary obstructed (12) to primary refluxing (14) megaureters and control ureters (6), there was a statistically different collagen-to-smooth muscle ratio (p < 0.001) between the primary obstructed and primary refluxing megaureter groups. For patients with primary refluxing megaureter there was a 2-fold increase in the tissue matrix ratio of collagen-to-smooth muscle when compared to patients with primary obstructed megaureter. In the primary obstructed megaureters the amount of collagen and smooth muscle was not statistically different from controls (p > 0.01). The increased tissue matrix ratio of 2.0 +/- 0.35 (collagen-to-smooth muscle) in the refluxing megaureter group compared to 0.78 +/- 0.22 in the obstructed megaureter group and 0.52 +/- 0.12 in controls was found to be due not only to a marked increase in collagen but also a significant decrease in the smooth muscle component of the tissue. Primary obstructed and normal control ureters had similar quantitative amounts of smooth muscle with 60 +/- 5% and 61 +/- 6%, respectively, while refluxing megaureters had only 40 +/- 5% smooth muscle. The percentage collagen was 36 +/- 5 in the obstructed megaureter group and 30 +/- 5 in controls, with refluxing megaureters having 58 +/- 5% collagen on analysis. Our findings emphasize the significant differences in the structural components (collagen and smooth muscle) of the dilated ureter of childhood, and provide us with further insight into the pathological nature of these dilated ureters and their surgical repair. PMID:1433552

  5. Insights from Hydrogen Refueling Station Manufacturing Competitiveness Analysis

    SciTech Connect

    Mayyas, Ahmad

    2015-12-18

    In work for the Clean Energy Manufacturing Analysis Center (CEMAC), NREL is currently collaborating with Great Lakes Wind Network in conducting a comprehensive hydrogen refueling stations manufacturing competitiveness and supply chain analyses. In this project, CEMAC will be looking at several metrics that will facilitate understanding of the interactions between and within the HRS supply chain, such metrics include innovation potential, intellectual properties, learning curves, related industries and clustering, existing supply chains, ease of doing business, and regulations and safety. This presentation to Fuel Cell Seminar and Energy Exposition 2015 highlights initial findings from CEMAC's analysis.

  6. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  7. Hydrogen Trailer Storage Facility (Building 878). Consequence analysis

    SciTech Connect

    Banda, Z.; Wood, C.L.

    1994-12-01

    The Department of Energy Order 5500.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This consequence analysis documents the impact that a hydrogen accident could have to employees, the general public, and nearby facilities. The computer model ARCHIE was utilized to determine discharge rates, toxic vapor dispersion analyses, flammable vapor cloud hazards, explosion hazards, and flame jets for the Hydrogen Trailer Storage Facility located at Building 878. To determine over pressurization effects, hand calculations derived from the Department of the Air Force Manual, ``Structures to Resist the Effects of Accidental Explosions,`` were utilized. The greatest distances at which a postulated facility event will produce the Lower Flammability and the Lower Detonation Levels are 1,721 feet and 882 feet, respectively. The greatest distance at which 10.0 psi overpressure (i.e., total building destruction) is reached is 153 feet.

  8. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  9. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  10. Macro-System Model for Hydrogen Energy Systems Analysis in Transportation: Preprint

    SciTech Connect

    Diakov, V.; Ruth, M.; Sa, T. J.; Goldsby, M. E.

    2012-06-01

    The Hydrogen Macro System Model (MSM) is a simulation tool that links existing and emerging hydrogen-related models to perform rapid, cross-cutting analysis. It allows analysis of the economics, primary energy-source requirements, and emissions of hydrogen production and delivery pathways.

  11. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  12. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    PubMed

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  13. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  14. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  15. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling

    PubMed Central

    Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T. M.; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  16. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  17. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  18. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-01

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  19. Inside single cells: quantitative analysis with advanced optics and nanomaterials.

    PubMed

    Cui, Yi; Irudayaraj, Joseph

    2015-01-01

    Single-cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites, and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single-cell activity. To obtain quantitative information (e.g., molecular quantity, kinetics, and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single-cell studies, both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live-cell analysis. Although a considerable proportion of single-cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single-cell analysis. PMID:25430077

  20. Quantitative genetic analysis of the metabolic syndrome in Hispanic children.

    PubMed

    Butte, Nancy F; Comuzzie, Anthony G; Cole, Shelley A; Mehta, Nitesh R; Cai, Guowen; Tejero, Maria; Bastarrachea, Raul; Smith, E O'Brian

    2005-12-01

    Childhood obesity is associated with a constellation of metabolic derangements including glucose intolerance, hypertension, and dyslipidemia, referred to as metabolic syndrome. The purpose of this study was to investigate genetic and environmental factors contributing to the metabolic syndrome in Hispanic children. Metabolic syndrome, defined as having three or more metabolic risk components, was determined in 1030 Hispanic children, ages 4-19 y, from 319 families enrolled in the VIVA LA FAMILIA study. Anthropometry, body composition by dual energy x-ray absorptiometry, clinical signs, and serum biochemistries were measured using standard techniques. Risk factor analysis and quantitative genetic analysis were performed. Of the overweight children, 20%, or 28% if abnormal liver function is included in the definition, presented with the metabolic syndrome. Odds ratios for the metabolic syndrome were significantly increased by body mass index z-score and fasting serum insulin; independent effects of sex, age, puberty, and body composition were not seen. Heritabilities +/- SE for waist circumference, triglycerides (TG), HDL, systolic blood pressure (SBP), glucose, and alanine aminotransferase (ALT) were highly significant. Pleiotropy (a common set of genes affecting two traits) detected between SBP and waist circumference, SBP and glucose, HDL and waist circumference, ALT and waist circumference, and TG and ALT may underlie the clustering of the components of the metabolic syndrome. Significant heritabilities and pleiotropy seen for the components of the metabolic syndrome indicate a strong genetic contribution to the metabolic syndrome in overweight Hispanic children. PMID:16306201

  1. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  2. Quantitative analysis of noninvasive diagnostic procedures for induction motor drives

    NASA Astrophysics Data System (ADS)

    Eltabach, Mario; Antoni, Jerome; Najjar, Micheline

    2007-10-01

    This paper reports quantitative analyses of spectral fault components in five noninvasive diagnostic procedures that use input electric signals to detect different types of abnormalities in induction motors. Besides the traditional one phase current spectrum analysis "SC", the diagnostic procedures based on spectrum analysis of the instantaneous partial powers " P ab", " P cb", total power " P abc", and the current space vector modulus " csvm" are considered. The aim of this comparison study is to improve the diagnosis tools for detection of electromechanical faults in electrical machines by using the best suitable diagnostic procedure knowing some motor and fault characteristics. Defining a severity factor as the increase in amplitude of the fault characteristic frequency, with respect to the healthy condition, enables us to study the sensitivity of the electrical diagnostic tools. As a result, it is shown that the relationship between the angular displacement of the current side-bands components at frequencies ( f± fosc) is directly related to the type of induction motor faults. It is also proved that the total instantaneous power diagnostic procedure was observed to exhibit the highest values of the detection criterion in case of mechanical faults while in case of electrical ones the most reliable diagnostic procedure is tightly related to the value of the motor power factor angle and the group motor-load inertia. Finally, simulation and experimental results show good agreement with the fault modeling theoretical results.

  3. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  4. Estimation of Hydrogen-Exchange Protection Factors from MD Simulation Based on Amide Hydrogen Bonding Analysis.

    PubMed

    Park, In-Hee; Venable, John D; Steckler, Caitlin; Cellitti, Susan E; Lesley, Scott A; Spraggon, Glen; Brock, Ansgar

    2015-09-28

    Hydrogen exchange (HX) studies have provided critical insight into our understanding of protein folding, structure, and dynamics. More recently, hydrogen exchange mass spectrometry (HX-MS) has become a widely applicable tool for HX studies. The interpretation of the wealth of data generated by HX-MS experiments as well as other HX methods would greatly benefit from the availability of exchange predictions derived from structures or models for comparison with experiment. Most reported computational HX modeling studies have employed solvent-accessible-surface-area based metrics in attempts to interpret HX data on the basis of structures or models. In this study, a computational HX-MS prediction method based on classification of the amide hydrogen bonding modes mimicking the local unfolding model is demonstrated. Analysis of the NH bonding configurations from molecular dynamics (MD) simulation snapshots is used to determine partitioning over bonded and nonbonded NH states and is directly mapped into a protection factor (PF) using a logistics growth function. Predicted PFs are then used for calculating deuteration values of peptides and compared with experimental data. Hydrogen exchange MS data for fatty acid synthase thioesterase (FAS-TE) collected for a range of pHs and temperatures was used for detailed evaluation of the approach. High correlation between prediction and experiment for observable fragment peptides is observed in the FAS-TE and additional benchmarking systems that included various apo/holo proteins for which literature data were available. In addition, it is shown that HX modeling can improve experimental resolution through decomposition of in-exchange curves into rate classes, which correlate with prediction from MD. Successful rate class decompositions provide further evidence that the presented approach captures the underlying physical processes correctly at the single residue level. This assessment is further strengthened in a comparison of

  5. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  6. Integrated analysis of hydrogen passenger vehicle transportation pathways

    SciTech Connect

    Thomas, C.E.; James, B.D.; Lomax, F.D. Jr.; Kuhn, I.F. Jr.

    1998-08-01

    Hydrogen-powered fuel cell vehicles will reduce local air pollution, greenhouse gas emissions and oil imports. Other alternative vehicles such as gasoline- or methanol-powered fuel cell vehicles, natural gas vehicles and various hybrid electric vehicles with internal combustion engines may also provide significant environmental and national security advantages. This report summarizes a two-year project to compare the direct hydrogen fuel cell vehicle with other alternatives in terms of estimated cost and estimated societal benefits, all relative to a conventional gasoline-powered internal combustion engine vehicle. The cost estimates used in this study involve ground-up, detailed analysis of the major components of a fuel cell vehicle system, assuming mass production in automotive quantities. The authors have also estimated the cost of both gasoline and methanol onboard fuel processors, as well as the cost of stationary hydrogen fueling system components including steam methane reformers, electrolyzers, compressors and stationary storage systems. Sixteen different vehicle types are compared with respect to mass production cost, local air pollution and greenhouse gas emissions.

  7. Analysis of a microwave-heated planar propagating hydrogen plasma

    SciTech Connect

    Knecht, J.P.; Micci, M.M.

    1988-02-01

    The heating of a gas to high temperatures by absorption of microwave radiation has been proposed as a potential electrothermal rocket propulsion system. One possible mode of microwave energy absorption is by means of a planar plasma region propagating toward the source of the microwave radiation. Such a planar propagating plasma can be spatially stabilized by a gas stream flowing in the same direction as the microwave radiation with a velocity equal to the plasma propagation velocity. A one-dimensional analysis of the microwave-heated planar propagating plasma for hydrogen gas was developed to predict maximum gas temperatures and propagation velocities. The governing electromagnetic and energy equations were numerically integrated with temperature-dependent thermodynamic properties of equilibrium hydrogen. The propagation velocity eigenvalue was solved by means of an iterative technique. Temperature distribution in the gas, propagation velocities, and percent power absorbed, reflected and transmitted, were obtained as a function of incident microwave power at a frequency of 2.45 GHza for hydrogen gas pressures of 1 and 10 atm. 19 references.

  8. New Analysis of Hydrogen and Deuterium Escape from Venus

    NASA Astrophysics Data System (ADS)

    Donahue, Thomas M.

    1999-10-01

    This paper is concerned with the time required for escape of hydrogen and deuterium to produce the present D/ H ratio in Venus water, the sizes of the original hydrogen reservoirs and their sensitivity to the magnitude of the present escape fluxes, the characteristics of exogenous and endogenous hydrogen sources, and the D/ H ratio for primordial Venus hydrogen. The procedure followed allowed the H escape flux to vary over a large range, the ratio of input to escape flux to vary from 0 to 1, and the fractionation factor, which expresses the relative efficiency of D and H escape, to vary between 0.02 and 0.5. It was found that, unless deuterium escape is very efficient, the present H escape flux (averaged over a solar cycle) cannot be larger than about 10 7 cm -2 s -1 if today's water is to be the remnant of water deposited eons ago. On the other hand if the escape flux is as large as large as 3×10 7 cm -2 s -1, today's water would be the remnant of water outgassed only about 500 million years ago. These conclusions are relatively insensitive to factors other than the magnitude of the escape flux. Since recent analysis of escape fluxes indicates that the H escape fluxes may be in the neighborhood of 3×10 7 cm -2 s -1 and the fractionation factor may be 0.14 or larger, the suggestion of Grinspoon (1993, Nature 363, 1702-1704) that the water now on Venus was created during a recent massive resurfacing event is credible. However, since it is still possible that the average escape flux is as small as 7×10 6 cm -2 s -1, the choice between 4 and 0.5 Gyr must await a resolution of this conflict by reanalysis of Pioneer Venus Lyman α data (Paxton, L., D. E. Anderson, and A. I. F. Stewart 1988, J. Geophys. Res. 93, 1766-1772).

  9. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  10. Rapid inorganic ion analysis using quantitative microchip capillary electrophoresis.

    PubMed

    Vrouwe, Elwin X; Luttge, Regina; Olthuis, Wouter; van den Berg, Albert

    2006-01-13

    Rapid quantitative microchip capillary electrophoresis (CE) for online monitoring of drinking water enabling inorganic ion separation in less than 15 s is presented. Comparing cationic and anionic standards at different concentrations the analysis of cationic species resulted in non-linear calibration curves. We interpret this effect as a variation in the volume of the injected sample plug caused by changes of the electroosmotic flow (EOF) due to the strong interaction of bivalent cations with the glass surface. This explanation is supported by the observation of severe peak tailing. Conducting microchip CE analysis in a glass microchannel, optimized conditions are received for the cationic species K+, Na+, Ca2+, Mg2+ using a background electrolyte consisting of 30 mmol/L histidine and 2-(N-morpholino)ethanesulfonic acid, containing 0.5 mmol/L potassium chloride to reduce surface interaction and 4 mmol/L tartaric acid as a complexing agent resulting in a pH-value of 5.8. Applying reversed EOF co-migration for the anionic species Cl-, SO42- and HCO3- optimized separation occurs in a background electrolyte consisting of 10 mmol/L 4-(2-hydroxyethyl)-1-piperazineethanesulfonic acid (HEPES) and 10 mmol/L HEPES sodium salt, containing 0.05 mmol/L CTAB (cetyltrimethylammonium bromide) resulting in a pH-value of 7.5. The detection limits are 20 micromol/L for the monovalent cationic and anionic species and 10 micromol/L for the divalent species. These values make the method very suitable for many applications including the analysis of abundant ions in tap water as demonstrated in this paper. PMID:16310794

  11. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  12. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created

  13. Techno Economic Analysis of Hydrogen Production by gasification of biomass

    SciTech Connect

    Francis Lau

    2002-12-01

    Biomass represents a large potential feedstock resource for environmentally clean processes that produce power or chemicals. It lends itself to both biological and thermal conversion processes and both options are currently being explored. Hydrogen can be produced in a variety of ways. The majority of the hydrogen produced in this country is produced through natural gas reforming and is used as chemical feedstock in refinery operations. In this report we will examine the production of hydrogen by gasification of biomass. Biomass is defined as organic matter that is available on a renewable basis through natural processes or as a by-product of processes that use renewable resources. The majority of biomass is used in combustion processes, in mills that use the renewable resources, to produce electricity for end-use product generation. This report will explore the use of hydrogen as a fuel derived from gasification of three candidate biomass feedstocks: bagasse, switchgrass, and a nutshell mix that consists of 40% almond nutshell, 40% almond prunings, and 20% walnut shell. In this report, an assessment of the technical and economic potential of producing hydrogen from biomass gasification is analyzed. The resource base was assessed to determine a process scale from feedstock costs and availability. Solids handling systems were researched. A GTI proprietary gasifier model was used in combination with a Hysys(reg. sign) design and simulation program to determine the amount of hydrogen that can be produced from each candidate biomass feed. Cost estimations were developed and government programs and incentives were analyzed. Finally, the barriers to the production and commercialization of hydrogen from biomass were determined. The end-use of the hydrogen produced from this system is small PEM fuel cells for automobiles. Pyrolysis of biomass was also considered. Pyrolysis is a reaction in which biomass or coal is partially vaporized by heating. Gasification is a more

  14. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  15. Separation and quantitative analysis of alkyl sulfate ethoxymers by HPLC.

    PubMed

    Morvan, Julien; Hubert-Roux, Marie; Agasse, Valérie; Cardinael, Pascal; Barbot, Florence; Decock, Gautier; Bouillon, Jean-Philippe

    2008-01-01

    Separation of alkyl sulfate ethoxymers is investigated on various high-performance liquid chromatography (HPLC) stationary phases: Acclaim C18 Surfactant, Surfactant C8, and Hypercarb. For a fixed alkyl chain length, ethoxymers are eluted in the order of increasing number of ethoxylated units on Acclaim C18 Surfactant, whereas a reversed elution order is observed on Surfactant C8 and Hypercarb. Moreover, on an Acclaim C18 Surfactant column, non-ethoxylated compounds are eluted in their ethoxymers distribution and the use of sodium acetate additive in mobile phase leads to a co-elution of ethoxymers. HPLC stationary phases dedicated to surfactants analysis are evaluated by means of the Tanaka test. Surfactant C8 presents a great silanol activity whereas Acclaim C18 Surfactant shows a high steric selectivity. For alkyl sulfates, linearity of the calibration curve and limits of detection and quantitation are evaluated. The amount of sodium laureth sulfate raw material found in commercial body product is in agreement with the specification of the manufacturer. PMID:19007494

  16. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems. PMID:21710159

  17. Quantitative analysis of protein dynamics during asymmetric cell division.

    PubMed

    Mayer, Bernd; Emery, Gregory; Berdnik, Daniela; Wirtz-Peitz, Frederik; Knoblich, Juergen A

    2005-10-25

    In dividing Drosophila sensory organ precursor (SOP) cells, the fate determinant Numb and its associated adaptor protein Pon localize asymmetrically and segregate into the anterior daughter cell, where Numb influences cell fate by repressing Notch signaling. Asymmetric localization of both proteins requires the protein kinase aPKC and its substrate Lethal (2) giant larvae (Lgl). Because both Numb and Pon localization require actin and myosin, lateral transport along the cell cortex has been proposed as a possible mechanism for their asymmetric distribution. Here, we use quantitative live analysis of GFP-Pon and Numb-GFP fluorescence and fluorescence recovery after photobleaching (FRAP) to characterize the dynamics of Numb and Pon localization during SOP division. We demonstrate that Numb and Pon rapidly exchange between a cytoplasmic pool and the cell cortex and that preferential recruitment from the cytoplasm is responsible for their asymmetric distribution during mitosis. Expression of a constitutively active form of aPKC impairs membrane recruitment of GFP-Pon. This defect can be rescued by coexpression of nonphosphorylatable Lgl, indicating that Lgl is the main target of aPKC. We propose that a high-affinity binding site is asymmetrically distributed by aPKC and Lgl and is responsible for asymmetric localization of cell-fate determinants during mitosis. PMID:16243032

  18. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    PubMed Central

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  19. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  20. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    NASA Astrophysics Data System (ADS)

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.; Fei, Baowei

    2012-07-01

    Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology.

  1. Quantitative phase imaging applied to laser damage detection and analysis.

    PubMed

    Douti, Dam-Bé L; Chrayteh, Mhamad; Aknoun, Sherazade; Doualle, Thomas; Hecquet, Christophe; Monneret, Serge; Gallais, Laurent

    2015-10-01

    We investigate phase imaging as a measurement method for laser damage detection and analysis of laser-induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry associated with a high-magnification optical microscope. The system has been used for the in-line observation of optical thin films and bulk samples, laser irradiated in two different conditions: 500 fs pulses at 343 and 1030 nm, and millisecond to second irradiation with a CO2 laser at 10.6 μm. We investigate the measurement of the laser-induced damage threshold of optical material by detection and phase changes and show that the technique realizes high sensitivity with different optical path measurements lower than 1 nm. Additionally, the quantitative information on the refractive index or surface modification of the samples under test that is provided by the system has been compared to classical metrology instruments used for laser damage or laser ablation characterization (an atomic force microscope, a differential interference contrast microscope, and an optical surface profiler). An accurate in-line measurement of the morphology of laser-ablated sites, from few nanometers to hundred microns in depth, is shown. PMID:26479612

  2. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-01

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene. PMID:21165476

  3. Quantitative analysis of flagellar proteins in Drosophila sperm tails.

    PubMed

    Mendes Maia, Teresa; Paul-Gilloteaux, Perrine; Basto, Renata

    2015-01-01

    The cilium has a well-defined structure, which can still accommodate some morphological and molecular composition diversity to suit the functional requirements of different cell types. The sperm flagellum of the fruit fly Drosophila melanogaster appears as a good model to study the genetic regulation of axoneme assembly and motility, due to the wealth of genetic tools publically available for this organism. In addition, the fruit fly's sperm flagellum displays quite a long axoneme (∼1.8mm), which may facilitate both histological and biochemical analyses. Here, we present a protocol for imaging and quantitatively analyze proteins, which associate with the fly differentiating, and mature sperm flagella. We will use as an example the quantification of tubulin polyglycylation in wild-type testes and in Bug22 mutant testes, which present defects in the deposition of this posttranslational modification. During sperm biogenesis, flagella appear tightly bundled, which makes it more challenging to get accurate measurements of protein levels from immunostained specimens. The method we present is based on the use of a novel semiautomated, macro installed in the image processing software ImageJ. It allows to measure fluorescence levels in closely associated sperm tails, through an exact distinction between positive and background signals, and provides background-corrected pixel intensity values that can directly be used for data analysis. PMID:25837396

  4. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  5. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  6. Hyperspectral imaging and quantitative analysis for prostate cancer detection.

    PubMed

    Akbari, Hamed; Halig, Luma V; Schuster, David M; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T; Chen, Georgia Z; Fei, Baowei

    2012-07-01

    Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  7. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  8. Quantitative trait locus analysis for hemostasis and thrombosis

    PubMed Central

    Sa, Qila; Hart, Erika; Hill, Annie E.; Nadeau, Joseph H.

    2009-01-01

    Susceptibility to thrombosis varies in human populations as well as many in inbred mouse strains. The objective of this study was to characterize the genetic control of thrombotic risk on three chromosomes. Previously, utilizing a tail-bleeding/rebleeding assay as a surrogate of hemostasis and thrombosis function, three mouse chromosome substitution strains (CSS) (B6-Chr5A/J, Chr11A/J, Chr17A/J) were identified (Hmtb1, Hmtb2, Hmtb3). The tailbleeding/rebleeding assay is widely used and distinguishes mice with genetic defects in blood clot formation or dissolution. In the present study, quantitative trait locus (QTL) analysis revealed a significant locus for rebleeding (clot stability) time (time between cessation of initial bleeding and start of the second bleeding) on chromosome 5, suggestive loci for bleeding time (time between start of bleeding and cessation of bleeding) also on chromosomes 5, and two suggestive loci for clot stability on chromosome 17 and one on chromosome 11. The three CSS and the parent A/J had elevated clot stability time. There was no interaction of genes on chromosome 11 with genes on chromosome 5 or chromosome 17. On chromosome 17, twenty-three candidate genes were identified in synteny with previously identified loci for thrombotic risk on human chromosome 18. Thus, we have identified new QTLs and candidate genes not previously known to influence thrombotic risk. PMID:18787898

  9. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy. PMID:24607988

  10. Quantitative Analysis of Human Cancer Cell Extravasation Using Intravital Imaging.

    PubMed

    Willetts, Lian; Bond, David; Stoletov, Konstantin; Lewis, John D

    2016-01-01

    Metastasis, or the spread of cancer cells from a primary tumor to distant sites, is the leading cause of cancer-associated death. Metastasis is a complex multi-step process comprised of invasion, intravasation, survival in circulation, extravasation, and formation of metastatic colonies. Currently, in vitro assays are limited in their ability to investigate these intricate processes and do not faithfully reflect metastasis as it occurs in vivo. Traditional in vivo models of metastasis are limited by their ability to visualize the seemingly sporadic behavior of where and when cancer cells spread (Reymond et al., Nat Rev Cancer 13:858-870, 2013). The avian embryo model of metastasis is a powerful platform to study many of the critical steps in the metastatic cascade including the migration, extravasation, and invasion of human cancer cells in vivo (Sung et al., Nat Commun 6:7164, 2015; Leong et al., Cell Rep 8, 1558-1570, 2014; Kain et al., Dev Dyn 243:216-28, 2014; Leong et al., Nat Protoc 5:1406-17, 2010; Zijlstra et al., Cancer Cell 13:221-234, 2008; Palmer et al., J Vis Exp 51:2815, 2011). The chicken chorioallantoic membrane (CAM) is a readily accessible and well-vascularized tissue that surrounds the developing embryo. When the chicken embryo is grown in a shell-less, ex ovo environment, the nearly transparent CAM provides an ideal environment for high-resolution fluorescent microcopy approaches. In this model, the embryonic chicken vasculature and labeled cancer cells can be visualized simultaneously to investigate specific steps in the metastatic cascade including extravasation. When combined with the proper image analysis tools, the ex ovo chicken embryo model offers a cost-effective and high-throughput platform for the quantitative analysis of tumor cell metastasis in a physiologically relevant in vivo setting. Here we discuss detailed procedures to quantify cancer cell extravasation in the shell-less chicken embryo model with advanced fluorescence

  11. Spectroscopic investigation and hydrogen-bonding analysis of triazinones.

    PubMed

    Dhas, Devadhas Arul; Joe, Isaac Hubert; Roy, Solomon Dawn Dharma; Balachandran, Sreedharan

    2012-08-01

    NIR FT-Raman, FTIR and UV-vis spectra of the herbicide metamitron were recorded and analyzed. The aromaticities, equilibrium geometries, bonding features, electrostatic potentials, and harmonic vibrational wavenumbers of the monomers and dimers of triazinone derivatives were also investigated with the aid of BLYP/6-311 G(df, p) density functional theory. Features in the vibrational spectra were assigned with the aid of the VEDA.4 program. The calculated results were a good match to the experimental data obtained from FTIR, Raman, and electronic absorption spectra. Mulliken population analysis was performed on the atomic charges and the HOMO-LUMO energies were also calculated. NBO analysis highlighted the intra- and intermolecular N-H…O and C-H…O hydrogen bonds in the crystal structures of the triazinones. The solvent effect was calculated using time-dependent density functional theory in combination with the polarizable continuum model. PMID:22350295

  12. Quantitative analysis of LISA pathfinder test-mass noise

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-12-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3×10-14ms-2/Hz at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise detection, the

  13. Hydrogen analysis depth calibration by CORTEO Monte-Carlo simulation

    NASA Astrophysics Data System (ADS)

    Moser, M.; Reichart, P.; Bergmaier, A.; Greubel, C.; Schiettekatte, F.; Dollinger, G.

    2016-03-01

    Hydrogen imaging with sub-μm lateral resolution and sub-ppm sensitivity has become possible with coincident proton-proton (pp) scattering analysis (Reichart et al., 2004). Depth information is evaluated from the energy sum signal with respect to energy loss of both protons on their path through the sample. In first order, there is no angular dependence due to elastic scattering. In second order, a path length effect due to different energy loss on the paths of the protons causes an angular dependence of the energy sum. Therefore, the energy sum signal has to be de-convoluted depending on the matrix composition, i.e. mainly the atomic number Z, in order to get a depth calibrated hydrogen profile. Although the path effect can be calculated analytically in first order, multiple scattering effects lead to significant deviations in the depth profile. Hence, in our new approach, we use the CORTEO Monte-Carlo code (Schiettekatte, 2008) in order to calculate the depth of a coincidence event depending on the scattering angle. The code takes individual detector geometry into account. In this paper we show, that the code correctly reproduces measured pp-scattering energy spectra with roughness effects considered. With more than 100 μm thick Mylar-sandwich targets (Si, Fe, Ge) we demonstrate the deconvolution of the energy spectra on our current multistrip detector at the microprobe SNAKE at the Munich tandem accelerator lab. As a result, hydrogen profiles can be evaluated with an accuracy in depth of about 1% of the sample thickness.

  14. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  15. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  16. Quantitative analysis of harmonic convergence in mosquito auditory interactions

    PubMed Central

    Aldersley, Andrew; Champneys, Alan; Robert, Daniel

    2016-01-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the ‘harmonic convergence’ phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male–female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male–male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  17. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    PubMed

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  18. Analysis of quantitative phase detection based on optical information processing

    NASA Astrophysics Data System (ADS)

    Tao, Wang; Tu, Jiang-Chen; Chun, Kuang-Tao; Yu, Han-Wang; Xin, Du

    2009-07-01

    Phase object exists widely in nature, such as biological cells, optical components, atmospheric flow field and so on. The phase detection of objects has great significance in the basic research, nondestructive testing, aerospace, military weapons and other areas. The usual methods of phase object detection include interference method, grating method, schlieren method, and phase-contrast method etc. These methods have their own advantages, but they also have some disadvantages on detecting precision, environmental requirements, cost, detection rate, detection range, detection linearity in various applications, even the most sophisticated method-phase contrast method mainly used in microscopic structure, lacks quantitative analysis of the size of the phase of the object and the relationship between the image contrast and the optical system. In this paper, various phase detection means and the characteristics of different applications are analyzed based on the optical information processing, and a phase detection system based on optical filtering is formed. Firstly the frequency spectrum of the phase object is achieved by Fourier transform lens in the system, then the frequency spectrum is changed reasonably by the filter, at last the image which can represent the phase distribution through light intensity is achieved by the inverse Fourier transform. The advantages and disadvantages of the common used filters such as 1/4 wavelength phase filter, high-pass filter and edge filter are analyzed, and their phase resolution is analyzed in the same optical information processing system, and the factors impacting phase resolution are pointed out. The paper draws a conclusion that there exists an optimal filter which makes the detect accuracy best for any application. At last, we discussed how to design an optimal filter through which the ability of the phase testing of optical information processing system can be improved most.

  19. Descriptive Quantitative Analysis of Rearfoot Alignment Radiographic Parameters.

    PubMed

    Meyr, Andrew J; Wagoner, Matthew R

    2015-01-01

    Although the radiographic parameters of the transverse talocalcaneal angle (tTCA), calcaneocuboid angle (CCA), talar head uncovering (THU), calcaneal inclination angle (CIA), talar declination angle (TDA), lateral talar-first metatarsal angle (lTFA), and lateral talocalcaneal angle (lTCA) form the basis of the preoperative evaluation and procedure selection for pes planovalgus deformity, the so-called normal values of these measurements are not well-established. The objectives of the present study were to retrospectively evaluate the descriptive statistics of these radiographic parameters (tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA) in a large population, and, second, to determine an objective basis for defining "normal" versus "abnormal" measurements. As a secondary outcome, the relationship of these variables to the body mass index was assessed. Anteroposterior and lateral foot radiographs from 250 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated. The results revealed a mean measurement of 24.12°, 13.20°, 74.32%, 16.41°, 26.64°, 8.37°, and 43.41° for the tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA, respectively. These were generally in line with the reported historical normal values. Descriptive statistical analysis demonstrated that the tTCA, THU, and TDA met the standards to be considered normally distributed but that the CCA, CIA, lTFA, and lTCA demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, only the CIA (R = -0.2428) and lTCA (R = -0.2449) demonstrated substantial correlation with the body mass index. No differentiations in deformity progression were observed when the radiographic parameters were plotted against each other to lead to a quantitative basis for defining "normal" versus "abnormal" measurements. PMID:26002682

  20. Analysis of IUE observations of hydrogen in comets

    NASA Technical Reports Server (NTRS)

    Combi, Michael R.; Feldman, Paul D.

    1993-01-01

    The large body of hydrogen Lyman-alpha observations of cometary comae obtained with the International Ultraviolet Explorer satellite has gone generally unanalyzed because of two main modeling complications. First, the inner comae of many bright (gas productive) comets are often optically thick to solar Lyman-alpha radiation. Second, even in the case of a small comet (low gas production) the large IUE aperture is quite small as compared with the immense size of the hydrogen coma, so an accurate model which properly accounts for the spatial distribution of the coma is required to invert the inferred brightnesses to column densities and finally to H atom production rates. Our Monte Carlo particle trajectory model (MPTM), which for the first time provides the realistic full phase space distribution of H atoms throughout the coma was used as the basis for the analysis of IUE observations of the inner coma. The MCPTM includes the effects of the vectorial ejection of the H atoms upon dissociation of their parent species (H2O and OH) and of their partial collisional thermalization. Both of these effects are crucial to characterize the velocity distribution of the H atoms. A new spherical radiative transfer calculation based on our MCPTM was developed to analyze IUE observations of optically thick H comae. The models were applied to observations of comets P/Giacobini-Zinner and P/Halley.

  1. Dynamic mechanical analysis of hydrogen purification substrates and membranes

    NASA Astrophysics Data System (ADS)

    Steinborn, Brandon

    Porous 420 stainless steel hydrogen purification substrates were fabricated using an ExOne R2 printer and sintered at temperatures of 1075 °C and 1100 °C for times ranging from 15 minutes to 240 minutes. Coatings of 1 micron silica beads, silica sol-gel, and palladium were applied to the sintered structure. Mechanical properties/degradation of each substrate/coating combination were evaluated using a cyclic 3-point loading condition imposed by a TA Q800 dynamic mechanical analysis unit (DMA). A constant deformation procedure was used while the required drive force for deformation and the elasticity (tan delta) were recorded throughout the cycle. Findings with respect to coating additions include: drive force increases with the addition of each coating, tan delta decreases with ceramic additions and increases with palladium addition (eventually decreases when membrane fails), and tan delta values become comparable with the addition of palladium regardless of other parameters. Findings with respect to sintering time and temperature include: drive force increases with increased sintering time and temperature, tan delta increases with increased sintering time at 1075 °C, and tan delta decreases with increased sintering time at 1100 °C. Overall, the palladium layer would likely remain intact in service due to actual force oscillations not being as extreme in service, poisoning would likely be the life limiting factor. Keywords: Sintering, dynamic mechanical properties, porous stainless steel, hydrogen purification, sol-gel.

  2. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    PubMed Central

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  3. Quantification of Hydrogen Concentrations in Surface and Interface Layers and Bulk Materials through Depth Profiling with Nuclear Reaction Analysis.

    PubMed

    Wilde, Markus; Ohno, Satoshi; Ogura, Shohei; Fukutani, Katsuyuki; Matsuzaki, Hiroyuki

    2016-01-01

    Nuclear reaction analysis (NRA) via the resonant (1)H((15)N,αγ)(12)C reaction is a highly effective method of depth profiling that quantitatively and non-destructively reveals the hydrogen density distribution at surfaces, at interfaces, and in the volume of solid materials with high depth resolution. The technique applies a (15)N ion beam of 6.385 MeV provided by an electrostatic accelerator and specifically detects the (1)H isotope in depths up to about 2 μm from the target surface. Surface H coverages are measured with a sensitivity in the order of ~10(13) cm(-2) (~1% of a typical atomic monolayer density) and H volume concentrations with a detection limit of ~10(18) cm(-3) (~100 at. ppm). The near-surface depth resolution is 2-5 nm for surface-normal (15)N ion incidence onto the target and can be enhanced to values below 1 nm for very flat targets by adopting a surface-grazing incidence geometry. The method is versatile and readily applied to any high vacuum compatible homogeneous material with a smooth surface (no pores). Electrically conductive targets usually tolerate the ion beam irradiation with negligible degradation. Hydrogen quantitation and correct depth analysis require knowledge of the elementary composition (besides hydrogen) and mass density of the target material. Especially in combination with ultra-high vacuum methods for in-situ target preparation and characterization, (1)H((15)N,αγ)(12)C NRA is ideally suited for hydrogen analysis at atomically controlled surfaces and nanostructured interfaces. We exemplarily demonstrate here the application of (15)N NRA at the MALT Tandem accelerator facility of the University of Tokyo to (1) quantitatively measure the surface coverage and the bulk concentration of hydrogen in the near-surface region of a H2 exposed Pd(110) single crystal, and (2) to determine the depth location and layer density of hydrogen near the interfaces of thin SiO2 films on Si(100). PMID:27077920

  4. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  5. Pressure Rise Analysis When Hydrogen Leak from a Cracked Pipe in the Cryogenic Hydrogen System in J-PARC

    NASA Astrophysics Data System (ADS)

    Tatsumoto, H.; Aso, T.; Hasegawa, S.; Ushijima, I.; Kato, T.; Ohtsu, K.; Ikeda, Y.

    2006-04-01

    As one of the main experimental facilities in the Japan Proton Accelerator Research Complex (J-PARC), an intense spallation neutron source (JSNS) driven by a 1 MW proton beam is being constructed. Cryogenic hydrogen at supercritical pressure is selected as a moderator. The total nuclear heating at the moderators is estimated to be a 3.7 kW. A hydrogen system to cool the moderators has been designed. The most severe off-normal event for the cryogenic hydrogen system is considered to be a hydrogen leak when a pipe cracks. In such a case, the hydrogen must be discharged to atmosphere quickly and safely. An analytical code that simulates the pressure change during a hydrogen leak was developed. A pressure rise analysis for various sized cracks was performed, and the required sizes for relief devices were determined. A safety valve size is φ42.7 mm and a rupture disc for vacuum layer should have a diameter of 37.1 mm, respectively.

  6. Pressure Rise Analysis When Hydrogen Leak from a Cracked Pipe in the Cryogenic Hydrogen System in J-PARC

    SciTech Connect

    Tatsumoto, H.; Aso, T.; Hasegawa, S.; Ushijima, I.; Kato, T.; Ohtsu, K.; Ikeda, Y.

    2006-04-27

    As one of the main experimental facilities in the Japan Proton Accelerator Research Complex (J-PARC), an intense spallation neutron source (JSNS) driven by a 1 MW proton beam is being constructed. Cryogenic hydrogen at supercritical pressure is selected as a moderator. The total nuclear heating at the moderators is estimated to be a 3.7 kW. A hydrogen system to cool the moderators has been designed. The most severe off-normal event for the cryogenic hydrogen system is considered to be a hydrogen leak when a pipe cracks. In such a case, the hydrogen must be discharged to atmosphere quickly and safely. An analytical code that simulates the pressure change during a hydrogen leak was developed. A pressure rise analysis for various sized cracks was performed, and the required sizes for relief devices were determined. A safety valve size is {phi}42.7 mm and a rupture disc for vacuum layer should have a diameter of 37.1 mm, respectively.

  7. Quantitative analysis of localized surface plasmons based on molecular probing.

    PubMed

    Deeb, Claire; Bachelot, Renaud; Plain, Jérôme; Baudrion, Anne-Laure; Jradi, Safi; Bouhelier, Alexandre; Soppera, Olivier; Jain, Prashant K; Huang, Libai; Ecoffet, Carole; Balan, Lavinia; Royer, Pascal

    2010-08-24

    We report on the quantitative characterization of the plasmonic optical near-field of a single silver nanoparticle. Our approach relies on nanoscale molecular molding of the confined electromagnetic field by photoactivated molecules. We were able to directly image the dipolar profile of the near-field distribution with a resolution better than 10 nm and to quantify the near-field depth and its enhancement factor. A single nanoparticle spectral signature was also assessed. This quantitative characterization constitutes a prerequisite for developing nanophotonic applications. PMID:20687536

  8. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  9. Quantitative Proteomic Analysis of Differentially Expressed Protein Profiles Involved in Pancreatic Ductal Adenocarcinoma

    PubMed Central

    Kuo, Kung-Kai; Kuo, Chao-Jen; Chiu, Chiang-Yen; Liang, Shih-Shin; Huang, Chun-Hao; Chi, Shu-Wen; Tsai, Kun-Bow; Chen, Chiao-Yun; Hsi, Edward; Cheng, Kuang-Hung; Chiou, Shyh-Horng

    2016-01-01

    Objectives The aim of this study was to identify differentially expressed proteins among various stages of pancreatic ductal adenocarcinoma (PDAC) by shotgun proteomics using nano-liquid chromatography coupled tandem mass spectrometry and stable isotope dimethyl labeling. Methods Differentially expressed proteins were identified and compared based on the mass spectral differences of their isotope-labeled peptide fragments generated from protease digestion. Results Our quantitative proteomic analysis of the differentially expressed proteins with stable isotope (deuterium/hydrogen ratio, ≥2) identified a total of 353 proteins, with at least 5 protein biomarker proteins that were significantly differentially expressed between cancer and normal mice by at least a 2-fold alteration. These 5 protein biomarker candidates include α-enolase, α-catenin, 14-3-3 β, VDAC1, and calmodulin with high confidence levels. The expression levels were also found to be in agreement with those examined by Western blot and histochemical staining. Conclusions The systematic decrease or increase of these identified marker proteins may potentially reflect the morphological aberrations and diseased stages of pancreas carcinoma throughout progressive developments leading to PDAC. The results would form a firm foundation for future work concerning validation and clinical translation of some identified biomarkers into targeted diagnosis and therapy for various stages of PDAC. PMID:26262590

  10. Analysis of hydrogen-bond interaction potentials from the electron density: Integration of NCI regions

    PubMed Central

    Contreras-García, Julia; Yang, Weitao; Johnson, Erin R.

    2013-01-01

    Hydrogen bonds are of crucial relevance to many problems in chemistry biology and materials science. The recently-developed NCI (Non-Covalent Interactions) index enables real-space visualization of both attractive (van der Waals and hydrogen-bonding) and repulsive (steric) interactions based on properties of the electron density It is thus an optimal index to describe the interplay of stabilizing and de-stabilizing contributions that determine stable minima on hydrogen-bonding potential-energy surfaces (PESs). In the framework of density-functional theory energetics are completely determined by the electron density Consequently NCI will be shown to allow quantitative treatment of hydrogen-bond energetics. The evolution of NCI regions along a PES follows a well-behaved pattern which, upon integration of the electron density is capable of mimicking conventional hydrogen-bond interatomic potentials. PMID:21786796

  11. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  12. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  13. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  14. Features of the Quantitative Analysis in Moessbauer Spectroscopy

    SciTech Connect

    Semenov, V. G.; Panchuk, V. V.; Irkaev, S. M.

    2010-07-13

    The results describing the effect of different factors on errors in quantitative determination of the phase composition of studied substances by Moessbauer spectroscopy absorption are presented, and the ways of using them are suggested. The effectiveness of the suggested methods is verified by an example of analyzing standard and unknown compositions.

  15. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  16. Quantitative Analysis of Radionuclides in Process and Environmental Samples

    SciTech Connect

    Boni, A.L.

    2003-02-21

    An analytical method was developed for the radiochemical separation and quantitative recovery of ruthenium, zirconium, niobium, neptunium, cobalt, iron, zinc, strontium, rare earths, chromium and cesium from a wide variety of natural materials. This paper discusses this analytical method, based on the anion exchange properties of the various radionuclides, although both ion exchange and precipitation techniques are incorporated.

  17. Technical Analysis of Hydrogen Production: Evaluation of H2 Mini-Grids

    SciTech Connect

    Lasher, Stephen; Sinha, Jayanti

    2005-05-03

    We have assessed the transportation of hydrogen as a metal hydride slurry through pipelines over a short distance from a neighborhood hydrogen production facility to local points of use. The assessment was conducted in the context of a hydrogen "mini-grid" serving both vehicle fueling and stationary fuel cell power systems for local building heat and power. The concept was compared to a compressed gaseous hydrogen mini-grid option and to a stand-alone hydrogen fueling station. Based on our analysis results we have concluded that the metal hydride slurry concept has potential to provide significant reductions in overall energy use compared to liquid or chemical hydride delivery, but only modest reductions in overall energy use, hydrogen cost, and GHG emissions compared to a compressed gaseous hydrogen delivery. However, given the inherent (and perceived) safety and reasonable cost/efficiency of the metal hydride slurry systems, additional research and analysis is warranted. The concept could potentially overcome the public acceptance barrier associated with the perceptions about hydrogen delivery (including liquid hydrogen tanker trucks and high-pressure gaseous hydrogen pipelines or tube trailers) and facilitate the development of a near-term hydrogen infrastructure.

  18. Analysis of IUE Observations of Hydrogen in Comets

    NASA Technical Reports Server (NTRS)

    Combi, Michael R.; Feldman, Paul D.

    1998-01-01

    The 15-years worth of hydrogen Lyman-alpha observations of cometary comae obtained with the International Ultraviolet Explorer (IUE) satellite had gone generally unanalyzed because of two main modeling complications. First, the inner comae of many bright (gas productive) comets are often optically thick to solar Lyman-alpha radiation. Second, even in the case of a small comet (low gas production) the large IUE aperture is quite small as compared with the immense size of the hydrogen coma, so an accurate model which properly accounts for the spatial distribution of the coma is required to invert the infrared brightnesses to column densities and finally to H atom production rates. Our Monte Carlo particle trajectory model (MCPTM), which for the first time provides the realistic full phase space distribution of H atoms throughout the coma has been used as the basis for the analysis of IUE observations of the inner coma. The MCPTM includes the effects of the vectorial ejection of the H atoms upon dissociation of their parent species (H2O and OH) and of their partial collisional thermalization. Both of these effects are crucial to characterize the velocity distribution of the H atoms. This combination of the MCPTM and spherical radiative transfer code had already been shown to be successful in understanding the moderately optically thick coma of comet P/Giacobini-Zinner and the coma of comet Halley that varied from being slightly to very optically thick. Both of these comets were observed during solar minimum conditions. Solar activity affects both the photochemistry of water and the solar Lyman-alpha radiation flux. The overall plan of this program here was to concentrate on comets observed by IUE at other time during the solar cycle, most importantly during the two solar maxima of 1980 and 1990. Described herein are the work performed and the results obtained.

  19. A survey and analysis of commercially available hydrogen sensors

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.

    1992-01-01

    The performance requirements for hydrogen detection in aerospace applications often exceed those of more traditional applications. In order to ascertain the applicability of existing hydrogen sensors to aerospace applications, a survey was conducted of commercially available point-contact hydrogen sensors, and their operation was analyzed. The operation of the majority of commercial hydrogen sensors falls into four main categories: catalytic combustion, electrochemical, semiconducting oxide sensors, and thermal conductivity detectors. The physical mechanism involved in hydrogen detection for each main category is discussed in detail. From an understanding of the detection mechanism, each category of sensor is evaluated for use in a variety of space and propulsion environments. In order to meet the needs of aerospace applications, the development of point-contact hydrogen sensors that are based on concepts beyond those used in commercial sensors is necessary.

  20. In-line calibration of Raman systems for analysis of gas mixtures of hydrogen isotopologues with sub-percent accuracy.

    PubMed

    Schlösser, Magnus; Seitz, Hendrik; Rupp, Simone; Herwig, Philipp; Alecu, Catalin Gabriel; Sturm, Michael; Bornschein, Beate

    2013-03-01

    Highly accurate, in-line, and real-time composition measurements of gases are mandatory in many processing applications. The quantitative analysis of mixtures of hydrogen isotopologues (H2, D2, T2, HD, HT, and DT) is of high importance in such fields as DT fusion, neutrino mass measurements using tritium β-decay or photonuclear experiments where HD targets are used. Raman spectroscopy is a favorable method for these tasks. In this publication we present a method for the in-line calibration of Raman systems for the nonradioactive hydrogen isotopologues. It is based on precise volumetric gas mixing of the homonuclear species H2/D2 and a controlled catalytic production of the heteronuclear species HD. Systematic effects like spurious exchange reactions with wall materials and others are considered with care during the procedure. A detailed discussion of statistical and systematic uncertainties is presented which finally yields a calibration accuracy of better than 0.4%. PMID:23320553

  1. Cost Analysis of a Concentrator Photovoltaic Hydrogen Production System

    SciTech Connect

    Thompson, J. R.; McConnell, R. D.; Mosleh, M.

    2005-08-01

    The development of efficient, renewable methods of producing hydrogen are essential for the success of the hydrogen economy. Since the feedstock for electrolysis is water, there are no harmful pollutants emitted during the use of the fuel. Furthermore, it has become evident that concentrator photovoltaic (CPV) systems have a number of unique attributes that could shortcut the development process, and increase the efficiency of hydrogen production to a point where economics will then drive the commercial development to mass scale.

  2. Quantitative analysis of HSV gene expression during lytic infection

    PubMed Central

    Turner, Anne-Marie W.; Arbuckle, Jesse H.; Kristie, Thomas M.

    2014-01-01

    Herpes Simplex Virus (HSV) is a human pathogen that establishes latency and undergoes periodic reactivation, resulting in chronic recurrent lytic infection. HSV lytic infection is characterized by an organized cascade of three gene classes, however successful transcription and expression of the first, the immediate early class, is critical to the overall success of viral infection. This initial event of lytic infection is also highly dependent on host cell factors. This unit uses RNA interference and small molecule inhibitors to examine the role of host and viral proteins in HSV lytic infection. Methods detailing isolation of viral and host RNA and genomic DNA, followed by quantitative real-time PCR, allow characterization of impacts on viral transcription and replication respectively. Western blot can be used to confirm quantitative PCR results. This combination of protocols represents a starting point for researchers interested in virus-host interactions during HSV lytic infection. PMID:25367270

  3. Quantitative and qualitative HPLC analysis of thermogenic weight loss products.

    PubMed

    Schaneberg, B T; Khan, I A

    2004-11-01

    An HPLC qualitative and quantitative method of seven analytes (caffeine, ephedrine, forskolin, icariin, pseudoephedrine, synephrine, and yohimbine) in thermogenic weight loss preparations available on the market is described in this paper. After 45 min the seven analytes were separated and detected in the acetonitrile: water (80:20) extract. The method uses a Waters XTerra RP18 (5 microm particle size) column as the stationary phase, a gradient mobile phase of water (5.0 mM SDS) and acetonitrile, and a UV detection of 210 nm. The correlation coefficients for the calibration curves and the recovery rates ranged from 0.994 to 0.999 and from 97.45% to 101.05%, respectively. The qualitative and quantitative results are discussed. PMID:15587578

  4. Analysis of hydrogen vehicles with cryogenic high pressure storage

    SciTech Connect

    Aceves, S. M.; Berry, G. D.

    1998-06-19

    Insulated pressure vessels are cryogenic-capable pressure vessels that can be fueled with liquid hydrogen (LIQ) or ambient-temperature compressed hydrogen (CH2). Insulated pressure vessels offer the advantages of liquid hydrogen tanks (low weight and volume), with reduced disadvantages (lower energy requirement for hydrogen liquefaction and reduced evaporative losses). This paper shows an evaluation of the applicability of the insulated pressure vessels for light-duty vehicles. The paper shows an evaluation of evaporative losses and insulation requirements and a description of the current experimental plans for testing insulated pressure vessels. The results show significant advantages to the use of insulated pressure vessels for light-duty vehicles.

  5. Fluorescent microscopy approaches of quantitative soil microbial analysis

    NASA Astrophysics Data System (ADS)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  6. Comprehensive objective maps of macromolecular conformations by quantitative SAXS analysis

    PubMed Central

    Hura, Greg L.; Budworth, Helen; Dyer, Kevin N.; Rambo, Robert P.; Hammel, Michal

    2013-01-01

    Comprehensive perspectives of macromolecular conformations are required to connect structure to biology. Here we present a small angle X-ray scattering (SAXS) Structural Similarity Map (SSM) and Volatility of Ratio (VR) metric providing comprehensive, quantitative and objective (superposition-independent) perspectives on solution state conformations. We validate VR and SSM utility on human MutSβ, a key ABC ATPase and chemotherapeutic target, by revealing MutSβ DNA sculpting and identifying multiple conformational states for biological activity. PMID:23624664

  7. Advanced hydrogen/oxygen thrust chamber design analysis

    NASA Technical Reports Server (NTRS)

    Shoji, J. M.

    1973-01-01

    The results are reported of the advanced hydrogen/oxygen thrust chamber design analysis program. The primary objectives of this program were to: (1) provide an in-depth analytical investigation to develop thrust chamber cooling and fatigue life limitations of an advanced, high pressure, high performance H2/O2 engine design of 20,000-pounds (88960.0 N) thrust; and (2) integrate the existing heat transfer analysis, thermal fatigue and stress aspects for advanced chambers into a comprehensive computer program. Thrust chamber designs and analyses were performed to evaluate various combustor materials, coolant passage configurations (tubes and channels), and cooling circuits to define the nominal 1900 psia (1.31 x 10 to the 7th power N/sq m) chamber pressure, 300-cycle life thrust chamber. The cycle life capability of the selected configuration was then determined for three duty cycles. Also the influence of cycle life and chamber pressure on thrust chamber design was investigated by varying in cycle life requirements at the nominal chamber pressure and by varying the chamber pressure at the nominal cycle life requirement.

  8. Elastic recoil detection analysis of hydrogen with 7Li ions using a polyimide foil as a thick hydrogen reference

    NASA Astrophysics Data System (ADS)

    Pelicon, Primož; Razpet, Alenka; Markelj, Sabina; Čadež, Iztok; Budnar, Miloš

    2005-01-01

    Elastic recoil detection analysis (ERDA) with an absorber foil using a 4.2 MeV 7Li2+ beam was utilized for evaluation of hydrogen depth profiles. Since recoil cross-sections when using Li ions as projectiles are not well known, the energy dependent ratio between the experimental yield and the yield calculated using the Rutherford recoil cross-section was obtained from an ERDA spectrum of a thick polyimide (Kapton) sample. It was estimated that this ratio does not significantly depend on sample composition. Therefore it was used for correction of measured spectra analyzed by existing simulation and evaluation programs in which the Rutherford recoil cross-sections were applied. The correction procedure has been verified in round-robin measurements of well-characterized Si:H thin layers. Application of the method for determination of a hydrogen depth concentration profile in hydrogen-containing graphite samples is presented.

  9. Analysis of combined hydrogen, heat, and power as a bridge to a hydrogen transition.

    SciTech Connect

    Mahalik, M.; Stephan, C.

    2011-01-18

    Combined hydrogen, heat, and power (CHHP) technology is envisioned as a means to providing heat and electricity, generated on-site, to large end users, such as hospitals, hotels, and distribution centers, while simultaneously producing hydrogen as a by-product. The hydrogen can be stored for later conversion to electricity, used on-site (e.g., in forklifts), or dispensed to hydrogen-powered vehicles. Argonne has developed a complex-adaptive-system model, H2CAS, to simulate how vehicles and infrastructure can evolve in a transition to hydrogen. This study applies the H2CAS model to examine how CHHP technology can be used to aid the transition to hydrogen. It does not attempt to predict the future or provide one forecast of system development. Rather, the purpose of the model is to understand how the system works. The model uses a 50- by 100-mile rectangular grid of 1-square-mile cells centered on the Los Angeles metropolitan area. The major expressways are incorporated into the model, and local streets are considered to be ubiquitous, except where there are natural barriers. The model has two types of agents. Driver agents are characterized by a number of parameters: home and job locations, income, various types of 'personalities' reflective of marketing distinctions (e.g., innovators, early adopters), willingness to spend extra money on 'green' vehicles, etc. At the beginning of the simulations, almost all driver agents own conventional vehicles. They drive around the metropolitan area, commuting to and from work and traveling to various other destinations. As they do so, they observe the presence or absence of facilities selling hydrogen. If they find such facilities conveniently located along their routes, they are motivated to purchase a hydrogen-powered vehicle when it becomes time to replace their present vehicle. Conversely, if they find that they would be inconvenienced by having to purchase hydrogen earlier than necessary or if they become worried that they

  10. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  11. Quantitative analysis of single amino acid variant peptides associated with pancreatic cancer in serum by an isobaric labeling quantitative method.

    PubMed

    Nie, Song; Yin, Haidi; Tan, Zhijing; Anderson, Michelle A; Ruffin, Mack T; Simeone, Diane M; Lubman, David M

    2014-12-01

    Single amino acid variations are highly associated with many human diseases. The direct detection of peptides containing single amino acid variants (SAAVs) derived from nonsynonymous single nucleotide polymorphisms (SNPs) in serum can provide unique opportunities for SAAV associated biomarker discovery. In the present study, an isobaric labeling quantitative strategy was applied to identify and quantify variant peptides in serum samples of pancreatic cancer patients and other benign controls. The largest number of SAAV peptides to date in serum including 96 unique variant peptides were quantified in this quantitative analysis, of which five variant peptides showed a statistically significant difference between pancreatic cancer and other controls (p-value < 0.05). Significant differences in the variant peptide SDNCEDTPEAGYFAVAVVK from serotransferrin were detected between pancreatic cancer and controls, which was further validated by selected reaction monitoring (SRM) analysis. The novel biomarker panel obtained by combining α-1-antichymotrypsin (AACT), Thrombospondin-1 (THBS1) and this variant peptide showed an excellent diagnostic performance in discriminating pancreatic cancer from healthy controls (AUC = 0.98) and chronic pancreatitis (AUC = 0.90). These results suggest that large-scale analysis of SAAV peptides in serum may provide a new direction for biomarker discovery research. PMID:25393578

  12. Geographically Based Hydrogen Consumer Demand and Infrastructure Analysis: Final Report

    SciTech Connect

    Melendez, M.; Milbrandt, A.

    2006-10-01

    In FY 2004 and 2005, NREL developed a proposed minimal infrastructure to support nationwide deployment of hydrogen vehicles by offering infrastructure scenarios that facilitated interstate travel. This report identifies key metropolitan areas and regions on which to focus infrastructure efforts during the early hydrogen transition.

  13. Hydrogen engine performance analysis project. First quarterly report, March 1980

    SciTech Connect

    Adt, Jr, R R; Swain, M R; Pappas, J M

    1980-01-01

    Progress in a program aimed at obtaining operational and performance data on a prototype pre intake valve closing fuel ingestion (PreIVC) hydrogen-fueled automotive engine is reported. Information is included on the construction and testing of an unthrottled hydrogen delivery system and on flashback during starting. It was determined that the flashback was caused by runaway surface ignition. (LCL)

  14. Hydrogen engine performance analysis project. Second annual report

    SciTech Connect

    Adt, Jr., R. R.; Swain, M. R.; Pappas, J. M.

    1980-01-01

    Progress in a 3 year research program to evaluate the performance and emission characteristics of hydrogen-fueled internal combustion engines is reported. Fifteen hydrogen engine configurations will be subjected to performance and emissions characterization tests. During the first two years, baseline data for throttled and unthrottled, carburetted and timed hydrogen induction, Pre IVC hydrogen-fueled engine configurations, with and without exhaust gas recirculation (EGR) and water injection, were obtained. These data, along with descriptions of the test engine and its components, the test apparatus, experimental techniques, experiments performed and the results obtained, are given. Analyses of other hydrogen-engine project data are also presented and compared with the results of the present effort. The unthrottled engine vis-a-vis the throttled engine is found, in general, to exhibit higher brake thermal efficiency. The unthrottled engine also yields lower NO/sub x/ emissions, which were found to be a strong function of fuel-air equivalence ratio. (LCL)

  15. Quantitative analysis of a wind energy conversion model

    NASA Astrophysics Data System (ADS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-03-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s-1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is cp = 0.15. The v3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively.

  16. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. PMID:23523366

  17. Technoeconomic analysis of renewable hydrogen production, storage, and detection systems

    SciTech Connect

    Mann, M.K.; Spath, P.L.; Kadam, K.

    1996-10-01

    Technical and economic feasibility studies of different degrees of completeness and detail have been performed on several projects being funded by the Department of Energy`s Hydrogen Program. Work this year focused on projects at the National Renewable Energy Laboratory, although analyses of projects at other institutions are underway or planned. Highly detailed analyses were completed on a fiber optic hydrogen leak detector and a process to produce hydrogen from biomass via pyrolysis followed by steam reforming of the pyrolysis oil. Less detailed economic assessments of solar and biologically-based hydrogen production processes have been performed and focused on the steps that need to be taken to improve the competitive position of these technologies. Sensitivity analyses were conducted on all analyses to reveal the degree to which the cost results are affected by market changes and technological advances. For hydrogen storage by carbon nanotubes, a survey of the competing storage technologies was made in order to set a baseline for cost goals. A determination of the likelihood of commercialization was made for nearly all systems examined. Hydrogen from biomass via pyrolysis and steam reforming was found to have significant economic potential if a coproduct option could be co-commercialized. Photoelectrochemical hydrogen production may have economic potential, but only if low-cost cells can be modified to split water and to avoid surface oxidation. The use of bacteria to convert the carbon monoxide in biomass syngas to hydrogen was found to be slightly more expensive than the high end of currently commercial hydrogen, although there are significant opportunities to reduce costs. Finally, the cost of installing a fiber-optic chemochromic hydrogen detection system in passenger vehicles was found to be very low and competitive with alternative sensor systems.

  18. Qualitative and quantitative analysis of volatile constituents from latrines.

    PubMed

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 μg/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

  19. Quantitative analysis of radiation-induced changes in sperm morphology.

    PubMed

    Young, I T; Gledhill, B L; Lake, S; Wyrobek, A J

    1982-09-01

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure. PMID:6184000

  20. Quantitative phenotypic analysis of multistress response in Zygosaccharomyces rouxii complex.

    PubMed

    Solieri, Lisa; Dakal, Tikam C; Bicciato, Silvio

    2014-06-01

    Zygosaccharomyces rouxii complex comprises three yeasts clusters sourced from sugar- and salt-rich environments: haploid Zygosaccharomyces rouxii, diploid Zygosaccharomyces sapae and allodiploid/aneuploid strains of uncertain taxonomic affiliations. These yeasts have been characterized with respect to gene copy number variation, karyotype variability and change in ploidy, but functional diversity in stress responses has not been explored yet. Here, we quantitatively analysed the stress response variation in seven strains of the Z. rouxii complex by modelling growth variables via model and model-free fitting methods. Based on the spline fit as most reliable modelling method, we resolved different interstrain responses to 15 environmental perturbations. Compared with Z. rouxii CBS 732(T) and Z. sapae strains ABT301(T) and ABT601, allodiploid strain ATCC 42981 and aneuploid strains CBS 4837 and CBS 4838 displayed higher multistress resistance and better performance in glycerol respiration even in the presence of copper. μ-based logarithmic phenotypic index highlighted that ABT601 is a slow-growing strain insensitive to stress, whereas ABT301(T) grows fast on rich medium and is sensitive to suboptimal conditions. Overall, the differences in stress response could imply different adaptation mechanisms to sugar- and salt-rich niches. The obtained phenotypic profiling contributes to provide quantitative insights for elucidating the adaptive mechanisms to stress in halo- and osmo-tolerant Zygosaccharomyces yeasts. PMID:24533625

  1. Quantitative analysis of laminin 5 gene expression in human keratinocytes.

    PubMed

    Akutsu, Nobuko; Amano, Satoshi; Nishiyama, Toshio

    2005-05-01

    To examine the expression of laminin 5 genes (LAMA3, LAMB3, and LAMC2) encoding the three polypeptide chains alpha3, beta3, and gamma2, respectively, in human keratinocytes, we developed novel quantitative polymerase chain reaction (PCR) methods utilizing Thermus aquaticus DNA polymerase, specific primers, and fluorescein-labeled probes with the ABI PRISM 7700 sequence detector system. Gene expression levels of LAMA3, LAMB3, and LAMC2 and glyceraldehyde-3-phosphate dehydrogenase were quantitated reproducibly and sensitively in the range from 1 x 10(2) to 1 x 10(8) gene copies. Basal gene expression level of LAMB3 was about one-tenth of that of LAMA3 or LAMC2 in human keratinocytes, although there was no clear difference among immunoprecipitated protein levels of alpha3, beta3, and gamma2 synthesized in radio-labeled keratinocytes. Human serum augmented gene expressions of LAMA3, LAMB3, and LAMC2 in human keratinocytes to almost the same extent, and this was associated with an increase of the laminin 5 protein content measured by a specific sandwich enzyme-linked immunosorbent assay. These results demonstrate that the absolute mRNA levels generated from the laminin 5 genes do not determine the translated protein levels of the laminin 5 chains in keratinocytes, and indicate that the expression of the laminin 5 genes may be controlled by common regulation mechanisms. PMID:15854126

  2. Quantitative Analysis of the Enhanced Permeation and Retention (EPR) Effect

    PubMed Central

    Ulmschneider, Martin B.; Searson, Peter C.

    2015-01-01

    Tumor vasculature is characterized by a variety of abnormalities including irregular architecture, poor lymphatic drainage, and the upregulation of factors that increase the paracellular permeability. The increased permeability is important in mediating the uptake of an intravenously administered drug in a solid tumor and is known as the enhanced permeation and retention (EPR) effect. Studies in animal models have demonstrated a cut-off size of 500 nm - 1 µm for molecules or nanoparticles to extravasate into a tumor, however, surprisingly little is known about the kinetics of the EPR effect. Here we present a pharmacokinetic model to quantitatively assess the influence of the EPR effect on the uptake of a drug into a solid tumor. We use pharmacokinetic data for Doxil and doxorubicin from human clinical trials to illustrate how the EPR effect influences tumor uptake. This model provides a quantitative framework to guide preclinical trials of new chemotherapies and ultimately to develop design rules that can increase targeting efficiency and decrease unwanted side effects in normal tissue. PMID:25938565

  3. Quantitative analysis of Babesia ovis infection in sheep and ticks.

    PubMed

    Erster, Oran; Roth, Asael; Wollkomirsky, Ricardo; Leibovich, Benjamin; Savitzky, Igor; Zamir, Shmuel; Molad, Thea; Shkap, Varda

    2016-05-15

    A quantitative PCR, based on the gene encoding Babesia ovis Surface Protein D (BoSPD) was developed and applied to investigate the presence of Babesia ovis (B. ovis) in its principal vector, the tick Rhipicephalus bursa (R. bursa), and in the ovine host. Quantification of B. ovis in experimentally-infected lambs showed a sharp increase in parasitemia 10-11days in blood-inoculated and adult tick-infested lambs, and 24days in a larvae-infested lamb. A gradual decrease of parasitemia was observed in the following months, with parasites detectable 6-12 months post-infection. Examination of the parasite load in adult R. bursa during the post-molting period using the quantitative PCR assay revealed a low parasite load during days 2-7 post-molting, followed by a sharp increase, until day 11, which corresponded to the completion of the pre-feeding period. The assay was then used to detect B. ovis in naturally-infected sheep and ticks. Examination of samples from 8 sheep and 2 goats from infected flocks detected B. ovis in both goats and in 7 out of the 8 sheep. Additionally, B. ovis was detected in 9 tick pools (5 ticks in each pool) and two individual ticks removed from sheep in infected flocks. PMID:27084469

  4. Quantitative analysis of thermal spray deposits using stereology

    SciTech Connect

    Leigh, S.H.; Sampath, S.; Herman, H.; Berndt, C.C.; Montavon, G.; Coddet, C.

    1995-12-31

    Stereology deals with protocols for describing a 3-D space, when only 2-D sections through solid bodies are available. This paper describes a stereological characterization of the microstructure of a thermal spray deposit. The aim of this work is to present results on the stereological characterization of a thermal spray deposit, using two approaches known as DeHoff`s and Cruz-Orive`s protocols. The individual splats are assumed to have an oblate spheroidal shape. The splat size distribution and elongation ratio distribution of splats are calculated using quantitative information from 2-D plane sections. The stereological methods are implemented to investigate the microstructure of a water stabilized plasma spray-formed Al{sub 2}O{sub 3}-13wt.%TiO{sub 2}. Results are obtained with both protocols. The splat sizes range from 0 to 60 {micro}m and shape factors from 0.4 to 1.0. The splats within the deposit seem to be much smaller and thicker (i.e., lower spreading) than those of the first layer deposited onto the substrate. The approach described in this work provides helpful quantitative information on the 3-D microstructure of thermal spray deposit.

  5. ISE Analysis of Hydrogen Sulfide in Cigarette Smoke

    NASA Astrophysics Data System (ADS)

    Li, Guofeng; Polk, Brian J.; Meazell, Liz A.; Hatchett, David W.

    2000-08-01

    Many advanced undergraduate analytical laboratory courses focus on exposing students to various modern instruments. However, students rarely have the opportunity to construct their own analytical tools for solving practical problems. We designed an experiment in which students are required to build their own analytical module, a potentiometric device composed of a Ag/AgCl reference electrode, a Ag/Ag2S ion selective electrode (ISE), and a pH meter used as voltmeter, to determine the amount of hydrogen sulfide in cigarette smoke. Very simple techniques were developed for constructing these electrodes. Cigarette smoke is collected by a gas washing bottle into a 0.1 M NaOH solution. The amount of sulfide in the cigarette smoke solution is analyzed by standard addition of sulfide solution while monitoring the response of the Ag/Ag2S ISE. The collected data are further evaluated using the Gran plot technique to determine the concentration of sulfide in the cigarette smoke solution. The experiment has been successfully incorporated into the lab course Instrumental Analysis at Georgia Institute of Technology. Students enjoy the idea of constructing an analytical tool themselves and applying their classroom knowledge to solve real-life problems. And while learning electrochemistry they also get a chance to visualize the health hazard imposed by cigarette smoking.

  6. U.S. Geographic Analysis of the Cost of Hydrogen from Electrolysis

    SciTech Connect

    Saur, G.; Ainscough, C.

    2011-12-01

    This report summarizes U.S. geographic analysis of the cost of hydrogen from electrolysis. Wind-based water electrolysis represents a viable path to renewably-produced hydrogen production. It might be used for hydrogen-based transportation fuels, energy storage to augment electricity grid services, or as a supplement for other industrial hydrogen uses. This analysis focuses on the levelized production, costs of producing green hydrogen, rather than market prices which would require more extensive knowledge of an hourly or daily hydrogen market. However, the costs of hydrogen presented here do include a small profit from an internal rate of return on the system. The cost of renewable wind-based hydrogen production is very sensitive to the cost of the wind electricity. Using differently priced grid electricity to supplement the system had only a small effect on the cost of hydrogen; because wind electricity was always used either directly or indirectly to fully generate the hydrogen. Wind classes 3-6 across the U.S. were examined and the costs of hydrogen ranged from $3.74kg to $5.86/kg. These costs do not quite meet the 2015 DOE targets for central or distributed hydrogen production ($3.10/kg and $3.70/kg, respectively), so more work is needed on reducing the cost of wind electricity and the electrolyzers. If the PTC and ITC are claimed, however, many of the sites will meet both targets. For a subset of distributed refueling stations where there is also inexpensive, open space nearby this could be an alternative to central hydrogen production and distribution.

  7. Theoretical analysis of hydrogen spillover mechanism on carbon nanotubes

    PubMed Central

    Juarez-Mosqueda, Rosalba; Mavrandonakis, Andreas; Kuc, Agnieszka B.; Pettersson, Lars G. M.; Heine, Thomas

    2015-01-01

    The spillover mechanism of molecular hydrogen on carbon nanotubes in the presence of catalytically active platinum clusters was critically and systematically investigated by using density-functional theory. Our simulation model includes a Pt4 cluster for the catalyst nanoparticle and curved and planar circumcoronene for two exemplary single-walled carbon nanotubes (CNT), the (10,10) CNT and one of large diameter, respectively. Our results show that the H2 molecule dissociates spontaneously on the Pt4 cluster. However, the dissociated H atoms have to overcome a barrier of more than 2 eV to migrate from the catalyst to the CNT, even if the Pt4 cluster is at full saturation with six adsorbed and dissociated hydrogen molecules. Previous investigations have shown that the mobility of hydrogen atoms on the CNT surface is hindered by a barrier. We find that instead the Pt4 catalyst may move along the outer surface of the CNT with activation energy of only 0.16 eV, and that this effect offers the possibility of full hydrogenation of the CNT. Thus, although we have not found a low-energy pathway to spillover onto the CNT, we suggest, based on our calculations and calculated data reported in the literature, that in the hydrogen-spillover process the observed saturation of the CNT at hydrogen background pressure occurs through mobile Pt nanoclusters, which move on the substrate more easily than the substrate-chemisorbed hydrogens, and deposit or reattach hydrogens in the process. Initial hydrogenation of the carbon substrate, however, is thermodynamically unfavoured, suggesting that defects should play a significant role. PMID:25699250

  8. Bridging the gaps for global sustainable development: a quantitative analysis.

    PubMed

    Udo, Victor E; Jansson, Peter Mark

    2009-09-01

    Global human progress occurs in a complex web of interactions between society, technology and the environment as driven by governance and infrastructure management capacity among nations. In our globalizing world, this complex web of interactions over the last 200 years has resulted in the chronic widening of economic and political gaps between the haves and the have-nots with consequential global cultural and ecosystem challenges. At the bottom of these challenges is the issue of resource limitations on our finite planet with increasing population. The problem is further compounded by pleasure-driven and poverty-driven ecological depletion and pollution by the haves and the have-nots respectively. These challenges are explored in this paper as global sustainable development (SD) quantitatively; in order to assess the gaps that need to be bridged. Although there has been significant rhetoric on SD with very many qualitative definitions offered, very few quantitative definitions of SD exist. The few that do exist tend to measure SD in terms of social, energy, economic and environmental dimensions. In our research, we used several human survival, development, and progress variables to create an aggregate SD parameter that describes the capacity of nations in three dimensions: social sustainability, environmental sustainability and technological sustainability. Using our proposed quantitative definition of SD and data from relatively reputable secondary sources, 132 nations were ranked and compared. Our comparisons indicate a global hierarchy of needs among nations similar to Maslow's at the individual level. As in Maslow's hierarchy of needs, nations that are struggling to survive are less concerned with environmental sustainability than advanced and stable nations. Nations such as the United States, Canada, Finland, Norway and others have higher SD capacity, and thus, are higher on their hierarchy of needs than nations such as Nigeria, Vietnam, Mexico and other

  9. A survey and analysis of experimental hydrogen sensors

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.

    1992-01-01

    In order to ascertain the applicability of hydrogen sensors to aerospace applications, a survey was conducted of promising experimental point-contact hydrogen sensors and their operation was analyzed. The techniques discussed are metal-oxide-semiconductor or MOS based sensors, catalytic resistor sensors, acoustic wave detectors, and pyroelectric detectors. All of these sensors depend on the interaction of hydrogen with Pd or a Pd-alloy. It is concluded that no single technique will meet the needs of aerospace applications but a combination of approaches is necessary. The most promising combination is an MOS based sensor with a catalytic resistor.

  10. Hydrogen Fueling Station in Honolulu, Hawaii Feasibility Analysis

    SciTech Connect

    Porter Hill; Michael Penev

    2014-08-01

    The Department of Energy Hydrogen & Fuel Cells Program Plan (September 2011) identifies the use of hydrogen for government and fleet electric vehicles as a key step for achieving “reduced greenhouse gas emissions; reduced oil consumption; expanded use of renewable power …; highly efficient energy conversion; fuel flexibility …; reduced air pollution; and highly reliable grid-support.” This report synthesizes several pieces of existing information that can inform a decision regarding the viability of deploying a hydrogen (H2) fueling station at the Fort Armstrong site in Honolulu, Hawaii.

  11. Quantitative aspects of ESR and spin trapping of hydroxyl radicals and hydrogen atoms in gamma-irradiated aqueous solutions

    SciTech Connect

    Carmichael, A.J.; Makino, K.; Riesz, P.

    1984-11-01

    The efficiency of 5,5-dimethylpyrroline-1-N-oxide (DMPO) and ..cap alpha..-(4-pyridyl-1-oxide)-N-tert-butylnitrone (POBN) to spin trap hydroxyl radicals and hydrogen atoms, respectively, was studied in ..gamma..-irradiated solutions where the radical yields are accurately known. The effects of dose, spin trap concentration, and pH and of the stability of the spin adducts on the spin-trapping efficiency were investigated. In degassed or N/sub 2/-saturated solutions the spin-trapping efficiencies were 35% for DMPO and hydroxyl radicals and 14% for POBN and hydrogen atoms. The low spin-trapping efficiency of DMPO may be explained by the reaction of hydroxyl radicals to abstract hydrogen from the DMPO molecule to produce carbon radicals as well as addition to the N=C double bond to form nitroxide radicals. For POBN the low spin-trapping efficiency for hydrogen atoms is explained in terms of addition reactions of hydrogen atoms to the aromatic ring and the pyridinium and nitrone oxygens.

  12. [Quantitative analysis of alloy steel based on laser induced breakdown spectroscopy with partial least squares method].

    PubMed

    Cong, Zhi-Bo; Sun, Lan-Xiang; Xin, Yong; Li, Yang; Qi, Li-Feng; Yang, Zhi-Jia

    2014-02-01

    In the present paper both the partial least squares (PLS) method and the calibration curve (CC) method are used to quantitatively analyze the laser induced breakdown spectroscopy data obtained from the standard alloy steel samples. Both the major and trace elements were quantitatively analyzed. By comparing the results of two different calibration methods some useful results were obtained: for major elements, the PLS method is better than the CC method in quantitative analysis; more importantly, for the trace elements, the CC method can not give the quantitative results due to the extremely weak characteristic spectral lines, but the PLS method still has a good ability of quantitative analysis. And the regression coefficient of PLS method is compared with the original spectral data with background interference to explain the advantage of the PLS method in the LIBS quantitative analysis. Results proved that the PLS method used in laser induced breakdown spectroscopy is suitable for quantitative analysis of trace elements such as C in the metallurgical industry. PMID:24822436

  13. Quantitative analysis on electric dipole energy in Rashba band splitting

    NASA Astrophysics Data System (ADS)

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-09-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime.

  14. Quantitative analysis of CT scans of ceramic candle filters

    SciTech Connect

    Ferer, M.V.; Smith, D.H.

    1996-12-31

    Candle filters are being developed to remove coal ash and other fine particles (<15{mu}m) from hot (ca. 1000 K) gas streams. In the present work, a color scanner was used to digitize hard-copy CT X-ray images of cylindrical SiC filters, and linear regressions converted the scanned (color) data to a filter density for each pixel. These data, with the aid of the density of SiC, gave a filter porosity for each pixel. Radial averages, density-density correlation functions, and other statistical analyses were performed on the density data. The CT images also detected the presence and depth of cracks that developed during usage of the filters. The quantitative data promise to be a very useful addition to the color images.

  15. Quantitative Analysis of Cancer Metastasis using an Avian Embryo Model

    PubMed Central

    Palmer, Trenis D.; Lewis, John; Zijlstra, Andries

    2011-01-01

    During metastasis cancer cells disseminate from the primary tumor, invade into surrounding tissues, and spread to distant organs. Metastasis is a complex process that can involve many tissue types, span variable time periods, and often occur deep within organs, making it difficult to investigate and quantify. In addition, the efficacy of the metastatic process is influenced by multiple steps in the metastatic cascade making it difficult to evaluate the contribution of a single aspect of tumor cell behavior. As a consequence, metastasis assays are frequently performed in experimental animals to provide a necessarily realistic context in which to study metastasis. Unfortunately, these models are further complicated by their complex physiology. The chick embryo is a unique in vivo model that overcomes many limitations to studying metastasis, due to the accessibility of the chorioallantoic membrane (CAM), a well-vascularized extra-embryonic tissue located underneath the eggshell that is receptive to the xenografting of tumor cells (figure 1). Moreover, since the chick embryo is naturally immunodeficient, the CAM readily supports the engraftment of both normal and tumor tissues. Most importantly, the avian CAM successfully supports most cancer cell characteristics including growth, invasion, angiogenesis, and remodeling of the microenvironment. This makes the model exceptionally useful for the investigation of the pathways that lead to cancer metastasis and to predict the response of metastatic cancer to new potential therapeutics. The detection of disseminated cells by species-specific Alu PCR makes it possible to quantitatively assess metastasis in organs that are colonized by as few as 25 cells. Using the Human Epidermoid Carcinoma cell line (HEp3) we use this model to analyze spontaneous metastasis of cancer cells to distant organs, including the chick liver and lung. Furthermore, using the Alu-PCR protocol we demonstrate the sensitivity and reproducibility of the

  16. Quantitative error analysis for computer assisted navigation: a feasibility study

    PubMed Central

    Güler, Ö.; Perwög, M.; Kral, F.; Schwarm, F.; Bárdosi, Z. R.; Göbel, G.; Freysinger, W.

    2013-01-01

    Purpose The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features – the User Localization Error (ULE) - is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system. Methods Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials) and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root mean square, RMS; probe tip calibration was 0.18 mm RMS. Variances of tracking along the principal directions were measured as 0.18 mm2, 0.32 mm2, and 0.42 mm2. ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively. Results The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE. Conclusions Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials

  17. Two-dimensional Raman mole-fraction and temperature measurements for hydrogen-nitrogen mixture analysis.

    PubMed

    Braeuer, Andreas; Leipertz, Alfred

    2009-02-01

    A two-dimensional laser Raman technique was developed and applied to directly probe the population number of selected rotational and vibrational energy levels of hydrogen and nitrogen. Using three cameras simultaneously, temperature and mole fraction images could be detected. Three different combinations of rotational and vibrational Raman signals of hydrogen and nitrogen were analyzed to identify the combination that is most suitable for future mixture analysis in hydrogen internal combustion engines. Here the experiments were conducted in an injection chamber where hot hydrogen was injected into room temperature nitrogen at 1.1 MPa. PMID:19183582

  18. Technoeconomic analysis of different options for the production of hydrogen from sunlight, wind, and biomass

    SciTech Connect

    Mann, M.K.; Spath, P.L.; Amos, W.A.

    1998-08-01

    To determine their technical and economic viability and to provide insight into where each technology is in its development cycle, different options to produce hydrogen from sunlight, wind, and biomass were studied. Additionally, costs for storing and transporting hydrogen were determined for different hydrogen quantities and storage times. The analysis of hydrogen from sunlight examined the selling price of hydrogen from two technologies: direct photoelectrochemical (PEC) conversion of sunlight and photovoltaic (PV)-generated electricity production followed by electrolysis. The wind analysis was based on wind-generated electricity production followed by electrolysis. In addition to the base case analyses, which assume that hydrogen is the sole product, three alternative scenarios explore the economic impact of integrating the PV- and wind-based systems with the electric utility grid. Results show that PEC hydrogen production has the potential to be economically feasible. Additionally, the economics of the PV and wind electrolysis systems are improved by interaction with the grid. The analysis of hydrogen from biomass focused on three gasification technologies. The systems are: low pressure, indirectly-heated gasification followed by steam reforming; high pressure, oxygen-blown gasification followed by steam reforming; and pyrolysis followed by partial oxidation. For each of the systems studied, the downstream process steps include shift conversion followed by hydrogen purification. Only the low pressure system produces hydrogen within the range of the current industry selling prices (typically $0.7--$2/kg, or $5--14/GJ on a HHV basis). A sensitivity analysis showed that, for the other two systems, in order to bring the hydrogen selling price down to $2/kg, negative-priced feedstocks would be required.

  19. How resonance assists hydrogen bonding interactions: an energy decomposition analysis.

    PubMed

    Beck, John Frederick; Mo, Yirong

    2007-01-15

    Block-localized wave function (BLW) method, which is a variant of the ab initio valence bond (VB) theory, was employed to explore the nature of resonance-assisted hydrogen bonds (RAHBs) and to investigate the mechanism of synergistic interplay between pi delocalization and hydrogen-bonding interactions. We examined the dimers of formic acid, formamide, 4-pyrimidinone, 2-pyridinone, 2-hydroxpyridine, and 2-hydroxycyclopenta-2,4-dien-1-one. In addition, we studied the interactions in beta-diketone enols with a simplified model, namely the hydrogen bonds of 3-hydroxypropenal with both ethenol and formaldehyde. The intermolecular interaction energies, either with or without the involvement of pi resonance, were decomposed into the Hitler-London energy (DeltaEHL), polarization energy (DeltaEpol), charge transfer energy (DeltaECT), and electron correlation energy (DeltaEcor) terms. This allows for the examination of the character of hydrogen bonds and the impact of pi conjugation on hydrogen bonding interactions. Although it has been proposed that resonance-assisted hydrogen bonds are accompanied with an increasing of covalency character, our analyses showed that the enhanced interactions mostly originate from the classical dipole-dipole (i.e., electrostatic) attraction, as resonance redistributes the electron density and increases the dipole moments in monomers. The covalency of hydrogen bonds, however, changes very little. This disputes the belief that RAHB is primarily covalent in nature. Accordingly, we recommend the term "resonance-assisted binding (RAB)" instead of "resonance-assisted hydrogen bonding (RHAB)" to highlight the electrostatic, which is a long-range effect, rather than the electron transfer nature of the enhanced stabilization in RAHBs. PMID:17143867

  20. Response Neighborhoods in Online Learning Networks: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2005-01-01

    Theoretical foundation of Response mechanisms in networks of online learners are revealed by Statistical Analysis of p* Markov Models for the Networks. Our comparative analysis of two networks shows that the minimal-effort hunt-for-social-capital mechanism controls a major behavior of both networks: negative tendency to respond. Differences in…

  1. [Research progress of quantitative analysis for respiratory sinus arrhythmia].

    PubMed

    Sun, Congcong; Zhang, Zhengbo; Wang, Buqing; Liu, Hongyun; Ang, Qing; Wang, Weidong

    2011-12-01

    Respiratory sinus arrhythmia (RSA) is known as fluctuations of heart rate associated with breathing. It has been increasingly used as a noninvasive index of cardiac vagal tone in psychophysiological research recently. Its analysis is often influenced or distorted by respiratory parameters, posture and action, etc. This paper reviews five methods of quantification, including the root mean square of successive differences (RMSSD), peak valley RSA (pvRSA), cosinor fitting, spectral analysis, and joint timing-frequency analysis (JTFA). Paced breathing, analysis of covariance, residua method and msRSA per liter tidal volume are adjustment strategies of measurement and analysis of RSA in this article as well. At last, some prospects of solutions of the problems of RSA research are given. PMID:22295719

  2. Analysis of Hydrogenated Ti-Zr-Ni alloys.

    NASA Astrophysics Data System (ADS)

    Majzoub, E. H.; Viano, A. M.; Kelton, K. F.; Yelon, W. B.; Goldman, A. I.

    1996-03-01

    The polytetrahedral order in quasicrystals deserves close attention for pragmatic reasons. It is well known that hydrogen atoms prefer to sit in tetrahedral interstitial sites in transition metals and their alloys. Since the number of tetrahedral intersticies is presumed large in quasicrystals, it is interesting to see how much hydrogen can be stored in them. Additionally, hydrogen in quasicrystals can be of great help in locating tetrahedral sites, since neutron diffraction of a loaded sample will determine the position of the hydrogen atoms. We present results of elastic neutron scattering studies of deuterated tzn453817. Samples were deuterated using the gas phase method, samples were hydrogenated using both the gas phase method and by electrochemical means. Differential scanning calorimetry (DSC) was performed on loaded samples to determine stability of phases formed and to estimate transition enthalpies. Approximate binding energies of hydrogen in the host alloy were determined using DSC enthalpies and are shown to be in the range of 1eV/H atom for tzn453817 alloys. Chemical variations in the alloy tvzn45-xx3817 are shown to affect this binding energy quite dramatically.

  3. Quantitative material analysis by dual-energy computed tomography for industrial NDT applications

    NASA Astrophysics Data System (ADS)

    Nachtrab, F.; Weis, S.; Keßling, P.; Sukowski, F.; Haßler, U.; Fuchs, T.; Uhlmann, N.; Hanke, R.

    2011-05-01

    Dual-energy computed tomography (DECT) is an established method in the field of medical CT to obtain quantitative information on a material of interest instead of mean attenuation coefficients only. In the field of industrial X-ray imaging dual-energy techniques have been used to solve special problems on a case-by-case basis rather than as a standard tool. Our goal is to develop an easy-to-use dual-energy solution that can be handled by the average industrial operator without the need for a specialist. We are aiming at providing dual-energy CT as a measurement tool for those cases where qualitative images are not enough and one needs additional quantitative information (e.g. mass density ρ and atomic number Z) about the sample at hand. Our solution is based on an algorithm proposed by Heismann et al. (2003) [1] for application in medical CT . As input data this algorithm needs two CT data sets, one with low (LE) and one with high effective energy (HE). A first order linearization is applied to the raw data, and two volumes are reconstructed thereafter. The dual-energy analysis is done voxel by voxel, using a pre-calculated function F(Z) that implies the parameters of the low and high energy measurement (such as tube voltage, filtration and detector sensitivity). As a result, two volume data sets are obtained, one providing information about the mass density ρ in each voxel, the other providing the effective atomic number Z of the material therein. One main difference between medical and industrial CT is that the range of materials that can be contained in a sample is much wider and can cover the whole range of elements, from hydrogen to uranium. Heismann's algorithm is limited to the range of elements Z=1-30, because for Z>30 the function F(Z) as given by Heismann is not a bijective function anymore. While this still seems very suitable for medical application, it is not enough to cover the complete range of industrial applications. We therefore investigated the

  4. Quantitative analysis of numerical solvers for oscillatory biomolecular system models

    PubMed Central

    Quo, Chang F; Wang, May D

    2008-01-01

    Background This article provides guidelines for selecting optimal numerical solvers for biomolecular system models. Because various parameters of the same system could have drastically different ranges from 10-15 to 1010, the ODEs can be stiff and ill-conditioned, resulting in non-unique, non-existing, or non-reproducible modeling solutions. Previous studies have not examined in depth how to best select numerical solvers for biomolecular system models, which makes it difficult to experimentally validate the modeling results. To address this problem, we have chosen one of the well-known stiff initial value problems with limit cycle behavior as a test-bed system model. Solving this model, we have illustrated that different answers may result from different numerical solvers. We use MATLAB numerical solvers because they are optimized and widely used by the modeling community. We have also conducted a systematic study of numerical solver performances by using qualitative and quantitative measures such as convergence, accuracy, and computational cost (i.e. in terms of function evaluation, partial derivative, LU decomposition, and "take-off" points). The results show that the modeling solutions can be drastically different using different numerical solvers. Thus, it is important to intelligently select numerical solvers when solving biomolecular system models. Results The classic Belousov-Zhabotinskii (BZ) reaction is described by the Oregonator model and is used as a case study. We report two guidelines in selecting optimal numerical solver(s) for stiff, complex oscillatory systems: (i) for problems with unknown parameters, ode45 is the optimal choice regardless of the relative error tolerance; (ii) for known stiff problems, both ode113 and ode15s are good choices under strict relative tolerance conditions. Conclusions For any given biomolecular model, by building a library of numerical solvers with quantitative performance assessment metric, we show that it is possible

  5. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. PMID:21705250

  6. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  7. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  8. Space-to-Ground Communication for Columbus: A Quantitative Analysis

    PubMed Central

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  9. Quantitative analysis on electric dipole energy in Rashba band splitting.

    PubMed

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-01-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime. PMID:26323493

  10. Quantitative analysis on electric dipole energy in Rashba band splitting

    PubMed Central

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-01-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime. PMID:26323493

  11. Quantitative analysis of electroluminescence images from polymer solar cells

    NASA Astrophysics Data System (ADS)

    Seeland, Marco; Rösch, Roland; Hoppe, Harald

    2012-01-01

    We introduce the micro-diode-model (MDM) based on a discrete network of interconnected diodes, which allows for quantitative description of lateral electroluminescence emission images obtained from organic bulk heterojunction solar cells. Besides the distributed solar cell description, the equivalent circuit, respectively, network model considers interface and bulk resistances as well as the sheet resistance of the semitransparent electrode. The application of this model allows direct calculation of the lateral current and voltage distribution within the solar cell and thus accounts well for effects known as current crowding. In addition, network parameters such as internal resistances and the sheet-resistance of the higher resistive electrode can be determined. Furthermore, upon introduction of current sources the micro-diode-model also is able to describe and predict current-voltage characteristics for solar cell devices under illumination. The local nature of this description yields important conclusions concerning the geometry dependent performance and the validity of classical models and equivalent circuits describing thin film solar cells.

  12. Quantitative analysis of ultrasound images for computer-aided diagnosis.

    PubMed

    Wu, Jie Ying; Tuomi, Adam; Beland, Michael D; Konrad, Joseph; Glidden, David; Grand, David; Merck, Derek

    2016-01-01

    We propose an adaptable framework for analyzing ultrasound (US) images quantitatively to provide computer-aided diagnosis using machine learning. Our preliminary clinical targets are hepatic steatosis, adenomyosis, and craniosynostosis. For steatosis and adenomyosis, we collected US studies from 288 and 88 patients, respectively, as well as their biopsy or magnetic resonanceconfirmed diagnosis. Radiologists identified a region of interest (ROI) on each image. We filtered the US images for various texture responses and use the pixel intensity distribution within each ROI as feature parameterizations. Our craniosynostosis dataset consisted of 22 CT-confirmed cases and 22 age-matched controls. One physician manually measured the vectors from the center of the skull to the outer cortex at every 10 deg for each image and we used the principal directions as shape features for parameterization. These parameters and the known diagnosis were used to train classifiers. Testing with cross-validation, we obtained 72.74% accuracy and 0.71 area under receiver operating characteristics curve for steatosis ([Formula: see text]), 77.27% and 0.77 for adenomyosis ([Formula: see text]), and 88.63% and 0.89 for craniosynostosis ([Formula: see text]). Our framework is able to detect a variety of diseases with high accuracy. We hope to include it as a routinely available support system in the clinic. PMID:26835502

  13. Quantitative proteomic analysis of the Salmonella-lettuce interaction

    PubMed Central

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-01-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens. PMID:24512637

  14. Quantitative analysis of pheromone-binding protein specificity

    PubMed Central

    Katti, S.; Lokhande, N.; González, D.; Cassill, A.; Renthal, R.

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using β-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between β-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the β-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ~100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ~200 nM for the silk moth pheromone bombykol and ~90 nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the pheromone receptor model proposed by Laughlin et al. (Cell 133: 1255–65, 2008) are discussed. PMID:23121132

  15. Quantitative Dopant/Impurity Analysis for ICF Targets

    NASA Astrophysics Data System (ADS)

    Huang, Haibo; Nikroo, Abbas; Stephens, Richard; Eddinger, Samual; Xu, Hongwei; Chen, K. C.; Moreno, Kari

    2008-11-01

    We developed a number of new or improved metrology techniques to measure the spatial distributions of multiple elements in ICF ablator capsules to tight NIF specifications (0.5±0.1 at% Cu, 0.25±0.10 at% Ar, 0.4±0.4 at% O). The elements are either the graded dopants for shock timing, such as Cu in Be, or process-induced impurities, such as Ar and O. Their low concentration, high spatial variation and simultaneous presence make the measurement very difficult. We solved this metrology challenge by combining several techniques: Cu and Ar profiles can be nondestructively measured by operating Contact Radiography (CR) in a differential mode. The result, as well as the O profile, can be checked destructively by a quantitative Energy Dispersive Spectroscopy (EDS) method. Non-spatially resolved methods, such as absorption edge spectroscopy (and to a lesser accuracy, x-ray fluorescence) can calibrate the Ar and Cu measurement in EDS and CR. In addition, oxygen pick-up during mandrel removal can be validated by before-and-after CR and by density change. Use of all these methods gives multiple checks on the reported profiles.

  16. Analysis of alpha-synuclein-associated proteins by quantitative proteomics.

    PubMed

    Zhou, Yong; Gu, Guangyu; Goodlett, David R; Zhang, Terry; Pan, Catherine; Montine, Thomas J; Montine, Kathleen S; Aebersold, Ruedi H; Zhang, Jing

    2004-09-10

    To identify the proteins associated with soluble alpha-synuclein (AS) that might promote AS aggregation, a key event leading to neurodegeneration, we quantitatively compared protein profiles of AS-associated protein complexes in MES cells exposed to rotenone, a pesticide that produces parkinsonism in animals and induces Lewy body (LB)-like inclusions in the remaining dopaminergic neurons, and to vehicle. We identified more than 250 proteins associated with Nonidet P-40 soluble AS, and demonstrated that at least 51 of these proteins displayed significant differences in their relative abundance in AS complexes under conditions where rotenone was cytotoxic and induced formation of cytoplasmic inclusions immunoreactive to anti-AS. Overexpressing one of these proteins, heat shock protein (hsp) 70, not only protected cells from rotenone-mediated cytotoxicity but also decreased soluble AS aggregation. Furthermore, the protection afforded by hsp70 transfection appeared to be related to suppression of rotenone-induced oxidative stress as well as mitochondrial and proteasomal dysfunction. PMID:15234983

  17. Quantitative analysis of virus and plasmid trafficking in cells

    NASA Astrophysics Data System (ADS)

    Lagache, Thibault; Dauty, Emmanuel; Holcman, David

    2009-01-01

    Intracellular transport of DNA carriers is a fundamental step of gene delivery. By combining both theoretical and numerical approaches we study here single and several viruses and DNA particles trafficking in the cell cytoplasm to a small nuclear pore. We present a physical model to account for certain aspects of cellular organization, starting with the observation that a viral trajectory consists of epochs of pure diffusion and epochs of active transport along microtubules. We define a general degradation rate to describe the limitations of the delivery of plasmid or viral particles to a nuclear pore imposed by various types of direct and indirect hydrolysis activity inside the cytoplasm. By replacing the switching dynamics by a single steady state stochastic description, we obtain estimates for the probability and the mean time for the first one of many particles to go from the cell membrane to a small nuclear pore. Computational simulations confirm that our model can be used to analyze and interpret viral trajectories and estimate quantitatively the success of nuclear delivery.

  18. Analysis of copy number variation using quantitative interspecies competitive PCR.

    PubMed

    Williams, Nigel M; Williams, Hywel; Majounie, Elisa; Norton, Nadine; Glaser, Beate; Morris, Huw R; Owen, Michael J; O'Donovan, Michael C

    2008-10-01

    Over recent years small submicroscopic DNA copy-number variants (CNVs) have been highlighted as an important source of variation in the human genome, human phenotypic diversity and disease susceptibility. Consequently, there is a pressing need for the development of methods that allow the efficient, accurate and cheap measurement of genomic copy number polymorphisms in clinical cohorts. We have developed a simple competitive PCR based method to determine DNA copy number which uses the entire genome of a single chimpanzee as a competitor thus eliminating the requirement for competitive sequences to be synthesized for each assay. This results in the requirement for only a single reference sample for all assays and dramatically increases the potential for large numbers of loci to be analysed in multiplex. In this study we establish proof of concept by accurately detecting previously characterized mutations at the PARK2 locus and then demonstrating the potential of quantitative interspecies competitive PCR (qicPCR) to accurately genotype CNVs in association studies by analysing chromosome 22q11 deletions in a sample of previously characterized patients and normal controls. PMID:18697816

  19. Quantitative proteomic analysis of the Salmonella-lettuce interaction.

    PubMed

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-11-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens. PMID:24512637

  20. Quantitative Analysis of CME Deflections in the Corona

    NASA Astrophysics Data System (ADS)

    Gui, Bin; Shen, Chenglong; Wang, Yuming; Ye, Pinzhong; Liu, Jiajia; Wang, Shui; Zhao, Xuepu

    2011-07-01

    In this paper, ten CME events viewed by the STEREO twin spacecraft are analyzed to study the deflections of CMEs during their propagation in the corona. Based on the three-dimensional information of the CMEs derived by the graduated cylindrical shell (GCS) model (Thernisien, Howard, and Vourlidas in Astrophys. J. 652, 1305, 2006), it is found that the propagation directions of eight CMEs had changed. By applying the theoretical method proposed by Shen et al. ( Solar Phys. 269, 389, 2011) to all the CMEs, we found that the deflections are consistent, in strength and direction, with the gradient of the magnetic energy density. There is a positive correlation between the deflection rate and the strength of the magnetic energy density gradient and a weak anti-correlation between the deflection rate and the CME speed. Our results suggest that the deflections of CMEs are mainly controlled by the background magnetic field and can be quantitatively described by the magnetic energy density gradient (MEDG) model.

  1. Analysis of quantitative trait loci for behavioral laterality in mice.

    PubMed Central

    Roubertoux, Pierre L; Le Roy, Isabelle; Tordjman, Sylvie; Cherfou, Améziane; Migliore-Samour, Danièle

    2003-01-01

    Laterality is believed to have genetic components, as has been deduced from family studies in humans and responses to artificial selection in mice, but these genetic components are unknown and the underlying physiological mechanisms are still a subject of dispute. We measured direction of laterality (preferential use of left or right paws) and degree of laterality (absolute difference between the use of left and right paws) in C57BL/6ByJ (B) and NZB/BlNJ (N) mice and in their F(1) and F(2) intercrosses. Measurements were taken of both forepaws and hind paws. Quantitative trait loci (QTL) did not emerge for direction but did for degree of laterality. One QTL for forepaw (LOD score = 5.6) and the second QTL for hind paw (LOD score = 7.2) were both located on chromosome 4 and their peaks were within the same confidence interval. A QTL for plasma luteinizing hormone concentration was also found in the confidence interval of these two QTL. These results suggest that the physiological mechanisms underlying degree of laterality react to gonadal steroids. PMID:12663540

  2. Quantitative Proteome Analysis of Leishmania donovani under Spermidine Starvation

    PubMed Central

    Singh, Shalini; Dubey, Vikash Kumar

    2016-01-01

    We have earlier reported antileishmanial activity of hypericin by spermidine starvation. In the current report, we have used label free proteome quantitation approach to identify differentially modulated proteins after hypericin treatment. A total of 141 proteins were found to be differentially regulated with ANOVA P value less than 0.05 in hypericin treated Leishmania promastigotes. Differentially modulated proteins have been broadly classified under nine major categories. Increase in ribosomal protein S7 protein suggests the repression of translation. Inhibition of proteins related to ubiquitin proteasome system, RNA binding protein and translation initiation factor also suggests altered translation. We have also observed increased expression of Hsp 90, Hsp 83–1 and stress inducible protein 1. Significant decreased level of cyclophilin was observed. These stress related protein could be cellular response of the parasite towards hypericin induced cellular stress. Also, defective metabolism, biosynthesis and replication of nucleic acids, flagellar movement and signalling of the parasite were observed as indicated by altered expression of proteins involved in these pathways. The data was analyzed rigorously to get further insight into hypericin induced parasitic death. PMID:27123864

  3. Quantitative dual-probe microdialysis: mathematical model and analysis.

    PubMed

    Chen, Kevin C; Höistad, Malin; Kehr, Jan; Fuxe, Kjell; Nicholson, Charles

    2002-04-01

    Steady-state microdialysis is a widely used technique to monitor the concentration changes and distributions of substances in tissues. To obtain more information about brain tissue properties from microdialysis, a dual-probe approach was applied to infuse and sample the radiotracer, [3H]mannitol, simultaneously both in agar gel and in the rat striatum. Because the molecules released by one probe and collected by the other must diffuse through the interstitial space, the concentration profile exhibits dynamic behavior that permits the assessment of the diffusion characteristics in the brain extracellular space and the clearance characteristics. In this paper a mathematical model for dual-probe microdialysis was developed to study brain interstitial diffusion and clearance processes. Theoretical expressions for the spatial distribution of the infused tracer in the brain extracellular space and the temporal concentration at the probe outlet were derived. A fitting program was developed using the simplex algorithm, which finds local minima of the standard deviations between experiments and theory by adjusting the relevant parameters. The theoretical curves accurately fitted the experimental data and generated realistic diffusion parameters, implying that the mathematical model is capable of predicting the interstitial diffusion behavior of [3H]mannitol and that it will be a valuable quantitative tool in dual-probe microdialysis. PMID:12067242

  4. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  5. Temporal kinetics and quantitative analysis of Cryptococcus neoformans nonlytic exocytosis.

    PubMed

    Stukes, Sabriya A; Cohen, Hillel W; Casadevall, Arturo

    2014-05-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  6. Quantitative analysis of synaptic release at the photoreceptor synapse.

    PubMed

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B

    2010-05-19

    Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca(2+) and exhibits an unusually shallow dependence on presynaptic Ca(2+). To provide a quantitative description of the photoreceptor Ca(2+) sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca(2+)-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca(2+): exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca(2+) binding sites on the rod Ca(2+) sensor rather than the typical four or five. For most models, the on-rates for Ca(2+) binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca(2+) unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca(2+) sensor, slow Ca(2+) unbinding may support the fusion of vesicles located at a distance from Ca(2+) channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  7. Quantitative Analysis of Synaptic Release at the Photoreceptor Synapse

    PubMed Central

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B.

    2010-01-01

    Abstract Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca2+ and exhibits an unusually shallow dependence on presynaptic Ca2+. To provide a quantitative description of the photoreceptor Ca2+ sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca2+-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca2+: exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca2+ binding sites on the rod Ca2+ sensor rather than the typical four or five. For most models, the on-rates for Ca2+ binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca2+ unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca2+ sensor, slow Ca2+ unbinding may support the fusion of vesicles located at a distance from Ca2+ channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  8. Quantitative analysis of task selection for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  9. Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis

    PubMed Central

    Stukes, Sabriya A.; Cohen, Hillel W.

    2014-01-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  10. Economic Analysis of a Nuclear Reactor Powered High-Temperature Electrolysis Hydrogen Production Plant

    SciTech Connect

    E. A. Harvego; M. G. McKellar; M. S. Sohal; J. E. O'Brien; J. S. Herring

    2008-08-01

    A reference design for a commercial-scale high-temperature electrolysis (HTE) plant for hydrogen production was developed to provide a basis for comparing the HTE concept with other hydrogen production concepts. The reference plant design is driven by a high-temperature helium-cooled nuclear reactor coupled to a direct Brayton power cycle. The reference design reactor power is 600 MWt, with a primary system pressure of 7.0 MPa, and reactor inlet and outlet fluid temperatures of 540°C and 900°C, respectively. The electrolysis unit used to produce hydrogen includes 4,009,177 cells with a per-cell active area of 225 cm2. The optimized design for the reference hydrogen production plant operates at a system pressure of 5.0 MPa, and utilizes an air-sweep system to remove the excess oxygen that is evolved on the anode (oxygen) side of the electrolyzer. The inlet air for the air-sweep system is compressed to the system operating pressure of 5.0 MPa in a four-stage compressor with intercooling. The alternating-current, AC, to direct-current, DC, conversion efficiency is 96%. The overall system thermal-to-hydrogen production efficiency (based on the lower heating value of the produced hydrogen) is 47.12% at a hydrogen production rate of 2.356 kg/s. An economic analysis of this plant was performed using the standardized H2A Analysis Methodology developed by the Department of Energy (DOE) Hydrogen Program, and using realistic financial and cost estimating assumptions. The results of the economic analysis demonstrated that the HTE hydrogen production plant driven by a high-temperature helium-cooled nuclear power plant can deliver hydrogen at a competitive cost. A cost of $3.23/kg of hydrogen was calculated assuming an internal rate of return of 10%.

  11. [Quantitative spectrum analysis of characteristic gases of spontaneous combustion coal].

    PubMed

    Liang, Yun-Tao; Tang, Xiao-Jun; Luo, Hai-Zhu; Sun, Yong

    2011-09-01

    Aimed at the characteristics of spontaneous combustion gas such as a variety of gases, lou limit of detection, and critical requirement of safety, Fourier transform infrared (FTIR) spectral analysis is presented to analyze characteristic gases of spontaneous combustion In this paper, analysis method is introduced at first by combing characteristics of absorption spectra of analyte and analysis requirement. Parameter setting method, sample preparation, feature variable abstract and analysis model building are taken into consideration. The methods of sample preparation, feature abstraction and analysis model are introduced in detail. And then, eleven kinds of gases were tested with Tensor 27 spectrometer. CH4, C2H6, C3H8, iC4H10, nC4H10, C2 H4, C3 H6, C3 H2, SF6, CO and CO2 were included. The optical path length was 10 cm while the spectra resolution was set as 1 cm(-1). The testing results show that the detection limit of all analytes is less than 2 x 10(-6). All the detection limits fit the measurement requirement of spontaneous combustion gas, which means that FTIR may be an ideal instrument and the analysis method used in this paper is competent for spontaneous combustion gas measurement on line. PMID:22097853

  12. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli

    PubMed Central

    Hur, Kwang-Ho; Mueller, Joachim D.

    2015-01-01

    The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell. PMID:26099032

  13. Analysis of hydrogen cyanide in air in a case of attempted cyanide poisoning.

    PubMed

    Magnusson, R; Nyholm, S; Åstot, C

    2012-10-10

    A 32-year-old man attempted to poison his ex-girlfriend with hydrogen cyanide by hiding the pesticide Uragan D2 in her car. During the police investigation, chemical analysis of the air inside the car was performed. Hydrogen cyanide was detected through on-site air analysis using a portable Fourier transform infrared (FTIR) spectroscopy gas analyzer and colorimetric gas detection tubes. Furthermore, impinger air-sampling was performed for off-site sample preparation and analysis by gas chromatography-mass spectrometry (GC-MS). All three independent techniques demonstrated the presence of hydrogen cyanide, at concentrations of 14-20 ppm. Owing to the high volatility of hydrogen cyanide, the temperature and the time since exposure have a substantial effect on the likelihood of detecting hydrogen cyanide at a crime scene. The prevailing conditions (closed space, low temperature) must have supported the preservation of HCN in the car thus enabling the identification even though the analysis was performed several days after the hydrogen cyanide source was removed. This paper demonstrates the applicability of combining on-site FTIR measurements and off-site GC-MS analysis of a crime scene in order to ensure fast detection as well as unambiguous identification for forensic purposes of hydrogen cyanide in air. PMID:22704552

  14. [Quantitative Analysis of Immuno-fluorescence of Nuclear Factor-κB Activation].

    PubMed

    Xiu, Min; He, Feng; Lou, Yuanlei; Xu, Lu; Xiong Jieqi; Wang, Ping; Liu, Sisun; Guo, Fei

    2015-06-01

    Immuno-fluorescence technique can qualitatively determine certain nuclear translocation, of which NF-κB/ p65 implicates the activation of NF-κB signal pathways. Immuno-fluorescence analysis software with independent property rights is able to quantitatively analyze dynamic location of NF-κB/p65 by computing relative fluorescence units in nuclei and cytoplasm. We verified the quantitative analysis by Western Blot. When we applied the software to analysis of nuclear translocation in lipopolysaccharide (LPS) induced (0. 5 h, 1 h, 2 h, 4 h) primary human umbilical vein endothelial cells (HUVECs) , we found that nuclear translocation peak showed up at 2h as with calculated Western blot verification results, indicating that the inventive immuno-fluorescence analysis software can be applied to the quantitative analysis of immuno-fluorescence. PMID:26485997

  15. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  16. Quantitative Analysis of 3′-Hydroxynorcotinine in Human Urine

    PubMed Central

    Upadhyaya, Pramod

    2015-01-01

    Introduction: Based on previous metabolism studies carried out in patas monkeys, we hypothesized that urinary 3′-hydroxynorcotinine could be a specific biomarker for uptake and metabolism of the carcinogen N′-nitrosonornicotine in people who use tobacco products. Methods: We developed a method for quantitation of 3′-hydroxynorcotinine in human urine. [Pyrrolidinone-13C4]3′-hydroxynorcotinine was added to urine as an internal standard, the samples were treated with β-glucuronidase, partially purified by solid supported liquid extraction and quantified by liquid chromatography–electrospray ionization–tandem mass spectrometry. Results: The method was accurate (average accuracy = 102%) and precise (coefficient of variation = 5.6%) in the range of measurement. 3′-Hydroxynorcotinine was detected in 48 urine samples from smokers (mean 393±287 pmol/ml urine) and 12 samples from individuals who had stopped smoking and were using the nicotine patch (mean 658±491 pmol/ml urine), but not in any of 10 samples from nonsmokers. Conclusions: Since the amounts of 3′-hydroxynorcotinine found in smokers’ urine were approximately 50 times greater than the anticipated daily dose of N′-nitrosonornicotine, we concluded that it is a metabolite of nicotine or one of its metabolites, comprising perhaps 1% of nicotine intake in smokers. Therefore, it would not be suitable as a specific biomarker for uptake and metabolism of N′-nitrosonornicotine. Since 3′-hydroxynorcotinine has never been previously reported as a constituent of human urine, further studies are required to determine its source and mode of formation. PMID:25324430

  17. Quantitative analysis of nailfold capillary morphology in patients with fibromyalgia

    PubMed Central

    Choi, Dug-Hyun

    2015-01-01

    Background/Aims Nailfold capillaroscopy (NFC) has been used to examine morphological and functional microcirculation changes in connective tissue diseases. It has been demonstrated that NFC patterns reflect abnormal microvascular dynamics, which may play a role in fibromyalgia (FM) syndrome. The aim of this study was to determine NFC patterns in FM, and their association with clinical features of FM. Methods A total of 67 patients with FM, and 30 age- and sex-matched healthy controls, were included. Nailfold capillary patterns were quantitatively analyzed using computerized NFC. The parameters of interest were as follows: number of capillaries within the central 3 mm, deletion score, apical limb width, capillary width, and capillary dimension. Capillary dimension was determined by calculating the number of capillaries using the Adobe Photoshop version 7.0. Results FM patients had a lower number of capillaries and higher deletion scores on NFC compared to healthy controls (17.3 ± 1.7 vs. 21.8 ± 2.9, p < 0.05; 2.2 ± 0.9 vs. 0.7 ± 0.6, p < 0.05, respectively). Both apical limb width (µm) and capillary width (µm) were significantly decreased in FM patients (1.1 ± 0.2 vs. 3.7 ± 0.6; 5.4 ± 0.5 vs. 7.5 ± 1.4, respectively), indicating that FM patients have abnormally decreased digital capillary diameter and density. Interestingly, there was no difference in capillary dimension between the two groups, suggesting that the length or tortuosity of capillaries in FM patients is increased to compensate for diminished microcirculation. Conclusions FM patients had altered capillary density and diameter in the digits. Diminished microcirculation on NFC may alter capillary density and increase tortuosity. PMID:26161020

  18. Quantitative analysis of wrist electrodermal activity during sleep.

    PubMed

    Sano, Akane; Picard, Rosalind W; Stickgold, Robert

    2014-12-01

    We present the first quantitative characterization of electrodermal activity (EDA) patterns on the wrists of healthy adults during sleep using dry electrodes. We compare the new results on the wrist to the prior findings on palmar or finger EDA by characterizing data measured from 80 nights of sleep consisting of 9 nights of wrist and palm EDA from 9 healthy adults sleeping at home, 56 nights of wrist and palm EDA from one healthy adult sleeping at home, and 15 nights of wrist EDA from 15 healthy adults in a sleep laboratory, with the latter compared to concurrent polysomnography. While high frequency patterns of EDA called "storms" were identified by eye in the 1960s, we systematically compare thresholds for automatically detecting EDA peaks and establish criteria for EDA storms. We found that more than 80% of the EDA peaks occurred in non-REM sleep, specifically during slow-wave sleep (SWS) and non-REM stage 2 sleep (NREM2). Also, EDA amplitude is higher in SWS than in other sleep stages. Longer EDA storms were more likely to occur in the first two quarters of sleep and during SWS and NREM2. We also found from the home studies (65 nights) that EDA levels were higher and the skin conductance peaks were larger and more frequent when measured on the wrist than when measured on the palm. These EDA high frequency peaks and high amplitude were sometimes associated with higher skin temperature, but more work is needed looking at neurological and other EDA elicitors in order to elucidate their complete behavior. PMID:25286449

  19. Thermal Desorption Analysis of Hydrogen in High Strength Martensitic Steels

    NASA Astrophysics Data System (ADS)

    Enomoto, M.; Hirakami, D.; Tarui, T.

    2012-02-01

    Thermal desorption analyses (TDA) were conducted in high strength martensitic steels containing carbon from 0.33 to 1.0 mass pct, which were charged with hydrogen at 1223 K (950 °C) under hydrogen of one atmospheric pressure and quenched to room temperature. In 0.33C steel, which had the highest M s temperature, only one desorption peak was observed around 373 K (100 °C), whereas two peaks, one at a similar temperature and the other around and above 573 K (300 °C), were observed in the other steels, the height of the second peak increasing with carbon content. In 0.82C steel, both peaks disappeared during exposure at room temperature in 1 week, whereas the peak heights decreased gradually over 2 weeks in specimens electrolytically charged with hydrogen and aged for varying times at room temperature. From computer simulation, by means of the McNabb-Foster theory coupled with theories of carbon segregation, these peaks are likely to be due to trapping of hydrogen in the strain fields and cores of dislocations, and presumably to a lesser extent in prior austenite grain boundaries. The results also indicate that carbon atoms prevent and even expel hydrogen from trapping sites during quenching and aging in these steels.

  20. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  1. Lifecycle Cost Analysis of Hydrogen Versus Other Technologies for Electrical Energy Storage

    SciTech Connect

    Steward, D.; Saur, G.; Penev, M.; Ramsden, T.

    2009-11-01

    This report presents the results of an analysis evaluating the economic viability of hydrogen for medium- to large-scale electrical energy storage applications compared with three other storage technologies: batteries, pumped hydro, and compressed air energy storage (CAES).

  2. FLOW INJECTION ANALYSIS OF TRACE HYDROGEN PEROXIDE USING AN IMMOBILIZED ENZYME REACTOR (JOURNAL VERSION)

    EPA Science Inventory

    Sub-parts per billion (ppb) levels of aqueous hydrogen peroxide have been determined with a flow injection analysis system employing a single bead string reactor composed of horseradish peroxidase covalently bound to an aminated macroporous polymeric absorbent with glutaraldehyde...

  3. Application of diffusion theory to the analysis of hydrogen desorption data at 25 deg C

    SciTech Connect

    Danford, M.D.

    1985-10-01

    The application of diffusion theory to the analysis of hydrogen desorption data (coulombs of H/sub 2/ desorbed versus time) has been studied. From these analyses, important information concerning hydrogen solubilities and the nature of the hydrogen distributions in the metal has been obtained. Two nickel base alloys, Rene' 41 and Waspaloy, and one ferrous alloy, 4340 steel, are studied in this work. For the nickel base alloys, it is found that the hydrogen distributions after electrolytic charging conforms closely to those which would be predicted by diffusion theory. For Waspaloy samples charged at 5,000 psi, it is found that the hydrogen distributions are essentially the same as those obtained by electrolytic charging. The hydrogen distributions in electrolytically charged 4340 steel, on the other hand, are essentially uniform in nature, which would not be predicted by diffusion theory. A possible explanation has been proposed. Finally, it is found that the hydrogen desorption is completely explained by the nature of the hydrogen distribution in the metal, and that the fast hydrogen is not due to surface and sub-surface hydride formation, as was originally proposed.

  4. The application of diffusion theory to the analysis of hydrogen desorption data at 25 deg C

    NASA Technical Reports Server (NTRS)

    Danford, M. D.

    1985-01-01

    The application of diffusion theory to the analysis of hydrogen desorption data (coulombs of H2 desorbed versus time) has been studied. From these analyses, important information concerning hydrogen solubilities and the nature of the hydrogen distributions in the metal has been obtained. Two nickel base alloys, Rene' 41 and Waspaloy, and one ferrous alloy, 4340 steel, are studied in this work. For the nickel base alloys, it is found that the hydrogen distributions after electrolytic charging conforms closely to those which would be predicted by diffusion theory. For Waspaloy samples charged at 5,000 psi, it is found that the hydrogen distributions are essentially the same as those obtained by electrolytic charging. The hydrogen distributions in electrolytically charged 4340 steel, on the other hand, are essentially uniform in nature, which would not be predicted by diffusion theory. A possible explanation has been proposed. Finally, it is found that the hydrogen desorption is completely explained by the nature of the hydrogen distribution in the metal, and that the fast hydrogen is not due to surface and sub-surface hydride formation, as was originally proposed.

  5. Quantitative analysis and purity evaluation of medroxyprogesterone acetate by HPLC.

    PubMed

    Cavina, G; Valvo, L; Alimenti, R

    1985-01-01

    A reversed-phase high-performance liquid chromatographic method was developed for the assay of medroxyprogesterone acetate and for the detection and determination of related steroids present as impurities in the drug. The method was compared with the normal-phase technique of the USP XX and was also applied to the analysis of tablets and injectable suspensions. PMID:16867645

  6. Quantitative modeling and analysis in environmental studies. Technical report

    SciTech Connect

    Gaver, D.P.

    1994-10-01

    This paper reviews some of the many mathematical modeling and statistical data analysis problems that arise in environmental studies. It makes no claim to be comprehensive nor truly up-to-date. It will appear as a chapter in a book on ecotoxicology to be published by CRC Press, probably in 1995. Workshops leading to the book creation were sponsored by The Conte Foundation.

  7. Procedures for Quantitative Analysis of Change Facilitator Interventions.

    ERIC Educational Resources Information Center

    Hord, Shirley M.; Hall, Gene E.

    The procedures and coding schema that have been developed by the Research on the Improvement Process (RIP) Program for analyzing the frequency of interventions and for examining their internal characteristics are described. In two in-depth ethnographic studies of implementation efforts, interventions were the focus of data collection and analysis.…

  8. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  9. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    2001-01-01

    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  10. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution…

  11. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  12. Quantitative histology analysis of the ovarian tumour microenvironment.

    PubMed

    Lan, Chunyan; Heindl, Andreas; Huang, Xin; Xi, Shaoyan; Banerjee, Susana; Liu, Jihong; Yuan, Yinyin

    2015-01-01

    Concerted efforts in genomic studies examining RNA transcription and DNA methylation patterns have revealed profound insights in prognostic ovarian cancer subtypes. On the other hand, abundant histology slides have been generated to date, yet their uses remain very limited and largely qualitative. Our goal is to develop automated histology analysis as an alternative subtyping technology for ovarian cancer that is cost-efficient and does not rely on DNA quality. We developed an automated system for scoring primary tumour sections of 91 late-stage ovarian cancer to identify single cells. We demonstrated high accuracy of our system based on expert pathologists' scores (cancer = 97.1%, stromal = 89.1%) as well as compared to immunohistochemistry scoring (correlation = 0.87). The percentage of stromal cells in all cells is significantly associated with poor overall survival after controlling for clinical parameters including debulking status and age (multivariate analysis p = 0.0021, HR = 2.54, CI = 1.40-4.60) and progression-free survival (multivariate analysis p = 0.022, HR = 1.75, CI = 1.09-2.82). We demonstrate how automated image analysis enables objective quantification of microenvironmental composition of ovarian tumours. Our analysis reveals a strong effect of the tumour microenvironment on ovarian cancer progression and highlights the potential of therapeutic interventions that target the stromal compartment or cancer-stroma signalling in the stroma-high, late-stage ovarian cancer subset. PMID:26573438

  13. Clinical value of quantitative analysis of ST slope during exercise.

    PubMed Central

    Ascoop, C A; Distelbrink, C A; De Lang, P A

    1977-01-01

    The diagnostic performance of automatic analysis of the exercise electrocardiogram in detecting ischaemic heart disease was studied in 147 patients with angiographically documented coronary disease. The results were compared with the results of visual analysis of the same recordings. Using a bicycle ergometer we tried to reach at least 90 per cent of the predicted maximal heart rate of the patient. Two bipolar thoracic leads (CM5, CC5) were used. In the visual analysis the criterion of the so-called ischaemic ST segment was applied. For the automatic analysis the population was divided into a learning group (N=87) and a testing group (N=60). In the learning group first critical values were computed for different ST measurements that provided optimal separation between patients with (CAG POS.) and without (CAG. NEG.) significant coronary stenoses as revealed by coronary arteriography. These critical values were kept unchanged when applied to the testing group. With respect to the visual method an increase of the sensitivity by 0-45 and 0-36 was obtained by the automatic analysis in the learning and testing group, respectively. The best separation between CAG. POS. and CAG. NEG. group was reached using a criterion consisting of a linear combination of the slope of the initial part of the ST segment and the ST depression; the sensitivity being 0-70 and 0-60, respectively, in the learning and testing group. Using a criterion based on the area between the baseline and the ST segment (the SX integral) these values were 0-42 and 0-49, respectively. All specificities were kept to at least 0-90. PMID:319813

  14. Quantitative analysis of acrylamide labeled serum proteins by LC-MS/MS.

    PubMed

    Faca, Vitor; Coram, Marc; Phanstiel, Doug; Glukhova, Veronika; Zhang, Qing; Fitzgibbon, Matthew; McIntosh, Martin; Hanash, Samir

    2006-08-01

    Isotopic labeling of cysteine residues with acrylamide was previously utilized for relative quantitation of proteins by MALDI-TOF. Here, we explored and compared the application of deuterated and (13)C isotopes of acrylamide for quantitative proteomic analysis using LC-MS/MS and high-resolution FTICR mass spectrometry. The method was applied to human serum samples that were immunodepleted of abundant proteins. Our results show reliable quantitation of proteins across an abundance range that spans 5 orders of magnitude based on ion intensities and known protein concentration in plasma. The use of (13)C isotope of acrylamide had a slightly greater advantage relative to deuterated acrylamide, because of shifts in elution of deuterated acrylamide relative to its corresponding nondeuterated compound by reversed-phase chromatography. Overall, the use of acrylamide for differentially labeling intact proteins in complex mixtures, in combination with LC-MS/MS provides a robust method for quantitative analysis of complex proteomes. PMID:16889424

  15. Normal coordinate analysis, hydrogen bonding, and conformation analysis of heptane-3,5-dione

    NASA Astrophysics Data System (ADS)

    Soltani-Ghoshkhaneh, Samira; Vakili, Mohammad; Tayyari, Sayyed Faramaraz; Berenji, Ali Reza

    2016-01-01

    Fourier transform Raman and infrared spectral measurements have been made for the heptane-3,5-dione (HPD) and simultaneously compared with those of acetylacetone (AA) to give a clear understanding of substitution effect of ethyl groups (in β-positions) on the structure, electron delocalization, and intramolecular hydrogen bonding (IHB). Molecular structure, conformational stabilities, and intramolecular hydrogen bonding of different oxo-enol forms of HPD, have been investigated by MP2, BLYP, B2PLYP, TPSSh, and B3LYP methods, using various basis sets, and experimental results. The energy differences between four stable E1-E4 chelated forms are relatively negligible. The theoretical and experimental results obtained for stable oxo-enol forms of HPD have been compared with each other and also with those of AA. According to the theoretical calculations, HPD has a hydrogen bond strength of about 15.9 kcal/mol, calculated at the B3LYP/6-311++G** level, which is the same as AA, 15.9 kcal/mol. This similarity in the IHB strength is also consistent with the experimental results of the band frequency shifts for the OH/OD and O···O stretching and OH/OD out-of-plane bending frequencies and chemical shift of the O-H group. The molecular stability and the hydrogen bond strength also were investigated by applying the topological analysis, geometry calculations, and spectroscopic results. Potential energy distribution (PED) and normal coordinate analysis have also been performed. A complete assignment of the observed band frequencies has been suggested the presence of four HPD forms at comparable amounts in the sample.

  16. Quantitative analysis of cell-free DNA in ovarian cancer

    PubMed Central

    SHAO, XUEFENG; He, YAN; JI, MIN; CHEN, XIAOFANG; QI, JING; SHI, WEI; HAO, TIANBO; JU, SHAOQING

    2015-01-01

    The aim of the present study was to investigate the association between cell-free DNA (cf-DNA) levels and clinicopathological characteristics of patients with ovarian cancer using a branched DNA (bDNA) technique, and to determine the value of quantitative cf-DNA detection in assisting with the diagnosis of ovarian cancer. Serum specimens were collected from 36 patients with ovarian cancer on days 1, 3 and 7 following surgery, and additional serum samples were also collected from 22 benign ovarian tumor cases, and 19 healthy, non-cancerous ovaries. bDNA techniques were used to detect serum cf-DNA concentrations. All data were analyzed using SPSS version 18.0. The cf-DNA levels were significantly increased in the ovarian cancer group compared with those of the benign ovarian tumor group and healthy ovarian group (P<0.01). Furthermore, cf-DNA levels were significantly increased in stage III and IV ovarian cancer compared with those of stages I and II (P<0.01). In addition, cf-DNA levels were significantly increased on the first day post-surgery (P<0.01), and subsequently demonstrated a gradual decrease. In the ovarian cancer group, the area under the receiver operating characteristic curve of cf-DNA and the sensitivity were 0.917 and 88.9%, respectively, which was higher than those of cancer antigen 125 (0.724, 75%) and human epididymis protein 4 (0.743, 80.6%). There was a correlation between the levels of serum cf-DNA and the occurrence and development of ovarian cancer in the patients evaluated. bDNA techniques possessed higher sensitivity and specificity than other methods for the detection of serum cf-DNA in patients exhibiting ovarian cancer, and bDNA techniques are more useful for detecting cf-DNA than other factors. Thus, the present study demonstrated the potential value for the use of bDNA as an adjuvant diagnostic method for ovarian cancer. PMID:26788153

  17. Quantitative analysis of night skyglow amplification under cloudy conditions

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio

    2014-10-01

    The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU

  18. Probabilistic reliability analysis, quantitative safety goals, and nuclear licensing in the United Kingdom.

    PubMed

    Cannell, W

    1987-09-01

    Although unpublicized, the use of quantitative safety goals and probabilistic reliability analysis for licensing nuclear reactors has become a reality in the United Kingdom. This conclusion results from an examination of the process leading to the licensing of the Sizewell B PWR in England. The licensing process for this reactor has substantial implications for nuclear safety standards in Britain, and is examined in the context of the growing trend towards quantitative safety goals in the United States. PMID:3685540

  19. Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells

    NASA Astrophysics Data System (ADS)

    Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.

    2014-05-01

    Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

  20. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    PubMed

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI. PMID:27576710

  1. Quantitative Analysis of PMLA Nanoconjugate Components after Backbone Cleavage

    PubMed Central

    Ding, Hui; Patil, Rameshwar; Portilla-Arias, Jose; Black, Keith L.; Ljubimova, Julia Y.; Holler, Eggehard

    2015-01-01

    Multifunctional polymer nanoconjugates containing multiple components show great promise in cancer therapy, but in most cases complete analysis of each component is difficult. Polymalic acid (PMLA) based nanoconjugates have demonstrated successful brain and breast cancer treatment. They consist of multiple components including targeting antibodies, Morpholino antisense oligonucleotides (AONs), and endosome escape moieties. The component analysis of PMLA nanoconjugates is extremely difficult using conventional spectrometry and HPLC method. Taking advantage of the nature of polyester of PMLA, which can be cleaved by ammonium hydroxide, we describe a method to analyze the content of antibody and AON within nanoconjugates simultaneously using SEC-HPLC by selectively cleaving the PMLA backbone. The selected cleavage conditions only degrade PMLA without affecting the integrity and biological activity of the antibody. Although the amount of antibody could also be determined using the bicinchoninic acid (BCA) method, our selective cleavage method gives more reliable results and is more powerful. Our approach provides a new direction for the component analysis of polymer nanoconjugates and nanoparticles. PMID:25894227

  2. Quantitative Analysis of Calcium Spikes in Noisy Fluorescent Background

    PubMed Central

    Janicek, Radoslav; Hotka, Matej; Zahradníková, Alexandra; Zahradníková, Alexandra; Zahradník, Ivan

    2013-01-01

    Intracellular calcium signals are studied by laser-scanning confocal fluorescence microscopy. The required spatio-temporal resolution makes description of calcium signals difficult because of the low signal-to-noise ratio. We designed a new procedure of calcium spike analysis based on their fitting with a model. The accuracy and precision of calcium spike description were tested on synthetic datasets generated either with randomly varied spike parameters and Gaussian noise of constant amplitude, or with constant spike parameters and Gaussian noise of various amplitudes. Statistical analysis was used to evaluate the performance of spike fitting algorithms. The procedure was optimized for reliable estimation of calcium spike parameters and for dismissal of false events. A new algorithm was introduced that corrects the acquisition time of pixels in line-scan images that is in error due to sequential acquisition of individual pixels along the space coordinate. New software was developed in Matlab and provided for general use. It allows interactive dissection of temporal profiles of calcium spikes from x-t images, their fitting with predefined function(s) and acceptance of results on statistical grounds, thus allowing efficient analysis and reliable description of calcium signaling in cardiac myocytes down to the in situ function of ryanodine receptors. PMID:23741324

  3. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation. PMID:27501287

  4. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    NASA Astrophysics Data System (ADS)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  5. Technical Analysis of the Hydrogen Energy Station Concept, Phase I and Phase II

    SciTech Connect

    TIAX, LLC

    2005-05-04

    patterns would be most viable for an energy station, TIAX developed several criteria for selecting a representative set of technology configurations. TIAX applied these criteria to all possible technology configurations to determine an optimized set for further analysis, as shown in Table ES-1. This analysis also considered potential energy station operational scenarios and their impact upon hydrogen and power production. For example, an energy station with a 50-kWe reformer could generate enough hydrogen to serve up to 12 vehicles/day (at 5 kg/fill) or generate up to 1,200 kWh/day, as shown in Figure ES-1. Buildings that would be well suited for an energy station would utilize both the thermal and electrical output of the station. Optimizing the generation and utilization of thermal energy, hydrogen, and electricity requires a detailed look at the energy transfer within the energy station and the transfer between the station and nearby facilities. TIAX selected the Baseline configuration given in Table ES-1 for an initial analysis of the energy and mass transfer expected from an operating energy station. Phase II The purpose of this technical analysis was to analyze the development of a hydrogen-dispensing infrastructure for transportation applications through the installation of a 50-75 kW stationary fuel cell-based energy station at federal building sites. The various scenarios, costs, designs and impacts of such a station were quantified for a hypothetical cost-shared program that utilizes a natural gas reformer to provide hydrogen fuel for both the stack(s) and a limited number of fuel cell powered vehicles, with the possibility of using cogeneration to support the building heat load.

  6. Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size

    SciTech Connect

    Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao

    2011-12-01

    In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

  7. Digitally Enhanced Thin-Layer Chromatography: An Inexpensive, New Technique for Qualitative and Quantitative Analysis

    ERIC Educational Resources Information Center

    Hess, Amber Victoria Irish

    2007-01-01

    A study conducted shows that if digital photography is combined with regular thin-layer chromatography (TLC), it could perform highly improved qualitative analysis as well as make accurate quantitative analysis possible for a much lower cost than commercial equipment. The findings suggest that digitally enhanced TLC (DE-TLC) is low-cost and easy…

  8. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  9. Integrating Data Analysis (IDA): Working with Sociology Departments to Address the Quantitative Literacy Gap

    ERIC Educational Resources Information Center

    Howery, Carla B.; Rodriguez, Havidan

    2006-01-01

    The NSF-funded Integrating Data Analysis (IDA) Project undertaken by the American Sociological Association (ASA) and the Social Science Data Analysis Network sought to close the quantitative literacy gap for sociology majors. Working with twelve departments, the project built on lessons learned from ASA's Minority Opportunities through School…

  10. A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals

    ERIC Educational Resources Information Center

    Takusi, Gabriel Samuto

    2010-01-01

    This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

  11. A Quantitative Content Analysis of Mercer University MEd, EdS, and Doctoral Theses

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Gaiek, Lura S.; White, Torian A.; Slappey, Lisa A.; Chastain, Andrea; Harris, Rose Prejean

    2010-01-01

    Quantitative content analysis of a body of research not only helps budding researchers understand the culture, language, and expectations of scholarship, it helps identify deficiencies and inform policy and practice. Because of these benefits, an analysis of a census of 980 Mercer University MEd, EdS, and doctoral theses was conducted. Each thesis…

  12. Some remarks on the quantitative analysis of behavior

    PubMed Central

    Marr, M. Jackson

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences—physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  13. Identifying severity of electroporation through quantitative image analysis

    NASA Astrophysics Data System (ADS)

    Morshed, Bashir I.; Shams, Maitham; Mussivand, Tofy

    2011-04-01

    Electroporation is the formation of reversible hydrophilic pores in the cell membrane under electric fields. Severity of electroporation is challenging to measure and quantify. An image analysis method is developed, and the initial results with a fabricated microfluidic device are reported. The microfluidic device contains integrated microchannels and coplanar interdigitated electrodes allowing low-voltage operation and low-power consumption. Noninvasive human buccal cell samples were specifically stained, and electroporation was induced. Captured image sequences were analyzed for pixel color ranges to quantify the severity of electroporation. The method can detect even a minor occurrence of electroporation and can perform comparative studies.

  14. Hydrogen Safety Project chemical analysis support task: Window C'' volatile organic analysis

    SciTech Connect

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.; Hoope, E.A.

    1992-01-01

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requested analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.

  15. Hydrogen Safety Project chemical analysis support task: Window ``C`` volatile organic analysis

    SciTech Connect

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.; Hoope, E.A.

    1992-01-01

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requested analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.

  16. Milestone report TCTP application to the SSME hydrogen system analysis

    NASA Technical Reports Server (NTRS)

    Richards, J. S.

    1975-01-01

    The Transient Cryogen Transfer Computer Program (TCTP) developed and verified for LOX systems by analyses of Skylab S-1B stage loading data from John F. Kennedy Space Center launches was extended to include hydrogen as the working fluid. The feasibility of incorporating TCTP into the space shuttle main engine dynamic model was studied. The program applications are documented.

  17. Quantitative proteomic analysis of cold-responsive proteins in rice.

    PubMed

    Neilson, Karlie A; Mariani, Michael; Haynes, Paul A

    2011-05-01

    Rice is susceptible to cold stress and with a future of climatic instability we will be unable to produce enough rice to satisfy increasing demand. A thorough understanding of the molecular responses to thermal stress is imperative for engineering cultivars, which have greater resistance to low temperature stress. In this study we investigated the proteomic response of rice seedlings to 48, 72 and 96 h of cold stress at 12-14°C. The use of both label-free and iTRAQ approaches in the analysis of global protein expression enabled us to assess the complementarity of the two techniques for use in plant proteomics. The approaches yielded a similar biological response to cold stress despite a disparity in proteins identified. The label-free approach identified 236 cold-responsive proteins compared to 85 in iTRAQ results, with only 24 proteins in common. Functional analysis revealed differential expression of proteins involved in transport, photosynthesis, generation of precursor metabolites and energy; and, more specifically, histones and vitamin B biosynthetic proteins were observed to be affected by cold stress. PMID:21433000

  18. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  19. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  20. iTRAQ-Based Quantitative Proteomic Analysis of Nasopharyngeal Carcinoma.

    PubMed

    Cai, Xin-Zhang; Zeng, Wei-Qun; Xiang, Yi; Liu, Yi; Zhang, Hong-Min; Li, Hong; She, Sha; Yang, Min; Xia, Kun; Peng, Shi-Fang

    2015-07-01

    Nasopharyngeal carcinoma (NPC) is a common disease in the southern provinces of China with a poor prognosis. To better understand the pathogenesis of NPC and identify proteins involved in NPC carcinogenesis, we applied iTRAQ coupled with two-dimensional LC-MS/MS to compare the proteome profiles of NPC tissues and the adjacent non-tumor tissues. We identified 54 proteins with differential expression in NPC and the adjacent non-tumor tissues. The differentially expressed proteins were further determined by RT-PCR and Western blot analysis. In addition, the up-regulation of HSPB1, NPM1 and NCL were determined by immunohistochemistry using tissue microarray. Functionally, we found that siRNA mediated knockdown of NPM1 inhibited the migration and invasion of human NPC CNE1 cell line. In summary, this is the first study on proteome analysis of NPC tissues using an iTRAQ method, and we identified many new differentially expressed proteins which are potential targets for the diagnosis and therapy of NPC. PMID:25648846

  1. a Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis.

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Available from UMI in association with The British Library. Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4-tetrahidro-1, 6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 Br.H_2O) has been determined, first using monochromatic Mo Kalpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed a R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop- 2-en-1-one, (C_{25 }H_{20}N _2O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analysis respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analysis of the benzil compound ((C_6H_5 O.CO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature -114 ^circC. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation

  2. [Quantitative analysis of seven phenolic acids in eight Yinqiao Jiedu serial preparations by quantitative analysis of multi-components with single-marker].

    PubMed

    Wang, Jun-jun; Zhang, Li; Guo, Qing; Kou, Jun-ping; Yu, Bo-yang; Gu, Dan-hua

    2015-04-01

    The study aims to develop a unified method to determine seven phenolic acids (neochlorogenic acid, chlorogenic acid, 4-caffeoylquinic acid, caffeic acid, isochlorogenic acid B, isochlorogenic acid A and isochlorogenic acid C) contained in honeysuckle flower that is the monarch drug of all the eight Yinqiao Jiedu serial preparations using quantitative analysis of multi-components by single-marker (QAMS). Firstly, chlorogenic acid was used as a reference to get the average relative correction factors (RCFs) of the other phenolic acids in ratios to the reference; columns and instruments from different companies were used to validate the durability of the achieved RCFs in different levels of standard solutions; and honeysuckle flower extract was used as the reference substance to fix the positions of chromatographic peaks. Secondly, the contents of seven phenolic acids in eight different Yinqiao Jiedu serial preparations samples were calculated based on the RCFs durability. Finally, the quantitative results were compared between QAMS and the external standard (ES) method. The results have showed that the durability of the achieved RCFs is good (RSD during 0.80% - 2.56%), and there are no differences between the quantitative results of QAMS and ES (the relative average deviation < 0.93%). So it can be successfully used to the quantitative control of honeysuckle flower principally prescribed in Yinqiao Jiedu serial preparations. PMID:26223132

  3. Analysis of nuclear quantum effects on hydrogen bonding.

    PubMed

    Swalina, Chet; Wang, Qian; Chakraborty, Arindam; Hammes-Schiffer, Sharon

    2007-03-22

    The impact of nuclear quantum effects on hydrogen bonding is investigated for a series of hydrogen fluoride (HF)n clusters and a partially solvated fluoride anion, F-(H2O). The nuclear quantum effects are included using the path integral formalism in conjunction with the Car-Parrinello molecular dynamics (PICPMD) method and using the second-order vibrational perturbation theory (VPT2) approach. For the HF clusters, a directional change in the impact of nuclear quantum effects on the hydrogen-bonding strength is observed as the clusters evolve toward the condensed phase. Specifically, the inclusion of nuclear quantum effects increases the F-F distances for the (HF)n=2-4 clusters and decreases the F-F distances for the (HF)n>4 clusters. This directional change occurs because the enhanced electrostatic interactions between the HF monomers become more dominant than the zero point energy effects of librational modes as the size of the HF clusters increases. For the F-(H2O) system, the inclusion of nuclear quantum effects decreases the F-O distance and strengthens the hydrogen bonding interaction between the fluoride anion and the water molecule because of enhanced electrostatic interactions. The vibrationally averaged 19F shielding constant for F-(H2O) is significantly lower than the value for the equilibrium geometry, indicating that the electronic density on the fluorine decreases as a result of the quantum delocalization of the shared hydrogen. Deuteration of this system leads to an increase in the vibrationally averaged F-O distance and nuclear magnetic shielding constant because of the smaller degree of quantum delocalization for deuterium. PMID:17388289

  4. A life cycle cost analysis framework for geologic storage of hydrogen : a scenario analysis.

    SciTech Connect

    Kobos, Peter Holmes; Lord, Anna Snider; Borns, David James

    2010-10-01

    The U.S. Department of Energy has an interest in large scale hydrogen geostorage, which would offer substantial buffer capacity to meet possible disruptions in supply. Geostorage options being considered are salt caverns, depleted oil/gas reservoirs, aquifers and potentially hard rock cavrns. DOE has an interest in assessing the geological, geomechanical and economic viability for these types of hydrogen storage options. This study has developed an ecocomic analysis methodology to address costs entailed in developing and operating an underground geologic storage facility. This year the tool was updated specifically to (1) a version that is fully arrayed such that all four types of geologic storage options can be assessed at the same time, (2) incorporate specific scenarios illustrating the model's capability, and (3) incorporate more accurate model input assumptions for the wells and storage site modules. Drawing from the knowledge gained in the underground large scale geostorage options for natural gas and petroleum in the U.S. and from the potential to store relatively large volumes of CO{sub 2} in geological formations, the hydrogen storage assessment modeling will continue to build on these strengths while maintaining modeling transparency such that other modeling efforts may draw from this project.

  5. Quantitative Analysis of Spectral Impacts on Silicon Photodiode Radiometers: Preprint

    SciTech Connect

    Myers, D. R.

    2011-04-01

    Inexpensive broadband pyranometers with silicon photodiode detectors have a non-uniform spectral response over the spectral range of 300-1100 nm. The response region includes only about 70% to 75% of the total energy in the terrestrial solar spectral distribution from 300 nm to 4000 nm. The solar spectrum constantly changes with solar position and atmospheric conditions. Relative spectral distributions of diffuse hemispherical irradiance sky radiation and total global hemispherical irradiance are drastically different. This analysis convolves a typical photodiode response with SMARTS 2.9.5 spectral model spectra for different sites and atmospheric conditions. Differences in solar component spectra lead to differences on the order of 2% in global hemispherical and 5% or more in diffuse hemispherical irradiances from silicon radiometers. The result is that errors of more than 7% can occur in the computation of direct normal irradiance from global hemispherical irradiance and diffuse hemispherical irradiance using these radiometers.

  6. Quantitative radiographic analysis of fiber reinforced polymer composites.

    PubMed

    Baidya, K P; Ramakrishna, S; Rahman, M; Ritchie, A

    2001-01-01

    X-ray radiographic examination of the bone fracture healing process is a widely used method in the treatment and management of patients. Medical devices made of metallic alloys reportedly produce considerable artifacts that make the interpretation of radiographs difficult. Fiber reinforced polymer composite materials have been proposed to replace metallic alloys in certain medical devices because of their radiolucency, light weight, and tailorable mechanical properties. The primary objective of this paper is to provide a comparable radiographic analysis of different fiber reinforced polymer composites that are considered suitable for biomedical applications. Composite materials investigated consist of glass, aramid (Kevlar-29), and carbon reinforcement fibers, and epoxy and polyether-ether-ketone (PEEK) matrices. The total mass attenuation coefficient of each material was measured using clinical X-rays (50 kev). The carbon fiber reinforced composites were found to be more radiolucent than the glass and kevlar fiber reinforced composites. PMID:11261603

  7. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila

    PubMed Central

    Itskov, Pavel M.; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H.; Ribeiro, Carlos

    2014-01-01

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake. PMID:25087594

  8. Digital photogrammetry for quantitative wear analysis of retrieved TKA components.

    PubMed

    Grochowsky, J C; Alaways, L W; Siskey, R; Most, E; Kurtz, S M

    2006-11-01

    The use of new materials in knee arthroplasty demands a way in which to accurately quantify wear in retrieved components. Methods such as damage scoring, coordinate measurement, and in vivo wear analysis have been used in the past. The limitations in these methods illustrate a need for a different methodology that can accurately quantify wear, which is relatively easy to perform and uses a minimal amount of expensive equipment. Off-the-shelf digital photogrammetry represents a potentially quick and easy alternative to what is readily available. Eighty tibial inserts were visually examined for front and backside wear and digitally photographed in the presence of two calibrated reference fields. All images were segmented (via manual and automated algorithms) using Adobe Photoshop and National Institute of Health ImageJ. Finally, wear was determined using ImageJ and Rhinoceros software. The absolute accuracy of the method and repeatability/reproducibility by different observers were measured in order to determine the uncertainty of wear measurements. To determine if variation in wear measurements was due to implant design, 35 implants of the three most prevalent designs were subjected to retrieval analysis. The overall accuracy of area measurements was 97.8%. The error in automated segmentation was found to be significantly lower than that of manual segmentation. The photogrammetry method was found to be reasonably accurate and repeatable in measuring 2-D areas and applicable to determining wear. There was no significant variation in uncertainty detected among different implant designs. Photogrammetry has a broad range of applicability since it is size- and design-independent. A minimal amount of off-the-shelf equipment is needed for the procedure and no proprietary knowledge of the implant is needed. PMID:16649169

  9. Hydration of hyaluronan polysaccharide observed by IR spectrometry. II. Definition and quantitative analysis of elementary hydration spectra and water uptake.

    PubMed

    Haxaire, K; Maréchal, Y; Milas, M; Rinaudo, M

    2003-01-01

    We recorded a series of spectra of sodium hyaluronan (HA) films that were in equilibrium with their surrounding humid atmosphere. The hygrometry of this atmosphere extended from 0 to 0.97% relative humidity. We performed a quantitative analysis of the corresponding series of hydration spectra that are the difference spectra of the film at a defined hygrometry minus the spectrum of the dried film (hygrometry = 0). The principle of this analysis is to use this series of hydration spectra to define a limited number (four) of "elementary hydration spectra" over which we can decompose all hydration spectra with good accuracy. This decomposition, combined with the measurements of the numbers of H(2)O molecules at the origin in these elementary hydration spectra of the three characteristic vibrational bands of H(2)O, allowed us to calculate the hydration number under different relative humidity conditions. This number compares well with that determined by thermogravimetry. Furthermore, the decomposition defines for each hygrometry value which chemical mechanisms represented by elementary hydration spectra are active. This analysis is pursued by determining for the elementary hydration spectra the number of hydrogen bonds established by each of the four alcohol groups found in each disaccharide repeat unit before performing the same analysis for amide and carboxylate groups. These results are later utilized to discuss the structure of HA at various stages of hydration. PMID:12722111

  10. Quantitative analysis by mid-infrared spectrometry in food and agro-industrial fields

    NASA Astrophysics Data System (ADS)

    Dupuy, Nathalie; Huvenne, J. P.; Sombret, B.; Legrand, P.

    1993-03-01

    Thanks to what has been achieved by the Fourier transform, infrared spectroscopy can now become a state of the art device in the quality control laboratories if we consider its precision and the gain in time it ensures compared to traditional analysis methods such as HPLC chromatography. Moreover, the increasing number of new mathematical regression methods such as Partial Least Square ( PLS) regression allows the multicomponent quantitative analysis in mixtures. Nevertheless, the efficiency of infrared spectrometry as a quantitative analysis method often depends on the choice of an adequate presentation for the sample. In this document, we shall demonstrate several techniques such as diffuse reflectance and Attenuated Total Reflectance (ATR) which can be according to the various physical states of the mixtures. The quantitative analysis of real samples from the food industry enables us to estimate its precision. For instance, the analysis of the three main components (glucose, fructose and maltose) in the glucose syrups can be done (using ATR) with a precision in the region of 3% whereas the time required to obtain an analysis report is about 5 minutes. Finally multicomponent quantitative analysis is quite feasable by mid-IR spectroscopy.

  11. High throughput comparative proteome analysis using a quantitative cysteinyl-peptide enrichment technology

    SciTech Connect

    Liu, Tao; Qian, Weijun; Strittmatter, Eric F.; Camp, David G.; Anderson, Gordon A.; Thrall, Brian D.; Smith, Richard D.

    2004-09-15

    A new quantitative cysteinyl-peptide enrichment technology (QCET) was developed to achieve higher efficiency, greater dynamic range, and higher throughput in quantitative proteomics that use stable-isotope labeling techniques combined with high resolution liquid chromatography (LC)-mass spectrometry (MS). This approach involves {sup 18}O labeling of tryptic peptides, high efficiency enrichment of cysteine-containing peptides, and confident protein identification and quantification using the accurate mass and time tag strategy. Proteome profiling of naive and in vitro-differentiated human mammary epithelial cells using QCET resulted in the identification and quantification of 603 proteins in a single LC-Fourier transform ion cyclotron resonance MS analysis. Advantages of this technology include: (1) a simple, highly efficient method for enriching cysteinyl-peptides; (2) a high throughput strategy suitable for extensive proteome analysis; and (3) improved labeling efficiency for better quantitative measurements. This technology enhances both the functional analysis of biological systems and the detection of potential clinical biomarkers.

  12. Statistical shape analysis using 3D Poisson equation-A quantitatively validated approach.

    PubMed

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. PMID:26874288

  13. Quantitative autoradiography with radiopharmaceuticals, Part 1: Digital film-analysis system by videodensitometry: concise communication

    SciTech Connect

    Yonekura, Y.; Brill, A.B.; Som, P.; Bennett, G.W.; Fand, I.

    1983-03-01

    A simple low-cost digital film-analysis system using videodensitometry was developed to quantitate autoradiograms. It is based on a TV-film analysis system coupled to a minicomputer. Digital sampling of transmitted light intensities through the autoradiogram is performed with 8-bit gray levels according to the selected array size (128 X 128 to 1024 X 1024). The performance characteristics of the system provide sufficient stability, uniformity, linearity, and intensity response for use in quantitative analysis. Digital images of the autoradiograms are converted to radioactivity content, pixel by pixel, using step-wedge standards. This type of low-cost system can be installed on conventional mini-computers commonly used in modern nuclear medical facilities. Quantitative digital autoradiography can play an important role, with applications stretching from dosimetry calculations of radiopharmaceuticals to metabolic studies in conjunction with positron-emission tomography.

  14. Global Assessment of Hydrogen Technologies – Tasks 3 & 4 Report Economic, Energy, and Environmental Analysis of Hydrogen Production and Delivery Options in Select Alabama Markets: Preliminary Case Studies

    SciTech Connect

    Fouad, Fouad H.; Peters, Robert W.; Sisiopiku, Virginia P.; Sullivan Andrew J.; Gillette, Jerry; Elgowainy, Amgad; Mintz, Marianne

    2007-12-01

    This report documents a set of case studies developed to estimate the cost of producing, storing, delivering, and dispensing hydrogen for light-duty vehicles for several scenarios involving metropolitan areas in Alabama. While the majority of the scenarios focused on centralized hydrogen production and pipeline delivery, alternative delivery modes were also examined. Although Alabama was used as the case study for this analysis, the results provide insights into the unique requirements for deploying hydrogen infrastructure in smaller urban and rural environments that lie outside the DOE’s high priority hydrogen deployment regions. Hydrogen production costs were estimated for three technologies – steam-methane reforming (SMR), coal gasification, and thermochemical water-splitting using advanced nuclear reactors. In all cases examined, SMR has the lowest production cost for the demands associated with metropolitan areas in Alabama. Although other production options may be less costly for larger hydrogen markets, these were not examined within the context of the case studies.

  15. Segmentation and learning in the quantitative analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Ruggiero, Christy; Ross, Amy; Porter, Reid

    2015-02-01

    In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.

  16. Quantitative analysis of cardiac lesions in chronic canine chagasic cardiomyopathy.

    PubMed

    Caliari, Marcelo Vidigal; do Pilar Machado, Raquel; de Lana, Marta; Caja, Rosângela Aparecida França; Carneiro, Cláudia Martins; Bahia, Maria Teresinha; dos Santos, César Augusto Bueno; Magalhaes, Gustavo Albergaria; Sampaio, Ivan Barbosa Machado; Tafuri, Washington Luiz

    2002-01-01

    Lesions observed in chronic chagasic cardiopathy frequently produce electrocardiographic alterations and affect cardiac function. Through a computerized morphometrical analysis we quantified the areas occupied by cardiac muscle, connective and adipose tissues in the right atrium of dogs experimentally infected with Trypanosoma cruzi. All of the infected dogs showed chronic myocarditis with variable reduction levels of cardiac muscle, fibrosis and adipose tissue replacement. In the atrial myocardium of dogs infected with Be78 and Be62 cardiac muscle represented 34 and 50%, fibrosis 28 and 32% and adipose tissue 38 and 18%, respectively. The fibrosis observed was both diffuse and focal and mostly intrafascicular, either partially or completely interrupting the path of muscle bundles. Such histological alterations probably contributed to the appearance of electrocardiographic disturbances verified in 10 out 11 dogs which are also common in human chronic chagasic cardiopathy. Fibrosis was the most important microscopic occurrence found since it produces rearrangements of collagen fibers in relation to myocardiocytes which causes changes in anatomical physiognomy and mechanical behavior of the myocardium. These abnormalities can contribute to the appearance of cardiac malfunction, arrythmias and congestive cardiac insufficiency as observed in two of the analyzed dogs. Strain Be78 caused destruction of atrial cardiac muscle higher than that induced by strain Be62. PMID:12436168

  17. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  18. Quantitative real-time single particle analysis of virions

    SciTech Connect

    Heider, Susanne; Metzner, Christoph

    2014-08-15

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed—or adapted from other fields, such as nanotechnology—to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification. - Highlights: • We introduce four methods for virus particle-based quantification of viruses. • They allow for quantification of a wide range of samples in under an hour time. • The additional measurement of size and zeta potential is possible for some.

  19. Funtools: Fits Users Need Tools for Quick, Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Mandel, Eric; Brederkamp, Joe (Technical Monitor)

    2001-01-01

    The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

  20. Quantitative image analysis of histological sections of coronary arteries

    NASA Astrophysics Data System (ADS)

    Holmes, David R., III; Robb, Richard A.

    2000-06-01

    The study of coronary arteries has evolved from examining gross anatomy and morphology to scrutinizing micro-anatomy and cellular composition. Technological advances such as high- resolution digital microscopes and high precision cutting devices have allowed examination of coronary artery morphology and pathology at micron resolution. We have developed a software toolkit to analyze histological sections. In particular, we are currently engaged in examining normal coronary arteries in order to provide the foundation for study of remodeled tissue. The first of two coronary arteries was stained for elastin and collagen. The second coronary artery was sectioned and stained for cellular nuclei and smooth muscle. High resolution light microscopy was used to image the sections. Segmentation was accomplished initially with slice- to-slice thresholding algorithms. These segmentation techniques choose optimal threshold values by modeling the tissue as one or more distributions. Morphology and image statistics were used to further differentiate the thresholded data into different tissue categories therefore refine the results of the segmentation. Specificity/sensitivity analysis suggests that automatic segmentation can be very effective. For both tissue samples, greater than 90% specificity was achieved. Summed voxel projection and maximum intensity projection appear to be effective 3-D visualization tools. Shading methods also provide useful visualization, however it is important to incorporate combined 2-D and 3-D displays. Surface rendering techniques (e.g. color mapping) can be used for visualizing parametric data. Preliminary results are promising, but continued development of algorithms is needed.

  1. Quantitative analysis of bloggers' collective behavior powered by emotions

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Paltoglou, Georgios; Tadić, Bosiljka

    2011-02-01

    Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.

  2. Quantitative Analysis of the Microstructure of Auxetic Foams

    SciTech Connect

    Gaspar, N.; Smith, C.W.; Miller, E.A.; Seidler, G.T.; Evans, K.E.

    2008-07-28

    The auxetic foams first produced by Lakes have been modelled in a variety of ways, each model trying to reproduce some observed feature of the microscale of the foams. Such features include bent or broken ribs or inverted angles between ribs. These models can reproduce the Poisson's ratio or Poisson's function of auxetic foam if the model parameters are carefully chosen. However these model parameters may not actually reflect the internal structure of the foams. A big problem is that measurement of parameters such as lengths and angles is not straightforward within a 3-d sample. In this work a sample of auxetic foam has been imaged by 3-d X-ray computed tomography. The resulting image is translated to a form that emphasises the geometrical structure of connected ribs. This connected rib data are suitably analysed to describe both the microstructural construction of auxetic foams and the statistical spread of structure, that is, the heterogeneity of an auxetic foam. From the analysis of the microstructure, observations are made about the requirements for microstructural models and comparisons made to previous existing models. From the statistical data, measures of heterogeneity are made that will help with future modelling that includes the heterogeneous aspect of auxetic foams.

  3. Quantitative Analysis with Heavy Ion E-TOF ERD

    SciTech Connect

    Banks, J.C.; Doyle, B.L.; Font, A. Climent

    1999-07-23

    Heavy ion TOF ERD combined with energy detection (E-TOF-ERD) is a powerful analytical technique taking advantage of the following facts: the scattering cross section is usually very high ({approximately}10{sup {minus}21} cm{sup 2}/sr) compared to regular He RBS ({approximately}10{sup {minus}25} cm{sup 2}/sr), contrary to what happens with the energy resolution in ordinary surface solid barrier detectors, time resolution is almost independent of the atomic mass of the detected element, and the detection in coincidence of time and energy signals allows for the mass separation of overlapping signals with the same energy (or time of flight). Measurements on several oxides have been performed with the E-TOF-ERD set up at Sandia National Laboratories using an incident beam of 10-15 MeV Au. The information on the composition of the sample is obtained from the time domain spectrum, which is converted to energy domain, and then, using existing software codes, the analysis is performed. During the quantification of the results, they have found problems related to the interaction of the beam with the sample and to the tabulated values of the stopping powers for heavy ions.

  4. Direct Quantitative Analysis of Arsenic in Coal Fly Ash

    PubMed Central

    Hartuti, Sri; Kambara, Shinji; Takeyama, Akihiro; Kumabe, Kazuhiro; Moritomi, Hiroshi

    2012-01-01

    A rapid, simple method based on graphite furnace atomic absorption spectrometry is described for the direct determination of arsenic in coal fly ash. Solid samples were directly introduced into the atomizer without preliminary treatment. The direct analysis method was not always free of spectral matrix interference, but the stabilization of arsenic by adding palladium nitrate (chemical modifier) and the optimization of the parameters in the furnace program (temperature, rate of temperature increase, hold time, and argon gas flow) gave good results for the total arsenic determination. The optimal furnace program was determined by analyzing different concentrations of a reference material (NIST1633b), which showed the best linearity for calibration. The optimized parameters for the furnace programs for the ashing and atomization steps were as follows: temperatures of 500–1200 and 2150°C, heating rates of 100 and 500°C s−1, hold times of 90 and 7 s, and medium then maximum and medium argon gas flows, respectively. The calibration plots were linear with a correlation coefficient of 0.9699. This method was validated using arsenic-containing raw coal samples in accordance with the requirements of the mass balance calculation; the distribution rate of As in the fly ashes ranged from 101 to 119%. PMID:23251836

  5. Fibrin Architecture in Clots: A Quantitative Polarized Light Microscopy Analysis

    PubMed Central

    Whittaker, Peter; Przyklenk, Karin

    2009-01-01

    Fibrin plays a vital structural role in thrombus integrity. Thus, the ability to assess fibrin architecture has potential to provide insight into thrombosis and thrombolysis. Fibrin has an anisotropic molecular structure, which enables it to be seen with polarized light. Therefore, we aimed to determine if automated polarized light microscopy methods of quantifying two structural parameters; fibrin fiber bundle orientation and fibrin's optical retardation (OR: a measure of molecular anisotropy) could be used to assess thrombi. To compare fibrin fiber bundle orientation we analyzed picrosirius red-stained sections obtained from clots formed: (A) in vitro, (B) in injured and stenotic coronary arteries, and (C) in surgically created aortic aneurysms (n = 6 for each group). To assess potential changes in OR, we examined fibrin in picrosirius red-stained clots formed after ischemic preconditioning (10 minutes ischemia + 10 minutes reflow; a circumstance shown to enhance lysability) and in control clots (n = 8 each group). The degree of fibrin organization differed significantly according to the location of clot formation; fibrin was most aligned in the aneurysms and least aligned in vitro whereas fibrin in the coronary clots had an intermediate organization. The OR of fibrin in the clots formed after ischemic preconditioning was lower than that in controls (2.9 ± 0.5 nm versus 5.4 ± 1.0 nm, P < 0.05). The automated polarized light analysis methods not only enabled fibrin architecture to be assessed, but also revealed structural differences in clots formed under different circumstances. PMID:19054699

  6. Quantitative analysis of American woodcock nest and brood habitat

    USGS Publications Warehouse

    Bourgeois, A.

    1977-01-01

    Sixteen nest and 19 brood sites of American woodcock (Philohela minoI) were examined in northern lower Michigan between 15 April and 15 June 1974 to determine habitat structure associated with these sites. Woodcock hens utilized young, second-growth forest stands which were similar in species composition for both nesting and brood rearing. A multi-varIate discriminant function analysis revealed a significant (P< 0.05) difference, however, in habitat structure. Nest habitat was characterized by lower tree density (2176 trees/ha) and basal area (8.6 m2/ha), by being close to forest openings (7 m) and by being situated on dry, relatively well drained sites. In contrast, woodcock broods were located in sites that had nearly twice the tree density (3934 trees/hal and basal area (16.5 m2/ha), was located over twice as far from forest openings (18 m) and generally occurred on damp sites, near (8 m) standing water. Importance of the habitat features to the species and possible management implications are discussed.

  7. Segmentation of vascular structures and hematopoietic cells in 3D microscopy images and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Mu, Jian; Yang, Lin; Kamocka, Malgorzata M.; Zollman, Amy L.; Carlesso, Nadia; Chen, Danny Z.

    2015-03-01

    In this paper, we present image processing methods for quantitative study of how the bone marrow microenvironment changes (characterized by altered vascular structure and hematopoietic cell distribution) caused by diseases or various factors. We develop algorithms that automatically segment vascular structures and hematopoietic cells in 3-D microscopy images, perform quantitative analysis of the properties of the segmented vascular structures and cells, and examine how such properties change. In processing images, we apply local thresholding to segment vessels, and add post-processing steps to deal with imaging artifacts. We propose an improved watershed algorithm that relies on both intensity and shape information and can separate multiple overlapping cells better than common watershed methods. We then quantitatively compute various features of the vascular structures and hematopoietic cells, such as the branches and sizes of vessels and the distribution of cells. In analyzing vascular properties, we provide algorithms for pruning fake vessel segments and branches based on vessel skeletons. Our algorithms can segment vascular structures and hematopoietic cells with good quality. We use our methods to quantitatively examine the changes in the bone marrow microenvironment caused by the deletion of Notch pathway. Our quantitative analysis reveals property changes in samples with deleted Notch pathway. Our tool is useful for biologists to quantitatively measure changes in the bone marrow microenvironment, for developing possible therapeutic strategies to help the bone marrow microenvironment recovery.

  8. Hydrogen isotope analysis of amino acids and whole cells reflects biosynthetic processing of nutrient- and water-derived hydrogen

    NASA Astrophysics Data System (ADS)

    Griffin, P.; Newsome, S.; Steele, A.; Fogel, M. L.

    2011-12-01

    Hydrogen (H) isotopes serve as sensitive tracers of biochemical processes that can be exploited to answer critical questions in biogeochemistry, ecology, and microbiology. Despite this apparent utility, relatively little is known about the specific mechanisms of H isotope fractionation involved in biosynthesis. In order to understand how organisms incorporate hydrogen from their chemical milieu into biomass, we have cultured the model bacterium E. coli MG1655 in a variety of media composed of deuterium-labeled nutrients and waters. Isotopic analysis of bulk cell mass reveals that the H fractionation between media water and cell material varies as a function of the nutrient source, with commonly used organic food sources (glucose and tryptone) leading to far smaller fractionation signals than non-standard ones (such as formamide, adenine, and urea). In addition, we have completed compound specific isotope analysis of amino acids using combined GC-IRMS. Amino acids harvested from E. coli cultured on glucose in water of varied D/H composition posses an extraordinary range of isotopic compositions (400-600 %). Furthermore, these amino acids follow a systematic distribution of D/H where proline is always heaviest and glycine is always lightest. However, when the short-chain peptide tryptone is used in place of glucose, only the non-essential amino acids reflect media water D/H values, suggesting the direct incorporation of some media-borne amino acids into cellular protein. These observations provide a foundation for understanding the cellular routing of hydrogen obtained from food and water sources and indicate that D/H analysis can serve as a powerful probe of biological function.

  9. Cooperativity of intermolecular hydrogen bonds in microsolvated DMSO and DMF clusters: a DFT, AIM, and NCI analysis.

    PubMed

    Venkataramanan, Natarajan Sathiyamoorthy

    2016-07-01

    Density functional theory (DFT) calculations are performed to study the hydrogen-bonding in the DMSO-water and DMF-water complexes. Quantitative molecular electrostatic potential (MESP) and atoms-in-molecules (AIM) analysis are applied to quantify the relative complexation of DMSO and DMF with water molecules. The interaction energy of DMSO with water molecules was higher than in DMF-water complexes. The existence of cooperativity effect helps in the strong complex formation. A linear dependence was observed between the hydrogen bond energies EHB, and the total electron densities in the BCP's of microsolvated complexes which supports the existence of cooperativity effect for the complexation process. Due to the stronger DMSO/DMF and water interaction, the water molecules in the formed complexes have a different structure than the isolated water clusters. NCI analysis shows that the steric area is more pronounced in DMF-water complex than the DMSO-water complex which accounts for the low stability of DMF-water complexes compared to the DMSO-water complex. Graphical abstract NCI analysis shows that the steric area is more pronounced in DMF-water complex than the DMSO-water complex which accounts for the low stability of DMF-water complexes compared to the DMSO-water complex. PMID:27278055

  10. Quantitative flux analysis reveals folate-dependent NADPH production

    NASA Astrophysics Data System (ADS)

    Fan, Jing; Ye, Jiangbin; Kamphorst, Jurre J.; Shlomi, Tomer; Thompson, Craig B.; Rabinowitz, Joshua D.

    2014-06-01

    ATP is the dominant energy source in animals for mechanical and electrical work (for example, muscle contraction or neuronal firing). For chemical work, there is an equally important role for NADPH, which powers redox defence and reductive biosynthesis. The most direct route to produce NADPH from glucose is the oxidative pentose phosphate pathway, with malic enzyme sometimes also important. Although the relative contribution of glycolysis and oxidative phosphorylation to ATP production has been extensively analysed, similar analysis of NADPH metabolism has been lacking. Here we demonstrate the ability to directly track, by liquid chromatography-mass spectrometry, the passage of deuterium from labelled substrates into NADPH, and combine this approach with carbon labelling and mathematical modelling to measure NADPH fluxes. In proliferating cells, the largest contributor to cytosolic NADPH is the oxidative pentose phosphate pathway. Surprisingly, a nearly comparable contribution comes from serine-driven one-carbon metabolism, in which oxidation of methylene tetrahydrofolate to 10-formyl-tetrahydrofolate is coupled to reduction of NADP+ to NADPH. Moreover, tracing of mitochondrial one-carbon metabolism revealed complete oxidation of 10-formyl-tetrahydrofolate to make NADPH. As folate metabolism has not previously been considered an NADPH producer, confirmation of its functional significance was undertaken through knockdown of methylenetetrahydrofolate dehydrogenase (MTHFD) genes. Depletion of either the cytosolic or mitochondrial MTHFD isozyme resulted in decreased cellular NADPH/NADP+ and reduced/oxidized glutathione ratios (GSH/GSSG) and increased cell sensitivity to oxidative stress. Thus, although the importance of folate metabolism for proliferating cells has been long recognized and attributed to its function of producing one-carbon units for nucleic acid synthesis, another crucial function of this pathway is generating reducing power.

  11. Quantitative ultrasound texture analysis for clinical decision making support

    NASA Astrophysics Data System (ADS)

    Wu, Jie Ying; Beland, Michael; Konrad, Joseph; Tuomi, Adam; Glidden, David; Grand, David; Merck, Derek

    2015-03-01

    We propose a general ultrasound (US) texture-analysis and machine-learning framework for detecting the presence of disease that is suitable for clinical application across clinicians, disease types, devices, and operators. Its stages are image selection, image filtering, ROI selection, feature parameterization, and classification. Each stage is modular and can be replaced with alternate methods. Thus, this framework is adaptable to a wide range of tasks. Our two preliminary clinical targets are hepatic steatosis and adenomyosis diagnosis. For steatosis, we collected US images from 288 patients and their pathology-determined values of steatosis (%) from biopsies. Two radiologists independently reviewed all images and identified the region of interest (ROI) most representative of the hepatic echotexture for each patient. To parameterize the images into comparable quantities, we filter the US images at multiple scales for various texture responses. For each response, we collect a histogram of pixel features within the ROI, and parameterize it as a Gaussian function using its mean, standard deviation, kurtosis, and skew to create a 36-feature vector. Our algorithm uses a support vector machine (SVM) for classification. Using a threshold of 10%, we achieved 72.81% overall accuracy, 76.18% sensitivity, and 65.96% specificity in identifying steatosis with leave-ten-out cross-validation (p<0.0001). Extending this framework to adenomyosis, we identified 38 patients with MR-confirmed findings of adenomyosis and previous US studies and 50 controls. A single rater picked the best US-image and ROI for each case. Using the same processing pipeline, we obtained 76.14% accuracy, 86.00% sensitivity, and 63.16% specificity with leave-one-out cross-validation (p<0.0001).

  12. Analysis of liver connexin expression using reverse transcription quantitative real-time polymerase chain reaction

    PubMed Central

    Maes, Michaël; Willebrords, Joost; Crespo Yanguas, Sara; Cogliati, Bruno; Vinken, Mathieu

    2016-01-01

    Summary Although connexin production is mainly regulated at the protein level, altered connexin gene expression has been identified as the underlying mechanism of several pathologies. When studying the latter, appropriate methods to quantify connexin mRNA levels are required. The present chapter describes a well-established reverse transcription quantitative real-time polymerase chain reaction procedure optimized for analysis of hepatic connexins. The method includes RNA extraction and subsequent quantification, generation of complementary DNA, quantitative real-time polymerase chain reaction and data analysis. PMID:27207283

  13. Analysis of Liver Connexin Expression Using Reverse Transcription Quantitative Real-Time Polymerase Chain Reaction.

    PubMed

    Maes, Michaël; Willebrords, Joost; Crespo Yanguas, Sara; Cogliati, Bruno; Vinken, Mathieu

    2016-01-01

    Although connexin production is mainly regulated at the protein level, altered connexin gene expression has been identified as the underlying mechanism of several pathologies. When studying the latter, appropriate methods to quantify connexin RNA levels are required. The present chapter describes a well-established reverse transcription quantitative real-time polymerase chain reaction procedure optimized for analysis of hepatic connexins. The method includes RNA extraction and subsequent quantification, generation of complementary DNA, quantitative real-time polymerase chain reaction, and data analysis. PMID:27207283

  14. New bone formation in the in vivo implantation of bioceramics. A quantitative analysis.

    PubMed

    Wu, H; Zhu, T B; Du, J Y; Hong, G X; Sun, S Z; Xu, X H

    1992-09-01

    Two kinds of synthetic biomaterial, porous tricalcium phosphate (PTCP) and magnetic porous tricalcium phosphate (MPTCP) ceramic granules were implanted in rat femur. In the period of 4 months, the assessment of serial histological sections, scanning electron microphotographs and quantitative analysis of bone formation in the sections showed that both ceramics are biocompatible and degradable in vivo. More new bone formation occurred in the MPTCP group. Endochondral ossification was seen in both groups. The quantitative analysis in this study is reliable, and may be suitable to the similar experimental models. PMID:1288979

  15. Quantitative analysis of complex three-dimensional microstructures

    NASA Astrophysics Data System (ADS)

    Genau, Amber Lynn

    The morphological evolution due to coarsening is analyzed for two distinctive types of microstructure. First, the feasibility of characterizing spatial correlations of interfacial curvature in topologically complex structures is demonstrated with the analysis of bicontinuous two-phase mixtures produced using phase field modeling. For structures produced with both conserved and nonconserved dynamics, new characteristic length scales are identified. In the nonconserved case, despite the local evolution law governing interfacial motion, long-range correlations develop that lead to a characteristic length scale associated with the distance between high curvature tunnels. In the conserved case the diffusional dynamics leads to a length scale that is related to correlations and anticorrelations between regions of curvature of opposite sign. Positive correlations due to this length scale can be measured out to seven times the characteristic length of the system. Spatial correlations are also compared for symmetric and asymmetric mixtures produced with conserved dynamics. In addition, the microstructure of directionally solidified and isothermally coarsened Pb-Sn samples are examined at various coarsening times. The samples, composed of Pb-69.1wt%Sn, have an overall volume fraction of 22% solid which is not uniformly distributed through the sample but clustered into regions of approximately 37% solid separated by empty eutectic regions. The morphology of the dendrites, both in the dense regions and at the edge of the eutectic spaces is analyzed using three-dimensional reconstructions, Interface Shape Distributions and Interface Normal Distributions. These methods are used to track the evolution of the structures from being dominated by secondary and tertiary arms in the plane perpendicular to the solidification direction to predominance of the primary stalks running in the solidification direction. Finally, the method of characterizing spatial correlations introduced above

  16. Thermodynamic analysis of alternate energy carriers, hydrogen and chemical heat pipes

    NASA Technical Reports Server (NTRS)

    Cox, K. E.; Carty, R. H.; Conger, W. L.; Soliman, M. A.; Funk, J. E.

    1976-01-01

    Hydrogen and chemical heat pipes were proposed as methods of transporting energy from a primary energy source (nuclear, solar) to the user. In the chemical heat pipe system, primary energy is transformed into the energy of a reversible chemical reaction; the chemical species are then transmitted or stored until the energy is required. Analysis of thermochemical hydrogen schemes and chemical heat pipe systems on a second law efficiency or available work basis show that hydrogen is superior especially if the end use of the chemical heat pipe is electrical power.

  17. Multi-criteria analysis on how to select solar radiation hydrogen production system

    SciTech Connect

    Badea, G.; Naghiu, G. S. Felseghi, R.-A.; Giurca, I.; Răboacă, S.; Aşchilean, I.

    2015-12-23

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  18. Thermodynamic analysis on the role of hydrogen in anodic stress corrosion cracking

    SciTech Connect

    Qiao, L.; Mao, X.

    1995-11-01

    A synergistic effect of hydrogen and stress on a corrosion rate was analyzed with thermodynamics. The results showed that an interaction of stress and hydrogen could increase the corrosion rate remarkably. Stress corrosion cracking (SCC) of austenitic stainless steel (ASS) was investigated in boiling chloride solution to confirm the analysis. Hydrogen could be introduced into the specimen concentrated at the crack tip during SCC in boiling LiCl solution (143 C). The concentrating factor is about 3 which is consistent with calculated results according to stress induced diffusion.

  19. Multi-criteria analysis on how to select solar radiation hydrogen production system

    NASA Astrophysics Data System (ADS)

    Badea, G.; Naghiu, G. S.; Felseghi, R.-A.; Rǎboacǎ, S.; Aşchilean, I.; Giurca, I.

    2015-12-01

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  20. Mechanistic and microkinetic analysis of CO2 hydrogenation on ceria.

    PubMed

    Cheng, Zhuo; Lo, Cynthia S

    2016-03-21

    We use density functional theory (DFT) calculations to investigate the mechanism of CO2 hydrogenation to methanol on a reduced ceria (110) catalyst, which has previously been shown to activate CO2. Two reaction channels to methanol are identified: (1) COOH pathway via a carboxyl intermediate and (2) HCOO pathway via a formate intermediate. While formaldehyde (H2CO) appears to be the key intermediate for methanol synthesis, other intermediates, including carbine diol, formic acid and methanol, are not feasible due to their high formation energies. Furthermore, direct formyl hydrogenation to formaldehyde is not feasible due to its high activation barrier. Instead, we find that conversion of H-formalin (H2COOH*) to formaldehyde is kinetically more favorable. The formaldehyde is then converted to methoxy (H3CO*), and finally hydrogenated to form methanol. Microkinetic analyses reveal the rate-limiting steps in the reaction network and establish that the HCOO route is the dominant pathway for methanol formation on this catalyst. PMID:26955867

  1. Delving into sensible measures to enhance the environmental performance of biohydrogen: A quantitative approach based on process simulation, life cycle assessment and data envelopment analysis.

    PubMed

    Martín-Gamboa, Mario; Iribarren, Diego; Susmozas, Ana; Dufour, Javier

    2016-08-01

    A novel approach is developed to evaluate quantitatively the influence of operational inefficiency in biomass production on the life-cycle performance of hydrogen from biomass gasification. Vine-growers and process simulation are used as key sources of inventory data. The life cycle assessment of biohydrogen according to current agricultural practices for biomass production is performed, as well as that of target biohydrogen according to agricultural practices optimised through data envelopment analysis. Only 20% of the vineyards assessed operate efficiently, and the benchmarked reduction percentages of operational inputs range from 45% to 73% in the average vineyard. The fulfilment of operational benchmarks avoiding irregular agricultural practices is concluded to improve significantly the environmental profile of biohydrogen (e.g., impact reductions above 40% for eco-toxicity and global warming). Finally, it is shown that this type of bioenergy system can be an excellent replacement for conventional hydrogen in terms of global warming and non-renewable energy demand. PMID:27155266

  2. Computerized rapid high resolution quantitative analysis of plasma lipoproteins based upon single vertical spin centrifugation.

    PubMed

    Cone, J T; Segrest, J P; Chung, B H; Ragland, J B; Sabesin, S M; Glasscock, A

    1982-08-01

    A method has been developed for rapidly quantitating the cholesterol concentration of normal and certain variant lipoproteins in a large number of patients (over 240 in one week). The method employs a microcomputer interfaced to the vertical autoprofiler (VAP) described earlier (Chung et al. 1981. J. Lipid Res. 22: 1003-1014). Software developed to accomplish rapid on-line analysis of the VAP signal uses peak shapes and positions derived from prior VAP analysis of isolated authentic lipoproteins HDL, LDL, and VLDL to quantitate these species in a VAP profile. Variant lipoproteins VHDL (a species with density greater than that of HDL(3)), MDL (a species, most likely Lp(a), with density intermediate between that of HDL and LDL), and IDL are subsequently quantitated by a method combining difference calculations with curve shapes. The procedure has been validated qualitatively by negative stain electron microscopy, gradient gel electrophoresis, strip electrophoresis, chemical analysis of the lipids, radioimmunoassay of the apolipoproteins, and measurement of the density of the peak centers. It has been validated quantitatively by comparison with Lipid Research Clinic methodology for HDL-, LDL-, and VLDL-cholesterol, and for MDL- and IDL-cholesterol by comparison of the amounts of MDL or IDL predicted to be present by the method with that known to be present following standard addition to whole plasma. These validations show that the method is a rapid and accurate technique of lipoprotein analysis suitable for the routine screening of patients for abnormal amounts of normal or variant lipoproteins, as well as for use as a research tool for quantitation of changes in cholesterol content of six or seven different plasma lipoprotein fractions.-Cone, J. T., J. P. Segrest, B. H. Chung, J. B. Ragland, S. M. Sabesin, and A. Glasscock. Computerized rapid high resolution quantitative analysis of plasma lipoproteins based upon single vertical spin centrifugation. PMID:7130860

  3. Analysis of mixed cell cultures with quantitative digital holographic phase microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Wibbeling, Jana; Ketelhut, Steffi

    2014-05-01

    In order to study, for example, the influence of pharmaceuticals or pathogens on different cell types under identical measurement conditions and to analyze interactions between different cellular specimens a minimally-invasive quantitative observation of mixed cell cultures is of particular interest. Quantitative phase microscopy (QPM) provides high resolution detection of optical path length changes that is suitable for stain-free minimally-invasive live cell analysis. Due to low light intensities for object illumination, QPM minimizes the interaction with the sample and is in particular suitable for long term time-lapse investigations, e.g., for the detection of cell morphology alterations due to drugs and toxins. Furthermore, QPM has been demonstrated to be a versatile tool for the quantification of cellular growth, the extraction morphological parameters and cell motility. We studied the feasibility of QPM for the analysis of mixed cell cultures. It was explored if quantitative phase images provide sufficient information to distinguish between different cell types and to extract cell specific parameters. For the experiments quantitative phase imaging with digital holographic microscopy (DHM) was utilized. Mixed cell cultures with different types of human pancreatic tumor cells were observed with quantitative DHM phase contrast up to 35 h. The obtained series of quantitative phase images were evaluated by adapted algorithms for image segmentation. From the segmented images the cellular dry mass and the mean cell thickness were calculated and used in the further analysis as parameters to quantify the reliability the measurement principle. The obtained results demonstrate that it is possible to characterize the growth of cell types with different morphologies in a mixed cell culture separately by consideration of specimen size and cell thickness in the evaluation of quantitative DHM phase images.

  4. Fracture mechanics analysis of a high-pressure hydrogen facility compressor

    NASA Technical Reports Server (NTRS)

    Vroman, G. A.

    1974-01-01

    The investigation and analysis of a high-pressure hydrogen facility compressor is chronicled, and a life prediction based on fracture mechanics is presented. Crack growth rates in SA 105 Gr II steel are developed for the condition of sustained loading, using a hypothesis of hydrogen embrittlement associated with plastic zone reverse yielding. The resultant formula is compared with test data obtained from laboratory specimens.

  5. Quantitative Analysis of Spectral Interference of Spontaneous Raman Scattering in High-Pressure Fuel-Rich H2-Air Combustion

    NASA Technical Reports Server (NTRS)

    Kojima, Jun; Nguyen, Quang-Viet

    2004-01-01

    We present a theoretical study of the spectral interferences in the spontaneous Raman scattering spectra of major combustion products in 30-atm fuel-rich hydrogen-air flames. An effective methodology is introduced to choose an appropriate line-shape model for simulating Raman spectra in high-pressure combustion environments. The Voigt profile with the additive approximation assumption was found to provide a reasonable model of the spectral line shape for the present analysis. The rotational/vibrational Raman spectra of H2, N2, and H2O were calculated using an anharmonic-oscillator model using the latest collisional broadening coefficients. The calculated spectra were validated with data obtained in a 10-atm fuel-rich H2-air flame and showed excellent agreement. Our quantitative spectral analysis for equivalence ratios ranging from 1.5 to 5.0 revealed substantial amounts of spectral cross-talk between the rotational H2 lines and the N2 O-/Q-branch; and between the vibrational H2O(0,3) line and the vibrational H2O spectrum. We also address the temperature dependence of the spectral cross-talk and extend our analysis to include a cross-talk compensation technique that removes the nterference arising from the H2 Raman spectra onto the N2, or H2O spectra.

  6. Transcriptional analysis of Mycobacterium fortuitum cultures upon hydrogen peroxide treatment using the novel standard rrnA-P1

    PubMed Central

    Núñez, María Carmen; Menéndez, María Carmen; Rebollo, María José; García, María J

    2008-01-01

    Background The ability of an intracellular pathogen to establish infection depends on the capacity of the organism to survive and replicate inside the host. Mycobacterium fortuitum is a bacteria that contains genes involved in the detoxification of the oxygen reactive species such as those produced by the host during the infection. In this work, we investigate the effects of hydrogen peroxide on the transcription and expression of these genes by developing a real time quantitative PCR technique (qRT-PCR) using the ribosomal promoter region (rrnA-P1) as reference product for quantification of the mRNA levels. Results M. fortuitum cultures were treated with different hydrogen peroxide concentrations (0.02 to 20 mM) during several periods of time (30 to 120 minutes). The activity of the enzymes KatGII and SodA, and the transcription of corresponding genes were evaluated. The transcriptional regulator furAII gene was also studied. The ribosomal promoter region rrnA-P1 was validated as referential product under the stress conditions checked by qRT-PCR. Minor changes were observed under the conditions tested except when bacteria were incubated in the presence of 20 mM hydrogen peroxide. Under those conditions, the levels of transcription of the three genes under study increased at 30 minutes of treatment. The viability of the bacteria was not influenced under the conditions tested. Conclusion In this work, we have quantified transcriptional responses to stress suggesting that, the opportunistic pathogen M. fortuitum is more resistant and differs in behaviour in the presence of hydrogen peroxide, when compared to the major pathogen Mycobacterium tuberculosis and the saprophyte Mycobacterium smegmatis. Besides, we demonstrate the mycobacterial non-coding region rrnA-P1 to be a suitable reference product in the analysis of qRT-PCR transcriptional data of M. fortuitum. PMID:18565220

  7. Quantitative Computed Tomography Protocols Affect Material Mapping and Quantitative Computed Tomography-Based Finite-Element Analysis Predicted Stiffness.

    PubMed

    Giambini, Hugo; Dragomir-Daescu, Dan; Nassr, Ahmad; Yaszemski, Michael J; Zhao, Chunfeng

    2016-09-01

    Quantitative computed tomography-based finite-element analysis (QCT/FEA) has become increasingly popular in an attempt to understand and possibly reduce vertebral fracture risk. It is known that scanning acquisition settings affect Hounsfield units (HU) of the CT voxels. Material properties assignments in QCT/FEA, relating HU to Young's modulus, are performed by applying empirical equations. The purpose of this study was to evaluate the effect of QCT scanning protocols on predicted stiffness values from finite-element models. One fresh frozen cadaveric torso and a QCT calibration phantom were scanned six times varying voltage and current and reconstructed to obtain a total of 12 sets of images. Five vertebrae from the torso were experimentally tested to obtain stiffness values. QCT/FEA models of the five vertebrae were developed for the 12 image data resulting in a total of 60 models. Predicted stiffness was compared to the experimental values. The highest percent difference in stiffness was approximately 480% (80 kVp, 110 mAs, U70), while the lowest outcome was ∼1% (80 kVp, 110 mAs, U30). There was a clear distinction between reconstruction kernels in predicted outcomes, whereas voltage did not present a clear influence on results. The potential of QCT/FEA as an improvement to conventional fracture risk prediction tools is well established. However, it is important to establish research protocols that can lead to results that can be translated to the clinical setting. PMID:27428281

  8. FUNDAMENTAL SAFETY TESTING AND ANALYSIS OF HYDROGEN STORAGE MATERIALS AND SYSTEMS

    SciTech Connect

    Anton, D

    2007-05-01

    Hydrogen is seen as the future automobile energy storage media due to its inherent cleanliness upon oxidation and its ready utilization in fuel cell applications. Its physical storage in light weight, low volume systems is a key technical requirement. In searching for ever higher gravimetric and volumetric density hydrogen storage materials and systems, it is inevitable that higher energy density materials will be studied and used. To make safe and commercially acceptable systems, it is important to understand quantitatively, the risks involved in using and handling these materials and to develop appropriate risk mitigation strategies to handle unforeseen accidental events. To evaluate these materials and systems, an IPHE sanctioned program was initiated in 2006 partnering laboratories from Europe, North America and Japan. The objective of this international program is to understanding the physical risks involved in synthesis, handling and utilization of solid state hydrogen storage materials and to develop methods to mitigate these risks. This understanding will support ultimate acceptance of commercially high density hydrogen storage system designs. An overview of the approaches to be taken to achieve this objective will be given. Initial experimental results will be presented on environmental exposure of NaAlH{sub 4}, a candidate high density hydrogen storage compound. The tests to be shown are based on United Nations recommendations for the transport of hazardous materials and include air and water exposure of the hydride at three hydrogen charge levels in various physical configurations. Additional tests developed by the American Society for Testing and Materials were used to quantify the dust cloud ignition characteristics of this material which may result from accidental high energy impacts and system breach. Results of these tests are shown along with necessary risk mitigation techniques used in the synthesis and fabrication of a prototype hydrogen storage

  9. Analysis of Improved Reference Design for a Nuclear-Driven High Temperature Electrolysis Hydrogen Production Plant

    SciTech Connect

    Edwin A. Harvego; James E. O'Brien; Michael G. McKellar

    2010-06-01

    The use of High Temperature Electrolysis (HTE) for the efficient production of hydrogen without the greenhouse gas emissions associated with conventional fossil-fuel hydrogen production techniques has been under investigation at the Idaho National Engineering Laboratory (INL) for the last several years. The activities at the INL have included the development, testing and analysis of large numbers of solid oxide electrolysis cells, and the analyses of potential plant designs for large scale production of hydrogen using an advanced Very-High Temperature Reactor (VHTR) to provide the process heat and electricity to drive the electrolysis process. The results of these system analyses, using the UniSim process analysis software, have shown that the HTE process, when coupled to a VHTR capable of operating at reactor outlet temperatures of 800 °C to 950 °C, has the potential to produce the large quantities of hydrogen needed to meet future energy and transportation needs with hydrogen production efficiencies in excess of 50%. In addition, economic analyses performed on the INL reference plant design, optimized to maximize the hydrogen production rate for a 600 MWt VHTR, have shown that a large nuclear-driven HTE hydrogen production plant can to be economically competitive with conventional hydrogen production processes, particularly when the penalties associated with greenhouse gas emissions are considered. The results of this research led to the selection in 2009 of HTE as the preferred concept in the U.S. Department of Energy (DOE) hydrogen technology down-selection process. However, the down-selection process, along with continued technical assessments at the INL, has resulted in a number of proposed modifications and refinements to improve the original INL reference HTE design. These modifications include changes in plant configuration, operating conditions and individual component designs. This paper describes the resulting new INL reference design and presents

  10. Possibility of quantitative estimation of blood cell forms by the spatial-frequency spectrum analysis

    NASA Astrophysics Data System (ADS)

    Spiridonov, Igor N.; Safonova, Larisa P.; Samorodov, Andrey V.

    2000-05-01

    At present in hematology there are no quantitative estimates of such important for the cell classification parameters: cell form and nuclear form. Due to the absence of the correlation between morphological parameters and parameters measured by hemoanalyzers, both flow cytometers and computer recognition systems, do not provide the completeness of the clinical blood analysis. Analysis of the spatial-frequency spectra of blood samples (smears and liquid probes) permit the estimate the forms quantitatively. On the results of theoretical and experimental researches carried out an algorithm of the form quantitative estimation by means of SFS parameters has been created. The criteria of the quality of these estimates have been proposed. A test bench based on the coherent optical and digital processors. The received results could be applied for the automated classification of ether normal or pathological blood cells in the standard blood smears.

  11. Applied research for quantitative analysis of fluorescent whitening agent in emulsion paint

    NASA Astrophysics Data System (ADS)

    Zhang, Lin

    Fluorescent whitening agents (FWAS) are widely used in the emulsion paint for brightening effect. In spite of extensive use of FWAS, there are no reports about the measurement method of FWAS in emulsion paint. In this work, a very simple quantitative approach is proposed. Based on the digital grayscale images of three-dimensional fluorescence spectra and two-dimensional fluorescence images, several wavelet moment invariants are calculated and used to establish the standard models for the quantitative analysis. The influence factors of storage time and exposure time are also studied here. Measurement results indicated the feasibility and precision of using this method for quantitative analysis of FWAS. The research results also provides a reliable basis for the application of FWAS in emulsion paint. Keywords: fluorescent whitening agents, three-dimensional fluorescence spectra, fluorescence image, wavelet moment invariants

  12. Hydrogen Energy Storage (HES) and Power-to-Gas Economic Analysis; NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Eichman, Joshua

    2015-07-30

    This presentation summarizes opportunities for hydrogen energy storage and power-to-gas and presents the results of a market analysis performed by the National Renewable Energy Laboratory to quantify the value of energy storage. Hydrogen energy storage and power-to-gas systems have the ability to integrate multiple energy sectors including electricity, transportation, and industrial. On account of the flexibility of hydrogen systems, there are a variety of potential system configurations. Each configuration will provide different value to the owner, customers and grid system operator. This presentation provides an economic comparison of hydrogen storage, power-to-gas and conventional storage systems. The total cost is compared to the revenue with participation in a variety of markets to assess the economic competitiveness. It is found that the sale of hydrogen for transportation or industrial use greatly increases competitiveness. Electrolyzers operating as demand response devices (i.e., selling hydrogen and grid services) are economically competitive, while hydrogen storage that inputs electricity and outputs only electricity have an unfavorable business case. Additionally, tighter integration with the grid provides greater revenue (e.g., energy, ancillary service and capacity markets are explored). Lastly, additional hours of storage capacity is not necessarily more competitive in current energy and ancillary service markets and electricity markets will require new mechanisms to appropriately compensate long duration storage devices.

  13. Hydrogen Fuel Cell Analysis: Lessons Learned from Stationary Power Generation Final Report

    SciTech Connect

    Scott E. Grasman; John W. Sheffield; Fatih Dogan; Sunggyu Lee; Umit O. Koylu; Angie Rolufs

    2010-04-30

    This study considered opportunities for hydrogen in stationary applications in order to make recommendations related to RD&D strategies that incorporate lessons learned and best practices from relevant national and international stationary power efforts, as well as cost and environmental modeling of pathways. The study analyzed the different strategies utilized in power generation systems and identified the different challenges and opportunities for producing and using hydrogen as an energy carrier. Specific objectives included both a synopsis/critical analysis of lessons learned from previous stationary power programs and recommendations for a strategy for hydrogen infrastructure deployment. This strategy incorporates all hydrogen pathways and a combination of distributed power generating stations, and provides an overview of stationary power markets, benefits of hydrogen-based stationary power systems, and competitive and technological challenges. The motivation for this project was to identify the lessons learned from prior stationary power programs, including the most significant obstacles, how these obstacles have been approached, outcomes of the programs, and how this information can be used by the Hydrogen, Fuel Cells & Infrastructure Technologies Program to meet program objectives primarily related to hydrogen pathway technologies (production, storage, and delivery) and implementation of fuel cell technologies for distributed stationary power. In addition, the lessons learned address environmental and safety concerns, including codes and standards, and education of key stakeholders.

  14. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  15. Forty Years of the "Journal of Librarianship and Information Science": A Quantitative Analysis, Part I

    ERIC Educational Resources Information Center

    Furner, Jonathan

    2009-01-01

    This paper reports on the first part of a two-part quantitative analysis of volume 1-40 (1969-2008) of the "Journal of Librarianship and Information Science" (formerly the "Journal of Librarianship"). It provides an overview of the current state of LIS research journal publishing in the UK; a review of the publication and printing history of…

  16. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    ERIC Educational Resources Information Center

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  17. A Computer Program for Calculation of Calibration Curves for Quantitative X-Ray Diffraction Analysis.

    ERIC Educational Resources Information Center

    Blanchard, Frank N.

    1980-01-01

    Describes a FORTRAN IV program written to supplement a laboratory exercise dealing with quantitative x-ray diffraction analysis of mixtures of polycrystalline phases in an introductory course in x-ray diffraction. Gives an example of the use of the program and compares calculated and observed calibration data. (Author/GS)

  18. Quantitative Analysis of Organic Compounds: A Simple and Rapid Method for Use in Schools

    ERIC Educational Resources Information Center

    Schmidt, Hans-Jurgen

    1973-01-01

    Describes the procedure for making a quantitative analysis of organic compounds suitable for secondary school chemistry classes. Using the Schoniger procedure, the organic compound, such as PVC, is decomposed in a conical flask with oxygen. The products are absorbed in a suitable liquid and analyzed by titration. (JR)

  19. Whose American Government? A Quantitative Analysis of Gender and Authorship in American Politics Texts

    ERIC Educational Resources Information Center

    Cassese, Erin C.; Bos, Angela L.; Schneider, Monica C.

    2014-01-01

    American government textbooks signal to students the kinds of topics that are important and, by omission, the kinds of topics that are not important to the discipline of political science. This article examines portrayals of women in introductory American politics textbooks through a quantitative content analysis of 22 widely used texts. We find…

  20. QUANTITATIVE PCR ANALYSIS OF MOLDS IN THE DUST FROM HOMES OF ASTHMATIC CHILDREN IN NORTH CAROLINA

    EPA Science Inventory

    The vacuum bag (VB) dust was analyzed by mold specific quantitative PCR. These results were compared to the analysis survey calculated for each of the homes. The mean and standard deviation (SD) of the ERMI values in the homes of the NC asthmatic children was 16.4 (6.77), compa...

  1. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  2. A Quantitative Categorical Analysis of Metadata Elements in Image-Applicable Metadata Schemas.

    ERIC Educational Resources Information Center

    Greenberg, Jane

    2001-01-01

    Reports on a quantitative categorical analysis of metadata elements in the Dublin Core, VRA (Visual Resource Association) Core, REACH (Record Export for Art and Cultural Heritage), and EAD (Encoded Archival Description) metadata schemas, all of which can be used for organizing and describing images. Introduces a new schema comparison methodology…

  3. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    SciTech Connect

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  4. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  5. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  6. Quantitative Intersectionality: A Critical Race Analysis of the Chicana/o Educational Pipeline

    ERIC Educational Resources Information Center

    Covarrubias, Alejandro

    2011-01-01

    Utilizing the critical race framework of intersectionality, this research reexamines the Chicana/o educational pipeline through a quantitative intersectional analysis. This approach disaggregates data along the intersection of race, class, gender, and citizenship status to provide a detailed portrait of the educational trajectory of Mexican-origin…

  7. Gender Differences in Learning Styles: A Narrative Review and Quantitative Meta-Analysis.

    ERIC Educational Resources Information Center

    Severiens, Sabine E.; Ten Dam, Geert T. N.

    1994-01-01

    Research since 1980 on gender and learning styles of students over age 18 is reviewed for commonalities in theory and research methodology. In addition, a quantitative meta-analysis was undertaken on two measures of learning style and study behavior to determine the direction and magnitude of gender differences in various samples. (Author/MSE)

  8. A Colorimetric Analysis Experiment Not Requiring a Spectrophotometer: Quantitative Determination of Albumin in Powdered Egg White

    ERIC Educational Resources Information Center

    Charlton, Amanda K.; Sevcik, Richard S.; Tucker, Dorie A.; Schultz, Linda D.

    2007-01-01

    A general science experiment for high school chemistry students might serve as an excellent review of the concepts of solution preparation, solubility, pH, and qualitative and quantitative analysis of a common food product. The students could learn to use safe laboratory techniques, collect and analyze data using proper scientific methodology and…

  9. A Quantitative Analysis of Cognitive Strategy Usage in the Marking of Two GCSE Examinations

    ERIC Educational Resources Information Center

    Suto, W. M. Irenka; Greatorex, Jackie

    2008-01-01

    Diverse strategies for marking GCSE examinations have been identified, ranging from simple automatic judgements to complex cognitive operations requiring considerable expertise. However, little is known about patterns of strategy usage or how such information could be utilised by examiners. We conducted a quantitative analysis of previous verbal…

  10. Gas chromatograph-mass spectrometer (GC/MS) system for quantitative analysis of reactive chemical compounds

    DOEpatents

    Grindstaff, Quirinus G.

    1992-01-01

    Described is a new gas chromatograph-mass spectrometer (GC/MS) system and method for quantitative analysis of reactive chemical compounds. All components of such a GC/MS system external to the oven of the gas chromatograph are programmably temperature controlled to operate at a volatilization temperature specific to the compound(s) sought to be separated and measured.

  11. A Quantitative Discourse Analysis of Student-Initiated Checks of Understanding during Teacher-Fronted Lessons

    ERIC Educational Resources Information Center

    Shepherd, Michael A.

    2012-01-01

    Recent research highlights the paradoxical importance of students' being able to check their understanding with teachers and of teachers' constraining student participation. Using quantitative discourse analysis, this paper examines third graders' discursive strategies in initiating such checks and teachers' strategies in constraining them. The…

  12. Clinical applications of a quantitative analysis of regional lift ventricular wall motion

    NASA Technical Reports Server (NTRS)

    Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.

    1975-01-01

    Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.

  13. Determination of hydrogen in metals, semiconductors, and other materials by cold neutron prompt gamma-ray activation analysis

    SciTech Connect

    Paul, R.L.; Lindstrom, R.M.

    1998-12-31

    Cold neutron prompt gamma-ray activation analysis has proven useful for nondestructive measurement of trace hydrogen. The sample is irradiated in a beam of neutrons; the presence of hydrogen is confirmed by the emission of a 2223 keV gamma-ray. Detection limits for hydrogen are 3 mg/kg in quartz and 8 mg/kg in titanium. The authors have used the technique to measure hydrogen in titanium alloys, germanium, quartz, fullerenes and their derivatives, and other materials.

  14. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  15. Electroencephalography reactivity for prognostication of post-anoxic coma after cardiopulmonary resuscitation: A comparison of quantitative analysis and visual analysis.

    PubMed

    Liu, Gang; Su, Yingying; Jiang, Mengdi; Chen, Weibi; Zhang, Yan; Zhang, Yunzhou; Gao, Daiquan

    2016-07-28

    Electroencephalogram reactivity (EEG-R) is a positive predictive factor for assessing outcomes in comatose patients. Most studies assess the prognostic value of EEG-R utilizing visual analysis; however, this method is prone to subjectivity. We sought to categorize EEG-R with a quantitative approach. We retrospectively studied consecutive comatose patients who had an EEG-R recording performed 1-3 days after cardiopulmonary resuscitation (CPR) or during normothermia after therapeutic hypothermia. EEG-R was assessed via visual analysis and quantitative analysis separately. Clinical outcomes were followed-up at 3-month and dichotomized as recovery of awareness or no recovery of awareness. A total of 96 patients met the inclusion criteria, and 38 (40%) patients recovered awareness at 3-month followed-up. Of 27 patients with EEG-R measured with visual analysis, 22 patients recovered awareness; and of the 69 patients who did not demonstrated EEG-R, 16 patients recovered awareness. The sensitivity and specificity of visually measured EEG-R were 58% and 91%, respectively. The area under the receiver operating characteristic curve for the quantitative analysis was 0.92 (95% confidence interval, 0.87-0.97), with the best cut-off value of 0.10. EEG-R through quantitative analysis might be a good method in predicting the recovery of awareness in patients with post-anoxic coma after CPR. PMID:27181515

  16. Improvements in the gaseous hydrogen-water equilibration technique for hydrogen isotope ratio analysis

    USGS Publications Warehouse

    Coplen, T.B.; Wildman, J.D.; Chen, J.

    1991-01-01

    Improved precision in the H2-H2O equilibration method for ??D analysis has been achieved in an automated system. Reduction in 1-?? standard deviation of a single mass-spectrometer analysis to 1.3??? is achieved by (1) bonding catalyst to glass rods and assigning use to specific equilibration chambers to monitor performance of catalyst, (2) improving the apparatus design, and (3) reducing the H3+ contribution of the mass-spectrometer ion source. For replicate analysis of a water sample, the standard deviation improved to 0.8???. H2S-bearing samples and samples as small as 0.1 mL can be analyzed routinely with this method.

  17. System Evaluation and Economic Analysis of a HTGR Powered High-Temperature Electrolysis Hydrogen Production Plant

    SciTech Connect

    Michael G. McKellar; Edwin A. Harvego; Anastasia A. Gandrik

    2010-10-01

    A design for a commercial-scale high-temperature electrolysis (HTE) plant for hydrogen production has been developed. The HTE plant is powered by a high-temperature gas-cooled reactor (HTGR) whose configuration and operating conditions are based on the latest design parameters planned for the Next Generation Nuclear Plant (NGNP). The current HTGR reference design specifies a reactor power of 600 MWt, with a primary system pressure of 7.0 MPa, and reactor inlet and outlet fluid temperatures of 322°C and 750°C, respectively. The power conversion unit will be a Rankine steam cycle with a power conversion efficiency of 40%. The reference hydrogen production plant operates at a system pressure of 5.0 MPa, and utilizes a steam-sweep system to remove the excess oxygen that is evolved on the anode (oxygen) side of the electrolyzer. The overall system thermal-to-hydrogen production efficiency (based on the higher heating value of the produced hydrogen) is 40.4% at a hydrogen production rate of 1.75 kg/s and an oxygen production rate of 13.8 kg/s. An economic analysis of this plant was performed with realistic financial and cost estimating assumptions. The results of the economic analysis demonstrated that the HTE hydrogen production plant driven by a high-temperature helium-cooled nuclear power plant can deliver hydrogen at a cost of $3.67/kg of hydrogen assuming an internal rate of return, IRR, of 12% and a debt to equity ratio of 80%/20%. A second analysis shows that if the power cycle efficiency increases to 44.4%, the hydrogen production efficiency increases to 42.8% and the hydrogen and oxygen production rates are 1.85 kg/s and 14.6 kg/s respectively. At the higher power cycle efficiency and an IRR of 12% the cost of hydrogen production is $3.50/kg.

  18. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    PubMed Central

    Bosschaart, Nienke; van Leeuwen, Ton G.; Aalders, Maurice C.G.; Faber, Dirk J.

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of Kraszewski, in support of their conclusion that SOCT optimization should include window shape, next to choice of window size and analysis algorithm. PMID:25401016

  19. Shifts in metabolic hydrogen sinks in the methanogenesis-inhibited ruminal fermentation: a meta-analysis

    PubMed Central

    Ungerfeld, Emilio M.

    2015-01-01

    Maximizing the flow of metabolic hydrogen ([H]) in the rumen away from CH4 and toward volatile fatty acids (VFA) would increase the efficiency of ruminant production and decrease its environmental impact. The objectives of this meta-analysis were: (i) To quantify shifts in metabolic hydrogen sinks when inhibiting ruminal methanogenesis in vitro; and (ii) To understand the variation in shifts of metabolic hydrogen sinks among experiments and between batch and continuous cultures systems when methanogenesis is inhibited. Batch (28 experiments, N = 193) and continuous (16 experiments, N = 79) culture databases of experiments with at least 50% inhibition in CH4 production were compiled. Inhibiting methanogenesis generally resulted in less fermentation and digestion in most batch culture, but not in most continuous culture, experiments. Inhibiting CH4 production in batch cultures resulted in redirection of metabolic hydrogen toward propionate and H2 but not butyrate. In continuous cultures, there was no overall metabolic hydrogen redirection toward propionate or butyrate, and H2 as a proportion of metabolic hydrogen spared from CH4 production was numerically smaller compared to batch cultures. Dihydrogen accumulation was affected by type of substrate and methanogenesis inhibitor, with highly fermentable substrates resulting in greater redirection of metabolic hydrogen toward H2 when inhibiting methanogenesis, and some oils causing small or no H2 accumulation. In both batch and continuous culture, there was a decrease in metabolic hydrogen recovered as the sum of propionate, butyrate, CH4 and H2 when inhibiting methanogenesis, and it is speculated that as CH4 production decreases metabolic hydrogen could be increasingly incorporated into formate, microbial biomass, and perhaps, reductive acetogenesis in continuous cultures. Energetic benefits of inhibiting methanogenesis depended on the inhibitor and its concentration and on the in vitro system. PMID:25699029

  20. Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Jiang, Dejun; Zhao, Shusen; Shen, Jingling

    2008-03-01

    A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.