Science.gov

Sample records for quantitative hydrogen analysis

  1. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  2. Qualitative and quantitative analysis of solar hydrogen generation literature from 2001 to 2014.

    PubMed

    Maghami, Mohammad Reza; Asl, Shahin Navabi; Rezadad, Mohammad Esmaeil; Ale Ebrahim, Nader; Gomes, Chandima

    Solar hydrogen generation is one of the new topics in the field of renewable energy. Recently, the rate of investigation about hydrogen generation is growing dramatically in many countries. Many studies have been done about hydrogen generation from natural resources such as wind, solar, coal etc. In this work we evaluated global scientific production of solar hydrogen generation papers from 2001 to 2014 in any journal of all the subject categories of the Science Citation Index compiled by Institute for Scientific Information (ISI), Philadelphia, USA. Solar hydrogen generation was used as keywords to search the parts of titles, abstracts, or keywords. The published output analysis showed that hydrogen generation from the sun research steadily increased over the past 14 years and the annual paper production in 2013 was about three times 2010-paper production. The number of papers considered in this research is 141 which have been published from 2001 to this date. There are clear distinctions among author keywords used in publications from the five most high-publishing countries such as USA, China, Australia, Germany and India in solar hydrogen studies. In order to evaluate this work quantitative and qualitative analysis methods were used to the development of global scientific production in a specific research field. The analytical results eventually provide several key findings and consider the overview hydrogen production according to the solar hydrogen generation.

  3. Quantitative hydrogen analysis of zircaloy-4 in laser-induced breakdown spectroscopy with ambient helium gas

    SciTech Connect

    Ramli, Muliadi; Fukumoto, Ken-ichi; Niki, Hideaki; Abdulmadjid, Syahrun Nur; Idris, Nasrullah; Maruyama, Tadashi; Kagawa, Kiichiro; Tjia, May On; Pardede, Marincan; Kurniawan, Koo Hendrik; Hedwig, Rinda; Lie, Zener Sukra; Lie, Tjung Jie; Kurniawan, Davy Putra

    2007-12-01

    This experiment was carried out to address the need for overcoming the difficulties encountered in hydrogen analysis by means of plasma emission spectroscopy in atmospheric ambient gas. The result of this study on zircaloy-4 samples from a nuclear power plant demonstrates the possibility of attaining a very sharp emission line from impure hydrogen with a very low background and practical elimination of spectral contamination of hydrogen emission arising from surface water and water vapor in atmospheric ambient gas. This was achieved by employing ultrapure ambient helium gas as well as the proper defocusing of the laser irradiation and a large number of repeated precleaning laser shots at the same spot of the sample surface. Further adjustment of the gating time has led to significant reduction of spectral width and improvement of detection sensitivity to {approx}50 ppm. Finally, a linear calibration curve was also obtained for the zircaloy-4 samples with zero intercept. These results demonstrate the feasibility of this technique for practical in situ and quantitative analysis of hydrogen impurity in zircaloy-4 tubes used in a light water nuclear power plant.

  4. Scattering influences in quantitative fission neutron radiography for the in situ analysis of hydrogen distribution in metal hydrides

    NASA Astrophysics Data System (ADS)

    Börries, S.; Metz, O.; Pranzas, P. K.; Bücherl, T.; Söllradl, S.; Dornheim, M.; Klassen, T.; Schreyer, A.

    2015-10-01

    In situ neutron radiography allows for the time-resolved study of hydrogen distribution in metal hydrides. However, for a precise quantitative investigation of a time-dependent hydrogen content within a host material, an exact knowledge of the corresponding attenuation coefficient is necessary. Additionally, the effect of scattering has to be considered as it is known to violate Beer's law, which is used to determine the amount of hydrogen from a measured intensity distribution. Within this study, we used a metal hydride inside two different hydrogen storage tanks as host systems, consisting of steel and aluminum. The neutron beam attenuation by hydrogen was investigated in these two different setups during the hydrogen absorption process. A linear correlation to the amount of absorbed hydrogen was found, allowing for a readily quantitative investigation. Further, an analysis of scattering contributions on the measured intensity distributions was performed and is described in detail.

  5. Highly hydrogen-sensitive thermal desorption spectroscopy system for quantitative analysis of low hydrogen concentration (˜1 × 1016 atoms/cm3) in thin-film samples

    NASA Astrophysics Data System (ADS)

    Hanna, Taku; Hiramatsu, Hidenori; Sakaguchi, Isao; Hosono, Hideo

    2017-05-01

    We developed a highly hydrogen-sensitive thermal desorption spectroscopy (HHS-TDS) system to detect and quantitatively analyze low hydrogen concentrations in thin films. The system was connected to an in situ sample-transfer chamber system, manipulators, and an rf magnetron sputtering thin-film deposition chamber under an ultra-high-vacuum (UHV) atmosphere of ˜10-8 Pa. The following key requirements were proposed in developing the HHS-TDS: (i) a low hydrogen residual partial pressure, (ii) a low hydrogen exhaust velocity, and (iii) minimization of hydrogen thermal desorption except from the bulk region of the thin films. To satisfy these requirements, appropriate materials and components were selected, and the system was constructed to extract the maximum performance from each component. Consequently, ˜2000 times higher sensitivity to hydrogen than that of a commercially available UHV-TDS system was achieved using H+-implanted Si samples. Quantitative analysis of an amorphous oxide semiconductor InGaZnO4 thin film (1 cm × 1 cm × 1 μm thickness, hydrogen concentration of 4.5 × 1017 atoms/cm3) was demonstrated using the HHS-TDS system. This concentration level cannot be detected using UHV-TDS or secondary ion mass spectroscopy (SIMS) systems. The hydrogen detection limit of the HHS-TDS system was estimated to be ˜1 × 1016 atoms/cm3, which implies ˜2 orders of magnitude higher sensitivity than that of SIMS and resonance nuclear reaction systems (˜1018 atoms/cm3).

  6. Quantitative analysis of hydrogenated diamondlike carbon films by visible Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Singha, Achintya; Ghosh, Aditi; Roy, Anushree; Ray, Nihar Ranjan

    2006-08-01

    The correlations between properties of hydrogenated diamondlike carbon films and their Raman spectra have been investigated. The films are prepared by plasma deposition technique, keeping different hydrogen to methane ratios during the growth process. The hydrogen concentration, sp3 content, hardness, and optical Tauc gap of the materials have been estimated from a detailed analysis of their Raman spectra. We have also measured the same parameters of the films by using other commonly used techniques, such as sp3 content in films by x-ray photoelectron spectroscopy, their Tauc gap by ellipsometric measurements, and hardness by microhardness testing. The reasons for the mismatch between the characteristics of the films, as obtained by Raman measurements and by the above mentioned techniques, have been discussed. We emphasize on the importance of the visible Raman spectroscopy in reliably predicting the above key properties of diamondlike carbon films.

  7. Quantitative analysis of the hydrogen peroxide formed in aqueous cigarette tar extracts

    SciTech Connect

    Nakayama, T.; Church, D.F.; Pryor, W.A. )

    1989-01-01

    We have established, for the first time, a reliable method to quantitate hydrogen peroxide (H{sub 2}O{sub 2}) generated in aqueous extracts of cigarette smoke tar. The aqueous tar extract was passed through a short reverse-phase column and its H{sub 2}O{sub 2} concentration determined by differential pulse polarography using an automatic reference subtraction system. The H{sub 2}O{sub 2} concentration increased with aging, pH and temperature; the presence of superoxide dismutase lead to lower H{sub 2}O{sub 2} concentrations. This method was applied to many kinds of research and commercial cigarettes. With a few exceptions, the amount of H{sub 2}O{sub 2} formed after a fixed time from each cigarette smoke was proportional to its tar yield.

  8. Purity analysis of hydrogen cyanide, cyanogen chloride and phosgene by quantitative (13)C NMR spectroscopy.

    PubMed

    Henderson, Terry J; Cullinan, David B

    2007-11-01

    Hydrogen cyanide, cyanogen chloride and phosgene are produced in tremendously large quantities today by the chemical industry. The compounds are also particularly attractive to foreign states and terrorists seeking an inexpensive mass-destruction capability. Along with contemporary warfare agents, therefore, the US Army evaluates protective equipment used by warfighters and domestic emergency responders against the compounds, and requires their certification at > or = 95 carbon atom % before use. We have investigated the (13)C spin-lattice relaxation behavior of the compounds to develop a quantitative NMR method for characterizing chemical lots supplied to the Army. Behavior was assessed at 75 and 126 MHz for temperatures between 5 and 15 degrees C to hold the compounds in their liquid states, dramatically improving detection sensitivity. T(1) values for cyanogen chloride and phosgene were somewhat comparable, ranging between 20 and 31 s. Hydrogen cyanide values were significantly shorter at 10-18 s, most likely because of a (1)H--(13)C dipolar contribution to relaxation not possible for the other compounds. The T(1) measurements were used to derive relaxation delays for collecting the quantitative (13)C data sets. At 126 MHz, only a single data acquisition with a cryogenic probehead gave a signal-to-noise ratio exceeding that necessary for certifying the compounds at > or = 95 carbon atom % and 99% confidence. Data acquired at 75 MHz with a conventional probehead, however, required > or = 5 acquisitions to reach this certifying signal-to-noise ratio for phosgene, and >/= 12 acquisitions were required for the other compounds under these same conditions. In terms of accuracy and execution time, the NMR method rivals typical chromatographic methods.

  9. Qualitative and quantitative analysis of mixtures of compounds containing both hydrogen and deuterium

    NASA Technical Reports Server (NTRS)

    Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.

    1969-01-01

    Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.

  10. Quantitative analysis of hydrogen in SiO2/SiN/SiO2 stacks using atom probe tomography

    NASA Astrophysics Data System (ADS)

    Kunimune, Yorinobu; Shimada, Yasuhiro; Sakurai, Yusuke; Inoue, Masao; Nishida, Akio; Han, Bin; Tu, Yuan; Takamizawa, Hisashi; Shimizu, Yasuo; Inoue, Koji; Yano, Fumiko; Nagai, Yasuyoshi; Katayama, Toshiharu; Ide, Takashi

    2016-04-01

    We have demonstrated that it is possible to reproducibly quantify hydrogen concentration in the SiN layer of a SiO2/SiN/SiO2 (ONO) stack structure using ultraviolet laser-assisted atom probe tomography (APT). The concentration of hydrogen atoms detected using APT increased gradually during the analysis, which could be explained by the effect of hydrogen adsorption from residual gas in the vacuum chamber onto the specimen surface. The amount of adsorbed hydrogen in the SiN layer was estimated by analyzing another SiN layer with an extremely low hydrogen concentration (<0.2 at. %). Thus, by subtracting the concentration of adsorbed hydrogen, the actual hydrogen concentration in the SiN layer was quantified as approximately 1.0 at. %. This result was consistent with that obtained by elastic recoil detection analysis (ERDA), which confirmed the accuracy of the APT quantification. The present results indicate that APT enables the imaging of the three-dimensional distribution of hydrogen atoms in actual devices at a sub-nanometer scale.

  11. iTRAQ-based quantitative proteomic analysis reveals new metabolic pathways of wheat seedling growth under hydrogen peroxide stress.

    PubMed

    Ge, Pei; Hao, Pengchao; Cao, Min; Guo, Guangfang; Lv, Dongwen; Subburaj, Saminathan; Li, Xiaohui; Yan, Xing; Xiao, Jitian; Ma, Wujun; Yan, Yueming

    2013-10-01

    As an abundant ROS, hydrogen peroxide (H2 O2 ) plays pivotal roles in plant growth and development. In this work, we conducted for the first time an iTRAQ-based quantitative proteomic analysis of wheat seedling growth under different exogenous H2 O2 treatments. The growth of seedlings and roots was significantly restrained by increased H2 O2 concentration stress. Malondialdehyde, soluble sugar, and proline contents as well as peroxidase activity increased with increasing H2 O2 levels. A total of 3,425 proteins were identified by iTRAQ, of which 157 showed differential expression and 44 were newly identified H2 O2 -responsive proteins. H2 O2 -responsive proteins were mainly involved in stress/defense/detoxification, signal transduction, and carbohydrate metabolism. It is clear that up-regulated expression of signal transduction and stress/defence/detoxification-related proteins under H2 O2 stress, such as plasma membrane intrinsic protein 1, fasciclin-like arabinogalactan protein, and superoxide dismutase, could contribute to H2 O2 tolerance of wheat seedlings. Increased gluconeogenesis (phosphoenol-pyruvate carboxykinase) and decreased pyruvate kinase proteins are potentially related to the higher H2 O2 tolerance of wheat seedlings. A metabolic pathway of wheat seedling growth under H2 O2 stress is presented.

  12. Quantitative kinetic analysis of hydrogen transfer reactions from dietary polyphenols to the DPPH radical.

    PubMed

    Goupy, Pascale; Dufour, Claire; Loonis, Michele; Dangles, Olivier

    2003-01-29

    Diphenylpicrylhydrazyl (DPPH) is widely used for quickly assessing the ability of polyphenols to transfer labile H atoms to radicals, a likely mechanism of antioxidant protection. This popular test generally pays no attention to the kinetics of H atom transfer, which however could be even more important than the total H-atom-donating capacities (stoichiometry, EC50) typically evaluated. In the present work, a series of dietary polyphenols belonging to the most representative families (flavonols from onion, flavanol monomers and oligomers from barley, and caffeic acid and caffeoyl esters from artichoke and endive) are characterized not only by their total stoichiometries (n(tot)) but also by their rate constants of first H atom abstraction by DPPH (k(1)), deduced from the kinetic analysis of the decay of the DPPH visible band following addition of the antioxidant. The mildly reactive DPPH radical allows a good discrimation between polyphenols, as demonstrated by the relatively large ranges of k(1) (ca. 400-5000 M(-)(1) s(-)(1)) and n(tot) (ca. 1-5) values typically measured with antioxidants having a single polyphenolic nucleus. With antioxidants displaying more than one polyphenolic nucleus (procyanidin oligomers, dicaffeoyl esters), the kinetic analysis makes it possible to demonstrate significant differences in reactivity between the subunits (two distinct k(1) values whose ratio lies in the range 3-10) and nonadditive stoichiometries.

  13. Quantitative determination of hydrogen in solids by gas chromatography.

    PubMed

    Addach, H; Berçot, P; Wery, M; Rezrazi, M

    2004-11-19

    Processes such as electroplating or acid cleaning are notorious causes of post-processing failure through hydrogen embrittlement. So, the determination of amounts of hydrogen in metals is of great importance. An analysis method for investigation of H content in solids has been established based on hot extraction and gas chromatography system. Hot extraction in inert gas enables complete and/or partial removal of the hydrogen from the samples. A gas chromatography system is used to determine quantitatively the amount of thermally desorbed hydrogen. An investigation of the baking operating conditions is made of the hydrogen desorption rate of zinc-plated steel parts. Then, an analysis of the polarisation conditions upon chromium electroplating is given.

  14. Highly hydrogen-sensitive thermal desorption spectroscopy system for quantitative analysis of low hydrogen concentration (∼1 × 10(16) atoms/cm(3)) in thin-film samples.

    PubMed

    Hanna, Taku; Hiramatsu, Hidenori; Sakaguchi, Isao; Hosono, Hideo

    2017-05-01

    We developed a highly hydrogen-sensitive thermal desorption spectroscopy (HHS-TDS) system to detect and quantitatively analyze low hydrogen concentrations in thin films. The system was connected to an in situ sample-transfer chamber system, manipulators, and an rf magnetron sputtering thin-film deposition chamber under an ultra-high-vacuum (UHV) atmosphere of ∼10(-8) Pa. The following key requirements were proposed in developing the HHS-TDS: (i) a low hydrogen residual partial pressure, (ii) a low hydrogen exhaust velocity, and (iii) minimization of hydrogen thermal desorption except from the bulk region of the thin films. To satisfy these requirements, appropriate materials and components were selected, and the system was constructed to extract the maximum performance from each component. Consequently, ∼2000 times higher sensitivity to hydrogen than that of a commercially available UHV-TDS system was achieved using H(+)-implanted Si samples. Quantitative analysis of an amorphous oxide semiconductor InGaZnO4 thin film (1 cm × 1 cm × 1 μm thickness, hydrogen concentration of 4.5 × 10(17) atoms/cm(3)) was demonstrated using the HHS-TDS system. This concentration level cannot be detected using UHV-TDS or secondary ion mass spectroscopy (SIMS) systems. The hydrogen detection limit of the HHS-TDS system was estimated to be ∼1 × 10(16) atoms/cm(3), which implies ∼2 orders of magnitude higher sensitivity than that of SIMS and resonance nuclear reaction systems (∼10(18) atoms/cm(3)).

  15. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  16. Quantitative phase analysis from powder diffraction using de Rietveld method in hydrogen storage alloys based on TiCr

    NASA Astrophysics Data System (ADS)

    Martinez, A.; Bellon, D.; Reina, L.

    2016-08-01

    Hydrogen storage is one of the important steps in the implementation of the hydrogen economy; metal hydrides are a promising way to achieve this goal. We present in this work the use of Rietveld analysis to characterize structurally TiCr-based alloys that are able to store hydrogen. TiCruV09, TiCrL1V0.45Nb0.45, TiCr1.1V0.2 Nb0.8, TiCr1.1Nb0.9 alloys were synthesized in an arc furnace under argon atmosphere. The analysis of phases was developed by X-Ray Diffraction (XRD) for further refinement of both the two lattice parameters and the percentage of the phases. Our results confirmed that a structure bcc, mostly combined with a small percentage of Laves phases, leads to obtain important properties in this area. Rietveld analysis was performed by the Fullprof program and this program allows us to obtain the different structural parameters.

  17. Quantitative analysis of desorption and decomposition kinetics of formic acid on Cu(111): The importance of hydrogen bonding between adsorbed species

    SciTech Connect

    Shiozawa, Yuichiro; Koitaya, Takanori; Mukai, Kozo; Yoshimoto, Shinya; Yoshinobu, Jun

    2015-12-21

    Quantitative analysis of desorption and decomposition kinetics of formic acid (HCOOH) on Cu(111) was performed by temperature programmed desorption (TPD), X-ray photoelectron spectroscopy, and time-resolved infrared reflection absorption spectroscopy. The activation energy for desorption is estimated to be 53–75 kJ/mol by the threshold TPD method as a function of coverage. Vibrational spectra of the first layer HCOOH at 155.3 K show that adsorbed molecules form a polymeric structure via the hydrogen bonding network. Adsorbed HCOOH molecules are dissociated gradually into monodentate formate species. The activation energy for the dissociation into monodentate formate species is estimated to be 65.0 kJ/mol at a submonolayer coverage (0.26 molecules/surface Cu atom). The hydrogen bonding between adsorbed HCOOH species plays an important role in the stabilization of HCOOH on Cu(111). The monodentate formate species are stabilized at higher coverages, because of the lack of vacant sites for the bidentate formation.

  18. Analysis of hydrogen isotope mixtures

    DOEpatents

    Villa-Aleman, Eliel

    1994-01-01

    An apparatus and method for determining the concentrations of hydrogen isotopes in a sample. Hydrogen in the sample is separated from other elements using a filter selectively permeable to hydrogen. Then the hydrogen is condensed onto a cold finger or cryopump. The cold finger is rotated as pulsed laser energy vaporizes a portion of the condensed hydrogen, forming a packet of molecular hydrogen. The desorbed hydrogen is ionized and admitted into a mass spectrometer for analysis.

  19. Quantitative biology of hydrogen peroxide signaling.

    PubMed

    Antunes, Fernando; Brito, Paula Matos

    2017-10-01

    Hydrogen peroxide (H2O2) controls signaling pathways in cells by oxidative modulation of the activity of redox sensitive proteins denominated redox switches. Here, quantitative biology concepts are applied to review how H2O2 fulfills a key role in information transmission. Equations described lay the foundation of H2O2 signaling, give new insights on H2O2 signaling mechanisms, and help to learn new information from common redox signaling experiments. A key characteristic of H2O2 signaling is that the ratio between reduction and oxidation of redox switches determines the range of H2O2 concentrations to which they respond. Thus, a redox switch with low H2O2-dependent oxidability and slow reduction rate responds to the same range of H2O2 concentrations as a redox switch with high H2O2-dependent oxidability, but that is rapidly reduced. Yet, in the first case the response time is slow while in the second case is rapid. H2O2 sensing and transmission of information can be done directly or by complex mechanisms in which oxidation is relayed between proteins before oxidizing the final regulatory redox target. In spite of being a very simple molecule, H2O2 has a key role in cellular signaling, with the reliability of the information transmitted depending on the inherent chemical reactivity of redox switches, on the presence of localized H2O2 pools, and on the molecular recognition between redox switches and their partners. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Identification of suitable reference genes for real-time quantitative PCR analysis of hydrogen peroxide-treated human umbilical vein endothelial cells.

    PubMed

    Li, Tianyi; Diao, Hongying; Zhao, Lei; Xing, Yue; Zhang, Jichang; Liu, Ning; Yan, Youyou; Tian, Xin; Sun, Wei; Liu, Bin

    2017-04-05

    Oxidative stress can induce cell injury in vascular endothelial cells, which is the initial event in the development of atherosclerosis. Although quantitative real-time polymerase chain reaction (qRT-PCR) has been widely used in gene expression studies in oxidative stress injuries, using carefully validated reference genes has not received sufficient attention in related studies. The objective of this study, therefore, was to select a set of stably expressed reference genes for use in qRT-PCR normalization in oxidative stress injuries in human umbilical vein endothelial cells (HUVECs) induced by hydrogen peroxide (H2O2). Using geNorm analysis, we found that five stably expressed reference genes were sufficient for normalization in qRT-PCR analysis in HUVECs treated with H2O2. Genes with the most stable expression according to geNorm were U6, TFRC, RPLP0, GAPDH, and ACTB, and according to NormFinder were ALAS1, TFRC, U6, GAPDH, and ACTB. Taken together, our study demonstrated that the expression stability of reference genes may differ according to the statistical program used. U6, TFRC, RPLP0, GAPDH, and ACTB was the optimal set of reference genes for studies on gene expression performed by qRT-PCR assays in HUVECs under oxidative stress study.

  1. Quantitative analysis of hydrogen sites and occupancy in deep mantle hydrous wadsleyite using single crystal neutron diffraction

    NASA Astrophysics Data System (ADS)

    Purevjav, Narangoo; Okuchi, Takuo; Tomioka, Naotaka; Wang, Xiaoping; Hoffmann, Christina

    2016-10-01

    Evidence from seismological and mineralogical studies increasingly indicates that water from the oceans has been transported to the deep earth to form water-bearing dense mantle minerals. Wadsleyite [(Mg, Fe2+)2SiO4] has been identified as one of the most important host minerals incorporating this type of water, which is capable of storing the entire mass of the oceans as a hidden reservoir. To understand the effects of such water on the physical properties and chemical evolution of Earth’s interior, it is essential to determine where in the crystal structure the hydration occurs and which chemical bonds are altered and weakened after hydration. Here, we conduct a neutron time-of-flight single-crystal Laue diffraction study on hydrous wadsleyite. Single crystals were grown under pressure to a size suitable for the experiment and with physical qualities representative of wet, deep mantle conditions. The results of this neutron single crystal diffraction study unambiguously demonstrate the method of hydrogen incorporation into the wadsleyite, which is qualitatively different from that of its denser polymorph, ringwoodite, in the wet mantle. The difference is a vital clue towards understanding why these dense mantle minerals show distinctly different softening behaviours after hydration.

  2. Quantitative analysis of hydrogen sites and occupancy in deep mantle hydrous wadsleyite using single crystal neutron diffraction

    PubMed Central

    Purevjav, Narangoo; Okuchi, Takuo; Tomioka, Naotaka; Wang, Xiaoping; Hoffmann, Christina

    2016-01-01

    Evidence from seismological and mineralogical studies increasingly indicates that water from the oceans has been transported to the deep earth to form water-bearing dense mantle minerals. Wadsleyite [(Mg, Fe2+)2SiO4] has been identified as one of the most important host minerals incorporating this type of water, which is capable of storing the entire mass of the oceans as a hidden reservoir. To understand the effects of such water on the physical properties and chemical evolution of Earth’s interior, it is essential to determine where in the crystal structure the hydration occurs and which chemical bonds are altered and weakened after hydration. Here, we conduct a neutron time-of-flight single-crystal Laue diffraction study on hydrous wadsleyite. Single crystals were grown under pressure to a size suitable for the experiment and with physical qualities representative of wet, deep mantle conditions. The results of this neutron single crystal diffraction study unambiguously demonstrate the method of hydrogen incorporation into the wadsleyite, which is qualitatively different from that of its denser polymorph, ringwoodite, in the wet mantle. The difference is a vital clue towards understanding why these dense mantle minerals show distinctly different softening behaviours after hydration. PMID:27725749

  3. Quantitative analysis of hydrogen in SiO{sub 2}/SiN/SiO{sub 2} stacks using atom probe tomography

    SciTech Connect

    Kunimune, Yorinobu Shimada, Yasuhiro; Sakurai, Yusuke; Katayama, Toshiharu; Ide, Takashi; Inoue, Masao; Nishida, Akio; Han, Bin; Tu, Yuan; Takamizawa, Hisashi; Shimizu, Yasuo; Inoue, Koji; Nagai, Yasuyoshi; Yano, Fumiko

    2016-04-15

    We have demonstrated that it is possible to reproducibly quantify hydrogen concentration in the SiN layer of a SiO{sub 2}/SiN/SiO{sub 2} (ONO) stack structure using ultraviolet laser-assisted atom probe tomography (APT). The concentration of hydrogen atoms detected using APT increased gradually during the analysis, which could be explained by the effect of hydrogen adsorption from residual gas in the vacuum chamber onto the specimen surface. The amount of adsorbed hydrogen in the SiN layer was estimated by analyzing another SiN layer with an extremely low hydrogen concentration (<0.2 at. %). Thus, by subtracting the concentration of adsorbed hydrogen, the actual hydrogen concentration in the SiN layer was quantified as approximately 1.0 at. %. This result was consistent with that obtained by elastic recoil detection analysis (ERDA), which confirmed the accuracy of the APT quantification. The present results indicate that APT enables the imaging of the three-dimensional distribution of hydrogen atoms in actual devices at a sub-nanometer scale.

  4. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  5. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  6. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  7. A Volumetric Method for Titrimetric Analysis of Hydrogen Peroxide

    DTIC Science & Technology

    1985-05-06

    side it necessary and iden~tify by block nambet) *Hydrogen Peroxide Quantitative Analysis *Potassium Dichromate * Volumetrie Analysis,~ Ferrous Ammonium ...report describes a titrimetric method (using ferrous- dichromate oxidation reduction) of analysis for hydrogen peroxide. The concept is theoretically...2 COMPARISON OF FERROUS SOLUTION TO DICHROMATE SOLUTION . . . . . . . . .. 3 PROCEDURE . . . . . . . . . . . . . . . . . 3 CALCULATIONS

  8. Hydrogen Data Book from the Hydrogen Analysis Resource Center

    DOE Data Explorer

    The Hydrogen Data Book contains a wide range of factual information on hydrogen and fuel cells (e.g., hydrogen properties, hydrogen production and delivery data, and information on fuel cells and fuel cell vehicles), and it also provides other data that might be useful in analyses of hydrogen infrastructure in the United States (e.g., demographic data and data on energy supply and/or infrastructure). ItÆs made available from the Hydrogen Analysis Resource Center along with a wealth of related information. The related information includes guidelines for DOE Hydrogen Program Analysis, various calculator tools, a hydrogen glossary, related websites, and analysis tools relevant to hydrogen and fuel cells. [From http://hydrogen.pnl.gov/cocoon/morf/hydrogen

  9. Study of Surface Damage caused by Laser Irradiation for Quantitative Hydrogen Analysis in Zircaloy using Laser-induced Plasma Breakdown Spectrometry

    SciTech Connect

    Fukumoto, K.; Yamada, N.; Niki, H.; Maruyama, T.; Kagawa, K.

    2009-03-17

    The surface damage caused by laser irradiation is studied to investigate the possibility of performing a depth-profile analysis of the hydrogen concentration in zircaloy-4 alloys using laser-induced plasma breakdown spectrometry. After laser irradiation, a heat-affected zone extending about 3 {mu}m down from the top surface can be seen. The depth of this heat-affected zone is independent of the laser power density in the range 10{sup 8} to 10{sup 9} W/cm{sup 2}. In order to obtain the depth profile of the hydrogen concentration in zircaloy-4 alloys, the power density of laser shots must be greater than 1.3x10{sup 9} W/cm{sup 2}.

  10. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  11. [Quantitative determination of penicillins by iodometry using potassium hydrogen peroxymonosulfate].

    PubMed

    Blazhevskiĭ, N E; Karpova, S P; Kabachyĭ, V I

    2013-01-01

    The kinetics and stoichiometry of S-oxidation of semisynthetic penicillins (amoxicillin trihydrate, ampicillin trihydrate, sodium salt of oxacillin and ticarcillin disodium salt) by potassium hydrogen peroxymonosulfate in aqueous solutions at pH 3-6 was studied by iodometric titration: 1 mol of KNSO5 per 1 mol of penicillin, the quantitative interaction is achieved in 1 min (time of observation). A unified method was developed and the possibility of quantification of penicillins by the iodometric method using potassium hydrogen peroxymonosulfate as an analytical reagent was shown.

  12. Semiquantal analysis of hydrogen bond

    NASA Astrophysics Data System (ADS)

    Ando, Koji

    2006-07-01

    The semiquantal time-dependent Hartree (SQTDH) theory is applied to the coupled Morse and modified Lippincott-Schroeder (LS) model potentials of hydrogen bond. The structural correlation between the heavy atoms distance and the proton position, the geometric isotope effect, the energy of hydrogen bond formation, and the proton vibrational frequency shift are examined in a broad range of structural parameters. In particular, the geometric isotope effect is found to depend notably on the choice of the potential model, for which the LS potential gives the isotope shift of the heavy atoms distance in the range of 0.02-0.04Å, in quantitative agreement with the experimental findings from assortment of hydrogen bonding crystals. The fourth-order expansion approximation to the semiquantal extended potential was confirmed to be highly accurate in reproducing the full SQTDH results. The approximation is computationally efficient and flexible enough to be applied to general models of hydrogen bond.

  13. Technical Analysis of Hydrogen Production

    SciTech Connect

    Ali T-Raissi

    2005-01-14

    The aim of this work was to assess issues of cost, and performance associated with the production and storage of hydrogen via following three feedstocks: sub-quality natural gas (SQNG), ammonia (NH{sub 3}), and water. Three technology areas were considered: (1) Hydrogen production utilizing SQNG resources, (2) Hydrogen storage in ammonia and amine-borane complexes for fuel cell applications, and (3) Hydrogen from solar thermochemical cycles for splitting water. This report summarizes our findings with the following objectives: Technoeconomic analysis of the feasibility of the technology areas 1-3; Evaluation of the hydrogen production cost by technology areas 1; and Feasibility of ammonia and/or amine-borane complexes (technology areas 2) as a means of hydrogen storage on-board fuel cell powered vehicles. For each technology area, we reviewed the open literature with respect to the following criteria: process efficiency, cost, safety, and ease of implementation and impact of the latest materials innovations, if any. We employed various process analysis platforms including FactSage chemical equilibrium software and Aspen Technologies AspenPlus and HYSYS chemical process simulation programs for determining the performance of the prospective hydrogen production processes.

  14. Quantitative Techniques in Volumetric Analysis

    NASA Astrophysics Data System (ADS)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  15. Quantitative analysis of PET studies.

    PubMed

    Weber, Wolfgang A

    2010-09-01

    Quantitative analysis can be included relatively easily in clinical PET-imaging protocols, but in order to obtain meaningful quantitative results one needs to follow a standardized protocol for image acquisition and data analysis. Important factors to consider are the calibration of the PET scanner, the radiotracer uptake time and the approach for definition of regions of interests. Using such standardized acquisition protocols quantitative parameters of tumor metabolism or receptor status can be derived from tracer kinetic analysis and simplified approaches such as calculation of standardized uptake values (SUVs).

  16. Quantitative analysis in megageomorphology

    NASA Technical Reports Server (NTRS)

    Mayer, L.

    1985-01-01

    Megageomorphology is the study of regional topographic features and their relations to independent geomorphic variables that operate at the regional scale. These independent variables can be classified as either tectonic or climatic in nature. Quantitative megageomorphology stresses the causal relations between plate tectonic factors and landscape features or correlations between climatic factors and geomorphic processes. In addition, the cumulative effects of tectonics and climate on landscape evolution that simultaneously operate in a complex system of energy transfer is of interst. Regional topographic differentiation, say between continents and ocean floors, is largely the result of the different densities and density contrasts within the oceanic and continental lithosphere and their isostatic consequences. Regional tectonic processes that alter these lithospheric characteristics include rifting, collision, subduction, transpression and transtension.

  17. Thermomagnetic analysis of hydrogenated nickel

    NASA Astrophysics Data System (ADS)

    Tavares, S. S. M.; Miraglia, S.; Lafuente, A.; Fruchart, D.

    2002-04-01

    The effect of hydrogen inserted by electrolytic charging on the magnetic properties of nickel is discussed by taking into account the thermomagnetic analysis (TMA), X-ray diffraction and saturation magnetization results. After hydrogenation, thin foils of nickel presented a biphasic structure of metastable β-NiH x ( x=0.67±0.07) and α-Ni (with <0.03 at% H). During room temperature aging the β-NiH x hydride decomposes into α-Ni and H 2. The TMA heating curves obtained just after hydrogenation show two magnetic transitions, the first one in the range 100-120°C and the second that of Curie point of Ni. Between the first and the second transition an abrupt increase of magnetization is observed, which is due to the formation of more ferromagnetic nickel from the hydride decomposition. On the other hand, the first transition of the TMA curve can only be attributed to the ferromagnetism of some regions of phase β.

  18. Software for quantitative trait analysis.

    PubMed

    Almasy, Laura; Warren, Diane M

    2005-09-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed.

  19. Software for quantitative trait analysis

    PubMed Central

    2005-01-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed. PMID:16197737

  20. Quantitative two-photon laser-induced fluorescence of hydrogen atoms in a 1 kW arcjet thruster

    NASA Astrophysics Data System (ADS)

    Wysong, I. J.; Pobst, J. A.

    1998-08-01

    Quantitative measurements of atomic hydrogen are reported for an arcjet thruster using two-photon laser-induced fluorescence. Number density, axial and radial velocity, and temperature of ground state atomic hydrogen are obtained at the nozzle exit plane and in the downstream plume of a 1 kW arcjet operating on hydrogen propellant. Details of the technique and data analysis are provided. Comparisons with other related available data are made, as well as with several computational models. The observed dissociation fraction of 31ᆢ %is significantly higher than predicted by the models.

  1. Skin moisturization by hydrogenated polyisobutene--quantitative and visual evaluation.

    PubMed

    Dayan, Nava; Sivalenka, Rajarajeswari; Chase, John

    2009-01-01

    Hydrogenated polyisobutene (HP) is used in topically applied cosmetic/personal care formulations as an emollient that leaves a pleasing skin feel when applied, and rubbed in after application. This effect, although distinguishable to the user, is difficult to define and quantify. Recognizing that some of the physical properties of HP such as film formation and wear resistance may contribute, in certain mechanisms, to skin moisturization, we designed a short-term pilot study to follow changes in skin moisturization. HP's incorporation into an o/w emulsion at 8% yielded increased viscosity and reduced emulsion droplet size as compared to the emollient ester CCT (capric/caprylic triglyceride) or a control formulation. Quantitative data indicate that application of the o/w emulsion formulation containing either HP or CCT significantly elevated skin moisture content and thus reduced transepidermal water loss (TEWL) by a maximal approximately 33% against the control formulation within 3 h and maintained this up to 6 h. Visual observation of skin treated with the HP-containing formulation showed fine texture and clear contrast as compared to the control or the CCT formulation, confirming this effect. As a result of increased hydration, skin conductivity, as measured in terms of corneometer values, was also elevated significantly by about tenfold as early as 20 min after HP or CCT application and was maintained throughout the test period. Throughout the test period the HP formulation was 5-10% more effective than the CCT formulation both in reduction of TEWL as well as in increased skin conductivity. Thus, compared to the emollient ester (CCT), HP showed a unique capability for long-lasting effect in retaining moisture and improving skin texture.

  2. Task D: Hydrogen safety analysis

    SciTech Connect

    Swain, M.R.; Sievert, B.G.; Swain, M.N.

    1996-10-01

    This report covers two topics. The first is a review of codes, standards, regulations, recommendations, certifications, and pamphlets which address safety of gaseous fuels. The second is an experimental investigation of hydrogen flame impingement. Four areas of concern in the conversion of natural gas safety publications to hydrogen safety publications are delineated. Two suggested design criteria for hydrogen vehicle fuel systems are proposed. It is concluded from the experimental work that light weight, low cost, firewalls to resist hydrogen flame impingement are feasible.

  3. Quantitative analysis of glycated proteins.

    PubMed

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  4. Bioimaging for quantitative phenotype analysis.

    PubMed

    Chen, Weiyang; Xia, Xian; Huang, Yi; Chen, Xingwei; Han, Jing-Dong J

    2016-06-01

    With the development of bio-imaging techniques, an increasing number of studies apply these techniques to generate a myriad of image data. Its applications range from quantification of cellular, tissue, organismal and behavioral phenotypes of model organisms, to human facial phenotypes. The bio-imaging approaches to automatically detect, quantify, and profile phenotypic changes related to specific biological questions open new doors to studying phenotype-genotype associations and to precisely evaluating molecular changes associated with quantitative phenotypes. Here, we review major applications of bioimage-based quantitative phenotype analysis. Specifically, we describe the biological questions and experimental needs addressable by these analyses, computational techniques and tools that are available in these contexts, and the new perspectives on phenotype-genotype association uncovered by such analyses.

  5. Quantitative analysis of retinal OCT.

    PubMed

    Sonka, Milan; Abràmoff, Michael D

    2016-10-01

    Clinical acceptance of 3-D OCT retinal imaging brought rapid development of quantitative 3-D analysis of retinal layers, vasculature, retinal lesions as well as facilitated new research in retinal diseases. One of the cornerstones of many such analyses is segmentation and thickness quantification of retinal layers and the choroid, with an inherently 3-D simultaneous multi-layer LOGISMOS (Layered Optimal Graph Image Segmentation for Multiple Objects and Surfaces) segmentation approach being extremely well suited for the task. Once retinal layers are segmented, regional thickness, brightness, or texture-based indices of individual layers can be easily determined and thus contribute to our understanding of retinal or optic nerve head (ONH) disease processes and can be employed for determination of disease status, treatment responses, visual function, etc. Out of many applications, examples provided in this paper focus on image-guided therapy and outcome prediction in age-related macular degeneration and on assessing visual function from retinal layer structure in glaucoma.

  6. Chemical/hydrogen energy systems analysis

    NASA Astrophysics Data System (ADS)

    Beller, M.

    1982-12-01

    Four hydrogen energy technologies are addressed including: hydrogen recovery from hydrogen separation using hydride technology, photochemical hydrogen production, anode depolarization in electrolytic hydrogen production.

  7. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-05

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses.

  8. Prediction of psychotropic properties of lisuride hydrogen maleate by quantitative pharmaco-electroencephalogram.

    PubMed

    Itil, T M; Herrmann, W M; Akpinar, S

    1975-07-01

    Based on "quantitative pharmaco-EEG" using computer-analyzed EEG (CEEG) measurements, unknown CNS effects of lisuride hydrogen maleate (LHM) were established. CEEG profiles of LHM in low dosages (less than or equal to 10 mcg) are similar to CNS "inhibitory" compounds, while in higher dosages (25 mcg to 100 mcg) they resemble "psychostimulant" compounds. By measuring the brain function using computer period analysis of cerebral biopotentials, dose-efficacy relations were found (in the range of 25-75 mcg) which suggest the bioavailability of LHM at the CNS level. By comparing the CEEG profiles of LHM with the previously studied compounds, five different clinical uses of LHM were predicted. The pilot trials suggest that LHM may have therapeutic potentials in patients with "aging" and/or organic brain syndromes, and in children with behavioral disturbances.

  9. Quantitative Tools for Dissection of Hydrogen-Producing Metabolic Networks-Final Report

    SciTech Connect

    Rabinowitz, Joshua D.; Dismukes, G.Charles.; Rabitz, Herschel A.; Amador-Noguez, Daniel

    2012-10-19

    During this project we have pioneered the development of integrated experimental-computational technologies for the quantitative dissection of metabolism in hydrogen and biofuel producing microorganisms (i.e. C. acetobutylicum and various cyanobacteria species). The application of these new methodologies resulted in many significant advances in the understanding of the metabolic networks and metabolism of these organisms, and has provided new strategies to enhance their hydrogen or biofuel producing capabilities. As an example, using mass spectrometry, isotope tracers, and quantitative flux-modeling we mapped the metabolic network structure in C. acetobutylicum. This resulted in a comprehensive and quantitative understanding of central carbon metabolism that could not have been obtained using genomic data alone. We discovered that biofuel production in this bacterium, which only occurs during stationary phase, requires a global remodeling of central metabolism (involving large changes in metabolite concentrations and fluxes) that has the effect of redirecting resources (carbon and reducing power) from biomass production into solvent production. This new holistic, quantitative understanding of metabolism is now being used as the basis for metabolic engineering strategies to improve solvent production in this bacterium. In another example, making use of newly developed technologies for monitoring hydrogen and NAD(P)H levels in vivo, we dissected the metabolic pathways for photobiological hydrogen production by cyanobacteria Cyanothece sp. This investigation led to the identification of multiple targets for improving hydrogen production. Importantly, the quantitative tools and approaches that we have developed are broadly applicable and we are now using them to investigate other important biofuel producers, such as cellulolytic bacteria.

  10. Quantitative analysis of glycoprotein glycans.

    PubMed

    Orlando, Ron

    2013-01-01

    The ability to quantitatively determine changes in the N- and O-linked glycans is an essential component of comparative glycomics. Multiple strategies are available to by which this can be accomplished, including; both label free approaches and isotopic labeling strategies. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  11. Quantitative observations of hydrogen-induced, slow crack growth in a low alloy steel

    NASA Technical Reports Server (NTRS)

    Nelson, H. G.; Williams, D. P.

    1973-01-01

    Hydrogen-induced slow crack growth, da/dt, was studied in AISI-SAE 4130 low alloy steel in gaseous hydrogen and distilled water environments as a function of applied stress intensity, K, at various temperatures, hydrogen pressures, and alloy strength levels. At low values of K, da/dt was found to exhibit a strong exponential K dependence (Stage 1 growth) in both hydrogen and water. At intermediate values of K, da/dt exhibited a small but finite K dependence (Stage 2), with the Stage 2 slope being greater in hydrogen than in water. In hydrogen, at a constant K, (da/dt) sub 2 varied inversely with alloy strength level and varied essentially in the same complex manner with temperature and hydrogen pressure as noted previously. The results of this study provide support for most of the qualitative predictions of the lattice decohesion theory as recently modified by Oriani. The lack of quantitative agreement between data and theory and the inability of theory to explain the observed pressure dependence of slow crack growth are mentioned and possible rationalizations to account for these differences are presented.

  12. The hydrogen abundance in stars: a first major step for quantitative astrophysics

    NASA Astrophysics Data System (ADS)

    Cenadelli, Davide

    2008-07-01

    Historiography has recognized that Saha's work in the early 1920s was the beginning of a quantitative era in astrophysics, and the deduction of the large hydrogen abundance in stars around 1930 was a major outcome of Saha's theory. In this paper, the development of stellar physics in these years is analysed, and the recognition of the hydrogen abundance is pointed out as the first major achievement of the quantitative era. This idea is sustained from two different points of view. First, there exists a tight scientific continuity from Saha's investigative papers up to Russell's 1929 paper where the hydrogen abundance was clearly worked out: the whole of the 1920s should therefore be considered as a scientific discontinuity that paved the way for modern stellar spectroscopy. Second, in 1932 the same conclusion was reached by Strömgren and Eddington, who were working on the problem of internal stellar structure. Thus, the hydrogen abundance can be viewed as the first major step of the quantitative era, as it led to the first sound theory of stellar structure, both for the inner and the surface regions of stars.

  13. Quantitative Analysis of Face Symmetry.

    PubMed

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait.

  14. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  15. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  16. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  17. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  18. Screening analysis of solar thermochemical hydrogen concepts.

    SciTech Connect

    Diver, Richard B., Jr.; Kolb, Gregory J.

    2008-03-01

    A screening analysis was performed to identify concentrating solar power (CSP) concepts that produce hydrogen with the highest efficiency. Several CSP concepts were identified that have the potential to be much more efficient than today's low-temperature electrolysis technology. They combine a central receiver or dish with either a thermochemical cycle or high-temperature electrolyzer that operate at temperatures >600 C. The solar-to-hydrogen efficiencies of the best central receiver concepts exceed 20%, significantly better than the 14% value predicted for low-temperature electrolysis.

  19. Sensitivity analysis in quantitative microbial risk assessment.

    PubMed

    Zwieterin, M H; van Gerwen, S J

    2000-07-15

    The occurrence of foodborne disease remains a widespread problem in both the developing and the developed world. A systematic and quantitative evaluation of food safety is important to control the risk of foodborne diseases. World-wide, many initiatives are being taken to develop quantitative risk analysis. However, the quantitative evaluation of food safety in all its aspects is very complex, especially since in many cases specific parameter values are not available. Often many variables have large statistical variability while the quantitative effect of various phenomena is unknown. Therefore, sensitivity analysis can be a useful tool to determine the main risk-determining phenomena, as well as the aspects that mainly determine the inaccuracy in the risk estimate. This paper presents three stages of sensitivity analysis. First, deterministic analysis selects the most relevant determinants for risk. Overlooking of exceptional, but relevant cases is prevented by a second, worst-case analysis. This analysis finds relevant process steps in worst-case situations, and shows the relevance of variations of factors for risk. The third, stochastic analysis, studies the effects of variations of factors for the variability of risk estimates. Care must be taken that the assumptions made as well as the results are clearly communicated. Stochastic risk estimates are, like deterministic ones, just as good (or bad) as the available data, and the stochastic analysis must not be used to mask lack of information. Sensitivity analysis is a valuable tool in quantitative risk assessment by determining critical aspects and effects of variations.

  20. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  1. Thermal Analysis of Cryogenic Hydrogen Liquid Separator

    NASA Technical Reports Server (NTRS)

    Congiardo, Jared F.; Fortier, Craig R. (Editor)

    2014-01-01

    During launch for the new Space Launch System (SLS) liquid hydrogen is bleed through the engines during replenish, pre-press, and extended pre-press to condition the engines prior to launch. The predicted bleed flow rates are larger than for the shuttle program. A consequence of the increased flow rates is having liquif hydrogen in the vent system, which the facilities was never designed to handle. To remedy the problem a liquid separator is being designed in the system to accumulated the liquid propellant and protect the facility flare stack (which can only handle gas). The attached document is a presentation of the current thermalfluid analysis performed for the separator and will be presented at the Thermal and Fluid Analysis Workshop (NASA workshop) next week in Cleveland, Ohio.

  2. Analysis of Hydrogen Production from Renewable Electricity Sources: Preprint

    SciTech Connect

    Levene, J. I.; Mann, M. K.; Margolis, R.; Milbrandt, A.

    2005-09-01

    To determine the potential for hydrogen production via renewable electricity sources, three aspects of the system are analyzed: a renewable hydrogen resource assessment, a cost analysis of hydrogen production via electrolysis, and the annual energy requirements of producing hydrogen for refueling. The results indicate that ample resources exist to produce transportation fuel from wind and solar power. However, hydrogen prices are highly dependent on electricity prices.

  3. Mobile app-based quantitative scanometric analysis.

    PubMed

    Wong, Jessica X H; Liu, Frank S F; Yu, Hua-Zhong

    2014-12-16

    The feasibility of using smartphones and other mobile devices as the detection platform for quantitative scanometric assays is demonstrated. The different scanning modes (color, grayscale, black/white) and grayscale converting protocols (average, weighted average/luminosity, and software specific) have been compared in determining the optical darkness ratio (ODR) values, a conventional quantitation measure for scanometric assays. A mobile app was developed to image and analyze scanometric assays, as demonstrated by paper-printed tests and a biotin-streptavidin assay on a plastic substrate. Primarily for ODR analysis, the app has been shown to perform as well as a traditional desktop scanner, augmenting that smartphones (and other mobile devices) promise to be a practical platform for accurate, quantitative chemical analysis and medical diagnostics.

  4. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  5. Quantitative dissection of hydrogen bond-mediated proton transfer in the ketosteroid isomerase active site

    PubMed Central

    Sigala, Paul A.; Fafarman, Aaron T.; Schwans, Jason P.; Fried, Stephen D.; Fenn, Timothy D.; Caaveiro, Jose M. M.; Pybus, Brandon; Ringe, Dagmar; Petsko, Gregory A.; Boxer, Steven G.; Herschlag, Daniel

    2013-01-01

    Hydrogen bond networks are key elements of protein structure and function but have been challenging to study within the complex protein environment. We have carried out in-depth interrogations of the proton transfer equilibrium within a hydrogen bond network formed to bound phenols in the active site of ketosteroid isomerase. We systematically varied the proton affinity of the phenol using differing electron-withdrawing substituents and incorporated site-specific NMR and IR probes to quantitatively map the proton and charge rearrangements within the network that accompany incremental increases in phenol proton affinity. The observed ionization changes were accurately described by a simple equilibrium proton transfer model that strongly suggests the intrinsic proton affinity of one of the Tyr residues in the network, Tyr16, does not remain constant but rather systematically increases due to weakening of the phenol–Tyr16 anion hydrogen bond with increasing phenol proton affinity. Using vibrational Stark spectroscopy, we quantified the electrostatic field changes within the surrounding active site that accompany these rearrangements within the network. We were able to model these changes accurately using continuum electrostatic calculations, suggesting a high degree of conformational restriction within the protein matrix. Our study affords direct insight into the physical and energetic properties of a hydrogen bond network within a protein interior and provides an example of a highly controlled system with minimal conformational rearrangements in which the observed physical changes can be accurately modeled by theoretical calculations. PMID:23798390

  6. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  7. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  8. Quantitative analysis of arm movement smoothness

    NASA Astrophysics Data System (ADS)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  9. A quantitative approach to scar analysis.

    PubMed

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology.

  10. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  11. Final Report: Hydrogen Storage System Cost Analysis

    SciTech Connect

    James, Brian David; Houchins, Cassidy; Huya-Kouadio, Jennie Moton; DeSantis, Daniel A.

    2016-09-30

    The Fuel Cell Technologies Office (FCTO) has identified hydrogen storage as a key enabling technology for advancing hydrogen and fuel cell power technologies in transportation, stationary, and portable applications. Consequently, FCTO has established targets to chart the progress of developing and demonstrating viable hydrogen storage technologies for transportation and stationary applications. This cost assessment project supports the overall FCTO goals by identifying the current technology system components, performance levels, and manufacturing/assembly techniques most likely to lead to the lowest system storage cost. Furthermore, the project forecasts the cost of these systems at a variety of annual manufacturing rates to allow comparison to the overall 2017 and “Ultimate” DOE cost targets. The cost breakdown of the system components and manufacturing steps can then be used to guide future research and development (R&D) decisions. The project was led by Strategic Analysis Inc. (SA) and aided by Rajesh Ahluwalia and Thanh Hua from Argonne National Laboratory (ANL) and Lin Simpson at the National Renewable Energy Laboratory (NREL). Since SA coordinated the project activities of all three organizations, this report includes a technical description of all project activity. This report represents a summary of contract activities and findings under SA’s five year contract to the US Department of Energy (Award No. DE-EE0005253) and constitutes the “Final Scientific Report” deliverable. Project publications and presentations are listed in the Appendix.

  12. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  13. Quantitative ADF STEM: acquisition, analysis and interpretation

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  14. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  15. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  16. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  17. High-energy PIXE: quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Opitz-Coutureau, J.; Campbell, J. L.; Maxwell, J. A.; Hopman, T.

    2004-06-01

    In recent years, high-energy PIXE was applied successfully for qualitative analysis on art and archaeological objects, e.g. coins, bronzes, sculptures, brooches. However, in the absence of software for quantitative analysis the full benefit inherent in the PIXE technique was not obtained. For example, a bronze could easily be distinguished from a brass, but the concentrations could not be rigorously compared within a set of bronzes. In this paper, the first quantitative analysis by high-energy PIXE is presented. The Guelph PIXE Software Package GUPIX has been extended to proton energies up to 100 MeV, so that high-energy PIXE spectra can be evaluated and concentrations derived. Measurements on metal and alloy standards at two different proton energies have been performed and the obtained compositions were compared to the certified values. The results will be presented and deviations discussed.

  18. Quantitative analysis of colony morphology in yeast.

    PubMed

    Ruusuvuori, Pekka; Lin, Jake; Scott, Adrian C; Tan, Zhihao; Sorsa, Saija; Kallio, Aleksi; Nykter, Matti; Yli-Harja, Olli; Shmulevich, Ilya; Dudley, Aimée M

    2014-01-01

    Microorganisms often form multicellular structures such as biofilms and structured colonies that can influence the organism's virulence, drug resistance, and adherence to medical devices. Phenotypic classification of these structures has traditionally relied on qualitative scoring systems that limit detailed phenotypic comparisons between strains. Automated imaging and quantitative analysis have the potential to improve the speed and accuracy of experiments designed to study the genetic and molecular networks underlying different morphological traits. For this reason, we have developed a platform that uses automated image analysis and pattern recognition to quantify phenotypic signatures of yeast colonies. Our strategy enables quantitative analysis of individual colonies, measured at a single time point or over a series of time-lapse images, as well as the classification of distinct colony shapes based on image-derived features. Phenotypic changes in colony morphology can be expressed as changes in feature space trajectories over time, thereby enabling the visualization and quantitative analysis of morphological development. To facilitate data exploration, results are plotted dynamically through an interactive Yeast Image Analysis web application (YIMAA; http://yimaa.cs.tut.fi) that integrates the raw and processed images across all time points, allowing exploration of the image-based features and principal components associated with morphological development.

  19. Quantitative laser atom probe analyses of hydrogenation-disproportionated Nd-Fe-B powders.

    PubMed

    Sepehri-Amin, H; Ohkubo, T; Nishiuchi, T; Hirosawa, S; Hono, K

    2011-05-01

    We report a successful atom probe tomography of hydrides in hydrogenation-disproportionated Nd-Fe-B powder using a green femtosecond laser. The atom probe specimens were prepared from one particle of powder using the focused ion beam lift-out method. The atom probe tomography taken from an α-Fe/NdH(2) structure suggested that B and Ga (trace added element) were partitioned in the NdH(2) phase. The hydrogen concentration of 64 at% determined from the atom probe analysis was in excellent agreement with the stoichiometry of the NdH(2) phase.

  20. Good practices for quantitative bias analysis.

    PubMed

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  1. Diffusion Analysis Of Hydrogen-Desorption Measurements

    NASA Technical Reports Server (NTRS)

    Danford, Merlin D.

    1988-01-01

    Distribution of hydrogen in metal explains observed desorption rate. Report describes application of diffusion theory to anaylsis of experimental data on uptake and elimination of hydrogen in high-strength alloys of 25 degree C. Study part of program aimed at understanding embrittlement of metals by hydrogen. Two nickel-base alloys, Rene 41 and Waspaloy, and one ferrous alloy, 4340 steel, studied. Desorption of hydrogen explained by distribution of hydrogen in metal. "Fast" hydrogen apparently not due to formation of hydrides on and below surface as proposed.

  2. Quantitative analysis to guide orphan drug development.

    PubMed

    Lesko, L J

    2012-08-01

    The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.

  3. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  4. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  5. Influence analysis in quantitative trait loci detection

    PubMed Central

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-01-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods—the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. PMID:24740424

  6. Influence analysis in quantitative trait loci detection.

    PubMed

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  8. An Analysis of NTSC's Timekeeping Hydrogen Masers

    NASA Astrophysics Data System (ADS)

    Hui-jie, Song; Shao-wu, Dong; Zheng-ming, Wang; Li-li, Qu; Yue-juan, Jing; Wei, Li

    2016-10-01

    In this article, the hydrogen masers in the NTSC (National Time Service Center) timekeeping laboratory are tested. In order to avoid the impact of larger noise of caesium atomic clocks, TA(k) or UTC(k) is not used as reference, instead, the four hydrogen masers are mutually referred and tested. The frequency stability of hydrogen masers is analyzed using the four-cornered hat method, and the Allan standard deviations of each single hydrogen maser in different sample times are estimated. Then, according to the characteristics of hydrogen masers, by removing the trend term, excluding outliers, and smoothing the data with a mathematical method to separate the Gaussian noises of hydrogen masers, and finally by through the Kolmogorov-Smirnov test, the Gaussian noise of each hydrogen maser is estimated.

  9. Analysis of NTSC's Timekeeping Hydrogen Masers

    NASA Astrophysics Data System (ADS)

    Song, H. J.; Dong, S. W.; Wang, Z. M.; Qu, L. L.; Jing, Y. J.; Li, W.

    2015-11-01

    In this article, the hydrogen masers were tested in NTSC (National Time Service Center) keeping time laboratory. In order to avoid the impact of larger noise of caesium atomic clocks, TA(k) or UTC(k) was not used as reference, and four hydrogen masers were mutually referred and tested. The frequency stabilities of hydrogen masers were analyzed by using four-cornered hat method, and the Allan standard deviation of single hydrogen maser was estimated in different sampling time. Then according to the characteristics of hydrogen masers, by removing the trend term, excluding outliers, and smoothing data with mathematical methods to separate the Gaussian noise of hydrogen masers, and finally through the normal Kolmogorov-Smirnov test, a single hydrogen maser's Gaussian noise has been estimated.

  10. Quantitative petrostructure analysis. Technical summary report

    SciTech Connect

    Warren, N.

    1980-09-01

    The establishment of quantitative techniques would lead to the development of predictive tools which would be of obvious importance in applied geophysics and engineering. In rock physics, it would help establish laws for averaging the effects of finite densities of real cracks and pores. It would also help in elucidating the relation between observed complex crack structures and various models for the mechanical properties of single cracks. The petrostructure study is addressed to this problem. The purpose of the effort is to quantitatively characterize the mineral and crack texture of granitic rock samples. The rock structures are to be characterized in such a way that the results can be used (1) to constrain the modelling of the effect of cracks on the physical properties of rocks, and (2) to test the possibility of establishing quantitative and predictive relations between petrographic observables and whole rock properties. Statistical techniques are being developed and being applied to the problem of parameterizing complex texture and crack patterns of rock, and of measuring correlation of these parameters to other measurable variables. The study is an application in factor analysis.

  11. Quantitative textural analysis of phenocryst zoning patterns

    NASA Astrophysics Data System (ADS)

    Niespolo, E.; Andrews, B. J.

    2011-12-01

    The textural complexity of phenocrysts has made quantitative analysis of large populations of crystals a challenging study. Because each phenocryst expresses a unique localized event in the volcanic interior, no single crystal necessarily records the complete pre-eruptive history of the magmatic system as a whole. Synthesizing the textural and compositional records of many crystals, however, should provide a more complete understanding of conditions prior to eruption. In this research, we present new techniques for quantitative analysis of individual crystals and across populations of crystals. We apply those techniques to back-scattered electron images of complexly zoned plagioclase from El Chichón volcano, Mexico. Analysis begins with Gaussian filtering to remove noise from the images and create more qualitatively distinct zoning patterns. Because pixel intensity is directly correlated with Anorthite content, compositional anisotropy is then calculated throughout each image by determining the distance from a grid point at which variation in pixel intensity exceeds a pre-determined standard deviation; both regular and adaptive grid spacings are used, and length scales are calculated in 8 directions. The resulting textural maps are analogous to a vector field and quantify 2-dimensional variation in texture. With both types of grid spacing, changes in magnitude and orientation of textural anisotropy and length scale indicate different crystal zones. The adaptive grid spacing, however, describes non-uniform textural variation more completely and has a higher measurement density in regions of high-frequency variation. In general, textural regions commonly described as clean or smooth show longer length scales and aligned anisotropies, whereas shorter length scales with variable anisotropies identify areas commonly described as patchy, dusty, or rough. The comparison and correlation of textural and compositional zoning help determine how different crystals record the

  12. Quantitative NIR Raman analysis in liquid mixtures.

    PubMed

    Sato-Berrú, R Ysacc; Medina-Valtierra, Jorge; Medina-Gutiérrez, Cirilo; Frausto-Reyes, Claudio

    2004-08-01

    The capability to obtain quantitative information of a simple way from Raman spectra is a subject of considerable interest. In this work, this is demonstrated for mixtures of ethanol with water and rhodamine-6G (R-6G) with methanol, which were analyzed directly in glass vessel. The Raman intensities and a simple mathematical model have been used and applied for the analysis of liquid samples. It is starting point to generate a general expression, from the experimental spectra, as the sum of the particular expression for each pure compound allow us to obtain an expression for the mixtures which can be used for determining concentrations, from the Raman spectrum, of the mixture.

  13. Quantitative analysis of non-Hodgkin's lymphoma.

    PubMed Central

    Abbott, C R; Blewitt, R W; Bird, C C

    1982-01-01

    A preliminary attempt has been made to characterise a small series of non-Hodgkin's lymphomas (NHL) by morphometric means using the Quantimet 720 Kontron MOP/AMO3 image analysis systems. In most cases it was found that the distribution of nuclear area and correlation between mean nuclear area and frequency per unit field, corresponded closely with tumour classification determined by light microscopy. These results suggest that it may be possible to devise an objective and reproducible grading system for NHL using quantitative morphometric techniques. PMID:7040479

  14. Systems analysis of hydrogen supplementation in natural gas pipelines

    SciTech Connect

    Hermelee, A.; Beller, M.; D'Acierno, J.

    1981-11-01

    The potential for hydrogen supplementation in natural gas pipelines is analyzed for a specific site from both mid-term (1985) and long-term perspectives. The concept of supplementing natural gas with the addition of hydrogen in the existing gas pipeline system serves to provide a transport and storage medium for hydrogen while eliminating the high investment costs associated with constructing separate hydrogen pipelines. This paper examines incentives and barriers to the implementation of this concept. The analysis is performed with the assumption that current developmental programs will achieve a process for cost-effectively separating pure hydrogen from natural gas/hydrogen mixtures to produce a separable and versatile chemical and fuel commodity. The energy systems formulation used to evaluate the role of hydrogen in the energy infrastructure is the Reference Energy System (RES). The RES is a network diagram that provides an analytic framework for incorporating all resources, technologies, and uses of energy in a uniform manner. A major aspect of the study is to perform a market analysis of traditional uses of resources in the various consuming sectors and the potential for hydrogen substitution in these sectors. The market analysis will focus on areas of industry where hydrogen is used as a feedstock rather than for its fuel-use opportunities to replace oil and natural gas. The sectors of industry where hydrogen is currently used and where its use can be expanded or substituted for other resources include petroleum refining, chemicals, iron and steel, and other minor uses.

  15. Analysis of Hybrid Hydrogen Systems: Final Report

    SciTech Connect

    Dean, J.; Braun, R.; Munoz, D.; Penev, M.; Kinchin, C.

    2010-01-01

    Report on biomass pathways for hydrogen production and how they can be hybridized to support renewable electricity generation. Two hybrid systems were studied in detail for process feasibility and economic performance. The best-performing system was estimated to produce hydrogen at costs ($1.67/kg) within Department of Energy targets ($2.10/kg) for central biomass-derived hydrogen production while also providing value-added energy services to the electric grid.

  16. The solar-hydrogen economy: an analysis

    NASA Astrophysics Data System (ADS)

    Reynolds, Warren D.

    2007-09-01

    The 20th Century was the age of the Petroleum Economy while the 21st Century is certainly the age of the Solar-Hydrogen Economy. The global Solar-Hydrogen Economy that is now emerging follows a different logic. Under this new economic paradigm, new machines and methods are once again being developed while companies are restructuring. The Petroleum Economy will be briefly explored in relation to oil consumption, Hubbert's curve, and oil reserves with emphasis on the "oil crash". Concerns and criticisms about the Hydrogen Economy will be addressed by debunking some of the "hydrogen myths". There are three major driving factors for the establishment of the Solar-Hydrogen Economy, i.e. the environment, the economy with the coming "oil crash", and national security. The New Energy decentralization pathway has developed many progressive features, e.g., reducing the dependence on oil, reducing the air pollution and CO II. The technical and economic aspects of the various Solar-Hydrogen energy options and combinations will be analyzed. A proposed 24-hour/day 200 MWe solar-hydrogen power plant for the U.S. with selected energy options will be discussed. There are fast emerging Solar Hydrogen energy infrastructures in the U.S., Europe, Japan and China. Some of the major infrastructure projects in the transportation and energy sectors will be discussed. The current and projected growth in the Solar-Hydrogen Economy through 2045 will be given.

  17. Quantitative analysis of retinal changes in hypertension

    NASA Astrophysics Data System (ADS)

    Giansanti, Roberto; Boemi, Massimo; Fumelli, Paolo; Passerini, Giorgio; Zingaretti, Primo

    1995-05-01

    Arterial hypertension is a high prevalence disease in Western countries and it is associated with increased risk for cardiovascular accidents. Retinal vessel changes are common findings in patients suffering from long-standing hypertensive disease. Morphological evaluations of the fundus oculi represent a fundamental tool for the clinical approach to the patient with hypertension. A qualitative analysis of the retinal lesions is usually performed and this implies severe limitations both in the classification of the different degrees of the pathology and in the follow-up of the disease. A diagnostic system based on a quantitative analysis of the retinal changes could overcome these problems. Our computerized approach was intended for this scope. The paper concentrates on the results and the implications of a computerized approach to the automatic extraction of numerical indexes describing morphological details of the fundus oculi. A previously developed image processing and recognition system, documented elsewhere and briefly described here, was successfully tested in pre-clinical experiments and applied in the evaluation of normal as well as of pathological fundus. The software system was developed to extract indexes such as caliber and path of vessels, local tortuosity of arteries and arterioles, positions and angles of crossings between two vessels. The reliability of the results, justified by their low variability, makes feasible the standardization of quantitative parameters to be used both in the diagnosis and in the prognosis of hypertension, and also allows prospective studies based upon them.

  18. Quantitative architectural analysis of bronchial intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Guillaud, Martial; MacAulay, Calum E.; Le Riche, Jean C.; Dawe, Chris; Korbelik, Jagoda; Lam, Stephen

    2000-04-01

    Considerable variation exists among pathologist in the interpretation of intraepithelial neoplasia making it difficult to determine the natural history of these lesion and to establish management guidelines for chemoprevention. The aim of the study is to evaluate architectural features of pre-neoplastic progression in lung cancer, and to search for a correlation between architectural index and conventional pathology. Quantitative architectural analysis was performed on a series of normal lung biopsies and Carcinoma In Situ (CIS). Centers of gravity of the nuclei within a pre-defined region of interest were used as seeds to generate a Voronoi Diagram. About 30 features derived from the Voronoi diagram, its dual the Delaunay tessellation, and the Minimum Spanning Tree were extracted. A discriminant analysis was performed to separate between the two groups. The architectural Index was calculated for each of the bronchial biopsies that were interpreted as hyperplasia, metaplasia, mild, moderate or severe dysplasia by conventional histopathology criteria. As a group, lesions classified as CIS by conventional histopathology criteria could be distinguished from dysplasia using the architectural Index. Metaplasia was distinct from hyperplasia and hyperplasia from normal. There was overlap between severe and moderate dysplasia but mild dysplasia could be distinguished form moderate dysplasia. Bronchial intraepithelial neoplastic lesions can be degraded objectively by architectural features. Combination of architectural features and nuclear morphometric features may improve the quantitation of the changes occurring during the intra-epithelial neoplastic process.

  19. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  20. The method of quantitative automatic metallographic analysis

    NASA Astrophysics Data System (ADS)

    Martyushev, N. V.; Skeeba, V. Yu

    2017-01-01

    A brief analysis of the existing softwares for computer processing of microstructure photographs is presented. The descriptions of the the software package developed by the author are demonstrated. This software product is intended for quantitative metallographic analysis of digital photographs of the microstructure of materials. It allows calculating the volume fraction and the average size of particles of the structure by several hundred secants (depending on the photographs resolution) in one vision field. Besides, a special module is built in the software allowing assessing the degree of deviation of the shape of different particles and impurities from the spherical one. The article presents the main algorithms, used during the creation of the software product, and formulae according to which the software calculates the parameters of the microstructure. It is shown that the reliability of calculations depends on the quality of preparation of the microstructure.

  1. Quantitative laryngeal electromyography: turns and amplitude analysis.

    PubMed

    Statham, Melissa McCarty; Rosen, Clark A; Nandedkar, Sanjeev D; Munin, Michael C

    2010-10-01

    Laryngeal electromyography (LEMG) is primarily a qualitative examination, with no standardized approach to interpretation. The objectives of our study were to establish quantitative norms for motor unit recruitment in controls and to compare with interference pattern analysis in patients with unilateral vocal fold paralysis (VFP). Retrospective case-control study We performed LEMG of the thyroarytenoid-lateral cricoarytenoid muscle complex (TA-LCA) in 21 controls and 16 patients with unilateral VFP. Our standardized protocol used a concentric needle electrode with subjects performing variable force TA-LCA contraction. To quantify the interference pattern density, we measured turns and mean amplitude per turn for ≥10 epochs (each 500 milliseconds). Logarithmic regression analysis between amplitude and turns was used to calculate slope and intercept. Standard deviation was calculated to further define the confidence interval, enabling generation of a linear-scale graphical "cloud" of activity containing ≥90% of data points for controls and patients. Median age of controls and patients was similar (50.7 vs. 48.5 years). In controls, TA-LCA amplitude with variable contraction ranged from 145-1112 μV, and regression analysis comparing mean amplitude per turn to root-mean-square amplitude demonstrated high correlation (R = 0.82). In controls performing variable contraction, median turns per second was significantly higher compared to patients (450 vs. 290, P = .002). We first present interference pattern analysis in the TA-LCA in healthy adults and patients with unilateral VFP. Our findings indicate that motor unit recruitment can be quantitatively measured within the TA-LCA. Additionally, patients with unilateral VFP had significantly reduced turns when compared with controls.

  2. Automated quantitative image analysis of nanoparticle assembly

    NASA Astrophysics Data System (ADS)

    Murthy, Chaitanya R.; Gao, Bo; Tao, Andrea R.; Arya, Gaurav

    2015-05-01

    The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly.The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated

  3. Materials characterization through quantitative digital image analysis

    SciTech Connect

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  4. Near Real Time Quantitative Gas Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Herget, William F.; Tromp, Marianne L.; Anderson, Charles R.

    1985-12-01

    A Fourier transform infrared (FT-IR) - based system has been developed and is undergoing evaluation for near real time multicomponent quantitative analysis of undiluted gaseous automotive exhaust emissions. The total system includes: (1) a gas conditioning system (GCS) for tracer gas injection, gas mixing, and temperature stabilization; and (2) an exhaust gas analyzer (EGA) consisting of a sample cell, an FT-IR system, and a computerized data processing system. Tests have shown that the system can monitor about 20 individual species (concentrations down to the 1-20 ppm range) with a time resolution of one second. Tests have been conducted on a chassis dynamometer system utilizing different autos, different fuels, and different driving cycles. Results were compared with those obtained using a standard constant volume sampling (CVS) system.

  5. Fast, quantitative, and nondestructive evaluation of hydrided LWR fuel cladding by small angle incoherent neutron scattering of hydrogen

    DOE PAGES

    Yan, Y.; Qian, S.; Littrell, K.; ...

    2015-02-13

    A non-destructive neutron scattering method to precisely measure the uptake of hydrogen and the distribution of hydride precipitates in light water reactor (LWR) fuel cladding was developed. Zircaloy-4 cladding used in commercial LWRs was used to produce hydrided specimens. The hydriding apparatus consists of a closed stainless steel vessel that contains Zr alloy specimens and hydrogen gas. Following hydrogen charging, the hydrogen content of the hydrided specimens was measured using the vacuum hot extraction method, by which the samples with desired hydrogen concentration were selected for the neutron study. Optical microscopy shows that our hydriding procedure results in uniform distributionmore » of circumferential hydrides across the wall. Small angle neutron incoherent scattering was performed in the High Flux Isotope Reactor at Oak Ridge National Laboratory. This study demonstrates that the hydrogen in commercial Zircaloy-4 cladding can be measured very accurately in minutes by this nondestructive method over a wide range of hydrogen concentrations from a very small amount ( 20 ppm) to over 1000 ppm. The hydrogen distribution in a tube sample was obtained by scaling the neutron scattering rate with a factor determined by a calibration process using standard, destructive direct chemical analysis methods on the specimens. This scale factor will be used in future tests with unknown hydrogen concentrations, thus providing a nondestructive method for absolute hydrogen concentration determination.« less

  6. Fast, quantitative, and nondestructive evaluation of hydrided LWR fuel cladding by small angle incoherent neutron scattering of hydrogen

    SciTech Connect

    Yan, Y.; Qian, S.; Littrell, K.; Parish, C. M.; Plummer, L. K.

    2015-02-13

    A non-destructive neutron scattering method to precisely measure the uptake of hydrogen and the distribution of hydride precipitates in light water reactor (LWR) fuel cladding was developed. Zircaloy-4 cladding used in commercial LWRs was used to produce hydrided specimens. The hydriding apparatus consists of a closed stainless steel vessel that contains Zr alloy specimens and hydrogen gas. Following hydrogen charging, the hydrogen content of the hydrided specimens was measured using the vacuum hot extraction method, by which the samples with desired hydrogen concentration were selected for the neutron study. Optical microscopy shows that our hydriding procedure results in uniform distribution of circumferential hydrides across the wall. Small angle neutron incoherent scattering was performed in the High Flux Isotope Reactor at Oak Ridge National Laboratory. This study demonstrates that the hydrogen in commercial Zircaloy-4 cladding can be measured very accurately in minutes by this nondestructive method over a wide range of hydrogen concentrations from a very small amount ( 20 ppm) to over 1000 ppm. The hydrogen distribution in a tube sample was obtained by scaling the neutron scattering rate with a factor determined by a calibration process using standard, destructive direct chemical analysis methods on the specimens. This scale factor will be used in future tests with unknown hydrogen concentrations, thus providing a nondestructive method for absolute hydrogen concentration determination.

  7. The hydrogen anomaly in neutron Compton scattering: new experiments and a quantitative theoretical explanation

    NASA Astrophysics Data System (ADS)

    Karlsson, E. B.; Hartmann, O.; Chatzidimitriou-Dreismann, C. A.; Abdul-Redah, T.

    2016-08-01

    No consensus has been reached so far about the hydrogen anomaly problem in Compton scattering of neutrons, although strongly reduced H cross-sections were first reported almost 20 years ago. Over the years, this phenomenon has been observed in many different hydrogen-containing materials. Here, we use yttrium hydrides as test objects, YH2, YH3, YD2 and YD3, Y(H x D1-x )2 and Y(H x D1-x )3, for which we observe H anomalies increasing with transferred momentum q. We also observe reduced deuteron cross-sections in YD2 and YD3 and have followed those up to scattering angles of 140° corresponding to high momentum transfers. In addition to data taken using the standard Au-197 foils for neutron energy selection, the present work includes experiments with Rh-103 foils and comparisons were also made with data from different detector setups. The H and D anomalies are discussed in terms of the different models proposed for their interpretation. The ‘electron loss model’ (which assumes energy transfer to excited electrons) is contradicted by the present data, but it is shown here that exchange effects in scattering from two or more protons (or deuterons) in the presence of large zero-point vibrations, can explain quantitatively the reduction of the cross-sections as well as their q-dependence. Decoherence processes also play an essential role. In a scattering time representation, shake-up processes can be followed on the attosecond scale. The theory also shows that large anomalies can appear only when the neutron coherence lengths (determined by energy selection and detector geometry) are about the same size as the distance between the scatterers.

  8. Quantitative Analysis of Tremors in Welders

    PubMed Central

    Sanchez-Ramos, Juan; Reimer, Dacy; Zesiewicz, Theresa; Sullivan, Kelly; Nausieda, Paul A.

    2011-01-01

    Background: Workers chronically exposed to manganese in welding fumes may develop an extra-pyramidal syndrome with postural and action tremors. Objectives: To determine the utility of tremor analysis in distinguishing tremors among workers exposed to welding fumes, patients with Idiopathic Parkinson’s Disease (IPD) and Essential Tremor (ET). Methods: Retrospective study of recorded tremor in subjects from academic Movement Disorders Clinics and Welders. Quantitative tremor analysis was performed and associated with clinical status. Results: Postural tremor intensity was increased in Welders and ET and was associated with visibly greater amplitude of tremor with arms extended. Mean center frequencies (Cf) of welders and patients with ET were significantly higher than the mean Cf of PD subjects. Although both the welders and the ET group exhibited a higher Cf with arms extended, welders could be distinguished from the ET subjects by a significantly lower Cf of the rest tremor than that measured in ET subjects. Conclusions: In the context of an appropriate exposure history and neurological examination, tremor analysis may be useful in the diagnosis of manganese-related extra-pyramidal manifestations. PMID:21655131

  9. Nonlinear dynamics and quantitative EEG analysis.

    PubMed

    Jansen, B H

    1996-01-01

    Quantitative, computerized electroencephalogram (EEG) analysis appears to be based on a phenomenological approach to EEG interpretation, and is primarily rooted in linear systems theory. A fundamentally different approach to computerized EEG analysis, however, is making its way into the laboratories. The basic idea, inspired by recent advances in the area of nonlinear dynamics and chaos theory, is to view an EEG as the output of a deterministic system of relatively simple complexity, but containing nonlinearities. This suggests that studying the geometrical dynamics of EEGs, and the development of neurophysiologically realistic models of EEG generation may produce more successful automated EEG analysis techniques than the classical, stochastic methods. A review of the fundamentals of chaos theory is provided. Evidence supporting the nonlinear dynamics paradigm to EEG interpretation is presented, and the kind of new information that can be extracted from the EEG is discussed. A case is made that a nonlinear dynamic systems viewpoint to EEG generation will profoundly affect the way EEG interpretation is currently done.

  10. Hydrogen storage and delivery system development: Analysis

    SciTech Connect

    Handrock, J.L.

    1996-10-01

    Hydrogen storage and delivery is an important element in effective hydrogen utilization for energy applications and is an important part of the FY1994-1998 Hydrogen Program Implementation Plan. This project is part of the Field Work Proposal entitled Hydrogen Utilization in Internal Combustion Engines (ICE). The goal of the Hydrogen Storage and Delivery System Development Project is to expand the state-of-the-art of hydrogen storage and delivery system design and development. At the foundation of this activity is the development of both analytical and experimental evaluation platforms. These tools provide the basis for an integrated approach for coupling hydrogen storage and delivery technology to the operating characteristics of potential hydrogen energy use applications. Results of the analytical model development portion of this project will be discussed. Analytical models have been developed for internal combustion engine (ICE) hybrid and fuel cell driven vehicles. The dependence of hydride storage system weight and energy use efficiency on engine brake efficiency and exhaust temperature for ICE hybrid vehicle applications is examined. Results show that while storage system weight decreases with increasing engine brake efficiency energy use efficiency remains relatively unchanged. The development, capability, and use of a recently developed fuel cell vehicle storage system model will also be discussed. As an example of model use, power distribution and control for a simulated driving cycle is presented. Model calibration results of fuel cell fluid inlet and exit temperatures at various fuel cell idle speeds, assumed fuel cell heat capacities, and ambient temperatures are presented. The model predicts general increases in temperature with fuel cell power and differences between inlet and exit temperatures, but under predicts absolute temperature values, especially at higher power levels.

  11. Quantitative Analysis of Triple Mutant Genetic Interactions

    PubMed Central

    Braberg, Hannes; Alexander, Richard; Shales, Michael; Xu, Jiewei; Franks-Skiba, Kathleen E.; Wu, Qiuqin; Haber, James E.; Krogan, Nevan J.

    2014-01-01

    The quantitative analysis of genetic interactions between pairs of gene mutations has proven effective for characterizing cellular functions but can miss important interactions for functionally redundant genes. To address this limitation, we have developed an approach termed Triple Mutant Analysis (TMA). The procedure relies on a query strain that contains two deletions in a pair of redundant or otherwise related genes, that is crossed against a panel of candidate deletion strains to isolate triple mutants and measure their growth. A central feature of TMA is to interrogate mutants that are synthetically sick when two other genes are deleted but interact minimally with either single deletion. This approach has been valuable for discovering genes that restore critical functions when the principle actors are deleted. TMA has also uncovered double mutant combinations that produce severe defects because a third protein becomes deregulated and acts in a deleterious fashion, and it has revealed functional differences between proteins presumed to act together. The protocol is optimized for Singer ROTOR pinning robots, takes 3 weeks to complete, and measures interactions for up to 30 double mutants against a library of 1536 single mutants. PMID:25010907

  12. Seniors' online communities: a quantitative content analysis.

    PubMed

    Nimrod, Galit

    2010-06-01

    To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. There was a constant increase in the daily activity level during the research period. Content analysis identified 13 main subjects discussed in the communities, including (in descending order) "Fun on line," "Retirement," "Family," "Health," "Work and Study," "Recreation" "Finance," "Religion and Spirituality," "Technology," "Aging," "Civic and Social," "Shopping," and "Travels." The overall tone was somewhat more positive than negative. The findings suggest that the utilities of Information and Communications Technologies for older adults that were identified in previous research are valid for seniors' online communities as well. However, the findings suggest several other possible benefits, which may be available only to online communities. The communities may provide social support, contribute to self-preservation, and serve as an opportunity for self-discovery and growth. Because they offer both leisure activity and an expanded social network, it is suggested that active participation in the communities may contribute to the well-being of older adults. Directions for future research and applied implications are further discussed.

  13. Quantitative analysis of protein turnover in plants.

    PubMed

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Spatial, Temporal, and Quantitative Manipulation of Intracellular Hydrogen Peroxide in Cultured Cells

    PubMed Central

    Alim, Ishraq; Haskew-Layton, Renee E.; Aleyasin, Hossein; Guo, Hengchang; Ratan, Rajiv R.

    2015-01-01

    Hydrogen peroxide (H2O2) is produced endogenously in a number of cellular compartments, including the mitochondria, the endoplasmic reticulum, peroxisomes, and at the plasma membrane, and can play divergent roles as a second messenger or a pathological toxin. It is assumed that the tuned production of H2O2 within neuronal and non-neuronal cells regulates a discreet balance between survival and death. However, a major challenge in understanding the physiological versus pathological role of H2O2 in cells has been the lack of validated methods that can spatially, temporally, and quantitatively modulate H2O2 production. A promising means of regulating endogenous H2O2 is through the expression of peroxide-producing enzyme D-amino acid oxidase (DAAO from Rhodotorula gracilis lacking a peroxisomal targeting sequence). Using viral vectors to express DAAO in distinct cell types and using targeting sequences to target DAAO to distinct subcellular sites, we can manipulate H2O2 production by applying the substrate D-alanine or permeable analogs of D-alanine. In this chapter, we describe the use of DAAO to produce H2O2 in culture models and the real-time visual validation of this technique using two-photon microscopy and chemoselective fluorescent probes. PMID:25416362

  15. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  16. Quantitative color analysis for capillaroscopy image segmentation.

    PubMed

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  17. The development of 1,3-diphenylisobenzofuran as a highly selective probe for the detection and quantitative determination of hydrogen peroxide.

    PubMed

    Żamojć, Krzysztof; Zdrowowicz, Magdalena; Rudnicki-Velasquez, Paweł Błażej; Krzymiński, Karol; Zaborowski, Bartłomiej; Niedziałkowski, Paweł; Jacewicz, Dagmara; Chmurzyński, Lech

    2017-01-01

    1,3-Diphenylisobenzofuran (DPBF) has been developed as a selective probe for the detection and quantitative determination of hydrogen peroxide in samples containing different reactive nitrogen and oxygen species (RNOS). DPBF is a fluorescent probe which, for almost 20 years, was believed to react in a highly specific manner toward some reactive oxygen species (ROS) such as singlet oxygen and hydroxy, alkyloxy or alkylperoxy radicals. Under the action of these individuals DPBF has been rapidly transformed to 1,2-dibenzoylbenzene (DBB). In order to check if DPBF can act as a unique indicator of the total amount of different RNOS, as well as oxidative stress caused by an overproduction of these individuals, a series of experiments was carried out, in which DPBF reacted with peroxynitrite anion, superoxide anion, hydrogen peroxide, hypochlorite anion, and anions commonly present under biological conditions, namely nitrite and nitrate. In all cases, except for hydrogen peroxide, the product of the reaction is DBB. Only under the action of H2O2 9-hydroxyanthracen-10(9H)-one (oxanthrone) is formed. This product has been identified with the use of fluorescence spectroscopy, NMR spectroscopy, high performance liquid chromatography coupled with mass spectrometry, infrared spectroscopy, elemental analysis, and cyclic voltammetry (CV). A linear relationship was found between a decrease in the fluorescence intensity of DPBF and the concentration of hydrogen peroxide in the range of concentrations of 0.196-3.941 mM. DPBF responds to hydrogen peroxide in a very specific way with the limits of detection and quantitation of 88 and 122.8 μM, respectively. The kinetics of the reaction between DBBF and H2O2 was also studied.

  18. Hydrogen Technical Analysis -- Dissemination of Information

    SciTech Connect

    George Kervitsky, Jr.

    2006-03-20

    SENTECH is a small energy and environmental consulting firm providing technical, analytical, and communications solutions to technology management issues. The activities proposed by SENTECH focused on gathering and developing communications materials and information, and various dissemination activities to present the benefits of hydrogen energy to a broad audience while at the same time establishing permanent communications channels to enable continued two-way dialog with these audiences in future years. Effective communications and information dissemination is critical to the acceptance of new technology. Hydrogen technologies face the additional challenge of safety preconceptions formed primarily as a result of the crash of the Hindenburg. Effective communications play a key role in all aspects of human interaction, and will help to overcome the perceptual barriers, whether of safety, economics, or benefits. As originally proposed SENTECH identified three distinct information dissemination activities to address three distinct but important audiences; these formed the basis for the task structure used in phases 1 and 2. The tasks were: (1) Print information--Brochures that target the certain segment of the population and will be distributed via relevant technical conferences and traditional distribution channels. (2) Face-to-face meetings--With industries identified to have a stake in hydrogen energy. The three industry audiences are architect/engineering firms, renewable energy firms, and energy companies that have not made a commitment to hydrogen (3) Educational Forums--The final audience is students--the future engineers, technicians, and energy consumers. SENTECH will expand on its previous educational work in this area. The communications activities proposed by SENTECH and completed as a result of this cooperative agreement was designed to compliment the research and development work funded by the DOE by presenting the technical achievements and validations

  19. A hydrogen energy carrier. Volume 2: Systems analysis

    NASA Technical Reports Server (NTRS)

    Savage, R. L. (Editor); Blank, L. (Editor); Cady, T. (Editor); Cox, K. (Editor); Murray, R. (Editor); Williams, R. D. (Editor)

    1973-01-01

    A systems analysis of hydrogen as an energy carrier in the United States indicated that it is feasible to use hydrogen in all energy use areas, except some types of transportation. These use areas are industrial, residential and commercial, and electric power generation. Saturation concept and conservation concept forecasts of future total energy demands were made. Projected costs of producing hydrogen from coal or from nuclear heat combined with thermochemical decomposition of water are in the range $1.00 to $1.50 per million Btu of hydrogen produced. Other methods are estimated to be more costly. The use of hydrogen as a fuel will require the development of large-scale transmission and storage systems. A pipeline system similar to the existing natural gas pipeline system appears practical, if design factors are included to avoid hydrogen environment embrittlement of pipeline metals. Conclusions from the examination of the safety, legal, environmental, economic, political and societal aspects of hydrogen fuel are that a hydrogen energy carrier system would be compatible with American values and the existing energy system.

  20. Hydrogen detection near surfaces and shallow interfaces with resonant nuclear reaction analysis

    NASA Astrophysics Data System (ADS)

    Wilde, Markus; Fukutani, Katsuyuki

    2014-12-01

    This review introduces hydrogen depth profiling by nuclear reaction analysis (NRA) via the resonant 1H(15N,αγ)12C reaction as a versatile method for the highly depth-resolved observation of hydrogen (H) at solid surfaces and interfaces. The technique is quantitative, non-destructive, and readily applied to a large variety of materials. Its fundamentals, instrumental requirements, advantages and limitations are described in detail, and its main performance benchmarks in terms of depth resolution and sensitivity are compared to those of elastic recoil detection (ERD) as a competing method. The wide range of 1H(15N,αγ)12C NRA applications in research of hydrogen-related phenomena at surfaces and interfaces is reviewed. Special emphasis is placed on the powerful combination of 1H(15N,αγ)12C NRA with surface science techniques of in-situ target preparation and characterization, as the NRA technique is ideally suited to investigate hydrogen interactions with atomically controlled surfaces and intact interfaces. In conjunction with thermal desorption spectroscopy, 15N NRA can assess the thermal stability of absorbed hydrogen species in different depth locations against diffusion and desorption. Hydrogen diffusion dynamics in the near-surface region, including transitions of hydrogen between the surface and the bulk, and between shallow interfaces of nanostructured thin layer stacks can directly be visualized. As a unique feature of 15N NRA, the analysis of Doppler-broadened resonance excitation curves allows for the direct measurement of the zero-point vibrational energy of hydrogen atoms adsorbed on single crystal surfaces.

  1. The stable hydrogen isotopic composition of sedimentary plant waxes as quantitative proxy for rainfall in the West African Sahel

    NASA Astrophysics Data System (ADS)

    Niedermeyer, Eva M.; Forrest, Matthew; Beckmann, Britta; Sessions, Alex L.; Mulch, Andreas; Schefuß, Enno

    2016-07-01

    Various studies have demonstrated that the stable hydrogen isotopic composition (δD) of terrestrial leaf waxes tracks that of precipitation (δDprecip) both spatially across climate gradients and over a range of different timescales. Yet, reconstructed estimates of δDprecip and corresponding rainfall typically remain largely qualitative, due mainly to uncertainties in plant ecosystem net fractionation, relative humidity, and the stability of the amount effect through time. Here we present δD values of the C31n-alkane (δDwax) from a marine sediment core offshore the Northwest (NW) African Sahel covering the past 100 years and overlapping with the instrumental record of rainfall. We use this record to investigate whether accurate, quantitative estimates of past rainfall can be derived from our δDwax time series. We infer the composition of vegetation (C3/C4) within the continental catchment area by analysis of the stable carbon isotopic composition of the same compounds (δ13Cwax), calculated a net ecosystem fractionation factor, and corrected the δDwax time series accordingly to derive δDprecip. Using the present-day relationship between δDprecip and the amount of precipitation in the tropics, we derive quantitative estimates of past precipitation amounts. Our data show that (a) vegetation composition can be inferred from δ13Cwax, (b) the calculated net ecosystem fractionation represents a reasonable estimate, and (c) estimated total amounts of rainfall based on δDwax correspond to instrumental records of rainfall. Our study has important implications for future studies aiming to reconstruct rainfall based on δDwax; the combined data presented here demonstrate that it is feasible to infer absolute rainfall amounts from sedimentary δDwax in tandem with δ13Cwax in specific depositional settings.

  2. A Qualitative-Quantitative H-NMR Experiment for the Instrumental Analysis Laboratory.

    ERIC Educational Resources Information Center

    Phillips, John S.; Leary, James J.

    1986-01-01

    Describes an experiment combining qualitative and quantitative information from hydrogen nuclear magnetic resonance spectra. Reviews theory, discusses the experimental approach, and provides sample results. (JM)

  3. A Qualitative-Quantitative H-NMR Experiment for the Instrumental Analysis Laboratory.

    ERIC Educational Resources Information Center

    Phillips, John S.; Leary, James J.

    1986-01-01

    Describes an experiment combining qualitative and quantitative information from hydrogen nuclear magnetic resonance spectra. Reviews theory, discusses the experimental approach, and provides sample results. (JM)

  4. Analysis Sharpens Mars Hydrogen Map, Hinting Equatorial Water Ice

    NASA Image and Video Library

    2017-09-28

    Re-analysis of 2002-2009 data from a hydrogen-finding instrument on NASA's Mars Odyssey orbiter increased the resolution of maps of hydrogen abundance. The reprocessed data (lower map) shows more "water-equivalent hydrogen" (darker blue) in some parts of this equatorial region of Mars. Puzzingly, this suggests the possible presence of water ice just beneath the surface near the equator, though it would not be thermodynamically stable there. The upper map uses raw data from Odyssey's neutron spectrometer instrument, which senses the energy state of neutrons coming from Mars, providing an indication of how much hydrogen is present in the top 3 feet (1 meter) of the surface. Hydrogen detected by Odyssey at high latitudes of Mars in 2002 was confirmed to be in the form of water ice by the follow-up NASA Phoenix Mars Lander mission in 2008. A 2017 reprocessing of the older data applied image-reconstruction techniques often used to reduce blurring from medical imaging data. The results are shown here for an area straddling the equator for about one-fourth the circumference of the planet, centered at 175 degrees west longitude. The white contours outline lobes of a formation called Medusae Fossae, coinciding with some areas of higher hydrogen abundance in the enhanced-resolution analysis. The black line indicates the limit of a relatively young lava plain, coinciding with areas of lower hydrogen abundance in the enhanced-resolution analysis. The color-coding key for hydrogen abundance in both maps is indicated by the horizontal bar, in units expressed as how much water would be present in the ground if the hydrogen is all in the form of water. Units of the equivalent water weight, as a percentage of the material in the ground, are correlated with counts recorded by the spectrometer, ranging from less than 1 weight-percent water equivalent (red) to more than 30 percent (dark blue). https://photojournal.jpl.nasa.gov/catalog/PIA21848

  5. Quantitative Analysis of Hypoperfusion in Acute Stroke

    PubMed Central

    Nael, Kambiz; Meshksar, Arash; Liebeskind, David S.; Coull, Bruce M.; Krupinski, Elizabeth A.; Villablanca, J. Pablo

    2014-01-01

    Background and Purpose This study compares the concordance between arterial spin labeling (ASL) and dynamic susceptibility contrast (DSC) for the identification of regional hypoperfusion and diffusion-perfusion mismatch tissue classification using a quantitative method. Methods The inclusion criteria for this retrospective study were as follows: patients with acute ischemic syndrome with symptom onset <24 hours and acquisition of both ASL and DSC MR perfusion. The volumes of infarction and hypoperfused lesions were calculated on ASL and DSC multi-parametric maps. Patients were classified into reperfused, matched, or mismatch groups using time to maximum >6 sec as the reference. In a subset of patients who were successfully recanalized, the identical analysis was performed and the infarction and hypoperfused lesion volumes were used for paired pre- and posttreatment comparisons. Results Forty-one patients met our inclusion criteria. Twenty patients underwent successful endovascular revascularization (TICI>2a), resulting in a total of 61 ASL-DSC data pairs for comparison. The hypoperfusion volume on ASL-cerebral blood flow best approximated the DSC-time to peak volume (r=0.83) in pretreatment group and time to maximum (r=0.46) after recanalization. Both ASL-cerebral blood flow and DSC-TTP overestimated the hypoperfusion volume compared with time to maximum volume in pretreatment (F=27.41, P<0.0001) and recanalized patients (F=8.78, P<0.0001). Conclusions ASL-cerebral blood flow overestimates the DSC time to maximum hypoperfusion volume and mismatch classification in patients with acute ischemic syndrome. Continued overestimation of hypoperfused volume after recanalization suggests flow pattern and velocity changes in addition to arterial transit delay can affects the performance of ASL. PMID:23988646

  6. System level analysis of hydrogen storage options.

    SciTech Connect

    Ahluwalia, R. K.; Peng, J.-C.; Hua, T. Q.; Kumar, R.; Satyapal, S.; USDOE

    2006-01-01

    The overall objective of this effort is to support DOE with independent system level analyses of various H2 storage approaches, to help to assess and down-select options, and to determine the feasibility of meeting DOE targets. Specific objectives in Fiscal Year 2008 included: (1) Model various developmental hydrogen storage systems, (2) Provide results to Centers of Excellence (CoEs) for assessment of performance targets and goals, (3) Develop models to 'reverse-engineer' particular approaches, (4) Identify interface issues, opportunities, and data needs for technology development. Several different approaches are being pursued to develop on-board hydrogen storage systems with the goal of meeting DOE targets for light-duty vehicular applications. Each approach has unique characteristics, such as the thermal energy and temperature of charge and discharge, kinetics of the physical and chemical process steps involved, and requirements for the materials and energy interfaces between the storage system and the fuel supply system on the one hand, and the fuel user on the other. Other storage system design and operating parameters influence the projected system costs as well. We are developing models to understand the characteristics of storage systems based on these approaches and to evaluate their potential to meet the DOE targets for on-board applications. Our approach is to develop thermodynamic, kinetic, and engineering models of the various hydrogen storage systems being developed under DOE sponsorship.

  7. Skeleton-based cerebrovascular quantitative analysis.

    PubMed

    Wang, Xingce; Liu, Enhui; Wu, Zhongke; Zhai, Feifei; Zhu, Yi-Cheng; Shui, Wuyang; Zhou, Mingquan

    2016-12-20

    Cerebrovascular disease is the most common cause of death worldwide, with millions of deaths annually. Interest is increasing toward understanding the geometric factors that influence cerebrovascular diseases, such as stroke. Cerebrovascular shape analyses are essential for the diagnosis and pathological identification of these conditions. The current study aimed to provide a stable and consistent methodology for quantitative Circle of Willis (CoW) analysis and to identify geometric changes in this structure. An entire pipeline was designed with emphasis on automating each step. The stochastic segmentation was improved and volumetric data were obtained. The L1 medial axis method was applied to vessel volumetric data, which yielded a discrete skeleton dataset. A B-spline curve was used to fit the skeleton, and geometric values were proposed for a one-dimensional skeleton and radius. The calculations used to derive these values were illustrated in detail. In one example(No. 47 in the open dataset) all values for different branches of CoW were calculated. The anterior communicating artery(ACo) was the shortest vessel, with a length of 2.6mm. The range of the curvature of all vessels was (0.3, 0.9) ± (0.1, 1.4). The range of the torsion was (-12.4,0.8) ± (0, 48.7). The mean radius value range was (3.1, 1.5) ± (0.1, 0.7) mm, and the mean angle value range was (2.2, 2.9) ± (0, 0.2) mm. In addition to the torsion variance values in a few vessels, the variance values of all vessel characteristics remained near 1. The distribution of the radii of symmetrical posterior cerebral artery(PCA) and angle values of the symmetrical posterior communicating arteries(PCo) demonstrated a certain correlation between the corresponding values of symmetrical vessels on the CoW. The data verified the stability of our methodology. Our method was appropriate for the analysis of large medical image datasets derived from the automated pipeline for populations. This method was applicable to

  8. Quantitative Data Analysis--In the Graduate Curriculum

    ERIC Educational Resources Information Center

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  9. Quantitative Auger analysis of Nb-Ge superconducting alloys

    SciTech Connect

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb/sub 3/Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements.

  10. Analysis of experimental hydrogen engine data and hydrogen vehicle performance and emissions simulation

    SciTech Connect

    Aceves, S.A.

    1996-10-01

    This paper reports the engine and vehicle simulation and analysis done at Lawrence Livermore (LLNL) as a part of a joint optimized hydrogen engine development effort. Project participants are: Sandia National Laboratory; Los Alamos National Laboratory; and the University of Miami. Fuel cells are considered as the ideal power source for future vehicles, due to their high efficiency and low emissions. However, extensive use of fuel cells in light-duty vehicles is likely to be years away, due to their high manufacturing cost. Hydrogen-fueled, spark-ignited, homogeneous-charge engines offer a near-term alternative to fuel cells. Hydrogen in a spark-ignited engine can be burned at very low equivalence ratios. NO{sub x} emissions can be reduced to less than 10 ppm without catalyst. HC and CO emissions may result from oxidation of engine oil, but by proper design are negligible (a few ppm). Lean operation also results in increased indicated efficiency due to the thermodynamic properties of the gaseous mixture contained in the cylinder. The high effective octane number of hydrogen allows the use of a high compression ratio, further increasing engine efficiency. In this paper, a simplified engine model is used for predicting hydrogen engine efficiency and emissions. The model uses basic thermodynamic equations for the compression and expansion processes, along with an empirical correlation for heat transfer, to predict engine indicated efficiency. A friction correlation and a supercharger/turbocharger model are then used to calculate brake thermal efficiency. The model is validated with many experimental points obtained in a recent evaluation of a hydrogen research engine. The experimental data are used to adjust the empirical constants in the heat release rate and heat transfer correlation. The results indicate that hydrogen lean-burn spark-ignite engines can provide Equivalent Zero Emission Vehicle (EZEV) levels in either a series hybrid or a conventional automobile.

  11. Hydrogen embrittlement II. Analysis of hydrogen-enhanced decohesion across (111) planes in α -Fe

    NASA Astrophysics Data System (ADS)

    Katzarov, Ivaylo H.; Paxton, Anthony T.

    2017-08-01

    This is the second of two papers that present a theoretical analysis of the phenomenon of hydrogen embrittlement of α -Fe. We make contact between the thermodynamic-kinetic continuum and cohesive zone models and the quantum-mechanical magnetic tight-binding approximation to interatomic forces. We are able to solve a coupled set of equations using quantum mechanically obtained atomistic data to follow the decohesion process in time as traction is applied to a hydrogen charged crystal and decohesion occurs between two (111) crystal planes. This scheme will be readily extended from transgranular to intergranular failure, although the complexities of the trapping sites in the cohesive zone associated with a grain boundary will greatly complicate the calculation of the configurational energy. Hydrogen-enhanced decohesion postulated widely in the field has not yet been demonstrated experimentally, although our calculations find a reduction in the ideal cohesive strength as a result of dissolved hydrogen in α -Fe from 30 to 22 GPa. Because of the well-known steep and nonlinear relation between plastic and ideal elastic work of fracture, this represents a very significant reduction in toughness as a result of a hydrogen concentration of less than ten atomic parts per million.

  12. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  13. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…

  14. Analysis of Hydrogen and Competing Technologies for Utility-Scale Energy Storage (Presentation)

    SciTech Connect

    Steward, D.

    2010-02-11

    Presentation about the National Renewable Energy Laboratory's analysis of hydrogen energy storage scenarios, including analysis framework, levelized cost comparison of hydrogen and competing technologies, analysis results, and conclusions drawn from the analysis.

  15. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method.

    PubMed

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is (252)Cf or (241)Am-Be. In this study, (252)Cf with a neutron flux of 6.3x10(6)n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with (3)He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of approximately 0.947g/cc and area of 40cmx25cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  16. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Technical Reports Server (NTRS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  17. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Astrophysics Data System (ADS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-02-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  18. Analysis of Two Quantitative Ultrasound Approaches.

    PubMed

    Muleki-Seya, Pauline; Han, Aiguo; Andre, Michael P; Erdman, John W; O'Brien, William D

    2017-09-01

    There are two well-known ultrasonic approaches to extract sets of quantitative parameters: Lizzi-Feleppa (LF) parameters: slope, intercept, and midband; and quantitative ultrasound (QUS)-derived parameters: effective scatterer diameter (ESD) and effective acoustic concentration (EAC). In this study, the relation between the LF and QUS-derived parameters is studied theoretically and experimentally on ex vivo mouse livers. As expected from the theory, LF slope is correlated to ESD ([Formula: see text]), and from experimental data, LF midband is correlated to EAC ([Formula: see text]). However, LF intercept is not correlated to ESD ([Formula: see text]) nor EAC ([Formula: see text]). The unexpected correlation observed between LF slope and EAC ([Formula: see text]) results likely from the high correlation between ESD and EAC due to the inversion process. For a liver fat percentage estimation, an important potential medical application, the parameters presenting the better correlation are EAC ([Formula: see text]) and LF midband ([Formula: see text]).

  19. Tracer-based laser-induced fluorescence measurement technique for quantitative fuel/air-ratio measurements in a hydrogen internal combustion engine.

    PubMed

    Blotevogel, Thomas; Hartmann, Matthias; Rottengruber, Hermann; Leipertz, Alfred

    2008-12-10

    A measurement technique for the quantitative investigation of mixture formation processes in hydrogen internal combustion engines (ICEs) has been developed using tracer-based laser-induced fluorescence (TLIF). This technique can be employed to fired and motored engine operation. The quantitative TLIF fuel/air-ratio results have been verified by means of linear Raman scattering measurements. Exemplary results of the simultaneous investigation of mixture formation and combustion obtained at an optical accessible hydrogen ICE are shown.

  20. ANALYSIS OF AVAILABLE HYDROGEN DATA & ACCUMULATION OF HYDROGEN IN UNVENTED TRANSURANIC (TRU) DRUMS

    SciTech Connect

    DAYLEY, L

    2004-06-24

    This document provides a response to the second action required in the approval for the Justification for Continued Operations (JCO) Assay and Shipment of Transuranic (TRU) Waste Containers in 218-W-4C. The Waste Management Project continues to make progress toward shipping certified TRU waste to the Waste Isolation Pilot Plant (WIPP). As the existing inventory of TRU waste in the Central Waste Complex (CWC) storage buildings is shipped, and the uncovered inventory is removed from the trenches and prepared for shipment from the Hanford Site, the covered inventory of suspect TRU wastes must be retrieved and prepared for processing for shipment to WIPP. Accumulation of hydrogen in unvented TRU waste containers is a concern due to the possibility of explosive mixtures of hydrogen and oxygen. The frequency and consequence of these gas mixtures resulting in an explosion must be addressed. The purpose of this study is to recommend an approach and schedule for venting TRU waste containers in the low-level burial ground (LLBG) trenches in conjunction with TRU Retrieval Project activities. This study provides a detailed analysis of the expected probability of hydrogen gas accumulation in significant quantities in unvented drums. Hydrogen gas accumulation in TRU drums is presented and evaluated in the following three categories: Hydrogen concentrations less than 5 vol%; Hydrogen between 5-15 vol%; and Hydrogen concentrations above 15 vol%. This analysis is based on complex-wide experience with TRU waste drums, available experimental data, and evaluations of storage conditions. Data reviewed in this report includes experience from the Idaho National Environmental Engineering Laboratories (INEEL), Savannah River Site (SRS), Los Alamos National Laboratories (LANL), Oak Ridge National Laboratories, (ORNL), Rocky Flats sites, Matrix Depletion Program and the National Transportation and Packaging Program. Based on this analysis, as well as an assessment of the probability and

  1. Hydrogen sensor

    DOEpatents

    Duan, Yixiang; Jia, Quanxi; Cao, Wenqing

    2010-11-23

    A hydrogen sensor for detecting/quantitating hydrogen and hydrogen isotopes includes a sampling line and a microplasma generator that excites hydrogen from a gas sample and produces light emission from excited hydrogen. A power supply provides power to the microplasma generator, and a spectrometer generates an emission spectrum from the light emission. A programmable computer is adapted for determining whether or not the gas sample includes hydrogen, and for quantitating the amount of hydrogen and/or hydrogen isotopes are present in the gas sample.

  2. Geospatial analysis and seasonal changes in water-equivalent hydrogen in eastern equatorial Mars

    NASA Astrophysics Data System (ADS)

    Clevy, June Renee

    2014-10-01

    This dissertation describes the relationship between hydrogen abundance, as measured through epithermal neutron counts, and the topographic, geologic, and surficial features in the equatorial region of eastern Mars. In Chapter 1, I present an alternative method for resampling the epithermal neutron count data collected by the neutron spectrometer from Mars Odyssey's Gamma Ray Spectrometer suite. Chapter 2 provides a seasonal break down of mean and median epithermal neutron count rates and examines areas of static, seasonal, and episodic hydrogen enrichment. Armed with new maps of mean epithermal neutron count rates and derivative maps of weight percent water equivalent hydrogen, I examine the spatial relationships between equatorial hydrogen concentrations and satellite-measured surface properties such as elevation, its derivatives slope and aspect, albedo, dust cover, geologic units, and valley networks in Chapter 3. The chapters in this dissertation represent a workflow from the development of the Water Equivalent Hydrogen dataset used in this research (Chapter 1), to an analysis of seasonal changes in the hydrogen signal (Chapter 2), and the relationships between this data and measurements of elevation, crustal thickness, surface composition, and geomorphology (Chapter 3). These investigations were made possible by the application of terrestrial geographic information science to planetary geology through Geographic Information Systems (GIS). Neighborhood processing allowed me to refine the spatial resolution of the epithermal neutron count in the first chapter. Class frequency tables permitted the identification of changes over time in chapter two and facilitated the identification of high and low variability areas. Finally, a quantitative process known as the Location Quotient, which builds upon frequency tables, was applied to identify more frequent than expected combinations of hydrogen abundance and other martian data (e.g., elevation) for the purpose of

  3. Analysis of experimental hydrogen engine data and hydrogen vehicle performance and emissions simulation

    SciTech Connect

    Aceves, S.M.

    1996-09-01

    This paper reports the engine and vehicle simulation and analysis done at Lawrence Livermore (LLNL) as a part of a joint optimized hydrogen engine development effort. Project participants are: Sandia National Laboratory, California (SNLC), responsible for experimental evaluation; Los Alamos National Laboratory (LANL), responsible for detailed fluid mechanics engine evaluations, and the University of Miami, responsible for engine friction reduction. Fuel cells are considered as the ideal power source for future vehicles, due to their high efficiency and low emissions. However, extensive use of fuel cells in light-duty vehicles is likely to be years away, due to their high manufacturing cost. Hydrogen-fueled, spark-ignited, homogeneous-charge engines offer a near-term alternative to fuel cells. Hydrogen in a spark-ignited engine can be burned at very low equivalence ratios, so that NO{sub x} emissions can be reduced to less than 10 ppm without catalyst. HC and CO emissions may result from oxidation of engine oil, but by proper design are negligible (a few ppm). Lean operation also results in increased indicated efficiency due to the thermodynamic properties of the gaseous mixture contained in the cylinder. The high effective octane number of hydrogen allows the use of a high compression ratio, further increasing engine efficiency.

  4. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  5. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  6. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  7. A kinetic model for quantitative evaluation of the effect of hydrogen and osmolarity on hydrogen production by Caldicellulosiruptor saccharolyticus

    PubMed Central

    2011-01-01

    Background Caldicellulosiruptor saccharolyticus has attracted increased interest as an industrial hydrogen (H2) producer. The aim of the present study was to develop a kinetic growth model for this extreme thermophile. The model is based on Monod kinetics supplemented with the inhibitory effects of H2 and osmotic pressure, as well as the liquid-to-gas mass transfer of H2. Results Mathematical expressions were developed to enable the simulation of microbial growth, substrate consumption and product formation. The model parameters were determined by fitting them to experimental data. The derived model corresponded well with experimental data from batch fermentations in which the stripping rates and substrate concentrations were varied. The model was used to simulate the inhibition of growth by H2 and solute concentrations, giving a critical dissolved H2 concentration of 2.2 mmol/L and an osmolarity of 0.27 to 29 mol/L. The inhibition by H2, being a function of the dissolved H2 concentration, was demonstrated to be mainly dependent on H2 productivity and mass transfer rate. The latter can be improved by increasing the stripping rate, thereby allowing higher H2 productivity. The experimentally determined degree of oversaturation of dissolved H2 was 12 to 34 times the equilibrium concentration and was comparable to the values given by the model. Conclusions The derived model is the first mechanistically based model for fermentative H2 production and provides useful information to improve the understanding of the growth behavior of C. saccharolyticus. The model can be used to determine optimal operating conditions for H2 production regarding the substrate concentration and the stripping rate. PMID:21914204

  8. Continuous quantitative local cerebral blood flow measurement. Calibration of thermal conductivity measurements by the hydrogen clearance method.

    PubMed

    Cusick, J F; Myklebust, J

    1980-01-01

    The capability of a miniaturized probe to measure local cerebral blood flow in a continuous and quantitative manner is described. The incorporation of thermal conductivity measurements using the isothermal principle with the hydrogen clearance method allows calibration of the thermal conductivity component in absolute terms. Evaluation of this system in 14 cats showed a linear relationship between both measurement methods. The major limitation of this combination probe system is the need for routine intermittent recalibration in order that changes of tissues thermal conductivity induced by physiologic alterations during the experimental procedure may be recognized and resolved.

  9. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system.

  10. U.S. Department of Energy Hydrogen Storage Cost Analysis

    SciTech Connect

    Law, Karen; Rosenfeld, Jeffrey; Han, Vickie; Chan, Michael; Chiang, Helena; Leonard, Jon

    2013-03-11

    The overall objective of this project is to conduct cost analyses and estimate costs for on- and off-board hydrogen storage technologies under development by the U.S. Department of Energy (DOE) on a consistent, independent basis. This can help guide DOE and stakeholders toward the most-promising research, development and commercialization pathways for hydrogen-fueled vehicles. A specific focus of the project is to estimate hydrogen storage system cost in high-volume production scenarios relative to the DOE target that was in place when this cost analysis was initiated. This report and its results reflect work conducted by TIAX between 2004 and 2012, including recent refinements and updates. The report provides a system-level evaluation of costs and performance for four broad categories of on-board hydrogen storage: (1) reversible on-board metal hydrides (e.g., magnesium hydride, sodium alanate); (2) regenerable off-board chemical hydrogen storage materials(e.g., hydrolysis of sodium borohydride, ammonia borane); (3) high surface area sorbents (e.g., carbon-based materials); and 4) advanced physical storage (e.g., 700-bar compressed, cryo-compressed and liquid hydrogen). Additionally, the off-board efficiency and processing costs of several hydrogen storage systems were evaluated and reported, including: (1) liquid carrier, (2) sodium borohydride, (3) ammonia borane, and (4) magnesium hydride. TIAX applied a bottom-up costing methodology customized to analyze and quantify the processes used in the manufacture of hydrogen storage systems. This methodology, used in conjunction with ® software and other tools, developed costs for all major tank components, balance-of-tank, tank assembly, and system assembly. Based on this methodology, the figure below shows the projected on-board high-volume factory costs of the various analyzed hydrogen storage systems, as designed. Reductions in the key cost drivers may bring hydrogen storage system costs closer to this DOE target

  11. Hydrogen and deuterium loss from the terrestrial atmosphere - A quantitative assessment of nonthermal escape fluxes

    NASA Technical Reports Server (NTRS)

    Yung, Yuk L.; Wen, Jun-Shan; Moses, Julianne I.; Landry, Bridget M.; Allen, Mark; Hsu, Kuang-Jung

    1989-01-01

    A comprehensive one-dimensional photochemical model extending from the middle atmosphere (50 km) to the exobase (432 km) has been used to study the escape of hydrogen and deuterium from the earth's atmosphere. The model incorporates recent advances in chemical kinetics as well as atmospheric observations by satellites, especially the Atmosphere Explorer C satellite. The results suggest that the escape fluxes of both H and D are limited by the upward transport of total hydrogen and total deuterium at the homopause. About one fourth of total hydrogen escape is thermal, the rest being nonthermal. It is shown that escape of D is nonthermal and that charge exchange and polar wind are important mechanisms for the nonthermal escape of H and D.

  12. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  13. Isotopic disproportionation during hydrogen isotopic analysis of nitrogen-bearing organic compounds

    USGS Publications Warehouse

    Nair, Sreejesh; Geilmann, Heike; Coplen, Tyler B.; Qi, Haiping; Gehre, Matthias; Schimmelmann, Arndt; Brand, Willi A.

    2015-01-01

    Rationale High-precision hydrogen isotope ratio analysis of nitrogen-bearing organic materials using high-temperature conversion (HTC) techniques has proven troublesome in the past. Formation of reaction products other than molecular hydrogen (H2) has been suspected as a possible cause of incomplete H2 yield and hydrogen isotopic fractionation. Methods The classical HTC reactor setup and a modified version including elemental chromium, both operated at temperatures in excess of 1400 °C, have been compared using a selection of nitrogen-bearing organic compounds, including caffeine. A focus of the experiments was to avoid or suppress hydrogen cyanide (HCN) formation and to reach quantitative H2 yields. The technique also was optimized to provide acceptable sample throughput. Results The classical HTC reaction of a number of selected compounds exhibited H2 yields from 60 to 90 %. Yields close to 100 % were measured for the experiments with the chromium-enhanced reactor. The δ2H values also were substantially different between the two types of experiments. For the majority of the compounds studied, a highly significant relationship was observed between the amount of missing H2and the number of nitrogen atoms in the molecules, suggesting the pyrolytic formation of HCN as a byproduct. A similar linear relationship was found between the amount of missing H2 and the observed hydrogen isotopic result, reflecting isotopic fractionation. Conclusions The classical HTC technique to produce H2 from organic materials using high temperatures in the presence of glassy carbon is not suitable for nitrogen-bearing compounds. Adding chromium to the reaction zone improves the yield to 100 % in most cases. The initial formation of HCN is accompanied by a strong hydrogen isotope effect, with the observed hydrogen isotope results on H2 being substantially shifted to more negative δ2H values. The reaction can be understood as an initial disproportionation leading to H2 and HCN

  14. Quantitative analysis of immobilized metalloenzymes by atomic absorption spectroscopy.

    PubMed

    Opwis, Klaus; Knittel, Dierk; Schollmeyer, Eckhard

    2004-12-01

    A new, sensitive assay for the quantitative determination of immobilized metal containing enzymes has been developed using atomic absorption spectroscopy (AAS). In contrast with conventionally used indirect methods the described quantitative AAS assay for metalloenzymes allows more exact analyses, because the carrier material with the enzyme is investigated directly. As an example, the validity and reliability of the method was examined by fixing the iron-containing enzyme catalase on cotton fabrics using different immobilization techniques. Sample preparation was carried out by dissolving the loaded fabrics in sulfuric acid before oxidising the residues with hydrogen peroxide. The iron concentrations were determined by flame atomic absorption spectrometry after calibration of the spectrometer with solutions of the free enzyme at different concentrations.

  15. Analysis of surface, subsurface, and bulk hydrogen in ZnO using nuclear reaction analysis

    SciTech Connect

    Traeger, F.; Kauer, M.; Woell, Ch.; Rogalla, D.; Becker, H.-W.

    2011-08-15

    Hydrogen concentrations in ZnO single crystals exposing different surfaces have been determined to be in the range of (0.02-0.04) at.% with an error of {+-}0.01 at.% using nuclear reaction analysis. In the subsurface region, the hydrogen concentration has been determined to be higher by up to a factor of 10. In contrast to the hydrogen in the bulk, part of the subsurface hydrogen is less strongly bound, can be removed by heating to 550 deg. C, and reaccommodated by loading with atomic hydrogen. By exposing the ZnO(1010) surface to water above room temperature and to atomic hydrogen, respectively, hydroxylation with the same coverage of hydrogen is observed.

  16. Quantitative Analysis of HIV-1 Preintegration Complexes

    PubMed Central

    Engelman, Alan; Oztop, Ilker; Vandegraaff, Nick; Raghavendra, Nidhanapati K.

    2009-01-01

    Retroviral replication proceeds through the formation of a provirus, an integrated DNA copy of the viral RNA genome. The linear cDNA product of reverse transcription is the integration substrate and two different integrase activities, 3′ processing and DNA strand transfer, are required for provirus formation. Integrase nicks the cDNA ends adjacent to phylogenetically-conserved CA dinucleotides during 3′ processing. After nuclear entry and locating a suitable chromatin acceptor site, integrase joins the recessed 3′-OHs to the 5′-phosphates of a double-stranded staggered cut in the DNA target. Integrase functions in the context of a large nucleoprotein complex, called the preintegration complex (PIC), and PICs are analyzed to determine levels of integrase 3′ processing and DNA strand transfer activities that occur during acute virus infection. Denatured cDNA end regions are monitored by indirect end-labeling to measure the extent of 3′ processing. Native PICs can efficiently integrate their viral cDNA into exogenously added target DNA in vitro, and Southern blotting or nested PCR assays are used to quantify the resultant DNA strand transfer activity. This study details HIV-1 infection, PIC extraction, partial purification, and quantitative analyses of integrase 3′ processing and DNA strand transfer activities. PMID:19233280

  17. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  18. Hydrogen measurement during steam oxidation using coupled thermogravimetric analysis and quadrupole mass spectrometry

    DOE PAGES

    Parkison, Adam J.; Nelson, Andrew Thomas

    2016-01-11

    An analytical technique is presented with the goal of measuring reaction kinetics during steam oxidation reactions for three cases in which obtaining kinetics information often requires a prohibitive amount of time and cost. The technique presented relies on coupling thermogravimetric analysis (TGA) with a quantitative hydrogen measurement technique using quadrupole mass spectrometry (QMS). The first case considered is in differentiating between the kinetics of steam oxidation reactions and those for simultaneously reacting gaseous impurities such as nitrogen or oxygen. The second case allows one to independently measure the kinetics of oxide and hydride formation for systems in which both ofmore » these reactions are known to take place during steam oxidation. The third case deals with measuring the kinetics of formation for competing volatile and non-volatile oxides during certain steam oxidation reactions. In order to meet the requirements of the coupled technique, a methodology is presented which attempts to provide quantitative measurement of hydrogen generation using QMS in the presence of an interfering fragmentation species, namely water vapor. This is achieved such that all calibrations and corrections are performed during the TGA baseline and steam oxidation programs, making system operation virtually identical to standard TGA. Benchmarking results showed a relative error in hydrogen measurement of 5.7–8.4% following the application of a correction factor. Lastly, suggestions are made for possible improvements to the presented technique so that it may be better applied to the three cases presented.« less

  19. Hydrogen measurement during steam oxidation using coupled thermogravimetric analysis and quadrupole mass spectrometry

    SciTech Connect

    Parkison, Adam J.; Nelson, Andrew Thomas

    2016-01-11

    An analytical technique is presented with the goal of measuring reaction kinetics during steam oxidation reactions for three cases in which obtaining kinetics information often requires a prohibitive amount of time and cost. The technique presented relies on coupling thermogravimetric analysis (TGA) with a quantitative hydrogen measurement technique using quadrupole mass spectrometry (QMS). The first case considered is in differentiating between the kinetics of steam oxidation reactions and those for simultaneously reacting gaseous impurities such as nitrogen or oxygen. The second case allows one to independently measure the kinetics of oxide and hydride formation for systems in which both of these reactions are known to take place during steam oxidation. The third case deals with measuring the kinetics of formation for competing volatile and non-volatile oxides during certain steam oxidation reactions. In order to meet the requirements of the coupled technique, a methodology is presented which attempts to provide quantitative measurement of hydrogen generation using QMS in the presence of an interfering fragmentation species, namely water vapor. This is achieved such that all calibrations and corrections are performed during the TGA baseline and steam oxidation programs, making system operation virtually identical to standard TGA. Benchmarking results showed a relative error in hydrogen measurement of 5.7–8.4% following the application of a correction factor. Lastly, suggestions are made for possible improvements to the presented technique so that it may be better applied to the three cases presented.

  20. Analysis of data from spilling experiments performed with liquid hydrogen.

    PubMed

    Statharas, J C; Venetsanos, A G; Bartzis, J G; Würtz, J; Schmidtchen, U

    2000-10-02

    This work describes the modelling of liquid hydrogen release experiments using the ADREA-HF 3-D time dependent finite volume code for cloud dispersion, jointly developed by DEMOKRITOS and JRC-Ispra. The experiments were performed by Batelle Ingenieurtechnik for BAM (Bundesanstalt fur Materialforschung und Prufung), Berlin, in the frame of the Euro-Quebec-Hydro-Hydrogen-Pilot-Project and they mainly deal with LH2 near ground releases between buildings. In the present study, the experimental trial #5 was assumed for simulation due to the fact that in this release the largest number of sensor readings were obtained. The simulations illustrated the complex behaviour of LH2 dispersion in presence of buildings, characterized by complicated wind patterns, plume back flow near the source, dense gas behaviour at near range and significant buoyant behaviour at the far range. The simulations showed the strong effect of ground heating in the LH2 dispersion. The model also revealed major features of the dispersion that had to do with the "dense" behaviour of the cold hydrogen and the buoyant behaviour of the "warming-up" gas as well as the interaction of the building and the release wake. Such a behaviour was in qualitative and even quantitative agreement with the experiment. The results are given in terms of concentration time series, scatter plots, contour plots, wind field vector plots and 3-D concentration wireframes. Given all experiment uncertainties, the model gives reasonable results on concentrations levels.

  1. Hydrogen

    PubMed Central

    Bockris, John O’M.

    2011-01-01

    The idea of a “Hydrogen Economy” is that carbon containing fuels should be replaced by hydrogen, thus eliminating air pollution and growth of CO2 in the atmosphere. However, storage of a gas, its transport and reconversion to electricity doubles the cost of H2 from the electrolyzer. Methanol made with CO2 from the atmosphere is a zero carbon fuel created from inexhaustible components from the atmosphere. Extensive work on the splitting of water by bacteria shows that if wastes are used as the origin of feed for certain bacteria, the cost for hydrogen becomes lower than any yet known. The first creation of hydrogen and electricity from light was carried out in 1976 by Ohashi et al. at Flinders University in Australia. Improvements in knowledge of the structure of the semiconductor-solution system used in a solar breakdown of water has led to the discovery of surface states which take part in giving rise to hydrogen (Khan). Photoelectrocatalysis made a ten times increase in the efficiency of the photo production of hydrogen from water. The use of two electrode cells; p and n semiconductors respectively, was first introduced by Uosaki in 1978. Most photoanodes decompose during the photoelectrolysis. To avoid this, it has been necessary to create a transparent shield between the semiconductor and its electronic properties and the solution. In this way, 8.5% at 25 °C and 9.5% at 50 °C has been reached in the photo dissociation of water (GaP and InAs) by Kainthla and Barbara Zeleney in 1989. A large consortium has been funded by the US government at the California Institute of Technology under the direction of Nathan Lewis. The decomposition of water by light is the main aim of this group. Whether light will be the origin of the post fossil fuel supply of energy may be questionable, but the maximum program in this direction is likely to come from Cal. Tech. PMID:28824125

  2. Quantitative Spectral Analysis of Evolved Low-Mass Stars

    NASA Astrophysics Data System (ADS)

    Werner, Klaus; Rauch, Thomas; Kruk, Jeffrey W.

    2009-09-01

    The hydrogen-deficiency in extremely hot post-AGB stars of spectral class PG1159 is probably caused by a (very) late helium-shell flash or a AGB final thermal pulse that consumes the hydrogen envelope, exposing the usually-hidden intershell region. Thus, the photospheric element abundances of these stars allow us to draw conclusions about details of nuclear burning and mixing processes in the precursor AGB stars. We compare predicted element abundances to those determined by quantitative spectral analyses performed with advanced non-LTE model atmospheres. A good qualitative and quantitative agreement is found for many species (He, C, N, O, Ne, F, Si, Ar) but discrepancies for others (P, S, Fe) point at shortcomings in stellar evolution models for AGB stars. Almost all of the chemical trace elements in these hot stars can only be identified in the UV spectral range. The Far Ultraviolet Spectroscopic Explorer and the Hubble Space Telescope played a crucial role for this research.

  3. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  4. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy.

    PubMed

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  5. Quantitive and Sociological Analysis of Blog Networks

    NASA Astrophysics Data System (ADS)

    Bachnik, W.; Szymczyk, S.; Leszczynski, S.; Podsiadlo, R.; Rymszewicz, E.; Kurylo, L.; Makowiec, D.; Bykowska, B.

    2005-10-01

    This paper examines the emerging phenomenon of blogging, using three different Polish blogging services as the base of the research. Authors show that blog networks are sharing their characteristics with complex networks (gamma coefficients, small worlds, cliques, etc.). Elements of sociometric analysis were used to prove existence of some social structures in the blog networks.

  6. Quantitative analysis of Li by PIGE technique

    NASA Astrophysics Data System (ADS)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  7. A quantitative analysis of the reactions involved in stratospheric ozone depletion in the polar vortex core

    NASA Astrophysics Data System (ADS)

    Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2017-09-01

    We present a quantitative analysis of the chemical reactions involved in polar ozone depletion in the stratosphere and of the relevant reaction pathways and cycles. While the reactions involved in polar ozone depletion are well known, quantitative estimates of the importance of individual reactions or reaction cycles are rare. In particular, there is no comprehensive and quantitative study of the reaction rates and cycles averaged over the polar vortex under conditions of heterogeneous chemistry so far. We show time series of reaction rates averaged over the core of the polar vortex in winter and spring for all relevant reactions and indicate which reaction pathways and cycles are responsible for the vortex-averaged net change of the key species involved in ozone depletion, i.e., ozone, chlorine species (ClOx, HCl, ClONO2), bromine species, nitrogen species (HNO3, NOx) and hydrogen species (HOx). For clarity, we focus on one Arctic winter (2004-2005) and one Antarctic winter (2006) in a layer in the lower stratosphere around 54 hPa and show results for additional pressure levels and winters in the Supplement. Mixing ratios and reaction rates are obtained from runs of the ATLAS Lagrangian chemistry and transport model (CTM) driven by the European Centre for Medium-Range Weather Forecasts (ECMWF) ERA-Interim reanalysis data. An emphasis is put on the partitioning of the relevant chemical families (nitrogen, hydrogen, chlorine, bromine and odd oxygen) and activation and deactivation of chlorine.

  8. Quantitative Analysis of Immunohistochemistry in Melanoma Tumors.

    PubMed

    Lilyquist, Jenna; White, Kirsten Anne Meyer; Lee, Rebecca J; Philips, Genevieve K; Hughes, Christopher R; Torres, Salina M

    2017-04-01

    Identification of positive staining is often qualitative and subjective. This is particularly troublesome in pigmented melanoma lesions, because melanin is difficult to distinguish from the brown stain resulting from immunohistochemistry (IHC) using horse radish peroxidase developed with 3,3'-Diaminobenzidine (HRP-DAB). We sought to identify and quantify positive staining, particularly in melanoma lesions. We visualized G-protein coupled estrogen receptor (GPER) expression developed with HRP-DAB and counterstained with Azure B (stains melanin) in melanoma tissue sections (n = 3). Matched sections (n = 3), along with 22 unmatched sections, were stained only with Azure B as a control. Breast tissue (n = 1) was used as a positive HRP-DAB control. Images of the stained tissues were generated using a Nuance Spectral Imaging Camera. Analysis of the images was performed using the Nuance Spectral Imaging software and SlideBook. Data was analyzed using a Kruskal-Wallis one way analysis of variance (ANOVA). We showed that a pigmented melanoma tissue doubly stained with anti-GPER HRP-DAB and Azure B can be unmixed using spectra derived from a matched, Azure B-only section, and an anti-GPER HRP-DAB control. We unmixed each of the melanoma lesions using each of the Azure B spectra, evaluated the mean intensity of positive staining, and examined the distribution of the mean intensities (P = .73; Kruskal-Wallis). These results suggest that this method does not require a matched Azure B-only stained control tissue for every melanoma lesion, allowing precious tissues to be conserved for other studies. Importantly, this quantification method reduces the subjectivity of protein expression analysis, and provides a valuable tool for accurate evaluation, particularly for pigmented tissues.

  9. Chromatic Image Analysis For Quantitative Thermal Mapping

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  10. Influence of corrosion layers on quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Röhrich, J.; Strub, E.

    2005-09-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed.

  11. Quantitative Analysis in Nuclear Medicine Imaging

    NASA Astrophysics Data System (ADS)

    Zaidi, Habib

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases.

  12. Segmentation and Quantitative Analysis of Epithelial Tissues.

    PubMed

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  13. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  14. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  15. Hydrogen and Water: An Engineering, Economic and Environmental Analysis

    SciTech Connect

    Simon, A J; Daily, W; White, R G

    2010-01-06

    The multi-year program plan for the Department of Energy's Hydrogen and Fuel Cells Technology Program (USDOE, 2007a) calls for the development of system models to determine economic, environmental and cross-cutting impacts of the transition to a hydrogen economy. One component of the hydrogen production and delivery chain is water; water's use and disposal can incur costs and environmental consequences for almost any industrial product. It has become increasingly clear that due to factors such as competing water demands and climate change, the potential for a water-constrained world is real. Thus, any future hydrogen economy will need to be constructed so that any associated water impacts are minimized. This, in turn, requires the analysis and comparison of specific hydrogen production schemes in terms of their water use. Broadly speaking, two types of water are used in hydrogen production: process water and cooling water. In the production plant, process water is used as a direct input for the conversion processes (e.g. steam for Steam Methane Reforming {l_brace}SMR{r_brace}, water for electrolysis). Cooling water, by distinction, is used indirectly to cool related fluids or equipment, and is an important factor in making plant processes efficient and reliable. Hydrogen production further relies on water used indirectly to generate other feedstocks required by a hydrogen plant. This second order indirect water is referred to here as 'embedded' water. For example, electricity production uses significant quantities of water; this 'thermoelectric cooling' contributes significantly to the total water footprint of the hydrogen production chain. A comprehensive systems analysis of the hydrogen economy includes the aggregate of the water intensities from every step in the production chain including direct, indirect, and embedded water. Process and cooling waters have distinct technical quality requirements. Process water, which is typically high purity (limited dissolved

  16. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  17. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  18. Quantitative analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    Kurths, J.; Voss, A.; Saparin, P.; Witt, A.; Kleiner, H. J.; Wessel, N.

    1995-03-01

    In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, noninvasive diagnostic tools like Holter monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyze the HRV. Especially, some complexity measures that are based on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.

  19. Economic Analysis of Hydrogen Production from Wind: Preprint

    SciTech Connect

    Levene, J. I.

    2005-05-01

    The purpose of this analysis is to determine the cost of using wind energy to produce hydrogen for use as a transportation fuel. This analysis assumes that a market exists for 50,000 kg of hydrogen per day produced from wind at the wind site; only production costs to the front gate are included, no delivery or dispensing costs are included. Three different scenarios are examined: near term, which represents 2005 currently available technology; mid term, which represents technological improvements and price reductions in the next 5-10 years; and long term, which is representative of the best technology gains and price reductions surmised by industry at this point, and represents the next 10-25 years.

  20. Hydrogen-fueled scramjets: Potential for detailed combustor analysis

    NASA Technical Reports Server (NTRS)

    Beach, H. L., Jr.

    1976-01-01

    Combustion research related to hypersonic scramjet (supersonic combustion ramjet) propulsion is discussed from the analytical point of view. Because the fuel is gaseous hydrogen, mixing is single phase and the chemical kinetics are well known; therefore, the potential for analysis is good relative to hydro-carbon fueled engines. Recent progress in applying two and three dimensional analytical techniques to mixing and reacting flows indicates cause for optimism, and identifies several areas for continuing effort.

  1. Hydrogen embrittlement I. Analysis of hydrogen-enhanced localized plasticity: Effect of hydrogen on the velocity of screw dislocations in α -Fe

    NASA Astrophysics Data System (ADS)

    Katzarov, Ivaylo H.; Pashov, Dimitar L.; Paxton, Anthony T.

    2017-08-01

    We demonstrate a kinetic Monte Carlo simulation tool, based on published data using first-principles quantum mechanics, applied to answer the question: under which conditions of stress, temperature, and nominal hydrogen concentration does the presence of hydrogen in iron increase or decrease the screw dislocation velocity? Furthermore, we examine the conditions under which hydrogen-induced shear localization is likely to occur. Our simulations yield quantitative data on dislocation velocity and the ranges of hydrogen concentration within which a large gradient of velocity as a function of concentration is expected to be observed and thereby contribute to a self-perpetuating localization of plasticity—a phenomenon that has been linked to hydrogen-induced fracture and fatigue failure in ultrahigh strength steel. We predict the effect of hydrogen in generating debris made up of edge dipoles trailing in the wake of gliding screw dislocations and their role in pinning. We also simulate the competing effects of softening by enhanced kink-pair generation and hardening by solute pinning. Our simulations act as a bridge between first-principles quantum mechanics and discrete dislocation dynamics, and at the same time offer the prospect of a fully physics-based dislocation dynamics method.

  2. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  3. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  4. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  5. Quantitative flow cytometric analysis of membrane antigen expression.

    PubMed

    D'hautcourt, Jean-Luc

    2002-11-01

    Immunological analysis for cell antigens has been performed by flow cytometry in a qualitative fashion for over thirty years. During that time it has become increasingly apparent that quantitative measurements such as number of antigens per cell provide unique and useful information. This unit on quantitative flow cytometry (QFCM) describes the most commonly used protocols, both direct and indirect, and the major methods of analysis for the number of antibody binding sites on a cell or particle. Practical applications include detection of antigen under- or overexpression in hematological malignancies, distinguishing between B cell lymphoproliferative disorders, and precise diagnosis of certain rare diseases.

  6. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  7. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  8. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  9. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  10. Macro-System Model for Hydrogen Energy Systems Analysis in Transportation: Preprint

    SciTech Connect

    Diakov, V.; Ruth, M.; Sa, T. J.; Goldsby, M. E.

    2012-06-01

    The Hydrogen Macro System Model (MSM) is a simulation tool that links existing and emerging hydrogen-related models to perform rapid, cross-cutting analysis. It allows analysis of the economics, primary energy-source requirements, and emissions of hydrogen production and delivery pathways.

  11. Hydrogen Trailer Storage Facility (Building 878). Consequence analysis

    SciTech Connect

    Banda, Z.; Wood, C.L.

    1994-12-01

    The Department of Energy Order 5500.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This consequence analysis documents the impact that a hydrogen accident could have to employees, the general public, and nearby facilities. The computer model ARCHIE was utilized to determine discharge rates, toxic vapor dispersion analyses, flammable vapor cloud hazards, explosion hazards, and flame jets for the Hydrogen Trailer Storage Facility located at Building 878. To determine over pressurization effects, hand calculations derived from the Department of the Air Force Manual, ``Structures to Resist the Effects of Accidental Explosions,`` were utilized. The greatest distances at which a postulated facility event will produce the Lower Flammability and the Lower Detonation Levels are 1,721 feet and 882 feet, respectively. The greatest distance at which 10.0 psi overpressure (i.e., total building destruction) is reached is 153 feet.

  12. Insights from Hydrogen Refueling Station Manufacturing Competitiveness Analysis

    SciTech Connect

    Mayyas, Ahmad

    2015-12-18

    In work for the Clean Energy Manufacturing Analysis Center (CEMAC), NREL is currently collaborating with Great Lakes Wind Network in conducting a comprehensive hydrogen refueling stations manufacturing competitiveness and supply chain analyses. In this project, CEMAC will be looking at several metrics that will facilitate understanding of the interactions between and within the HRS supply chain, such metrics include innovation potential, intellectual properties, learning curves, related industries and clustering, existing supply chains, ease of doing business, and regulations and safety. This presentation to Fuel Cell Seminar and Energy Exposition 2015 highlights initial findings from CEMAC's analysis.

  13. Estimation of Hydrogen-Exchange Protection Factors from MD Simulation Based on Amide Hydrogen Bonding Analysis.

    PubMed

    Park, In-Hee; Venable, John D; Steckler, Caitlin; Cellitti, Susan E; Lesley, Scott A; Spraggon, Glen; Brock, Ansgar

    2015-09-28

    Hydrogen exchange (HX) studies have provided critical insight into our understanding of protein folding, structure, and dynamics. More recently, hydrogen exchange mass spectrometry (HX-MS) has become a widely applicable tool for HX studies. The interpretation of the wealth of data generated by HX-MS experiments as well as other HX methods would greatly benefit from the availability of exchange predictions derived from structures or models for comparison with experiment. Most reported computational HX modeling studies have employed solvent-accessible-surface-area based metrics in attempts to interpret HX data on the basis of structures or models. In this study, a computational HX-MS prediction method based on classification of the amide hydrogen bonding modes mimicking the local unfolding model is demonstrated. Analysis of the NH bonding configurations from molecular dynamics (MD) simulation snapshots is used to determine partitioning over bonded and nonbonded NH states and is directly mapped into a protection factor (PF) using a logistics growth function. Predicted PFs are then used for calculating deuteration values of peptides and compared with experimental data. Hydrogen exchange MS data for fatty acid synthase thioesterase (FAS-TE) collected for a range of pHs and temperatures was used for detailed evaluation of the approach. High correlation between prediction and experiment for observable fragment peptides is observed in the FAS-TE and additional benchmarking systems that included various apo/holo proteins for which literature data were available. In addition, it is shown that HX modeling can improve experimental resolution through decomposition of in-exchange curves into rate classes, which correlate with prediction from MD. Successful rate class decompositions provide further evidence that the presented approach captures the underlying physical processes correctly at the single residue level. This assessment is further strengthened in a comparison of

  14. Estimation of Hydrogen-Exchange Protection Factors from MD Simulation Based on Amide Hydrogen Bonding Analysis

    PubMed Central

    Park, In-Hee; Venable, John D.; Steckler, Caitlin; Cellitti, Susan E.; Lesley, Scott A.; Spraggon, Glen; Brock, Ansgar

    2015-01-01

    Hydrogen exchange (HX) studies have provided critical insight into our understanding of protein folding, structure and dynamics. More recently, Hydrogen Exchange Mass Spectrometry (HX-MS) has become a widely applicable tool for HX studies. The interpretation of the wealth of data generated by HX-MS experiments as well as other HX methods would greatly benefit from the availability of exchange predictions derived from structures or models for comparison with experiment. Most reported computational HX modeling studies have employed solvent-accessible-surface-area based metrics in attempts to interpret HX data on the basis of structures or models. In this study, a computational HX-MS prediction method based on classification of the amide hydrogen bonding modes mimicking the local unfolding model is demonstrated. Analysis of the NH bonding configurations from Molecular Dynamics (MD) simulation snapshots is used to determine partitioning over bonded and non-bonded NH states and is directly mapped into a protection factor (PF) using a logistics growth function. Predicted PFs are then used for calculating deuteration values of peptides and compared with experimental data. Hydrogen exchange MS data for Fatty acid synthase thioesterase (FAS-TE) collected for a range of pHs and temperatures was used for detailed evaluation of the approach. High correlation between prediction and experiment for observable fragment peptides is observed in the FAS-TE and additional benchmarking systems that included various apo/holo proteins for which literature data were available. In addition, it is shown that HX modeling can improve experimental resolution through decomposition of in-exchange curves into rate classes, which correlate with prediction from MD. Successful rate class decompositions provide further evidence that the presented approach captures the underlying physical processes correctly at the single residue level. This assessment is further strengthened in a comparison of

  15. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  16. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  17. Position-Specific Hydrogen and Carbon Isotope Fractionations of Light Hydrocarbons by Quantitative NMR

    NASA Astrophysics Data System (ADS)

    Liu, C.; Mcgovern, G. P.; Horita, J.

    2015-12-01

    Traditional isotope ratio mass spectrometry methods to measure 2H/1H and 13C/12C ratios of organic molecules only provide average isotopic values of whole molecules. During the measurement process, valuable information of position-specific isotope fractionations (PSIF) between non-equivalent H and C positions is lost, which can provide additional very useful information about the origins and history of organic molecules. Quantitative nuclear magnetic resonance (NMR) spectrometry can measure 2H and 13C PSIF of organic molecules without destruction. The 2H and 13C signals from different positions of a given molecule show up as distinctive peaks in an NMR spectrum, and their peak areas are proportional to the 2H and 13C populations at each position. Moreover, quantitative NMR can be applied to a wide variety of organic molecules. We have been developing quantitative NMR methods to determine 2H and 13C PSIF of light hydrocarbons (propane, butane and pentane), using J-Young and custom-made high-pressure NMR cells. With careful conditioning of the NMR spectrometer (e.g. tuning, shimming) and effective 1H -13C decoupling, precision of ± <10‰ (2H) and ± <1‰ (13C) can be readily attainable after several hours of acquisition. Measurement time depends on the relaxation time of interested nucleus and the total number of scans needed for high signal-to-noise ratios. Our data for commercial, pure hydrocarbon samples showed that 2H PSIF in the hydrocarbons can be larger than 60‰ and that 13C PSIF can be as large as 15‰. Comparison with theoretical calculations indicates that the PSIF patterns of some hydrocarbon samples reflect non-equilibrium processes in their productions.

  18. Real-Time Quantitative Analysis of H2, He, O2, and Ar by Quadrupole Ion Trap Mass Spectrometry

    NASA Technical Reports Server (NTRS)

    Ottens, Andrew K.; Harrison, W. W.; Griffin, Timothy P.; Helms, William R.; Voska, N. (Technical Monitor)

    2002-01-01

    The use of a quadrupole ion trap mass spectrometer for quantitative analysis of hydrogen and helium as well as other permanent gases is demonstrated. The customized instrument utilizes the mass selective instability mode of mass analysis as with commercial instruments; however, this instrument operates at a greater RF trapping frequency and without a buffer gas. With these differences, a useable mass range from 2 to over 50 Da is achieved, as required by NASA for monitoring the Space Shuttle during a launch countdown. The performance of the ion trap is evaluated using part-per-million concentrations of hydrogen, helium, oxygen and argon mixed into a nitrogen gas stream. Relative accuracy and precision when quantitating the four analytes were better than the NASA-required minimum of 10% error and 5% deviation, respectively. Limits of detection were below the NASA requirement of 25-ppm hydrogen and 100-ppm helium; those for oxygen and argon were slightly higher than the requirement. The instrument provided adequate performance at fast data recording rates, demonstrating the utility of an ion trap mass spectrometer as a real-time quantitative monitoring device for permanent gas analysis.

  19. Real-Time Quantitative Analysis of H2, He, O2, and Ar by Quadrupole Ion Trap Mass Spectrometry

    NASA Technical Reports Server (NTRS)

    Ottens, Andrew K.; Harrison, W. W.; Griffin, Timothy P.; Helms, William R.; Voska, N. (Technical Monitor)

    2002-01-01

    The use of a quadrupole ion trap mass spectrometer for quantitative analysis of hydrogen and helium as well as other permanent gases is demonstrated. The customized instrument utilizes the mass selective instability mode of mass analysis as with commercial instruments; however, this instrument operates at a greater RF trapping frequency and without a buffer gas. With these differences, a useable mass range from 2 to over 50 Da is achieved, as required by NASA for monitoring the Space Shuttle during a launch countdown. The performance of the ion trap is evaluated using part-per-million concentrations of hydrogen, helium, oxygen and argon mixed into a nitrogen gas stream. Relative accuracy and precision when quantitating the four analytes were better than the NASA-required minimum of 10% error and 5% deviation, respectively. Limits of detection were below the NASA requirement of 25-ppm hydrogen and 100-ppm helium; those for oxygen and argon were slightly higher than the requirement. The instrument provided adequate performance at fast data recording rates, demonstrating the utility of an ion trap mass spectrometer as a real-time quantitative monitoring device for permanent gas analysis.

  20. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  1. Quantitating the subtleties of microglial morphology with fractal analysis.

    PubMed

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  2. Hydrogen Scenario Analysis Summary Report: Analysis of the Transition to Hydrogen Fuel Cell Vehicles and the Potential Hydrogen Energy Infrastructure Requirements

    SciTech Connect

    Greene, David L; Leiby, Paul Newsome; James, Brian; Perez, Julie; Melendez, Margo; Milbrandt, Anelia; Unnasch, Stefan; Rutherford, Daniel; Hooks, Matthew

    2008-03-01

    Infrastructure Technologies Program (HFCIT) has supported a series of analyses to evaluate alternative scenarios for deployment of millions of hydrogen fueled vehicles and supporting infrastructure. To ensure that these alternative market penetration scenarios took into consideration the thinking of the automobile manufacturers, energy companies, industrial hydrogen suppliers, and others from the private sector, DOE held several stakeholder meetings to explain the analyses, describe the models, and solicit comments about the methods, assumptions, and preliminary results (U.S. DOE, 2006a). The first stakeholder meeting was held on January 26, 2006, to solicit guidance during the initial phases of the analysis; this was followed by a second meeting on August 9-10, 2006, to review the preliminary results. A third and final meeting was held on January 31, 2007, to discuss the final analysis results. More than 60 hydrogen energy experts from industry, government, national laboratories, and universities attended these meetings and provided their comments to help guide DOE's analysis. The final scenarios attempt to reflect the collective judgment of the participants in these meetings. However, they should not be interpreted as having been explicitly endorsed by DOE or any of the stakeholders participating. The DOE analysis examined three vehicle penetration scenarios: Scenario 1--Production of thousands of vehicles per year by 2015 and hundreds of thousands per year by 2019. This option is expected to lead to a market penetration of 2.0 million fuel cell vehicles (FCV) by 2025. Scenario 2--Production of thousands of FCVs by 2013 and hundreds of thousands by 2018. This option is expected to lead to a market penetration of 5.0 million FCVs by 2025. Scenario 3--Production of thousands of FCVs by 2013, hundreds of thousands by 2018, and millions by 2021 such that market penetration is 10 million by 2025. Scenario 3 was formulated to comply with the NAS recommendation: 'DOE should map out

  3. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  4. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  5. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    NASA Astrophysics Data System (ADS)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  6. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  7. Quantitative analysis of single-molecule superresolution images

    PubMed Central

    Coltharp, Carla; Yang, Xinxing; Xiao, Jie

    2014-01-01

    This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006

  8. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  9. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  10. Quantitative numerical analysis of transient IR-experiments on buildings

    NASA Astrophysics Data System (ADS)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  11. Integrated analysis of hydrogen passenger vehicle transportation pathways

    SciTech Connect

    Thomas, C.E.; James, B.D.; Lomax, F.D. Jr.; Kuhn, I.F. Jr.

    1998-08-01

    Hydrogen-powered fuel cell vehicles will reduce local air pollution, greenhouse gas emissions and oil imports. Other alternative vehicles such as gasoline- or methanol-powered fuel cell vehicles, natural gas vehicles and various hybrid electric vehicles with internal combustion engines may also provide significant environmental and national security advantages. This report summarizes a two-year project to compare the direct hydrogen fuel cell vehicle with other alternatives in terms of estimated cost and estimated societal benefits, all relative to a conventional gasoline-powered internal combustion engine vehicle. The cost estimates used in this study involve ground-up, detailed analysis of the major components of a fuel cell vehicle system, assuming mass production in automotive quantities. The authors have also estimated the cost of both gasoline and methanol onboard fuel processors, as well as the cost of stationary hydrogen fueling system components including steam methane reformers, electrolyzers, compressors and stationary storage systems. Sixteen different vehicle types are compared with respect to mass production cost, local air pollution and greenhouse gas emissions.

  12. Final Report: Hydrogen Production Pathways Cost Analysis (2013 – 2016)

    SciTech Connect

    James, Brian David; DeSantis, Daniel Allan; Saur, Genevieve

    2016-09-30

    This report summarizes work conducted under a three year Department of Energy (DOE) funded project to Strategic Analysis, Inc. (SA) to analyze multiple hydrogen (H2) production technologies and project their corresponding levelized production cost of H2. The analysis was conducted using the H2A Hydrogen Analysis Tool developed by the DOE and National Renewable Energy Laboratory (NREL). The project was led by SA but conducted in close collaboration with the NREL and Argonne National Laboratory (ANL). In-depth techno-economic analysis (TEA) of five different H2 production methods was conducted. These TEAs developed projections for capital costs, fuel/feedstock usage, energy usage, indirect capital costs, land usage, labor requirements, and other parameters, for each H2 production pathway, and use the resulting cost and system parameters as inputs into the H2A discounted cash flow model to project the production cost of H2 ($/kgH2). Five technologies were analyzed as part of the project and are summarized in this report: Proton Exchange Membrane technology (PEM), High temperature solid oxide electrolysis cell technology (SOEC), Dark fermentation of biomass for H2 production, H2 production via Monolithic Piston-Type Reactors with rapid swing reforming and regeneration reactions, and Reformer-Electrolyzer-Purifier (REP) technology developed by Fuel Cell Energy, Inc. (FCE).

  13. Markov chain Monte Carlo linkage analysis of complex quantitative phenotypes.

    PubMed

    Hinrichs, A; Reich, T

    2001-01-01

    We report a Markov chain Monte Carlo analysis of the five simulated quantitative traits in Genetic Analysis Workshop 12 using the Loki software. Our objectives were to determine the efficacy of the Markov chain Monte Carlo method and to test a new scoring technique. Our initial blind analysis, on replicate 42 (the "best replicate") successfully detected four out of the five disease loci and found no false positives. A power analysis shows that the software could usually detect 4 of the 10 trait/gene combinations at an empirical point-wise p-value of 1.5 x 10(-4).

  14. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  15. Quantitative Rietveld analysis of CAC clinker phases using synchrotron radiation

    SciTech Connect

    Guirado, F. . E-mail: francesc.guirado@urv.cat; Gali, S.

    2006-11-15

    The quantitative Rietveld analyses of twenty samples of CAC from four different manufacturers over the world, one synthetic mixture and a NIST standard were performed using synchrotron radiation. As compared with conventional XRD, synchrotron powder diffraction permitted to find new minor phases, improve the characterization of solid solutions of iron rich CAC phases and reduce preferential orientation and microabsorption effects. Diffraction data were complemented with XRF and TG/DT analyses. Synchrotron results were used as a reference test to improve the performance of conventional powder diffraction, by an accurate selection of refinable profile and structural parameters, and permitted to extract several recommendations for conventional quantitative Rietveld procedures. It is shown that with these recommendations in mind, conventional XRD based Rietveld analyses are comparable to those obtained from synchrotron data. In summary, quantitative XRD Rietveld analysis is confirmed as an excellent tool for the CAC cement industry.

  16. Spotsizer: High-throughput quantitative analysis of microbial growth

    PubMed Central

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  17. The destruction chemistry of organophosphorus compounds in flames -- I: Quantitative determination of final phosphorus-containing species in hydrogen-oxygen flames

    SciTech Connect

    Korobeinichev, O.P.; Ilyin, S.B.; Shvartsberg, V.M.; Chernov, A.A.

    1999-09-01

    The combustion of organophosphorus compounds (OPC) is of considerable interest in connection with the disposal of toxic and hazardous chemical wastes and other undesirable substances containing phosphorus, including chemical warfare agents (CWA) such as the nerve agents sarin and VX. This paper presents the results of a quantitative determination of the composition of final phosphorus-containing products (PO, PO{sub 2}, HOPO, and HOPO{sub 2}) from the destruction of the organophosphorus compounds trimethyl phosphate (TMP) and dimethyl methylphosphonate (DMMP) in premixed hydrogen-oxygen flames. The flames were stabilized on a flat burner at 47 Torr and probed using molecular beam mass spectrometric techniques. Quantitative analysis of these species is difficult, due to problems with mass spectrometric calibrations. Also these compounds are unstable under normal conditions and are not readily available To solve this problem a material balance equation for the element phosphorus has been used to analyze the results is stoichiometric, rich, and lean flames, doped with different amounts of TMP and DMMP. A system of linear nondegenerate materials balance equations was solved using the Singular Value Decomposition (SVD) algorithm. The calculated calibration coefficients for the phosphorus species have allowed their mole fractions to be derived. How the concentrations of PO, PO{sub 2}, HOPO, and HOPI{sub 2} depend on the initial concentrations of DMMP or TMP and on the mixture's composition has been studied. The measurements are compared to the Results of thermochemical equilibrium calculations.

  18. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  19. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  20. Techno Economic Analysis of Hydrogen Production by gasification of biomass

    SciTech Connect

    Francis Lau

    2002-12-01

    Biomass represents a large potential feedstock resource for environmentally clean processes that produce power or chemicals. It lends itself to both biological and thermal conversion processes and both options are currently being explored. Hydrogen can be produced in a variety of ways. The majority of the hydrogen produced in this country is produced through natural gas reforming and is used as chemical feedstock in refinery operations. In this report we will examine the production of hydrogen by gasification of biomass. Biomass is defined as organic matter that is available on a renewable basis through natural processes or as a by-product of processes that use renewable resources. The majority of biomass is used in combustion processes, in mills that use the renewable resources, to produce electricity for end-use product generation. This report will explore the use of hydrogen as a fuel derived from gasification of three candidate biomass feedstocks: bagasse, switchgrass, and a nutshell mix that consists of 40% almond nutshell, 40% almond prunings, and 20% walnut shell. In this report, an assessment of the technical and economic potential of producing hydrogen from biomass gasification is analyzed. The resource base was assessed to determine a process scale from feedstock costs and availability. Solids handling systems were researched. A GTI proprietary gasifier model was used in combination with a Hysys(reg. sign) design and simulation program to determine the amount of hydrogen that can be produced from each candidate biomass feed. Cost estimations were developed and government programs and incentives were analyzed. Finally, the barriers to the production and commercialization of hydrogen from biomass were determined. The end-use of the hydrogen produced from this system is small PEM fuel cells for automobiles. Pyrolysis of biomass was also considered. Pyrolysis is a reaction in which biomass or coal is partially vaporized by heating. Gasification is a more

  1. Considerations in the analysis of hydrogen exchange mass spectrometry data

    PubMed Central

    Wales, Thomas E.; Eggertson, Michael J.; Engen, John R.

    2013-01-01

    i. Summary A major component of a hydrogen exchange mass spectrometry experiment is the analysis of protein and peptide mass spectra to yield information about deuterium incorporation. The processing of data that are produced includes the identification of each peptic peptide to create a master table/array of peptide sequence, retention time and retention time range, mass range and undeuterated mass. The amount of deuterium incorporated into each of the peptides in this array must then be determined. Various software platforms have been developed in order to perform this specific type of data analysis. We describe the fundamental parameters to be considered at each step along the way and how data processing, either by an individual or by software, must approach the analysis. PMID:23666730

  2. Stable hydrogen isotopic analysis of nanomolar molecular hydrogen by automatic multi-step gas chromatographic separation.

    PubMed

    Komatsu, Daisuke D; Tsunogai, Urumu; Kamimura, Kanae; Konno, Uta; Ishimura, Toyoho; Nakagawa, Fumiko

    2011-11-15

    We have developed a new automated analytical system that employs a continuous flow isotope ratio mass spectrometer to determine the stable hydrogen isotopic composition (δD) of nanomolar quantities of molecular hydrogen (H(2)) in an air sample. This method improves previous methods to attain simpler and lower-cost analyses, especially by avoiding the use of expensive or special devices, such as a Toepler pump, a cryogenic refrigerator, and a special evacuation system to keep the temperature of a coolant under reduced pressure. Instead, the system allows H(2) purification from the air matrix via automatic multi-step gas chromatographic separation using the coolants of both liquid nitrogen (77 K) and liquid nitrogen + ethanol (158 K) under 1 atm pressure. The analytical precision of the δD determination using the developed method was better than 4‰ for >5 nmol injections (250 mL STP for 500 ppbv air sample) and better than 15‰ for 1 nmol injections, regardless of the δD value, within 1 h for one sample analysis. Using the developed system, the δD values of H(2) can be quantified for atmospheric samples as well as samples of representative sources and sinks including those containing small quantities of H(2) , such as H(2) in soil pores or aqueous environments, for which there is currently little δD data available. As an example of such trace H(2) analyses, we report here the isotope fractionations during H(2) uptake by soils in a static chamber. The δD values of H(2) in these H(2)-depleted environments can be useful in constraining the budgets of atmospheric H(2) by applying an isotope mass balance model.

  3. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  4. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  5. An improved quantitative analysis method for plant cortical microtubules.

    PubMed

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  6. Quantitative analysis of synchrotron radiation intravenous angiographic images

    NASA Astrophysics Data System (ADS)

    Sarnelli, Anna; Nemoz, Christian; Elleaume, Hélène; Estève, François; Bertrand, Bernard; Bravin, Alberto

    2005-02-01

    A medical research protocol on clinical intravenous coronary angiography has been completed at the European Synchrotron Radiation Facility (ESRF) biomedical beamline. The aim was to investigate the accuracy of intravenous coronary angiography based on the K-edge digital subtraction technique for the detection of in-stent restenosis. For each patient, diagnosis has been performed on the synchrotron radiation images and monitored with the conventional selective coronary angiography method taken as the golden standard. In this paper, the methods of image processing and the results of the quantitative analysis are described. Image processing includes beam harmonic contamination correction, spatial deconvolution and the extraction of a 'contrast' and a 'tissue' image from each couple of radiograms simultaneously acquired at energies bracketing the K-edge of iodine. Quantitative analysis includes the estimation of the vessel diameter, the calculation of the absolute iodine concentration profiles along the coronary arteries and the stenosis degree measurement.

  7. Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces

    DTIC Science & Technology

    2012-03-01

    future MAC-enabled systems. A human-computer interaction ( HCI ) Index, originally applied to multi-function displays was applied to the prototype Vigilant...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...two modified interface designs. The modified HCI Index incorporates the Hick-Hyman decision time, Fitts’ Law time, and the physical actions

  8. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  9. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  10. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    PubMed

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-04

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  11. Quantitative analysis of the heterogeneous population of endocytic vesicles.

    PubMed

    Kozlov, Konstantin; Kosheverova, Vera; Kamentseva, Rimma; Kharchenko, Marianna; Sokolkova, Alena; Kornilova, Elena; Samsonova, Maria

    2017-03-07

    The quantitative characterization of endocytic vesicles in images acquired with microscope is critically important for deciphering of endocytosis mechanisms. Image segmentation is the most important step of quantitative image analysis. In spite of availability of many segmentation methods, the accurate segmentation is challenging when the images are heterogeneous with respect to object shapes and signal intensities what is typical for images of endocytic vesicles. We present a Morphological reconstruction and Contrast mapping segmentation method (MrComas) for the segmentation of the endocytic vesicle population that copes with the heterogeneity in their shape and intensity. The method uses morphological opening and closing by reconstruction in the vicinity of local minima and maxima respectively thus creating the strong contrast between their basins of attraction. As a consequence, the intensity is flattened within the objects and their edges are enhanced. The method accurately recovered quantitative characteristics of synthetic images that preserve characteristic features of the endocytic vesicle population. In benchmarks and quantitative comparisons with two other popular segmentation methods, namely manual thresholding and Squash plugin, MrComas shows the best segmentation results on real biological images of EGFR (Epidermal Growth Factor Receptor) endocytosis. As a proof of feasibility, the method was applied to quantify the dynamical behavior of Early Endosomal Autoantigen 1 (EEA1)-positive endosome subpopulations during EGF-stimulated endocytosis.

  12. Hydrogen analysis depth calibration by CORTEO Monte-Carlo simulation

    NASA Astrophysics Data System (ADS)

    Moser, M.; Reichart, P.; Bergmaier, A.; Greubel, C.; Schiettekatte, F.; Dollinger, G.

    2016-03-01

    Hydrogen imaging with sub-μm lateral resolution and sub-ppm sensitivity has become possible with coincident proton-proton (pp) scattering analysis (Reichart et al., 2004). Depth information is evaluated from the energy sum signal with respect to energy loss of both protons on their path through the sample. In first order, there is no angular dependence due to elastic scattering. In second order, a path length effect due to different energy loss on the paths of the protons causes an angular dependence of the energy sum. Therefore, the energy sum signal has to be de-convoluted depending on the matrix composition, i.e. mainly the atomic number Z, in order to get a depth calibrated hydrogen profile. Although the path effect can be calculated analytically in first order, multiple scattering effects lead to significant deviations in the depth profile. Hence, in our new approach, we use the CORTEO Monte-Carlo code (Schiettekatte, 2008) in order to calculate the depth of a coincidence event depending on the scattering angle. The code takes individual detector geometry into account. In this paper we show, that the code correctly reproduces measured pp-scattering energy spectra with roughness effects considered. With more than 100 μm thick Mylar-sandwich targets (Si, Fe, Ge) we demonstrate the deconvolution of the energy spectra on our current multistrip detector at the microprobe SNAKE at the Munich tandem accelerator lab. As a result, hydrogen profiles can be evaluated with an accuracy in depth of about 1% of the sample thickness.

  13. Pressure Rise Analysis When Hydrogen Leak from a Cracked Pipe in the Cryogenic Hydrogen System in J-PARC

    NASA Astrophysics Data System (ADS)

    Tatsumoto, H.; Aso, T.; Hasegawa, S.; Ushijima, I.; Kato, T.; Ohtsu, K.; Ikeda, Y.

    2006-04-01

    As one of the main experimental facilities in the Japan Proton Accelerator Research Complex (J-PARC), an intense spallation neutron source (JSNS) driven by a 1 MW proton beam is being constructed. Cryogenic hydrogen at supercritical pressure is selected as a moderator. The total nuclear heating at the moderators is estimated to be a 3.7 kW. A hydrogen system to cool the moderators has been designed. The most severe off-normal event for the cryogenic hydrogen system is considered to be a hydrogen leak when a pipe cracks. In such a case, the hydrogen must be discharged to atmosphere quickly and safely. An analytical code that simulates the pressure change during a hydrogen leak was developed. A pressure rise analysis for various sized cracks was performed, and the required sizes for relief devices were determined. A safety valve size is φ42.7 mm and a rupture disc for vacuum layer should have a diameter of 37.1 mm, respectively.

  14. Quantitative Phosphoproteomic Analysis of T-Cell Receptor Signaling.

    PubMed

    Ahsan, Nagib; Salomon, Arthur R

    2017-01-01

    TCR signaling critically depends on protein phosphorylation across many proteins. Localization of each phosphorylation event relative to the T-cell receptor (TCR) and canonical T-cell signaling proteins will provide clues about the structure of TCR signaling networks. Quantitative phosphoproteomic analysis by mass spectrometry provides a wide-scale view of cellular phosphorylation networks. However, analysis of phosphorylation by mass spectrometry is still challenging due to the relative low abundance of phosphorylated proteins relative to all proteins and the extraordinary diversity of phosphorylation sites across the proteome. Highly selective enrichment of phosphorylated peptides is essential to provide the most comprehensive view of the phosphoproteome. Optimization of phosphopeptide enrichment methods coupled with highly sensitive mass spectrometry workflows significantly improves the sequencing depth of the phosphoproteome to over 10,000 unique phosphorylation sites from complex cell lysates. Here we describe a step-by-step method for phosphoproteomic analysis that has achieved widespread success for identification of serine, threonine, and tyrosine phosphorylation. Reproducible quantification of relative phosphopeptide abundance is provided by intensity-based label-free quantitation. An ideal set of mass spectrometry analysis parameters is also provided that optimize the yield of identified sites. We also provide guidelines for the bioinformatic analysis of this type of data to assess the quality of the data and to comply with proteomic data reporting requirements.

  15. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    PubMed

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  16. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    ERIC Educational Resources Information Center

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  17. Analysis of IUE observations of hydrogen in comets

    NASA Technical Reports Server (NTRS)

    Combi, Michael R.; Feldman, Paul D.

    1993-01-01

    The large body of hydrogen Lyman-alpha observations of cometary comae obtained with the International Ultraviolet Explorer satellite has gone generally unanalyzed because of two main modeling complications. First, the inner comae of many bright (gas productive) comets are often optically thick to solar Lyman-alpha radiation. Second, even in the case of a small comet (low gas production) the large IUE aperture is quite small as compared with the immense size of the hydrogen coma, so an accurate model which properly accounts for the spatial distribution of the coma is required to invert the inferred brightnesses to column densities and finally to H atom production rates. Our Monte Carlo particle trajectory model (MPTM), which for the first time provides the realistic full phase space distribution of H atoms throughout the coma was used as the basis for the analysis of IUE observations of the inner coma. The MCPTM includes the effects of the vectorial ejection of the H atoms upon dissociation of their parent species (H2O and OH) and of their partial collisional thermalization. Both of these effects are crucial to characterize the velocity distribution of the H atoms. A new spherical radiative transfer calculation based on our MCPTM was developed to analyze IUE observations of optically thick H comae. The models were applied to observations of comets P/Giacobini-Zinner and P/Halley.

  18. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    PubMed

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes.

  19. Cell poking: quantitative analysis of indentation of thick viscoelastic layers.

    PubMed

    Duszyk, M; Schwab, B; Zahalak, G I; Qian, H; Elson, E L

    1989-04-01

    A recently introduced device, the cell poker, measures the force required to indent the exposed surface of a cell adherent to a rigid substratum. The cell poker has provided phenomenological information about the viscoelastic properties of several different types of cells, about mechanical changes triggered by external stimuli, and about the role of the cytoskeleton in these mechanical functions. Except in special cases, however, it has not been possible to extract quantitative estimates of viscosity and elasticity moduli from cell poker measurements. This paper presents cell poker measurements of well characterized viscoelastic polymeric materials, polydimethylsiloxanes of different degrees of polymerization, in a simple shape, a flat, thick layer, which for our purposes can be treated as a half space. Analysis of the measurements in terms of a linear viscoelasticity theory yields viscosity values for three polymer samples in agreement with those determined by measurements on a macroscopic scale. Theoretical analysis further indicates that the measured limiting static elasticity of the layers may result from the tension generated at the interface between the polymer and water. This work demonstrates the possibility of obtaining quantitative viscoelastic material properties from cell poker measurements and represents the first step in extending these quantitative studies to more complicated structures including cells.

  20. Cell poking: quantitative analysis of indentation of thick viscoelastic layers.

    PubMed Central

    Duszyk, M; Schwab, B; Zahalak, G I; Qian, H; Elson, E L

    1989-01-01

    A recently introduced device, the cell poker, measures the force required to indent the exposed surface of a cell adherent to a rigid substratum. The cell poker has provided phenomenological information about the viscoelastic properties of several different types of cells, about mechanical changes triggered by external stimuli, and about the role of the cytoskeleton in these mechanical functions. Except in special cases, however, it has not been possible to extract quantitative estimates of viscosity and elasticity moduli from cell poker measurements. This paper presents cell poker measurements of well characterized viscoelastic polymeric materials, polydimethylsiloxanes of different degrees of polymerization, in a simple shape, a flat, thick layer, which for our purposes can be treated as a half space. Analysis of the measurements in terms of a linear viscoelasticity theory yields viscosity values for three polymer samples in agreement with those determined by measurements on a macroscopic scale. Theoretical analysis further indicates that the measured limiting static elasticity of the layers may result from the tension generated at the interface between the polymer and water. This work demonstrates the possibility of obtaining quantitative viscoelastic material properties from cell poker measurements and represents the first step in extending these quantitative studies to more complicated structures including cells. PMID:2720066

  1. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  2. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  3. Quantitative analysis of endocytosis with cytoplasmic pHluorin chimeras.

    PubMed

    Prosser, Derek C; Whitworth, Karen; Wendland, Beverly

    2010-09-01

    The pH-sensitive green fluorescent protein (GFP) variant pHluorin is typically fused to the extracellular domain of transmembrane proteins to monitor endocytosis. Here, we have turned pHluorin inside-out, and show that cytoplasmic fusions of pHluorin are effective quantitative reporters for endocytosis and multivesicular body (MVB) sorting. In yeast in particular, fusion of GFP and its variants on the extracellular side of transmembrane proteins can result in perturbed trafficking. In contrast, cytoplasmic fusions are well tolerated, allowing for the quantitative assessment of trafficking of virtually any transmembrane protein. Quenching of degradation-resistant pHluorin in the acidic vacuole permits quantification of extravacuolar cargo proteins at steady-state levels and is compatible with kinetic analysis of endocytosis in live cells.

  4. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  5. [Simultaneous quantitative analysis of four lignanoids in Schisandra chinensis by quantitative analysis of multi-components by single marker].

    PubMed

    He, Feng-Cheng; Li, Shou-Xin; Zhao, Zhi-Quan; Dong, Jin-Ping; Liu, Wu-Zhan; Su, Rui-Qiang

    2012-07-01

    The aim of the study is to establish a new method of quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four lignanoids in Schisandra chinensis. A new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with Schisandra chinensis. Four main lignanoids, schisandrin, schisantherin A, deoxyschizandrin and gamma-schizandrin, were selected as analytes and schisandrin as internal reference substance to evaluate the quality. Their contents in 13 different batches of samples, collected from different bathes, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of four lignanoids in 13 batches of S. chinensis determined by external standard method and QAMS. QAMS is feasible for determination of four lignanoids simultaneously when some authentic standard substances were unavailable, and the developed method can be used for quality control of S. chinensis.

  6. Quantitative 3D analysis of huge nanoparticle assemblies

    NASA Astrophysics Data System (ADS)

    Zanaga, Daniele; Bleichrodt, Folkert; Altantzis, Thomas; Winckelmans, Naomi; Palenstijn, Willem Jan; Sijbers, Jan; de Nijs, Bart; van Huis, Marijn A.; Sánchez-Iglesias, Ana; Liz-Marzán, Luis M.; van Blaaderen, Alfons; Joost Batenburg, K.; Bals, Sara; van Tendeloo, Gustaaf

    2015-12-01

    Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed.Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques

  7. Quantitative MRI for analysis of peritumoral edema in malignant gliomas

    PubMed Central

    Warntjes, J. B. Marcel; Smedby, Örjan; Lundberg, Peter

    2017-01-01

    Background and purpose Damage to the blood-brain barrier with subsequent contrast enhancement is a hallmark of glioblastoma. Non-enhancing tumor invasion into the peritumoral edema is, however, not usually visible on conventional magnetic resonance imaging. New quantitative techniques using relaxometry offer additional information about tissue properties. The aim of this study was to evaluate longitudinal relaxation R1, transverse relaxation R2, and proton density in the peritumoral edema in a group of patients with malignant glioma before surgery to assess whether relaxometry can detect changes not visible on conventional images. Methods In a prospective study, 24 patients with suspected malignant glioma were examined before surgery. A standard MRI protocol was used with the addition of a quantitative MR method (MAGIC), which measured R1, R2, and proton density. The diagnosis of malignant glioma was confirmed after biopsy/surgery. In 19 patients synthetic MR images were then created from the MAGIC scan, and ROIs were placed in the peritumoral edema to obtain the quantitative values. Dynamic susceptibility contrast perfusion was used to obtain cerebral blood volume (rCBV) data of the peritumoral edema. Voxel-based statistical analysis was performed using a mixed linear model. Results R1, R2, and rCBV decrease with increasing distance from the contrast-enhancing part of the tumor. There is a significant increase in R1 gradient after contrast agent injection (P < .0001). There is a heterogeneous pattern of relaxation values in the peritumoral edema adjacent to the contrast-enhancing part of the tumor. Conclusion Quantitative analysis with relaxometry of peritumoral edema in malignant gliomas detects tissue changes not visualized on conventional MR images. The finding of decreasing R1 and R2 means shorter relaxation times closer to the tumor, which could reflect tumor invasion into the peritumoral edema. However, these findings need to be validated in the future. PMID

  8. Sampler-sensor for preconcentration and quantitation of atmospheric hydrogen sulfide

    SciTech Connect

    LaRue, R.; Ataman, O.Y.; Hautman, D.P.; Gerhardt, G.; Zimmer, H.; Mark, H.B. Jr.

    1987-09-15

    Increasing concern over atmospheric environmental problems has necessitated the design and development of new analytical techniques in order to introduce alternatives to the presently employed methods. The use of solid sorbents for collection of pollutants in air has gained general acceptance, and criteria for this type of system have often been presented. In general practice, solid sorbents are subjected to a desorption process in order to prepare an analyte solution prior to an analytical detection procedure of choice. Therefore, in general, the solid sorbents employed are designed to provide physical and chemical characteristics for efficient takeup of analyte and yet allow effective desorption procedures. This results in a high overall recovery value necessary for required accuracy. Collection of atmospheric H/sub 2/S on a Cd(II)-exchanged zeolite as a solid sorbent and application of several analytical techniques as diverse as X-ray fluorescence spectrometry, combustion analysis by nondispersive infrared (IR) measurement, diffuse reflectance Fourier transform infrared (FTIR) spectrometry, visible spectrometry, and photoacoustic spectrometry on both intact solid sorbent and its leached solution after conversion to methylene blue have been studied. The present paper reports the initial data on a novel sampler-sensor that uses a filter paper pretreated system as a solid sorbent sampler having an air channel in the shape of a planar spiral which gives a direct visual readout of low levels of H/sub 2/S.

  9. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  10. Application of hydrogen analysis by neutron imaging plate method to Zircaloy cladding tubes

    NASA Astrophysics Data System (ADS)

    Yasuda, Ryou; Nakata, Masahito; Matsubayashi, Masahito; Harada, Katsuya; Hatakeyama, Yuichi; Amano, Hidetoshi

    2003-08-01

    Effectiveness of neutron imaging plate (NIP) method for hydrogen analysis is investigated by using standard samples with known hydrogen concentrations. A relationship between hydrogen concentration in Zircaloy tubes and numerical data in the NIP images was obtained by image analysis process. By using the relationship, local hydrogen concentrations in segregated tubes with heterogeneous hydrogen distribution were estimated in a small area; 0.1 × 0.1 mm 2. Contribution of an oxide film in the tubes to the images is also investigated by using oxidized samples with and without hydrides. In the NIP images of the oxidized samples, oxide film was not recognized in the images of the sample. Results of numerical analysis also show no effect of the oxide film. These results show that the effect of oxygen in the image can be neglected when hydrogen analysis is performed on the Zircaloy tube with oxide film and hydrides by NIP method.

  11. Analysis of combined hydrogen, heat, and power as a bridge to a hydrogen transition.

    SciTech Connect

    Mahalik, M.; Stephan, C.

    2011-01-18

    Combined hydrogen, heat, and power (CHHP) technology is envisioned as a means to providing heat and electricity, generated on-site, to large end users, such as hospitals, hotels, and distribution centers, while simultaneously producing hydrogen as a by-product. The hydrogen can be stored for later conversion to electricity, used on-site (e.g., in forklifts), or dispensed to hydrogen-powered vehicles. Argonne has developed a complex-adaptive-system model, H2CAS, to simulate how vehicles and infrastructure can evolve in a transition to hydrogen. This study applies the H2CAS model to examine how CHHP technology can be used to aid the transition to hydrogen. It does not attempt to predict the future or provide one forecast of system development. Rather, the purpose of the model is to understand how the system works. The model uses a 50- by 100-mile rectangular grid of 1-square-mile cells centered on the Los Angeles metropolitan area. The major expressways are incorporated into the model, and local streets are considered to be ubiquitous, except where there are natural barriers. The model has two types of agents. Driver agents are characterized by a number of parameters: home and job locations, income, various types of 'personalities' reflective of marketing distinctions (e.g., innovators, early adopters), willingness to spend extra money on 'green' vehicles, etc. At the beginning of the simulations, almost all driver agents own conventional vehicles. They drive around the metropolitan area, commuting to and from work and traveling to various other destinations. As they do so, they observe the presence or absence of facilities selling hydrogen. If they find such facilities conveniently located along their routes, they are motivated to purchase a hydrogen-powered vehicle when it becomes time to replace their present vehicle. Conversely, if they find that they would be inconvenienced by having to purchase hydrogen earlier than necessary or if they become worried that they

  12. Quantitative Remote Laser-Induced Breakdown Spectroscopy by Multivariate Analysis

    NASA Astrophysics Data System (ADS)

    Clegg, S. M.; Sklute, E. C.; Dyar, M. D.; Barefield, J. E.; Wiens, R. C.

    2007-12-01

    The ChemCam instrument selected for the Mars Science Laboratory (MSL) rover includes a remote Laser- Induced Breakdown Spectrometer (LIBS) that will quantitatively probe samples up to 9m from the rover mast. LIBS is fundamentally an elemental analysis technique. LIBS involves focusing a Nd:YAG laser operating at 1064 nm onto the surface of the sample. The laser ablates material from the surface, generating an expanding plasma containing electronically excited ions, atoms, and small molecules. As these electronically excited species relax back to the ground state, they emit light at wavelengths characteristic of the species present in the sample. Some of this emission is directed into one of three dispersive spectrometers. In this paper, we studied a suite of 18 igneous and highly-metamorphosed samples from a wide variety of parageneses for which chemical analyses by XRF were already available. Rocks were chosen to represent a range of chemical composition from basalt to rhyolite, thus providing significant variations in all of the major element contents (Si, Fe, Al, Ca, Na, K, O, Ti, Mg, and Mn). These samples were probed at a 9m standoff distance under experimental conditions that are similar to ChemCam. Extracting quantitative elemental concentrations from LIBS spectra is complicated by the chemical matrix effects. Conventional methods for obtaining quantitative chemical data from LIBS analyses are compared with new multivariate analysis (MVA) techniques that appear to compensate for these chemical matrix effects. The traditional analyses use specific elemental peak heights or areas, which compared with calibration curves for each element at one or more emission lines for a series of standard samples. Because of matrix effects, the calibration standards generally must have similar chemistries to the unknown samples, and thus this conventional approach imposes severe limitations on application of the technique to remote analyses. In this suite of samples, the use

  13. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  14. Binary imaging analysis for comprehensive quantitative histomorphometry of peripheral nerve.

    PubMed

    Hunter, Daniel A; Moradzadeh, Arash; Whitlock, Elizabeth L; Brenner, Michael J; Myckatyn, Terence M; Wei, Cindy H; Tung, Thomas H H; Mackinnon, Susan E

    2007-10-15

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques.

  15. Variability in quantitative cardiac magnetic resonance perfusion analysis

    PubMed Central

    Bratis, K.

    2013-01-01

    By taking advantage of its high spatial resolution, noninvasive and nontoxic nature first-pass perfusion cardiovascular magnetic resonance (CMR) has rendered an indispensable tool for the noninvasive detection of reversible myocardial ischemia. A potential advantage of perfusion CMR is its ability to quantitatively assess perfusion reserve within a myocardial segment, as expressed semi- quantitatively by myocardial perfusion reserve index (MPRI) and fully- quantitatively by absolute myocardial blood flow (MBF). In contrast to the high accuracy and reliability of CMR in evaluating cardiac function and volumes, perfusion CMR is adversely affected by multiple potential reasons during data acquisition as well as post-processing. Various image acquisition techniques, various contrast agents and doses as well as variable blood flow at rest as well as variable reactions to stress all influence the acquired data. Mechanisms underlying the variability in perfusion CMR post processing, as well as their clinical significance, are yet to be fully elucidated. The development of a universal, reproducible, accurate and easily applicable tool in CMR perfusion analysis remains a challenge and will substantially enforce the role of perfusion CMR in improving clinical care. PMID:23825774

  16. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  17. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  18. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.

  19. Quantitative phosphoproteomic analysis using iTRAQ method.

    PubMed

    Asano, Tomoya; Nishiuchi, Takumi

    2014-01-01

    The MAPK (mitogen-activated kinase) cascade plays important roles in plant perception of and reaction to developmental and environmental cues. Phosphoproteomics are useful to identify target proteins regulated by MAPK-dependent signaling pathway. Here, we introduce the quantitative phosphoproteomic analysis using a chemical labeling method. The isobaric tag for relative and absolute quantitation (iTRAQ) method is a MS-based technique to quantify protein expression among up to eight different samples in one experiment. In this technique, peptides were labeled by some stable isotope-coded covalent tags. We perform quantitative phosphoproteomics comparing Arabidopsis wild type and a stress-responsive mapkk mutant after phytotoxin treatment. To comprehensively identify the downstream phosphoproteins of MAPKK, total proteins were extracted from phytotoxin-treated wild-type and mapkk mutant plants. The phosphoproteins were purified by Pro-Q(®) Diamond Phosphoprotein Enrichment Kit and were digested with trypsin. Resulting peptides were labeled with iTRAQ reagents and were quantified and identified by MALDI TOF/TOF analyzer. We identified many phosphoproteins that were decreased in the mapkk mutant compared with wild type.

  20. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  1. Analysis of hydrogen-bond interaction potentials from the electron density: integration of noncovalent interaction regions.

    PubMed

    Contreras-García, Julia; Yang, Weitao; Johnson, Erin R

    2011-11-17

    Hydrogen bonds are of crucial relevance to many problems in chemistry, biology, and materials science. The recently developed NCI (noncovalent interactions) index enables real-space visualization of both attractive (van der Waals and hydrogen-bonding) and repulsive (steric) interactions based on properties of the electron density. It is thus an optimal index to describe the interplay of stabilizing and destabilizing contributions that determine stable minima on hydrogen-bonding potential-energy surfaces (PESs). In the framework of density-functional theory, energetics are completely determined by the electron density. Consequently, NCI will be shown to allow quantitative treatment of hydrogen-bond energetics. The evolution of NCI regions along a PES follows a well-behaved pattern which, upon integration of the electron density, is capable of mimicking conventional hydrogen-bond interatomic potentials.

  2. Quantification of Hydrogen Concentrations in Surface and Interface Layers and Bulk Materials through Depth Profiling with Nuclear Reaction Analysis.

    PubMed

    Wilde, Markus; Ohno, Satoshi; Ogura, Shohei; Fukutani, Katsuyuki; Matsuzaki, Hiroyuki

    2016-03-29

    Nuclear reaction analysis (NRA) via the resonant (1)H((15)N,αγ)(12)C reaction is a highly effective method of depth profiling that quantitatively and non-destructively reveals the hydrogen density distribution at surfaces, at interfaces, and in the volume of solid materials with high depth resolution. The technique applies a (15)N ion beam of 6.385 MeV provided by an electrostatic accelerator and specifically detects the (1)H isotope in depths up to about 2 μm from the target surface. Surface H coverages are measured with a sensitivity in the order of ~10(13) cm(-2) (~1% of a typical atomic monolayer density) and H volume concentrations with a detection limit of ~10(18) cm(-3) (~100 at. ppm). The near-surface depth resolution is 2-5 nm for surface-normal (15)N ion incidence onto the target and can be enhanced to values below 1 nm for very flat targets by adopting a surface-grazing incidence geometry. The method is versatile and readily applied to any high vacuum compatible homogeneous material with a smooth surface (no pores). Electrically conductive targets usually tolerate the ion beam irradiation with negligible degradation. Hydrogen quantitation and correct depth analysis require knowledge of the elementary composition (besides hydrogen) and mass density of the target material. Especially in combination with ultra-high vacuum methods for in-situ target preparation and characterization, (1)H((15)N,αγ)(12)C NRA is ideally suited for hydrogen analysis at atomically controlled surfaces and nanostructured interfaces. We exemplarily demonstrate here the application of (15)N NRA at the MALT Tandem accelerator facility of the University of Tokyo to (1) quantitatively measure the surface coverage and the bulk concentration of hydrogen in the near-surface region of a H2 exposed Pd(110) single crystal, and (2) to determine the depth location and layer density of hydrogen near the interfaces of thin SiO2 films on Si(100).

  3. Technical Analysis of Hydrogen Production: Evaluation of H2 Mini-Grids

    SciTech Connect

    Lasher, Stephen; Sinha, Jayanti

    2005-05-03

    We have assessed the transportation of hydrogen as a metal hydride slurry through pipelines over a short distance from a neighborhood hydrogen production facility to local points of use. The assessment was conducted in the context of a hydrogen "mini-grid" serving both vehicle fueling and stationary fuel cell power systems for local building heat and power. The concept was compared to a compressed gaseous hydrogen mini-grid option and to a stand-alone hydrogen fueling station. Based on our analysis results we have concluded that the metal hydride slurry concept has potential to provide significant reductions in overall energy use compared to liquid or chemical hydride delivery, but only modest reductions in overall energy use, hydrogen cost, and GHG emissions compared to a compressed gaseous hydrogen delivery. However, given the inherent (and perceived) safety and reasonable cost/efficiency of the metal hydride slurry systems, additional research and analysis is warranted. The concept could potentially overcome the public acceptance barrier associated with the perceptions about hydrogen delivery (including liquid hydrogen tanker trucks and high-pressure gaseous hydrogen pipelines or tube trailers) and facilitate the development of a near-term hydrogen infrastructure.

  4. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. 3D visualization and quantitative analysis of human erythrocyte phagocytosis.

    PubMed

    Stachurska, Anna; Król, Teodora; Trybus, Wojciech; Szary, Karol; Fabijańska-Mitek, Jadwiga

    2016-11-01

    Since the erythrophagocytosis of opsonized erythrocytes is investigated mainly by calculating the phagocytic index using subjective light microscopy evaluation, we present methods for the quantitative and qualitative analysis of human cell erythrophagocytosis. Erythrocytes from two storage periods were used. Using Imaris software, we were able to create a three-dimensional model of erythrophagocytosis. The use of microscopy instead of cytometry revealed a significantly higher number of monocytes and erythrocytes that appeared active in phagocytosis. Spatial reconstruction allowed for detailed analysis of the process by precisely locating erythrocytes in phagocytes. Additionally, a technique of sequential image registration using Nis Elements software allowed for observation of the course of phagocytosis over a range of time intervals. This in vitro research may be helpful for understanding the cellular interactions between monocytes and erythrocytes. The cytometric method-being relatively rapid, sensitive, and specific-can serve as an alternative technique to microscopy in the quantitative analysis of erythrophagocytosis. This allows us to avoid counting the erythrocytes nonspecifically attached to monocytes and gives objective results. © 2016 International Federation for Cell Biology.

  6. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  7. Quantitative analysis of motion control in long term microgravity.

    PubMed

    Baroni, G; Ferrigno, G; Anolli, A; Andreoni, G; Pedotti, A

    1998-01-01

    In the frame of the 179-days EUROMIR '95 space mission, two in-flight experiments have foreseen quantitative three-dimensional human movement analysis in microgravity. For this aim, a space qualified opto-electronic motion analyser based on passive markers has been installed onboard the Russian Space Station MIR and 8 in flight sessions have been performed. Techhology and method for the collection of kinematics data are described, evaluating the accuracy in three-dimensional marker localisation. Results confirm the suitability of opto-electronic technology for quantitative human motion analysis on orbital modules and raise a set of "lessons learned", leading to the improvement of motion analyser performance with a contemporary swiftness of the on-board operations. Among the experimental program of T4, results of three voluntary posture perturbation protocols are described. The analysis suggests that a short term reinterpretation of proprioceptive information and re-calibration of sensorimotor mechanisms seem to end within the first weeks of flight, while a continuous long term adaptation process allows the refinement of motor performance, in the frame of never abandoned terrestrial strategies.

  8. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    PubMed

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  9. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  10. Quantitative analysis of sideband coupling in photoinduced force microscopy

    NASA Astrophysics Data System (ADS)

    Jahng, Junghoon; Kim, Bongsu; Lee, Eun Seong; Potma, Eric Olaf

    2016-11-01

    We present a theoretical and experimental analysis of the cantilever motions detected in photoinduced force microscopy (PiFM) using the sideband coupling detection scheme. In sideband coupling, the cantilever dynamics are probed at a combination frequency of a fundamental mechanical eigenmode and the modulation frequency of the laser beam. Using this detection mode, we develop a method for reconstructing the modulated photoinduced force gradient from experimental parameters in a quantitative manner. We show evidence, both theoretically and experimentally, that the sideband coupling detection mode provides PiFM images with superior contrast compared to images obtained when detecting the cantilever motions directly at the laser modulation frequency.

  11. Flow quantitation by radio frequency analysis of contrast echocardiography.

    PubMed

    Rovai, D; Lombardi, M; Mazzarisi, A; Landini, L; Taddei, L; Distante, A; Benassi, A; L'Abbate, A

    1993-03-01

    Contrast echocardiography has the potential for measuring cardiac output and regional blood flow. However, accurate quantitation is limited both by the use of non-standard contrast agents and by the electronic signal distortion inherent to the echocardiographic instruments. Thus, the aim of this study is to quantify flow by combining a stable contrast agent and a modified echo equipment, able to sample the radio frequency (RF) signal from a region of interest (ROI) in the echo image. The contrast agent SHU-454 (0.8 ml) was bolus injected into an in vitro calf vein, at 23 flow rates (ranging from 376 to 3620 ml/min) but constant volume and pressure. The ROI was placed in the centre of the vein, the RF signal was processed in real time and transferred to a personal computer to generate time-intensity curves. In the absence of recirculation, contrast washout slope and mean transit time (MTT) of curves (1.11-8.52 seconds) yielded excellent correlations with flow: r = 0.93 and 0.95, respectively. To compare the accuracy of RF analysis with that of conventional image processing as to flow quantitation, conventional images were collected in the same flow model by two different scanners: a) the mechanical sector scanner used for RF analysis, and b) a conventional electronic sector scanner. These images were digitized off-line, mean videodensity inside an identical ROI was measured and time-intensity curves were built. MTT by RF was shorter than by videodensitometric analysis of the images generated by the same scanner (p < 0.001). In contrast, MTT by RF was longer than by the conventional scanner (p < 0.001). Significant differences in MTT were also found with changes in the gain setting controls of the conventional scanner. To study the stability of the contrast effect, 6 contrast injections (20 ml) were performed at a constant flow rate during recirculation: the spontaneous decay in RF signal intensity (t1/2 = 64 +/- 8 seconds) was too long to affect MTT significantly

  12. Simulating the focal volume effect: a quantitative analysis

    NASA Astrophysics Data System (ADS)

    Scarborough, Timothy D.; Uiterwaal, Cornelis J. G. J.

    2013-12-01

    We present quantitative simulations of the focal volume effect. Intensity distributions in detection volumes with two- and three-dimensional spatial resolution are calculated. Results include an analysis of translations of these volumes in the focus along the direction of laser propagation as well as discussion of varying sizes of the spatially resolved volumes. We find that detection volumes less than half the 1/e full-width beam waist and less than half the Rayleigh length along the propagation direction offer an optimal compromise of maintaining intensity resolution without sacrificing peak intensity.

  13. Neutron diffractometer INES for quantitative phase analysis of archaeological objects

    NASA Astrophysics Data System (ADS)

    Imberti, S.; Kockelmann, W.; Celli, M.; Grazzi, F.; Zoppi, M.; Botti, A.; Sodo, A.; Imperiale, M. Leo; de Vries-Melein, M.; Visser, D.; Postma, H.

    2008-03-01

    With the Italian Neutron Experimental Station (INES) a new general purpose neutron powder diffractometer is available at ISIS, characterized by a high resolution at low d-spacings, and particularly suited for the quantitative phase analysis of a wide range of archaeological materials. Time-of-flight neutron diffraction is notable for being a non-destructive technique, allowing a reliable determination of the phase compositions of multiphase artefacts, with or without superficial corrosion layers. A selection of archaeometric studies carried out during the first year of the INES user programme is presented here to demonstrate the capabilities of the instrument.

  14. Quantum dots assisted laser desorption/ionization mass spectrometric detection of carbohydrates: qualitative and quantitative analysis.

    PubMed

    Bibi, Aisha; Ju, Huangxian

    2016-04-01

    A quantum dots (QDs) assisted laser desorption/ionization mass spectrometric (QDA-LDI-MS) strategy was proposed for qualitative and quantitative analysis of a series of carbohydrates. The adsorption of carbohydrates on the modified surface of different QDs as the matrices depended mainly on the formation of hydrogen bonding, which led to higher MS intensity than those with conventional organic matrix. The effects of QDs concentration and sample preparation method were explored for improving the selective ionization process and the detection sensitivity. The proposed approach offered a new dimension to the application of QDs as matrices for MALDI-MS research of carbohydrates. It could be used for quantitative measurement of glucose concentration in human serum with good performance. The QDs served as a matrix showed the advantages of low background, higher sensitivity, convenient sample preparation and excellent stability under vacuum. The QDs assisted LDI-MS approach has promising application to the analysis of carbohydrates in complex biological samples. Copyright © 2016 John Wiley & Sons, Ltd.

  15. First principles analysis of metal and oxide-metal interfacial catalysis for hydrogen production

    NASA Astrophysics Data System (ADS)

    Greeley, Jeffrey

    2014-03-01

    Current and growing interest in the development of new catalytic materials for complex chemistries has challenged the methods traditionally employed by practitioners of computational catalysis. Explicit Density Functional Theory (DFT) analysis of all possible reaction pathways in biomass reaction networks, for example, is computationally prohibitive, and to make progress at a reasonable rate, strategies to accelerate the predictions made by DFT-based methods must be developed. In this talk, we will review some recent work in our group focusing on first principles analyses of the production of hydrogen from the decomposition of biomass-derived oxygenated hydrocarbons on heterogeneous catalytic surfaces. We will discuss, in particular, the development of accelerated DFT-based strategies to map the complex reaction networks associated with biomass decomposition at metal and oxide-metal interfaces, and we will show how these strategies can efficiently produce semi-quantitative predictions of activity and selectivity trends in hydrogen production on these surfaces. We will also briefly describe the development of reactivity trends for another chemical process that is relevant to biomass chemistry, the water-gas shift reaction, at metal-oxide interfaces, and will describe how bifunctional properties of these interfaces may promote this important chemistry.

  16. A survey and analysis of commercially available hydrogen sensors

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.

    1992-01-01

    The performance requirements for hydrogen detection in aerospace applications often exceed those of more traditional applications. In order to ascertain the applicability of existing hydrogen sensors to aerospace applications, a survey was conducted of commercially available point-contact hydrogen sensors, and their operation was analyzed. The operation of the majority of commercial hydrogen sensors falls into four main categories: catalytic combustion, electrochemical, semiconducting oxide sensors, and thermal conductivity detectors. The physical mechanism involved in hydrogen detection for each main category is discussed in detail. From an understanding of the detection mechanism, each category of sensor is evaluated for use in a variety of space and propulsion environments. In order to meet the needs of aerospace applications, the development of point-contact hydrogen sensors that are based on concepts beyond those used in commercial sensors is necessary.

  17. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.

  18. Cost Analysis of a Concentrator Photovoltaic Hydrogen Production System

    SciTech Connect

    Thompson, J. R.; McConnell, R. D.; Mosleh, M.

    2005-08-01

    The development of efficient, renewable methods of producing hydrogen are essential for the success of the hydrogen economy. Since the feedstock for electrolysis is water, there are no harmful pollutants emitted during the use of the fuel. Furthermore, it has become evident that concentrator photovoltaic (CPV) systems have a number of unique attributes that could shortcut the development process, and increase the efficiency of hydrogen production to a point where economics will then drive the commercial development to mass scale.

  19. Analysis of IUE Observations of Hydrogen in Comets

    NASA Technical Reports Server (NTRS)

    Combi, Michael R.; Feldman, Paul D.

    1998-01-01

    The 15-years worth of hydrogen Lyman-alpha observations of cometary comae obtained with the International Ultraviolet Explorer (IUE) satellite had gone generally unanalyzed because of two main modeling complications. First, the inner comae of many bright (gas productive) comets are often optically thick to solar Lyman-alpha radiation. Second, even in the case of a small comet (low gas production) the large IUE aperture is quite small as compared with the immense size of the hydrogen coma, so an accurate model which properly accounts for the spatial distribution of the coma is required to invert the infrared brightnesses to column densities and finally to H atom production rates. Our Monte Carlo particle trajectory model (MCPTM), which for the first time provides the realistic full phase space distribution of H atoms throughout the coma has been used as the basis for the analysis of IUE observations of the inner coma. The MCPTM includes the effects of the vectorial ejection of the H atoms upon dissociation of their parent species (H2O and OH) and of their partial collisional thermalization. Both of these effects are crucial to characterize the velocity distribution of the H atoms. This combination of the MCPTM and spherical radiative transfer code had already been shown to be successful in understanding the moderately optically thick coma of comet P/Giacobini-Zinner and the coma of comet Halley that varied from being slightly to very optically thick. Both of these comets were observed during solar minimum conditions. Solar activity affects both the photochemistry of water and the solar Lyman-alpha radiation flux. The overall plan of this program here was to concentrate on comets observed by IUE at other time during the solar cycle, most importantly during the two solar maxima of 1980 and 1990. Described herein are the work performed and the results obtained.

  20. Quantitative chemical analysis of ocular melanosomes in the TEM.

    PubMed

    Eibl, O; Schultheiss, S; Blitgen-Heinecke, P; Schraermeyer, U

    2006-01-01

    Melanosomes in retinal tissues of a human, monkey and rat were analyzed by EDX in the TEM. Samples were prepared by ultramicrotomy at different thicknesses. The material was mounted on Al grids and samples were analyzed in a Zeiss 912 TEM equipped with an Omega filter and EDX detector with ultrathin window. Melanosomes consist of C and O as main components, mole fractions are about 90 and 3-10 at.%, respectively, and small mole fraction ratios, between 2 and 0.1 at.%, of Na, Mg, K, Si, P, S, Cl, Ca. All elements were measured quantitatively by standardless EDX with high precision. Mole fractions of transition metals Fe, Cu and Zn were also measured. For Fe a mole fraction ratio of less than 0.1at.% was found and gives the melanin its paramagnetic properties. Its mole fraction is however close to or below the minimum detectable mass fraction of the used equipment. Only in the human eye and only in the retinal pigment epitelium (rpe) the mole fractions of Zn (0.1 at.% or 5000 microg/g) and Cu were clearly beyond the minimum detectable mass fraction. In the rat and monkey eye the mole fraction of Zn was at or below the minimum detectable mass fraction and could not be measured quantitatively. The obtained results yielded the chemical composition of the melanosomes in the choroidal tissue and the retinal pigment epitelium (rpe) of the three different species. The results of the chemical analysis are discussed by mole fraction correlation diagrams. Similarities and differences between the different species are outlined. Correlation behavior was found to hold over species, e.g. the Ca-O correlation. It indicates that Ca is bound to oxygen rich sites in the melanin. These are the first quantitative analyses of melanosomes by EDX reported so far. The quantitative chemical analysis should open a deeper understanding of the metabolic processes in the eye that are of central importance for the understanding of a large number of eye-related diseases. The chemical analysis also

  1. Analysis of hydrogen vehicles with cryogenic high pressure storage

    SciTech Connect

    Aceves, S. M.; Berry, G. D.

    1998-06-19

    Insulated pressure vessels are cryogenic-capable pressure vessels that can be fueled with liquid hydrogen (LIQ) or ambient-temperature compressed hydrogen (CH2). Insulated pressure vessels offer the advantages of liquid hydrogen tanks (low weight and volume), with reduced disadvantages (lower energy requirement for hydrogen liquefaction and reduced evaporative losses). This paper shows an evaluation of the applicability of the insulated pressure vessels for light-duty vehicles. The paper shows an evaluation of evaporative losses and insulation requirements and a description of the current experimental plans for testing insulated pressure vessels. The results show significant advantages to the use of insulated pressure vessels for light-duty vehicles.

  2. Quantitative sonographic image analysis for hepatic nodules: a pilot study.

    PubMed

    Matsumoto, Naoki; Ogawa, Masahiro; Takayasu, Kentaro; Hirayama, Midori; Miura, Takao; Shiozawa, Katsuhiko; Abe, Masahisa; Nakagawara, Hiroshi; Moriyama, Mitsuhiko; Udagawa, Seiichi

    2015-10-01

    The aim of this study was to investigate the feasibility of quantitative image analysis to differentiate hepatic nodules on gray-scale sonographic images. We retrospectively evaluated 35 nodules from 31 patients with hepatocellular carcinoma (HCC), 60 nodules from 58 patients with liver hemangioma, and 22 nodules from 22 patients with liver metastasis. Gray-scale sonographic images were evaluated with subjective judgment and image analysis using ImageJ software. Reviewers classified the shape of nodules as irregular or round, and the surface of nodules as rough or smooth. Circularity values were lower in the irregular group than in the round group (median 0.823, 0.892; range 0.641-0.915, 0.784-0.932, respectively; P = 3.21 × 10(-10)). Solidity values were lower in the rough group than in the smooth group (median 0.957, 0.968; range 0.894-0.986, 0.933-0.988, respectively; P = 1.53 × 10(-4)). The HCC group had higher circularity and solidity values than the hemangioma group. The HCC and liver metastasis groups had lower median, mean, modal, and minimum gray values than the hemangioma group. Multivariate analysis showed circularity [standardized odds ratio (OR), 2.077; 95 % confidential interval (CI) = 1.295-3.331; P = 0.002] and minimum gray value (OR 0.482; 95 % CI = 0.956-0.990; P = 0.001) as factors predictive of malignancy. The combination of subjective judgment and image analysis provided 58.3 % sensitivity and 89.5 % specificity with AUC = 0.739, representing an improvement over subjective judgment alone (68.4 % sensitivity, 75.0 % specificity, AUC = 0.701) (P = 0.008). Quantitative image analysis for ultrasonic images of hepatic nodules may correlate with subjective judgment in predicting malignancy.

  3. Quantitative Proteomic and Phosphoproteomic Analysis of Trypanosoma cruzi Amastigogenesis*

    PubMed Central

    Queiroz, Rayner M. L.; Charneau, Sébastien; Mandacaru, Samuel C.; Schwämmle, Veit; Lima, Beatriz D.; Roepstorff, Peter; Ricart, Carlos A. O.

    2014-01-01

    Chagas disease is a tropical neglected disease endemic in Latin America caused by the protozoan Trypanosoma cruzi. The parasite has four major life stages: epimastigote, metacyclic trypomastigote, bloodstream trypomastigote, and amastigote. The differentiation from infective trypomastigotes into replicative amastigotes, called amastigogenesis, takes place in vivo inside mammalian host cells after a period of incubation in an acidic phagolysosome. This differentiation process can be mimicked in vitro by incubating tissue-culture-derived trypomastigotes in acidic DMEM. Here we used this well-established differentiation protocol to perform a comprehensive quantitative proteomic and phosphoproteomic analysis of T. cruzi amastigogenesis. Samples from fully differentiated forms and two biologically relevant intermediate time points were Lys-C/trypsin digested, iTRAQ-labeled, and multiplexed. Subsequently, phosphopeptides were enriched using a TiO2 matrix. Non-phosphorylated peptides were fractionated via hydrophilic interaction liquid chromatography prior to LC-MS/MS analysis. LC-MS/MS and bioinformatics procedures were used for protein and phosphopeptide quantitation, identification, and phosphorylation site assignment. We were able to identify regulated proteins and pathways involved in coordinating amastigogenesis. We also observed that a significant proportion of the regulated proteins were membrane proteins. Modulated phosphorylation events coordinated by protein kinases and phosphatases that are part of the signaling cascade induced by incubation in acidic medium were also evinced. To our knowledge, this work is the most comprehensive quantitative proteomics study of T. cruzi amastigogenesis, and these data will serve as a trustworthy basis for future studies, and possibly for new potential drug targets. PMID:25225356

  4. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  5. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  6. Quantitative analysis of live cells using digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Lewis, Tan Rongwei; Qu, Weijuan; Chee, Oi Choo; Singh, Vijay Raj; Asundi, Anand

    2010-03-01

    During the life time of a cell, it goes through changes to the plasma membrane as well as its internal structures especially distinctive during processes like cell division and death. Different types of microscope are used to fulfill the observation of the cell's variation. In our experiment, Vero cells have been investigated by using phase contrast microscopy and digital holographic microscopy (DHM). A comparison of the images obtained for cell division is presented here. The conventional phase contrast microscope provided a good imaging method in the real time analysis of cell division. The off-axis digital hologram recorded by the DHM system can be reconstructed to obtain both the intensity image and phase contrast image of the test object. These can be used for live cell imaging to provide multiple results from a single equipment setup. The DHM system, besides being a qualitative tool, is able to provide quantitative results and 3D images of the cell division process. The ability of DHM to provide quantitative analysis makes it an ideal tool for life science applications.

  7. Quantitative analysis of live cells using digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Lewis, Tan Rongwei; Qu, Weijuan; Chee, Oi Choo; Singh, Vijay Raj; Asundi, Anand

    2009-12-01

    During the life time of a cell, it goes through changes to the plasma membrane as well as its internal structures especially distinctive during processes like cell division and death. Different types of microscope are used to fulfill the observation of the cell's variation. In our experiment, Vero cells have been investigated by using phase contrast microscopy and digital holographic microscopy (DHM). A comparison of the images obtained for cell division is presented here. The conventional phase contrast microscope provided a good imaging method in the real time analysis of cell division. The off-axis digital hologram recorded by the DHM system can be reconstructed to obtain both the intensity image and phase contrast image of the test object. These can be used for live cell imaging to provide multiple results from a single equipment setup. The DHM system, besides being a qualitative tool, is able to provide quantitative results and 3D images of the cell division process. The ability of DHM to provide quantitative analysis makes it an ideal tool for life science applications.

  8. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGES

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; ...

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  9. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  10. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    PubMed

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  11. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-05

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs.

  12. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  13. 3D quantitative analysis of brain SPECT images

    NASA Astrophysics Data System (ADS)

    Loncaric, Sven; Ceskovic, Ivan; Petrovic, Ratimir; Loncaric, Srecko

    2001-07-01

    The main purpose of this work is to develop a computer-based technique for quantitative analysis of 3-D brain images obtained by single photon emission computed tomography (SPECT). In particular, the volume and location of ischemic lesion and penumbra is important for early diagnosis and treatment of infracted regions of the brain. SPECT imaging is typically used as diagnostic tool to assess the size and location of the ischemic lesion. The segmentation method presented in this paper utilizes a 3-D deformable model in order to determine size and location of the regions of interest. The evolution of the model is computed using a level-set implementation of the algorithm. In addition to 3-D deformable model the method utilizes edge detection and region growing for realization of a pre-processing. Initial experimental results have shown that the method is useful for SPECT image analysis.

  14. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  15. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    PubMed

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm.

  16. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  17. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  18. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  19. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  20. Quick 96FASP for high throughput quantitative proteome analysis.

    PubMed

    Yu, Yanbao; Bekele, Shiferaw; Pieper, Rembert

    2017-08-23

    Filter aided sample preparation (FASP) is becoming a central method for proteomic sample cleanup and peptide generation prior to LC-MS analysis. We previously adapted this method to a 96-well filter plate, and applied to prepare protein digests from cell lysate and body fluid samples in a high throughput quantitative manner. While the 96FASP approach is scalable and can handle multiple samples simultaneously, two key advantages compared to single FASP, it is also time-consuming. The centrifugation-based liquid transfer on the filter plate takes 3-5 times longer than single filter. To address this limitation, we now present a quick 96FASP (named q96FASP) approach that, relying on the use of filter membranes with a large MWCO size (~30kDa), significantly reduces centrifugal times. We show that q96FASP allows the generation of protein digests derived from whole cell lysates and body fluids in a quality similar to that of the single FASP method. Processing a sample in multiple wells in parallel, we observed excellent experimental repeatability by label-free quantitation approach. We conclude that the q96FASP approach promises to be a promising cost- and time-effective method for shotgun proteomics and will be particularly useful in large scale biomarker discovery studies. High throughput sample processing is of particular interests for quantitative proteomics. The previously developed 96FASP is high throughput and appealing, however it is time-consuming in the context of centrifugation-based liquid transfer (~1.5h per spin). This study presents a truly high throughput sample preparation method based on large cut-off 96-well filter plate, which shortens the spin time to ~20min. To our knowledge, this is the first multi-well method that is entirely comparable with conventional FASP. This study thoroughly examined two types of filter plates and performed side-by-side comparisons with single FASP. Two types of samples, whole cell lysate of a UTI (urinary tract infection

  1. Advanced hydrogen/oxygen thrust chamber design analysis

    NASA Technical Reports Server (NTRS)

    Shoji, J. M.

    1973-01-01

    The results are reported of the advanced hydrogen/oxygen thrust chamber design analysis program. The primary objectives of this program were to: (1) provide an in-depth analytical investigation to develop thrust chamber cooling and fatigue life limitations of an advanced, high pressure, high performance H2/O2 engine design of 20,000-pounds (88960.0 N) thrust; and (2) integrate the existing heat transfer analysis, thermal fatigue and stress aspects for advanced chambers into a comprehensive computer program. Thrust chamber designs and analyses were performed to evaluate various combustor materials, coolant passage configurations (tubes and channels), and cooling circuits to define the nominal 1900 psia (1.31 x 10 to the 7th power N/sq m) chamber pressure, 300-cycle life thrust chamber. The cycle life capability of the selected configuration was then determined for three duty cycles. Also the influence of cycle life and chamber pressure on thrust chamber design was investigated by varying in cycle life requirements at the nominal chamber pressure and by varying the chamber pressure at the nominal cycle life requirement.

  2. Geographically Based Hydrogen Consumer Demand and Infrastructure Analysis: Final Report

    SciTech Connect

    Melendez, M.; Milbrandt, A.

    2006-10-01

    In FY 2004 and 2005, NREL developed a proposed minimal infrastructure to support nationwide deployment of hydrogen vehicles by offering infrastructure scenarios that facilitated interstate travel. This report identifies key metropolitan areas and regions on which to focus infrastructure efforts during the early hydrogen transition.

  3. Hydrogenated fullerenes in space: FT-IR spectra analysis

    SciTech Connect

    El-Barbary, A. A.

    2016-06-10

    Fullerenes and hydrogenated fullerenes are found in circumstellar and interstellar environments. But the determination structures for the detected bands in the interstellar and circumstellar space are not completely understood so far. For that purpose, the aim of this article is to provide all possible infrared spectra for C{sub 20} and C{sub 60} fullerenes and their hydrogenated fullerenes. Density Functional theory (DFT) is applied using B3LYP exchange-functional with basis set 6–31G(d, p). The Fourier transform infrared spectroscopy (FT-IR) is found to be capable of distinguishing between fullerenes, mono hydrogenated fullerenes and fully hydrogenated fullerenes. In addition, deposition of one hydrogen atom outside the fully hydrogenated fullerenes is found to be distinguished by forming H{sub 2} molecule at peak around 4440 cm{sup −1}. However, deposition of one hydrogen atom inside the fully hydrogenated fullerenes cannot be distinguished. The obtained spectral structures are analyzed and are compared with available experimental results.

  4. Quantitative analysis of cyclic beta-turn models.

    PubMed Central

    Perczel, A.; Fasman, G. D.

    1992-01-01

    The beta-turn is a frequently found structural unit in the conformation of globular proteins. Although the circular dichroism (CD) spectra of the alpha-helix and beta-pleated sheet are well defined, there remains some ambiguity concerning the pure component CD spectra of the different types of beta-turns. Recently, it has been reported (Hollósi, M., Kövér, K.E., Holly, S., Radics, L., & Fasman, G.D., 1987, Biopolymers 26, 1527-1572; Perczel, A., Hollósi, M., Foxman, B.M., & Fasman, G.D., 1991a, J. Am. Chem. Soc. 113, 9772-9784) that some pseudohexapeptides (e.g., the cyclo[(delta)Ava-Gly-Pro-Aaa-Gly] where Aaa = Ser, Ser(OtBu), or Gly) in many solvents adopt a conformational mixture of type I and the type II beta-turns, although the X-ray-determined conformation was an ideal type I beta-turn. In addition to these pseudohexapeptides, conformational analysis was also carried out on three pseudotetrapeptides and three pseudooctapeptides. The target of the conformation analysis reported herein was to determine whether the ring stress of the above beta-turn models has an influence on their conformational properties. Quantitative nuclear Overhauser effect (NOE) measurements yielded interproton distances. The conformational average distances so obtained were interpreted utilizing molecular dynamics (MD) simulations to yield the conformational percentages. These conformational ratios were correlated with the conformational weights obtained by quantitative CD analysis of the same compounds. The pure component CD curves of type I and type II beta-turns were also obtained, using a recently developed algorithm (Perczel, A., Tusnády, G., Hollósi, M., & Fasman, G.D., 1991b, Protein Eng. 4(6), 669-679). For the first time the results of a CD deconvolution, based on the CD spectra of 14 beta-turn models, were assigned by quantitative NOE results. The NOE experiments confirmed the ratios of the component curves found for the two major beta-turns by CD analysis. These results

  5. Quantitative analysis of cyclic beta-turn models.

    PubMed

    Perczel, A; Fasman, G D

    1992-03-01

    The beta-turn is a frequently found structural unit in the conformation of globular proteins. Although the circular dichroism (CD) spectra of the alpha-helix and beta-pleated sheet are well defined, there remains some ambiguity concerning the pure component CD spectra of the different types of beta-turns. Recently, it has been reported (Hollósi, M., Kövér, K.E., Holly, S., Radics, L., & Fasman, G.D., 1987, Biopolymers 26, 1527-1572; Perczel, A., Hollósi, M., Foxman, B.M., & Fasman, G.D., 1991a, J. Am. Chem. Soc. 113, 9772-9784) that some pseudohexapeptides (e.g., the cyclo[(delta)Ava-Gly-Pro-Aaa-Gly] where Aaa = Ser, Ser(OtBu), or Gly) in many solvents adopt a conformational mixture of type I and the type II beta-turns, although the X-ray-determined conformation was an ideal type I beta-turn. In addition to these pseudohexapeptides, conformational analysis was also carried out on three pseudotetrapeptides and three pseudooctapeptides. The target of the conformation analysis reported herein was to determine whether the ring stress of the above beta-turn models has an influence on their conformational properties. Quantitative nuclear Overhauser effect (NOE) measurements yielded interproton distances. The conformational average distances so obtained were interpreted utilizing molecular dynamics (MD) simulations to yield the conformational percentages. These conformational ratios were correlated with the conformational weights obtained by quantitative CD analysis of the same compounds. The pure component CD curves of type I and type II beta-turns were also obtained, using a recently developed algorithm (Perczel, A., Tusnády, G., Hollósi, M., & Fasman, G.D., 1991b, Protein Eng. 4(6), 669-679). For the first time the results of a CD deconvolution, based on the CD spectra of 14 beta-turn models, were assigned by quantitative NOE results. The NOE experiments confirmed the ratios of the component curves found for the two major beta-turns by CD analysis. These results

  6. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  7. Phenotypic analysis of Arabidopsis mutants: quantitative analysis of root growth.

    PubMed

    Doerner, Peter

    2008-03-01

    INTRODUCTIONThe growth of plant roots is very easy to measure and is particularly straightforward in Arabidopsis thaliana, because the increase in organ size is essentially restricted to one dimension. The precise measurement of root apical growth can be used to accurately determine growth activity (the rate of growth at a given time) during development in mutants, transgenic backgrounds, or in response to experimental treatments. Root growth is measured in a number of ways, the simplest of which is to grow the seedlings in a Petri dish and record the position of the advancing root tip at appropriate time points. The increase in root length is measured with a ruler and the data are entered into Microsoft Excel for analysis. When dealing with large numbers of seedlings, however, this procedure can be tedious, as well as inaccurate. An alternative approach, described in this protocol, uses "snapshots" of the growing plants, which are taken using gel-documentation equipment (i.e., a video camera with a frame-grabber unit, now commonly used to capture images from ethidium-bromide-stained electrophoresis gels). The images are analyzed using publicly available software (NIH-Image), which allows the user simply to cut and paste data into Microsoft Excel.

  8. Hydrogen engine performance analysis project. Second annual report

    SciTech Connect

    Adt, Jr., R. R.; Swain, M. R.; Pappas, J. M.

    1980-01-01

    Progress in a 3 year research program to evaluate the performance and emission characteristics of hydrogen-fueled internal combustion engines is reported. Fifteen hydrogen engine configurations will be subjected to performance and emissions characterization tests. During the first two years, baseline data for throttled and unthrottled, carburetted and timed hydrogen induction, Pre IVC hydrogen-fueled engine configurations, with and without exhaust gas recirculation (EGR) and water injection, were obtained. These data, along with descriptions of the test engine and its components, the test apparatus, experimental techniques, experiments performed and the results obtained, are given. Analyses of other hydrogen-engine project data are also presented and compared with the results of the present effort. The unthrottled engine vis-a-vis the throttled engine is found, in general, to exhibit higher brake thermal efficiency. The unthrottled engine also yields lower NO/sub x/ emissions, which were found to be a strong function of fuel-air equivalence ratio. (LCL)

  9. Technoeconomic analysis of renewable hydrogen production, storage, and detection systems

    SciTech Connect

    Mann, M.K.; Spath, P.L.; Kadam, K.

    1996-10-01

    Technical and economic feasibility studies of different degrees of completeness and detail have been performed on several projects being funded by the Department of Energy`s Hydrogen Program. Work this year focused on projects at the National Renewable Energy Laboratory, although analyses of projects at other institutions are underway or planned. Highly detailed analyses were completed on a fiber optic hydrogen leak detector and a process to produce hydrogen from biomass via pyrolysis followed by steam reforming of the pyrolysis oil. Less detailed economic assessments of solar and biologically-based hydrogen production processes have been performed and focused on the steps that need to be taken to improve the competitive position of these technologies. Sensitivity analyses were conducted on all analyses to reveal the degree to which the cost results are affected by market changes and technological advances. For hydrogen storage by carbon nanotubes, a survey of the competing storage technologies was made in order to set a baseline for cost goals. A determination of the likelihood of commercialization was made for nearly all systems examined. Hydrogen from biomass via pyrolysis and steam reforming was found to have significant economic potential if a coproduct option could be co-commercialized. Photoelectrochemical hydrogen production may have economic potential, but only if low-cost cells can be modified to split water and to avoid surface oxidation. The use of bacteria to convert the carbon monoxide in biomass syngas to hydrogen was found to be slightly more expensive than the high end of currently commercial hydrogen, although there are significant opportunities to reduce costs. Finally, the cost of installing a fiber-optic chemochromic hydrogen detection system in passenger vehicles was found to be very low and competitive with alternative sensor systems.

  10. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/).

  11. In vivo osteogenesis assay: a rapid method for quantitative analysis.

    PubMed

    Dennis, J E; Konstantakos, E K; Arm, D; Caplan, A I

    1998-08-01

    A quantitative in vivo osteogenesis assay is a useful tool for the analysis of cells and bioactive factors that affect the amount or rate of bone formation. There are currently two assays in general use for the in vivo assessment of osteogenesis by isolated cells: diffusion chambers and porous calcium phosphate ceramics. Due to the relative ease of specimen preparation and reproducibility of results, the porous ceramic assay was chosen for the development of a rapid method for quantitating in vivo bone formation. The ceramic cube implantation technique consists of combining osteogenic cells with 27-mm3 porous calcium phosphate ceramics, implanting the cell-ceramic composites subcutaneously into an immuno-tolerant host, and, after 2-6 weeks, harvesting and preparing the ceramic implants for histologic analysis. A drawback to the analysis of bone formation within these porous ceramics is that the entire cube must be examined to find small foci of bone present in some samples; a single cross-sectional area is not representative. For this reason, image analysis of serial sections from ceramics is often prohibitively time-consuming. Two alternative scoring methodologies were tested and compared to bone volume measurements obtained by image analysis. The two subjective scoring methods were: (1) Bone Scale: the amount of bone within pores of the ceramic implant is estimated on a scale of 0-4 based on the degree of bone fill (0=no bone, 1=up to 25%, 2=25 to 75%, 4=75 to 100% fill); and (2) Percentage Bone: the amount of bone is estimated by determining the percentage of ceramic pores which contain bone. Every tenth section of serially sectioned cubes was scored by each of these methods under double-blind conditions, and the Bone Scale and Percentage Bone results were directly compared to image analysis measurements from identical samples. Correlation coefficients indicate that the Percentage Bone method was more accurate than the Bone Scale scoring method. The Bone Scale

  12. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study.

  13. From screening to quantitative sensitivity analysis. A unified approach

    NASA Astrophysics Data System (ADS)

    Campolongo, Francesca; Saltelli, Andrea; Cariboni, Jessica

    2011-04-01

    The present work is a sequel to a recent one published on this journal where the superiority of 'radial design' to compute the 'total sensitivity index' was ascertained. Both concepts belong to sensitivity analysis of model output. A radial design is the one whereby starting from a random point in the hyperspace of the input factors one step in turn is taken for each factor. The procedure is iterated a number of times with a different starting random point as to collect a sample of elementary shifts for each factor. The total sensitivity index is a powerful sensitivity measure which can be estimated based on such a sample. Given the similarity between the total sensitivity index and a screening test known as method of the elementary effects (or method of Morris), we test the radial design on this method. Both methods are best practices: the total sensitivity index in the class of the quantitative measures and the elementary effects in that of the screening methods. We find that the radial design is indeed superior even for the computation of the elementary effects method. This opens the door to a sensitivity analysis strategy whereby the analyst can start with a small number of points (screening-wise) and then - depending on the results - possibly increase the numeral of points up to compute a fully quantitative measure. Also of interest to practitioners is that a radial design is nothing else than an iterated 'One factor At a Time' (OAT) approach. OAT is a radial design of size one. While OAT is not a good practice, modelers in all domains keep using it for sensitivity analysis for reasons discussed elsewhere (Saltelli and Annoni, 2010) [23]. With the present approach modelers are offered a straightforward and economic upgrade of their OAT which maintain OAT's appeal of having just one factor moved at each step.

  14. Functional Linear Models for Association Analysis of Quantitative Traits

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Mills, James L.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao

    2014-01-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  15. Quantitative Analysis of Single-Molecule RNA-Protein Interaction

    PubMed Central

    Fuhrmann, Alexander; Schoening, Jan C.; Anselmetti, Dario; Staiger, Dorothee; Ros, Robert

    2009-01-01

    Abstract RNA-binding proteins impact gene expression at the posttranscriptional level by interacting with cognate cis elements within the transcripts. Here, we apply dynamic single-molecule force spectroscopy to study the interaction of the Arabidopsis glycine-rich RNA-binding protein AtGRP8 with its RNA target. A dwell-time-dependent analysis of the single-molecule data in combination with competition assays and site-directed mutagenesis of both the RNA target and the RNA-binding domain of the protein allowed us to distinguish and quantify two different binding modes. For dwell times <0.21 s an unspecific complex with a lifetime of 0.56 s is observed, whereas dwell times >0.33 s result in a specific interaction with a lifetime of 208 s. The corresponding reaction lengths are 0.28 nm for the unspecific and 0.55 nm for the specific AtGRP8-RNA interactions, indicating formation of a tighter complex with increasing dwell time. These two binding modes cannot be dissected in ensemble experiments. Quantitative titration in RNA bandshift experiments yields an ensemble-averaged equilibrium constant of dissociation of KD = 2 × 10−7 M. Assuming comparable on-rates for the specific and nonspecific binding modes allows us to estimate their free energies as ΔG0 = −42 kJ/mol and ΔG0 = −28 kJ/mol for the specific and nonspecific binding modes, respectively. Thus, we show that single-molecule force spectroscopy with a refined statistical analysis is a potent tool for the analysis of protein-RNA interactions without the drawback of ensemble averaging. This makes it possible to discriminate between different binding modes or sites and to analyze them quantitatively. We propose that this method could be applied to complex interactions of biomolecules in general, and be of particular interest for the investigation of multivalent binding reactions. PMID:19527663

  16. QuASAR: quantitative allele-specific analysis of reads.

    PubMed

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. QuASAR: quantitative allele-specific analysis of reads

    PubMed Central

    Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375

  18. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  19. U.S. Geographic Analysis of the Cost of Hydrogen from Electrolysis

    SciTech Connect

    Saur, G.; Ainscough, C.

    2011-12-01

    This report summarizes U.S. geographic analysis of the cost of hydrogen from electrolysis. Wind-based water electrolysis represents a viable path to renewably-produced hydrogen production. It might be used for hydrogen-based transportation fuels, energy storage to augment electricity grid services, or as a supplement for other industrial hydrogen uses. This analysis focuses on the levelized production, costs of producing green hydrogen, rather than market prices which would require more extensive knowledge of an hourly or daily hydrogen market. However, the costs of hydrogen presented here do include a small profit from an internal rate of return on the system. The cost of renewable wind-based hydrogen production is very sensitive to the cost of the wind electricity. Using differently priced grid electricity to supplement the system had only a small effect on the cost of hydrogen; because wind electricity was always used either directly or indirectly to fully generate the hydrogen. Wind classes 3-6 across the U.S. were examined and the costs of hydrogen ranged from $3.74kg to $5.86/kg. These costs do not quite meet the 2015 DOE targets for central or distributed hydrogen production ($3.10/kg and $3.70/kg, respectively), so more work is needed on reducing the cost of wind electricity and the electrolyzers. If the PTC and ITC are claimed, however, many of the sites will meet both targets. For a subset of distributed refueling stations where there is also inexpensive, open space nearby this could be an alternative to central hydrogen production and distribution.

  20. HBonanza: a computer algorithm for molecular-dynamics-trajectory hydrogen-bond analysis.

    PubMed

    Durrant, Jacob D; McCammon, J Andrew

    2011-11-01

    In the current work, we present a hydrogen-bond analysis of 2673 ligand-receptor complexes that suggests the total number of hydrogen bonds formed between a ligand and its receptor is a poor predictor of ligand potency; furthermore, even that poor prediction does not suggest a statistically significant correlation between hydrogen-bond formation and potency. While we are not the first to suggest that hydrogen bonds on average do not generally contribute to ligand binding affinities, this additional evidence is nevertheless interesting. The primary role of hydrogen bonds may instead be to ensure specificity, to correctly position the ligand within the active site, and to hold the protein active site in a ligand-friendly conformation. We also present a new computer program called HBonanza (hydrogen-bond analyzer) that aids the analysis and visualization of hydrogen-bond networks. HBonanza, which can be used to analyze single structures or the many structures of a molecular dynamics trajectory, is open source and python implemented, making it easily editable, customizable, and platform independent. Unlike many other freely available hydrogen-bond analysis tools, HBonanza provides not only a text-based table describing the hydrogen-bond network, but also a Tcl script to facilitate visualization in VMD, a popular molecular visualization program. Visualization in other programs is also possible. A copy of HBonanza can be obtained free of charge from http://www.nbcr.net/hbonanza. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. HBonanza: A Computer Algorithm for Molecular-Dynamics-Trajectory Hydrogen-Bond Analysis

    PubMed Central

    Durrant, Jacob D.; McCammon, J. Andrew

    2011-01-01

    In the current work, we present a hydrogen-bond analysis of 2,673 ligand-receptor complexes that suggests the total number of hydrogen bonds formed between a ligand and its protein receptor is a poor predictor of ligand potency; furthermore, even that poor prediction does not suggest a statistically significant correlation between hydrogen-bond formation and potency. While we are not the first to suggest that hydrogen bonds on average do not generally contribute to ligand binding affinities, this additional evidence is nevertheless interesting. The primary role of hydrogen bonds may instead be to ensure specificity, to correctly position the ligand within the active site, and to hold the protein active site in a ligand-friendly conformation. We also present a new computer program called HBonanza (hydrogen-bond analyzer) that aids the analysis and visualization of hydrogen-bond networks. HBonanza, which can be used to analyze single structures or the many structures of a molecular dynamics trajectory, is open source and python implemented, making it easily editable, customizable, and platform independent. Unlike many other freely available hydrogen-bond analysis tools, HBonanza provides not only a text-based table describing the hydrogen-bond network, but also a Tcl script to facilitate visualization in VMD, a popular molecular visualization program. Visualization in other programs is also possible. A copy of HBonanza can be obtained free of charge from http://www.nbcr.net/hbonanza. PMID:21880522

  2. Quantitative Analysis of Peripheral Tissue Perfusion Using Spatiotemporal Molecular Dynamics

    PubMed Central

    Lee, Jungsul; Koh, Gou Young; Kwon, Kihwan; Choi, Chulhee

    2009-01-01

    Background Accurate measurement of peripheral tissue perfusion is challenging but necessary to diagnose peripheral vascular insufficiency. Because near infrared (NIR) radiation can penetrate relatively deep into tissue, significant attention has been given to intravital NIR fluorescence imaging. Methodology/Principal Findings We developed a new optical imaging-based strategy for quantitative measurement of peripheral tissue perfusion by time-series analysis of local pharmacokinetics of the NIR fluorophore, indocyanine green (ICG). Time-series NIR fluorescence images were obtained after injecting ICG intravenously in a murine hindlimb ischemia model. Mathematical modeling and computational simulations were used for translating time-series ICG images into quantitative pixel perfusion rates and a perfusion map. We could successfully predict the prognosis of ischemic hindlimbs based on the perfusion profiles obtained immediately after surgery, which were dependent on the preexisting collaterals. This method also reflected increases in perfusion and improvements in prognosis of ischemic hindlimbs induced by treatment with vascular endothelial growth factor and COMP-angiopoietin-1. Conclusions/Significance We propose that this novel NIR-imaging-based strategy is a powerful tool for biomedical studies related to the evaluation of therapeutic interventions directed at stimulating angiogenesis. PMID:19169354

  3. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  4. Quantitative analysis of tumor burden in mouse lung via MRI.

    PubMed

    Tidwell, Vanessa K; Garbow, Joel R; Krupnick, Alexander S; Engelbach, John A; Nehorai, Arye

    2012-02-01

    Lung cancer is the leading cause of cancer death in the United States. Despite recent advances in screening protocols, the majority of patients still present with advanced or disseminated disease. Preclinical rodent models provide a unique opportunity to test novel therapeutic drugs for targeting lung cancer. Respiratory-gated MRI is a key tool for quantitatively measuring lung-tumor burden and monitoring the time-course progression of individual tumors in mouse models of primary and metastatic lung cancer. However, quantitative analysis of lung-tumor burden in mice by MRI presents significant challenges. Herein, a method for measuring tumor burden based upon average lung-image intensity is described and validated. The method requires accurate lung segmentation; its efficiency and throughput would be greatly aided by the ability to automatically segment the lungs. A technique for automated lung segmentation in the presence of varying tumor burden levels is presented. The method includes development of a new, two-dimensional parametric model of the mouse lungs and a multi-faceted cost function to optimally fit the model parameters to each image. Results demonstrate a strong correlation (0.93), comparable with that of fully manual expert segmentation, between the automated method's tumor-burden metric and the tumor burden measured by lung weight.

  5. Advance in orientation microscopy: quantitative analysis of nanocrystalline structures.

    PubMed

    Seyring, Martin; Song, Xiaoyan; Rettenmayr, Markus

    2011-04-26

    The special properties of nanocrystalline materials are generally accepted to be a consequence of the high density of planar defects (grain and twin boundaries) and their characteristics. However, until now, nanograin structures have not been characterized with similar detail and statistical relevance as coarse-grained materials, due to the lack of an appropriate method. In the present paper, a novel method based on quantitative nanobeam diffraction in transmission electron microscopy (TEM) is presented to determine the misorientation of adjacent nanograins and subgrains. Spatial resolution of <5 nm can be achieved. This method is applicable to characterize orientation relationships in wire, film, and bulk materials with nanocrystalline structures. As a model material, nanocrystalline Cu is used. Several important features of the nanograin structure are discovered utilizing quantitative analysis: the fraction of twin boundaries is substantially higher than that observed in bright-field images in the TEM; small angle grain boundaries are prominent; there is an obvious dependence of the grain boundary characteristics on grain size distribution and mean grain size.

  6. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  7. Quantitative colorimetric-imaging analysis of nickel in iron meteorites.

    PubMed

    Zamora, L Lahuerta; López, P Alemán; Fos, G M Antón; Algarra, R Martín; Romero, A M Mellado; Calatayud, J Martínez

    2011-02-15

    A quantitative analytical imaging approach for determining the nickel content of metallic meteorites is proposed. The approach uses a digital image of a series of standard solutions of the nickel-dimethylglyoxime coloured chelate and a meteorite sample solution subjected to the same treatment as the nickel standards for quantitation. The image is processed with suitable software to assign a colour-dependent numerical value (analytical signal) to each standard. Such a value is directly proportional to the analyte concentration, which facilitates construction of a calibration graph where the value for the unknown sample can be interpolated to calculate the nickel content of the meteorite. The results thus obtained were validated by comparison with the official, ISO-endorsed spectrophotometric method for nickel. The proposed method is fairly simple and inexpensive; in fact, it uses a commercially available digital camera as measuring instrument and the images it provides are processed with highly user-friendly public domain software (specifically, ImageJ, developed by the National Institutes of Health and freely available for download on the Internet). In a scenario dominated by increasingly sophisticated and expensive equipment, the proposed method provides a cost-effective alternative based on simple, robust hardware that is affordable and can be readily accessed worldwide. This can be especially advantageous for countries were available resources for analytical equipment investments are scant. The proposed method is essentially an adaptation of classical chemical analysis to current, straightforward, robust, cost-effective instrumentation. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Quantitative three-dimensional holographic interferometry for flow field analysis

    NASA Astrophysics Data System (ADS)

    Holden, C. M. E.; Parker, S. C. J.; Bryanston-Cross, P. J.

    Holographic interferometry offers the potential for quantitative, wholefield analysis of three-dimensional compressible flows. The technique is non-intrusive, does not require the introduction of seeding particles, and records the entire flow information within the pulse duration of a Q-switched ruby laser (~30ns). At present, however, holographic interferometry is mainly used qualitatively due to the practical restrictions of data recording, acquisition and processing. To address the potential of holographic flow analysis a prototype multi-channel interferometer has been designed and preliminary wind tunnel results have been obtained. The proposed configuration uses specular illumination which, unlike comparable diffuse systems, does not suffer from fringe localisation and speckle noise. Beam collimation and steering through the flow field is achieved in a single operation by the use of holographic optical elements (HOEs). The resulting design is compact, light efficient, has aberration compensation, and the recorded data are conducive to both tomographic analysis and direct comparison to computational fluid dynamics (CFD) predictions. Holograms have been recorded of simple two-dimensional and axisymmetric compressible flows, to compare the accuracy of holographic density measurements with data from conventional pressure sensors and CFD codes. Data extraction from the holograms, and the elimination of rigid body motion, was achieved using digital Fourier transform fringe analysis. The introduction of phase errors by image processing has been investigated by analysing simulated fringe patterns generated from a combination of experimental amplitude information and computer generated phase data.

  9. Multipoint quantitative-trait linkage analysis in general pedigrees.

    PubMed Central

    Almasy, L; Blangero, J

    1998-01-01

    Multipoint linkage analysis of quantitative-trait loci (QTLs) has previously been restricted to sibships and small pedigrees. In this article, we show how variance-component linkage methods can be used in pedigrees of arbitrary size and complexity, and we develop a general framework for multipoint identity-by-descent (IBD) probability calculations. We extend the sib-pair multipoint mapping approach of Fulker et al. to general relative pairs. This multipoint IBD method uses the proportion of alleles shared identical by descent at genotyped loci to estimate IBD sharing at arbitrary points along a chromosome for each relative pair. We have derived correlations in IBD sharing as a function of chromosomal distance for relative pairs in general pedigrees and provide a simple framework whereby these correlations can be easily obtained for any relative pair related by a single line of descent or by multiple independent lines of descent. Once calculated, the multipoint relative-pair IBDs can be utilized in variance-component linkage analysis, which considers the likelihood of the entire pedigree jointly. Examples are given that use simulated data, demonstrating both the accuracy of QTL localization and the increase in power provided by multipoint analysis with 5-, 10-, and 20-cM marker maps. The general pedigree variance component and IBD estimation methods have been implemented in the SOLAR (Sequential Oligogenic Linkage Analysis Routines) computer package. PMID:9545414

  10. Quantitative multi-image analysis for biomedical Raman spectroscopic imaging.

    PubMed

    Hedegaard, Martin A B; Bergholt, Mads S; Stevens, Molly M

    2016-05-01

    Imaging by Raman spectroscopy enables unparalleled label-free insights into cell and tissue composition at the molecular level. With established approaches limited to single image analysis, there are currently no general guidelines or consensus on how to quantify biochemical components across multiple Raman images. Here, we describe a broadly applicable methodology for the combination of multiple Raman images into a single image for analysis. This is achieved by removing image specific background interference, unfolding the series of Raman images into a single dataset, and normalisation of each Raman spectrum to render comparable Raman images. Multivariate image analysis is finally applied to derive the contributing 'pure' biochemical spectra for relative quantification. We present our methodology using four independently measured Raman images of control cells and four images of cells treated with strontium ions from substituted bioactive glass. We show that the relative biochemical distribution per area of the cells can be quantified. In addition, using k-means clustering, we are able to discriminate between the two cell types over multiple Raman images. This study shows a streamlined quantitative multi-image analysis tool for improving cell/tissue characterisation and opens new avenues in biomedical Raman spectroscopic imaging. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  12. Theoretical analysis of hydrogen spillover mechanism on carbon nanotubes

    PubMed Central

    Juarez-Mosqueda, Rosalba; Mavrandonakis, Andreas; Kuc, Agnieszka B.; Pettersson, Lars G. M.; Heine, Thomas

    2015-01-01

    The spillover mechanism of molecular hydrogen on carbon nanotubes in the presence of catalytically active platinum clusters was critically and systematically investigated by using density-functional theory. Our simulation model includes a Pt4 cluster for the catalyst nanoparticle and curved and planar circumcoronene for two exemplary single-walled carbon nanotubes (CNT), the (10,10) CNT and one of large diameter, respectively. Our results show that the H2 molecule dissociates spontaneously on the Pt4 cluster. However, the dissociated H atoms have to overcome a barrier of more than 2 eV to migrate from the catalyst to the CNT, even if the Pt4 cluster is at full saturation with six adsorbed and dissociated hydrogen molecules. Previous investigations have shown that the mobility of hydrogen atoms on the CNT surface is hindered by a barrier. We find that instead the Pt4 catalyst may move along the outer surface of the CNT with activation energy of only 0.16 eV, and that this effect offers the possibility of full hydrogenation of the CNT. Thus, although we have not found a low-energy pathway to spillover onto the CNT, we suggest, based on our calculations and calculated data reported in the literature, that in the hydrogen-spillover process the observed saturation of the CNT at hydrogen background pressure occurs through mobile Pt nanoclusters, which move on the substrate more easily than the substrate-chemisorbed hydrogens, and deposit or reattach hydrogens in the process. Initial hydrogenation of the carbon substrate, however, is thermodynamically unfavoured, suggesting that defects should play a significant role. PMID:25699250

  13. Theoretical analysis of hydrogen spillover mechanism on carbon nanotubes.

    PubMed

    Juarez-Mosqueda, Rosalba; Mavrandonakis, Andreas; Kuc, Agnieszka B; Pettersson, Lars G M; Heine, Thomas

    2015-01-01

    The spillover mechanism of molecular hydrogen on carbon nanotubes in the presence of catalytically active platinum clusters was critically and systematically investigated by using density-functional theory. Our simulation model includes a Pt4 cluster for the catalyst nanoparticle and curved and planar circumcoronene for two exemplary single-walled carbon nanotubes (CNT), the (10,10) CNT and one of large diameter, respectively. Our results show that the H2 molecule dissociates spontaneously on the Pt4 cluster. However, the dissociated H atoms have to overcome a barrier of more than 2 eV to migrate from the catalyst to the CNT, even if the Pt4 cluster is at full saturation with six adsorbed and dissociated hydrogen molecules. Previous investigations have shown that the mobility of hydrogen atoms on the CNT surface is hindered by a barrier. We find that instead the Pt4 catalyst may move along the outer surface of the CNT with activation energy of only 0.16 eV, and that this effect offers the possibility of full hydrogenation of the CNT. Thus, although we have not found a low-energy pathway to spillover onto the CNT, we suggest, based on our calculations and calculated data reported in the literature, that in the hydrogen-spillover process the observed saturation of the CNT at hydrogen background pressure occurs through mobile Pt nanoclusters, which move on the substrate more easily than the substrate-chemisorbed hydrogens, and deposit or reattach hydrogens in the process. Initial hydrogenation of the carbon substrate, however, is thermodynamically unfavoured, suggesting that defects should play a significant role.

  14. Theoretical analysis of hydrogen spillover mechanism on carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Juarez Mosqueda, Rosalba; Mavrandonakis, Andreas; Kuc, Agnieszka; Pettersson, Lars; Heine, Thomas

    2015-02-01

    The spillover mechanism of molecular hydrogen on carbon nanotubes in the presence of catalytically active platinum clusters was critically and systematically investigated by using density-functional theory. Our simulation model includes a Pt4 cluster for the catalyst nanoparticle and curved and planar circumcoronene for two exemplary single-walled carbon nanotubes (CNT), the (10,10) CNT and one of large diameter, respectively. Our results show that the H2 molecule dissociates spontaneously on the Pt4 cluster. However, the dissociated H atoms have to overcome a barrier of more than 2 eV to migrate from the catalyst to the CNT, even if the Pt4 cluster is at full saturation with six adsorbed and dissociated hydrogen molecules. Previous investigations have shown that the mobility of hydrogen atoms on the CNT surface is hindered by a barrier. We find that instead the Pt4 catalyst may move along the outer surface of the CNT with activation energy of only 0.16 eV, and that this effect offers the possibility of full hydrogenation of the CNT. Thus, although we have not found a low-energy pathway to spillover onto the CNT, we suggest, based on our calculations and calculated data reported in the literature, that in the hydrogen-spillover process the observed saturation of the CNT at hydrogen background pressure occurs through mobile Pt nanoclusters, which move on the substrate more easily than the substrate-chemisorbed hydrogens, and deposit or reattach hydrogens in the process. Initial hydrogenation of the carbon substrate, however, is thermodynamically unfavoured, suggesting that defects should play a significant role.

  15. ISE Analysis of Hydrogen Sulfide in Cigarette Smoke

    NASA Astrophysics Data System (ADS)

    Li, Guofeng; Polk, Brian J.; Meazell, Liz A.; Hatchett, David W.

    2000-08-01

    Many advanced undergraduate analytical laboratory courses focus on exposing students to various modern instruments. However, students rarely have the opportunity to construct their own analytical tools for solving practical problems. We designed an experiment in which students are required to build their own analytical module, a potentiometric device composed of a Ag/AgCl reference electrode, a Ag/Ag2S ion selective electrode (ISE), and a pH meter used as voltmeter, to determine the amount of hydrogen sulfide in cigarette smoke. Very simple techniques were developed for constructing these electrodes. Cigarette smoke is collected by a gas washing bottle into a 0.1 M NaOH solution. The amount of sulfide in the cigarette smoke solution is analyzed by standard addition of sulfide solution while monitoring the response of the Ag/Ag2S ISE. The collected data are further evaluated using the Gran plot technique to determine the concentration of sulfide in the cigarette smoke solution. The experiment has been successfully incorporated into the lab course Instrumental Analysis at Georgia Institute of Technology. Students enjoy the idea of constructing an analytical tool themselves and applying their classroom knowledge to solve real-life problems. And while learning electrochemistry they also get a chance to visualize the health hazard imposed by cigarette smoking.

  16. Hydrogen Fueling Station in Honolulu, Hawaii Feasibility Analysis

    SciTech Connect

    Porter Hill; Michael Penev

    2014-08-01

    The Department of Energy Hydrogen & Fuel Cells Program Plan (September 2011) identifies the use of hydrogen for government and fleet electric vehicles as a key step for achieving “reduced greenhouse gas emissions; reduced oil consumption; expanded use of renewable power …; highly efficient energy conversion; fuel flexibility …; reduced air pollution; and highly reliable grid-support.” This report synthesizes several pieces of existing information that can inform a decision regarding the viability of deploying a hydrogen (H2) fueling station at the Fort Armstrong site in Honolulu, Hawaii.

  17. A survey and analysis of experimental hydrogen sensors

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.

    1992-01-01

    In order to ascertain the applicability of hydrogen sensors to aerospace applications, a survey was conducted of promising experimental point-contact hydrogen sensors and their operation was analyzed. The techniques discussed are metal-oxide-semiconductor or MOS based sensors, catalytic resistor sensors, acoustic wave detectors, and pyroelectric detectors. All of these sensors depend on the interaction of hydrogen with Pd or a Pd-alloy. It is concluded that no single technique will meet the needs of aerospace applications but a combination of approaches is necessary. The most promising combination is an MOS based sensor with a catalytic resistor.

  18. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  19. Quantitative analysis of creatinine in urine by metalized nanostructured parylene

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Malvadkar, Niranjan; Koytek, S.; Bylander, J.; Reeves, W. Brian; Demirel, Melik C.

    2010-03-01

    A highly accurate, real-time multisensor agent monitor for biomarker detection is required for early detection of kidney diseases. Urine creatinine level can provide useful information on the status of the kidney. We prepare nanostructured surface-enhanced Raman spectroscopy (SERS) substrates without template or lithography, which provides controllable, well-organized nanostructures on the surface, for the quantitative analysis of creatinine concentration in urine. We present our work on sensitivity of the SERS substrate to urine samples collected from diabetic patients and healthy persons. We report the preparation of a new type of SERS substrate, which provides fast (<10 s), highly sensitive (creatinine concentration <0.5 μg/mL) and reproducible (<5% variation) detection of urine. Our method to analyze the creatinine level in urine is in good agreement with the enzymatic method.

  20. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1975-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y a thin Formvar film or by extracting the sample onto ion exchange resin which is pressed into a pellet.

  1. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  2. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  3. qfasar: quantitative fatty acid signature analysis with R

    USGS Publications Warehouse

    Bromaghin, Jeffrey

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  4. Automatic analysis of quantitative NMR data of pharmaceutical compound libraries.

    PubMed

    Liu, Xuejun; Kolpak, Michael X; Wu, Jiejun; Leo, Gregory C

    2012-08-07

    In drug discovery, chemical library compounds are usually dissolved in DMSO at a certain concentration and then distributed to biologists for target screening. Quantitative (1)H NMR (qNMR) is the preferred method for the determination of the actual concentrations of compounds because the relative single proton peak areas of two chemical species represent the relative molar concentrations of the two compounds, that is, the compound of interest and a calibrant. Thus, an analyte concentration can be determined using a calibration compound at a known concentration. One particularly time-consuming step in the qNMR analysis of compound libraries is the manual integration of peaks. In this report is presented an automated method for performing this task without prior knowledge of compound structures and by using an external calibration spectrum. The script for automated integration is fast and adaptable to large-scale data sets, eliminating the need for manual integration in ~80% of the cases.

  5. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  6. Cerebellar dyssynergia in humans--a quantitative analysis.

    PubMed

    Miller, R G; Freund, H J

    1980-12-01

    Patients with cerebellar lesions and limb ataxia performed two types of continuous tracking tasks involving flexion and extension of the index finger. In both tasks, patients were provided cutaneous and proprioceptive cues, but visual feedback was given in the first task (visual tracking) and not in the second (arbitrarily termed proprioceptive tracking). Raw records and Fourier-analyzed power spectra were compared with results in normal controls. Harmonic distortion was determined for each task. In all patients, as well as normal subjects, tracking performance was markedly improved and harmonic distortion substantially reduced during proprioceptive tracking. This surprising finding may result from a much shorter feedback loop for proprioceptive stimuli compared to visual stimuli. The tracking records, power spectra analysis, and determination of harmonic distortion provide both qualitative and quantitative data in patients with dyssynergia.

  7. Quantitative image analysis for investigating cell-matrix interactions

    NASA Astrophysics Data System (ADS)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  8. Quantitative microstructure analysis of polymer-modified mortars.

    PubMed

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  9. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  10. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    PubMed

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  11. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  12. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  13. Energy efficiency quantitative analysis method of discrete manufacturing system

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Ji, Zhicheng

    2017-07-01

    The difficulty in the energy efficiency analysis of discrete manufacturing system is the lack of evaluation index system. In this paper, a novel evaluation index system with three layers and 10 indexes was presented to analyze the overall energy consumption level of the discrete manufacturing system. Then, with the consideration of the difficulties in directly obtaining machine energy efficiency, a prediction method based on recursive variable forgetting factor identification was put forward to calculate it. Furthermore, a comprehensive quantitative evaluation method of rough set and attribute hierarchical model was designed based on the index structure to evaluate the energy efficiency level. Finally, an experiment was used to illustrate the effectiveness of our evaluation index system and method.

  14. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  15. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis

    PubMed Central

    Mazel, Christian; Mitulescu, Anca

    2007-01-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon–Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level’s degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon’s qualitative grading in 87% of cases. PMID:17216227

  16. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  17. Expanded Capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST)

    SciTech Connect

    Bush, Brian; Melaina, Marc; Penev, Michael

    2016-06-08

    This presentation describes how NREL expanded the capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST) in FY16. It was presented at the U.S. Department of Energy Hydrogen and Fuel Cells Program 2016 Annual Merit Review and Peer Evaluation Meeting on June 8, 2016, in Washington, D.C.

  18. Technoeconomic analysis of different options for the production of hydrogen from sunlight, wind, and biomass

    SciTech Connect

    Mann, M.K.; Spath, P.L.; Amos, W.A.

    1998-08-01

    To determine their technical and economic viability and to provide insight into where each technology is in its development cycle, different options to produce hydrogen from sunlight, wind, and biomass were studied. Additionally, costs for storing and transporting hydrogen were determined for different hydrogen quantities and storage times. The analysis of hydrogen from sunlight examined the selling price of hydrogen from two technologies: direct photoelectrochemical (PEC) conversion of sunlight and photovoltaic (PV)-generated electricity production followed by electrolysis. The wind analysis was based on wind-generated electricity production followed by electrolysis. In addition to the base case analyses, which assume that hydrogen is the sole product, three alternative scenarios explore the economic impact of integrating the PV- and wind-based systems with the electric utility grid. Results show that PEC hydrogen production has the potential to be economically feasible. Additionally, the economics of the PV and wind electrolysis systems are improved by interaction with the grid. The analysis of hydrogen from biomass focused on three gasification technologies. The systems are: low pressure, indirectly-heated gasification followed by steam reforming; high pressure, oxygen-blown gasification followed by steam reforming; and pyrolysis followed by partial oxidation. For each of the systems studied, the downstream process steps include shift conversion followed by hydrogen purification. Only the low pressure system produces hydrogen within the range of the current industry selling prices (typically $0.7--$2/kg, or $5--14/GJ on a HHV basis). A sensitivity analysis showed that, for the other two systems, in order to bring the hydrogen selling price down to $2/kg, negative-priced feedstocks would be required.

  19. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  20. In vitro quantitative chemical analysis of tattoo pigments.

    PubMed

    Timko, A L; Miller, C H; Johnson, F B; Ross, E

    2001-02-01

    The composition of cosmetic tattoos might prove relevant to their treatment by high-powered lasers. To test the accuracy and completeness of information supplied by the tattoo ink manufacturers and to perform an elemental assay of tattoo pigments using scanning electron microscopy with energy-dispersive x-ray analysis. Samples of 30 tattoo inks were examined using "standardless" energy-dispersive spectrometry. This technique uses quantitative electron x-ray microanalysis. The technique reliably identifies all elements with the exception of those elements with atomic numbers less than 11. A major national referral laboratory for microscopic examination and biochemical analysis of tissue. These results were compared with ink compositions compiled from manufacturer-supplied material safety data sheets. (1) The percentage of any given element in whole tattoo pigments. (2) The presence or absence of elements and/or compounds as recorded in material safety data sheets supplied by the tattoo ink manufacturers. Of the 30 tattoo inks studied, the most commonly identified elements were aluminum (87% of the pigments), oxygen (73% of the pigments), titanium (67% of the pigments), and carbon (67% of the pigments). The relative contribution of elements to the tattoo ink compositions was highly variable between different compounds. Overall, the manufacturer-supplied data sheets were consistent with the elemental analysis, but there were important exceptions. The composition of elements in tattoo inks varies greatly, even among like-colored pigments. Knowledge of the chemical composition of popular tattoo inks might aid the clinician in effective laser removal.

  1. A Quantitative Analysis of the Solar Composition Problem

    NASA Astrophysics Data System (ADS)

    Villante, F. L.; Serenelli, A. M.

    We perform a quantitative analysis of the solar composition problem by using a statistical approach that allows us to combine the information provided by helioseismic and solar neutrino data in an effective way. We show that the opacity profile of the Sun is well constrained by the solar observational properties. In the context of a two parameter analysis in which elements are grouped as volatiles (i.e. C, N, O and Ne) and refractories (i.e. Mg, Si, S, Fe), the optimal surface composition is found by increasing the abundance of volatiles by (45 ± 4) % and that of refractories by (19 ± 3) % with respect to the values provided by Asplund et al., 2009. As an additional result of our analysis, we show that the best fit to the observational data is obtained with values of input parameters of the standard solar models (radiative opacities, gravitational settling rate, the astrophysical factors S 34 and S 17) that differ at the ∼ 1σ level from those presently adopted.

  2. [Quantitative analysis of drug expenditures variability in dermatology units].

    PubMed

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  3. QTL analysis for some quantitative traits in bread wheat*

    PubMed Central

    Pushpendra, Kumar Gupta; Harindra, Singh Balyan; Pawan, Laxminarayan Kulwal; Neeraj, Kumar; Ajay, Kumar; Reyazul, Rouf Mir; Amita, Mohan; Jitendra, Kumar

    2007-01-01

    Quantitative trait loci (QTL) analysis was conducted in bread wheat for 14 important traits utilizing data from four different mapping populations involving different approaches of QTL analysis. Analysis for grain protein content (GPC) suggested that the major part of genetic variation for this trait is due to environmental interactions. In contrast, pre-harvest sprouting tolerance (PHST) was controlled mainly by main effect QTL (M-QTL) with very little genetic variation due to environmental interactions; a major QTL for PHST was detected on chromosome arm 3AL. For grain weight, one QTL each was detected on chromosome arms 1AS, 2BS and 7AS. QTL for 4 growth related traits taken together detected by different methods ranged from 37 to 40; nine QTL that were detected by single-locus as well as two-locus analyses were all M-QTL. Similarly, single-locus and two-locus QTL analyses for seven yield and yield contributing traits in two populations respectively allowed detection of 25 and 50 QTL by composite interval mapping (CIM), 16 and 25 QTL by multiple-trait composite interval mapping (MCIM) and 38 and 37 QTL by two-locus analyses. These studies should prove useful in QTL cloning and wheat improvement through marker aided selection. PMID:17973342

  4. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  5. Quantitative polymerase chain reaction analysis by deconvolution of internal standard.

    PubMed

    Hirakawa, Yasuko; Medh, Rheem D; Metzenberg, Stan

    2010-04-29

    Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  6. How resonance assists hydrogen bonding interactions: an energy decomposition analysis.

    PubMed

    Beck, John Frederick; Mo, Yirong

    2007-01-15

    Block-localized wave function (BLW) method, which is a variant of the ab initio valence bond (VB) theory, was employed to explore the nature of resonance-assisted hydrogen bonds (RAHBs) and to investigate the mechanism of synergistic interplay between pi delocalization and hydrogen-bonding interactions. We examined the dimers of formic acid, formamide, 4-pyrimidinone, 2-pyridinone, 2-hydroxpyridine, and 2-hydroxycyclopenta-2,4-dien-1-one. In addition, we studied the interactions in beta-diketone enols with a simplified model, namely the hydrogen bonds of 3-hydroxypropenal with both ethenol and formaldehyde. The intermolecular interaction energies, either with or without the involvement of pi resonance, were decomposed into the Hitler-London energy (DeltaEHL), polarization energy (DeltaEpol), charge transfer energy (DeltaECT), and electron correlation energy (DeltaEcor) terms. This allows for the examination of the character of hydrogen bonds and the impact of pi conjugation on hydrogen bonding interactions. Although it has been proposed that resonance-assisted hydrogen bonds are accompanied with an increasing of covalency character, our analyses showed that the enhanced interactions mostly originate from the classical dipole-dipole (i.e., electrostatic) attraction, as resonance redistributes the electron density and increases the dipole moments in monomers. The covalency of hydrogen bonds, however, changes very little. This disputes the belief that RAHB is primarily covalent in nature. Accordingly, we recommend the term "resonance-assisted binding (RAB)" instead of "resonance-assisted hydrogen bonding (RHAB)" to highlight the electrostatic, which is a long-range effect, rather than the electron transfer nature of the enhanced stabilization in RAHBs. Copyright (c) 2006 Wiley Periodicals, Inc.

  7. Performance Analysis of Hybrid Solar-Hydrogen Energy System

    NASA Astrophysics Data System (ADS)

    Xiao, Jinsheng; Guan, Xuehua

    The system of solar thermoelectric-photovoltaic hybrid generation for hydrogen production is designed in this paper. The mathematical model of the hybrid system using MATLAB/SIMULINK software is carried out. And the logic control system is designed. The current of the various sub-systems and the energy of the hydrogen storage tank are simulated and analyzed, this paper proves the solar hybrid system can be reliable and effective.

  8. Engineering analysis of potential photosynthetic bacterial hydrogen-production systems

    NASA Astrophysics Data System (ADS)

    Herlevich, A.; Karpuk, M. E.

    1982-06-01

    Photosynthetic bacteria (PSB) are capable of generating hydrogen from organics in effluents from food processing, pulp and paper, and chemical and pharmaceutical industries. Hydrogen evolution takes place under light in the absence of air. The rate of hydrogen production is expected to range between 300 to 600 scf of hydrogen per 1000 galloons of waste stream treated per hour. This hydrogen production system has been demonstrated at a bench-scale level and is ready for engineering development. A conceptual design for a PSB hydrogen production system is described. The system is expected to be sited adjacent to a waste stream source which will be pretreated by fermentation and pH adjustment, innoculated with bacteria, and then passed into the reactor. The reactor effluent can either be discharged into a rapid infiltration system, an irrigation ditch, and/or recycled back into the reactor. Several potential reactor designs have been developed, analyzed, and costed. A large covered pond appears to be the most economical design approach.

  9. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  10. Economic Analysis of a Nuclear Reactor Powered High-Temperature Electrolysis Hydrogen Production Plant

    SciTech Connect

    E. A. Harvego; M. G. McKellar; M. S. Sohal; J. E. O'Brien; J. S. Herring

    2008-08-01

    A reference design for a commercial-scale high-temperature electrolysis (HTE) plant for hydrogen production was developed to provide a basis for comparing the HTE concept with other hydrogen production concepts. The reference plant design is driven by a high-temperature helium-cooled nuclear reactor coupled to a direct Brayton power cycle. The reference design reactor power is 600 MWt, with a primary system pressure of 7.0 MPa, and reactor inlet and outlet fluid temperatures of 540°C and 900°C, respectively. The electrolysis unit used to produce hydrogen includes 4,009,177 cells with a per-cell active area of 225 cm2. The optimized design for the reference hydrogen production plant operates at a system pressure of 5.0 MPa, and utilizes an air-sweep system to remove the excess oxygen that is evolved on the anode (oxygen) side of the electrolyzer. The inlet air for the air-sweep system is compressed to the system operating pressure of 5.0 MPa in a four-stage compressor with intercooling. The alternating-current, AC, to direct-current, DC, conversion efficiency is 96%. The overall system thermal-to-hydrogen production efficiency (based on the lower heating value of the produced hydrogen) is 47.12% at a hydrogen production rate of 2.356 kg/s. An economic analysis of this plant was performed using the standardized H2A Analysis Methodology developed by the Department of Energy (DOE) Hydrogen Program, and using realistic financial and cost estimating assumptions. The results of the economic analysis demonstrated that the HTE hydrogen production plant driven by a high-temperature helium-cooled nuclear power plant can deliver hydrogen at a competitive cost. A cost of $3.23/kg of hydrogen was calculated assuming an internal rate of return of 10%.

  11. Quantitative determination of trace levels of hydrogen peroxide in crospovidone and a pharmaceutical product using high performance liquid chromatography with coulometric detection.

    PubMed

    Yue, Hongfei; Bu, Xin; Huang, Ming-Hsing; Young, Joel; Raglione, Thomas

    2009-06-22

    A reliable and reproducible high performance liquid chromatography method with coulometric detection was developed and validated for the quantitative determination of trace-levels of hydrogen peroxide in crospovidone, a pharmaceutical excipient, and a capsule pharmaceutical product. The method conditions included: a reproducible extraction procedure to provide a concentrated extract, aqueous extraction solvent; a simple HPLC mobile phase (aqueous 50 mM ammonium acetate) compatible with the coulometric detection; a reserve-phase HPLC column that did not collapse under 100% aqueous mobile phase conditions providing sufficient retention and separation of hydrogen peroxide from interferences; and a coulometric detector with a multi-electrode array providing sensitive and selective detection. The method validation results, including those for specificity, linearity, accuracy, precision, and recovery, were acceptable for the determination of trace levels of hydrogen peroxide. The method was shown to be linear over the range of 0.6-4.5 ppm (microg/g) and 6-90 ppm (microg/g) for the pharmaceutical product and crospovidone, respectively. The described method was applied to the determination of trace levels of hydrogen peroxide in different batches of crospovidone and the corresponding pharmaceutical product batches manufactured from these batches of this excipient.

  12. Quantitative analysis and parametric display of regional myocardial mechanics

    NASA Astrophysics Data System (ADS)

    Eusemann, Christian D.; Bellemann, Matthias E.; Robb, Richard A.

    2000-04-01

    Quantitative assessment of regional heart motion has significant potential for more accurate diagnosis of heart disease and/or cardiac irregularities. Local heart motion may be studied from medical imaging sequences. Using functional parametric mapping, regional myocardial motion during a cardiac cycle can be color mapped onto a deformable heart model to obtain better understanding of the structure- to-function relationships in the myocardium, including regional patterns of akinesis or diskinesis associated with ischemia or infarction. In this study, 3D reconstructions were obtained from the Dynamic Spatial Reconstructor at 15 time points throughout one cardiac cycle of pre-infarct and post-infarct hearts. Deformable models were created from the 3D images for each time point of the cardiac cycles. Form these polygonal models, regional excursions and velocities of each vertex representing a unit of myocardium were calculated for successive time-intervals. The calculated results were visualized through model animations and/or specially formatted static images. The time point of regional maximum velocity and excursion of myocardium through the cardiac cycle was displayed using color mapping. The absolute value of regional maximum velocity and maximum excursion were displayed in a similar manner. Using animations, the local myocardial velocity changes were visualized as color changes on the cardiac surface during the cardiac cycle. Moreover, the magnitude and direction of motion for individual segments of myocardium could be displayed. Comparison of these dynamic parametric displays suggest that the ability to encode quantitative functional information on dynamic cardiac anatomy enhances the diagnostic value of 4D images of the heart. Myocardial mechanics quantified this way adds a new dimension to the analysis of cardiac functional disease, including regional patterns of akinesis and diskinesis associated with ischemia and infarction. Similarly, disturbances in

  13. Quantitative 3-dimensional computed tomography analysis of olecranon fractures.

    PubMed

    Lubberts, Bart; Janssen, Stein; Mellema, Jos; Ring, David

    2016-05-01

    Olecranon fractures have variable size of the proximal fragment, patterns of fragmentation, and subluxation of the ulnohumeral joint that might be better understood and categorized on the basis of quantitative 3-dimensional computed tomography analysis. Mayo type I fractures are undisplaced, Mayo type II are displaced and stable, and Mayo type III are displaced and unstable. The last is categorized into anterior and posterior dislocations. The purpose of this study was to further clarify fracture morphology between Mayo type I, II, and III fractures. Three-dimensional models were created for a consecutive series of 78 patients with olecranon fractures that were evaluated with computed tomography. We determined the total number of fracture fragments, the volume and articular surface area of each fracture fragment, and the degree of displacement of the most proximal olecranon fracture fragment. Displaced olecranon fractures were more comminuted than nondisplaced fractures (P = .02). Displaced fractures without ulnohumeral subluxation were smallest in terms of both volume (P < .001) and articular surface involvement (P < .001) of the most proximal olecranon fracture fragment. There was no difference in average displacement of the proximal fragment between displaced fractures with and without ulnohumeral subluxation (P = .74). Anterior olecranon fracture-dislocations created more displaced (P = .04) and smaller proximal fragments than posterior fracture-dislocations (P = .005), with comparable fragmentation on average (P = .60). The ability to quantify volume, articular surface area, displacement, and fragmentation using quantitative 3-dimensional computed tomography should be considered when increased knowledge of fracture morphology and fracture patterns might be useful. Copyright © 2016 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  14. Towards quantitative analysis of retinal features in optical coherence tomography.

    PubMed

    Baroni, Maurizio; Fortunato, Pina; La Torre, Agostino

    2007-05-01

    The purpose of this paper was to propose a new computer method for quantitative evaluation of representative features of the retina using optical coherence tomography (OCT). A multi-step approach was devised and positively tested for segmentation of the three main retinal layers: the vitreo-retinal interface and the inner and outer retina. Following a preprocessing step, three regions of interest were delimited. Significant peaks corresponding to high and low intensity strips were located along the OCT A-scan lines and accurate boundaries between different layers were obtained by maximizing an edge likelihood function. For a quantitative description, thickness measurement, densitometry, texture and curvature analyses were performed. As a first application, the effect of intravitreal injection of triamcinolone acetonide (IVTA) for the treatment of vitreo-retinal interface syndrome was evaluated. Almost all the parameters, measured on a set of 16 pathologic OCT images, were statistically different before and after IVTA injection (p<0.05). Shape analysis of the internal limiting membrane confirmed the reduction of the pathological traction state. Other significant parameters, such as reflectivity and texture contrast, exhibited relevant changes both at the vitreo-retinal interface and in the inner retinal layers. Texture parameters in the inner and outer retinal layers significantly correlated with the visual acuity restoration. According to these findings an IVTA injection might be considered a possible alternative to surgery for selected patients. In conclusion, the proposed approach appeared to be a promising tool for the investigation of tissue changes produced by pathology and/or therapy.

  15. Quantitative analysis of drug-induced tremor in mice.

    PubMed

    Shinozaki, H

    1984-12-01

    A method of analyzing tremor in mice was developed using a power spectral analysis of the random current induced by the movement of a magnet attached to a mouse, on a wire coil. The power spectral density function defined the frequency composition of the tremor, and the mean square value of the data in any frequency range of concern was determined. It was possible to determine qualitative differences in the tremor caused by various tremorgenic agents, and to differentiate the drug-induced tremor from spontaneous motor activity. The power spectral densities of the tremorine- and oxotremorine-induced tremors were tentatively expressed as the sum of 3 main components (Cauchy distribution) with different peak frequencies, consisting of the spontaneous motor activity component and two tremor components. On the other hand, the power spectral densities of the harmaline-induced tremor were expressed as the sum of two components with two peak frequencies, and the plots of the power spectral densities versus frequency, consisting of the spontaneous motor activity component and a tremor component. The frequency of the peak spectral density was almost independent of the dose of tremorgenic agents, but changed slightly with the lapse of time after their injection. The severity of the tremor was determined quantitatively in terms of the sum of the mean square value. The sum of the mean square value for a period of 45 min after the injection of tremorine, changed in a dose-dependent manner. The severity of the tremor was different for each of the mouse strains. The method studied in the present paper is expected to be utilized for the quantitative examination of the fine motor movement of the experimental animal, particularly, for the screening test of new anti-tremor drugs.

  16. Quantitative analysis with the optoacoustic/ultrasound system OPUS

    NASA Astrophysics Data System (ADS)

    Haisch, Christoph; Zell, Karin; Sperl, Jonathan; Vogel, Mika W.; Niessner, Reinhard

    2009-02-01

    The OPUS (Optoacoustic plus Ultrasound) system is a combination of a medical ultrasound scanner with a highrepetition rate, wavelength-tunable laser system and a suitable triggering interface to synchronize the laser and the ultrasound system. The pulsed laser generates an optoacoustic (OA), or photoacoustic (PA), signal which is detected by the ultrasound system. Alternatively, imaging in conventional ultrasound mode can be performed. Both imaging modes can be superimposed. The laser light is coupled into the tissue laterally, parallel to the ultrasound transducer, which does not require for any major modification to the transducer or the ultrasound beam forming. This was a basic requirement on the instrument, as the intention of the project was to establish the optoacoustic imaging modality as add-on to a conventional standard ultrasound instrument. We believe that this approach may foster the introduction of OA imaging as routine tool in medical diagnosis. Another key aspect of the project was to exploit the capabilities of OA imaging for quantitative analysis. The intention of the presented work is to summarize all steps necessary to extract the significant information from the PA raw data, which are required for the quantification of local absorber distributions. We show results of spatially resolved absorption measurements in scattering samples and a comparison of four different image reconstruction algorithms, regarding their influence on lateral resolution as well as on the signal to noise ratio for different sample depths and absorption values. The reconstruction algorithms are based on Fourier transformation, on a generalized 2D Hough transformation, on circular back-projection and the classical delay-and-sum approach which is implemented in most ultrasound scanners. Furthermore, we discuss the influence of a newly developed laser source, combining diode and flash lamp pumping. Compared to all-flash-lamp pumped systems it features a significantly higher

  17. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  18. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  19. Corrections for volume hydrogen content in coal analysis by prompt gamma neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Salgado, J.; Oliveira, C.

    1992-05-01

    Prompt gamma neutron activation analysis, PGNAA, is a useful technique to determine the elemental composition of bulk samples in on-line measurements. Monte Carlo simulation studies performed in bulk coals of different compositions for given sample size and geometry have shown that both the gamma count rate for hydrogen and the gamma count rate per percent by weight for an arbitrary element due to (n, γ) reactions depend on the volume hydrogen content, being independent of coal composition. Experimental results using a 252Cf neutron source surrounded by a lead cylinder were obtained for nine different coal types. These show that the γ-peak originated by (n, n' γ) reactions in the lead shield depends on the sample density. Assuming that the source intensity is constant, this result enables the measurement of the coal bulk density. Taking into account the results just described, the present paper shows how the γ-peak intensities can be corrected for volume hydrogen content in order to obtain the percent by weight contents of the coal. The density is necessary to convert the volume hydrogen in percent by weight content and to calculate the bulk sample weight.

  20. Quantitative analysis of cryptic splicing associated with TDP-43 depletion.

    PubMed

    Humphrey, Jack; Emmett, Warren; Fratta, Pietro; Isaacs, Adrian M; Plagnol, Vincent

    2017-05-26

    Reliable exon recognition is key to the splicing of pre-mRNAs into mature mRNAs. TDP-43 is an RNA-binding protein whose nuclear loss and cytoplasmic aggregation are a hallmark pathology in amyotrophic lateral sclerosis and frontotemporal dementia (ALS/FTD). TDP-43 depletion causes the aberrant inclusion of cryptic exons into a range of transcripts, but their extent, relevance to disease pathogenesis and whether they are caused by other RNA-binding proteins implicated in ALS/FTD are unknown. We developed an analysis pipeline to discover and quantify cryptic exon inclusion and applied it to publicly available human and murine RNA-sequencing data. We detected widespread cryptic splicing in TDP-43 depletion datasets but almost none in another ALS/FTD-linked protein FUS. Sequence motif and iCLIP analysis of cryptic exons demonstrated that they are bound by TDP-43. Unlike the cryptic exons seen in hnRNP C depletion, those repressed by TDP-43 cannot be linked to transposable elements. Cryptic exons are poorly conserved and inclusion overwhelmingly leads to nonsense-mediated decay of the host transcript, with reduced transcript levels observed in differential expression analysis. RNA-protein interaction data on 73 different RNA-binding proteins showed that, in addition to TDP-43, 7 specifically bind TDP-43 linked cryptic exons. This suggests that TDP-43 competes with other splicing factors for binding to cryptic exons and can repress cryptic exon inclusion. Our quantitative analysis pipeline confirms the presence of cryptic exons during the depletion of TDP-43 but not FUS providing new insight into to RNA-processing dysfunction as a cause or consequence in ALS/FTD.

  1. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  2. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    DOE PAGES

    Xu, Zhe; Wu, Chaochao; Xie, Fang; ...

    2014-10-28

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective andmore » robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Additionally, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. In conclusion, peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.« less

  3. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  4. Comparative analysis of quantitative efficiency evaluation methods for transportation networks

    PubMed Central

    He, Yuxin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165

  5. Therapeutic electrical stimulation for spasticity: quantitative gait analysis.

    PubMed

    Pease, W S

    1998-01-01

    Improvement in motor function following electrical stimulation is related to strengthening of the stimulated spastic muscle and inhibition of the antagonist. A 26-year-old man with familial spastic paraparesis presented with gait dysfunction and bilateral lower limb spastic muscle tone. Clinically, muscle strength and sensation were normal. He was considered appropriate for a trial of therapeutic electrical stimulation following failed trials of physical therapy and baclofen. No other treatment was used concurrent with the electrical stimulation. Before treatment, quantitative gait analysis revealed 63% of normal velocity and a crouched gait pattern, associated with excessive electromyographic activity in the hamstrings and gastrocnemius muscles. Based on these findings, bilateral stimulation of the quadriceps and anterior compartment musculature was performed two to three times per week for three months. Repeat gait analysis was conducted three weeks after the cessation of stimulation treatment. A 27% increase in velocity was noted associated with an increase in both cadence and right step length. Right hip and bilateral knee stance motion returned to normal (rather than "crouched"). No change in the timing of dynamic electromyographic activity was seen. These findings suggest a role for the use of electrical stimulation for rehabilitation of spasticity. The specific mechanism of this improvement remains uncertain.

  6. Correlation between two methods of florbetapir PET quantitative analysis.

    PubMed

    Breault, Christopher; Piper, Jonathan; Joshi, Abhinay D; Pirozzi, Sara D; Nelson, Aaron S; Lu, Ming; Pontecorvo, Michael J; Mintun, Mark A; Devous, Michael D

    2017-01-01

    This study evaluated performance of a commercially available standardized software program for calculation of florbetapir PET standard uptake value ratios (SUVr) in comparison with an established research method. Florbetapir PET images for 183 subjects clinically diagnosed as cognitively normal (CN), mild cognitive impairment (MCI) or probable Alzheimer's disease (AD) (45 AD, 60 MCI, and 78 CN) were evaluated using two software processing algorithms. The research method uses a single florbetapir PET template generated by averaging both amyloid positive and amyloid negative registered brains together. The commercial software simultaneously optimizes the registration between the florbetapir PET images and three templates: amyloid negative, amyloid positive, and an average. Cortical average SUVr values were calculated across six predefined anatomic regions with respect to the whole cerebellum reference region. SUVr values were well correlated between the two methods (r2 = 0.98). The relationship between the methods computed from the regression analysis is: Commercial method SUVr = (0.9757*Research SUVr) + 0.0299. A previously defined cutoff SUVr of 1.1 for distinguishing amyloid positivity by the research method corresponded to 1.1 (95% CI = 1.098, 1.11) for the commercial method. This study suggests that the commercial method is comparable to the published research method of SUVr analysis for florbetapir PET images, thus facilitating the potential use of standardized quantitative approaches to PET amyloid imaging.

  7. Quantitative Analysis of Intracellular Motility Based on Optical Flow Model

    PubMed Central

    Li, Heng

    2017-01-01

    Analysis of cell mobility is a key issue for abnormality identification and classification in cell biology research. However, since cell deformation induced by various biological processes is random and cell protrusion is irregular, it is difficult to measure cell morphology and motility in microscopic images. To address this dilemma, we propose an improved variation optical flow model for quantitative analysis of intracellular motility, which not only extracts intracellular motion fields effectively but also deals with optical flow computation problem at the border by taking advantages of the formulation based on L1 and L2 norm, respectively. In the energy functional of our proposed optical flow model, the data term is in the form of L2 norm; the smoothness of the data changes with regional features through an adaptive parameter, using L1 norm near the edge of the cell and L2 norm away from the edge. We further extract histograms of oriented optical flow (HOOF) after optical flow field of intracellular motion is computed. Then distances of different HOOFs are calculated as the intracellular motion features to grade the intracellular motion. Experimental results show that the features extracted from HOOFs provide new insights into the relationship between the cell motility and the special pathological conditions.

  8. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  9. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  10. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  11. A Computational Tool for Quantitative Analysis of Vascular Networks

    PubMed Central

    Zudaire, Enrique; Gambardella, Laure; Kurcz, Christopher; Vermeren, Sonja

    2011-01-01

    Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time - and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called “branching index” (branch points / unit area), providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge. PMID:22110636

  12. Quantitative analysis of triple-mutant genetic interactions.

    PubMed

    Braberg, Hannes; Alexander, Richard; Shales, Michael; Xu, Jiewei; Franks-Skiba, Kathleen E; Wu, Qiuqin; Haber, James E; Krogan, Nevan J

    2014-08-01

    The quantitative analysis of genetic interactions between pairs of gene mutations has proven to be effective for characterizing cellular functions, but it can miss important interactions for functionally redundant genes. To address this limitation, we have developed an approach termed triple-mutant analysis (TMA). The procedure relies on a query strain that contains two deletions in a pair of redundant or otherwise related genes, which is crossed against a panel of candidate deletion strains to isolate triple mutants and measure their growth. A central feature of TMA is to interrogate mutants that are synthetically sick when two other genes are deleted but interact minimally with either single deletion. This approach has been valuable for discovering genes that restore critical functions when the principal actors are deleted. TMA has also uncovered double-mutant combinations that produce severe defects because a third protein becomes deregulated and acts in a deleterious fashion, and it has revealed functional differences between proteins presumed to act together. The protocol is optimized for Singer ROTOR pinning robots, takes 3 weeks to complete and measures interactions for up to 30 double mutants against a library of 1,536 single mutants.

  13. Analysis of hydrogen cyanide in air in a case of attempted cyanide poisoning.

    PubMed

    Magnusson, R; Nyholm, S; Åstot, C

    2012-10-10

    A 32-year-old man attempted to poison his ex-girlfriend with hydrogen cyanide by hiding the pesticide Uragan D2 in her car. During the police investigation, chemical analysis of the air inside the car was performed. Hydrogen cyanide was detected through on-site air analysis using a portable Fourier transform infrared (FTIR) spectroscopy gas analyzer and colorimetric gas detection tubes. Furthermore, impinger air-sampling was performed for off-site sample preparation and analysis by gas chromatography-mass spectrometry (GC-MS). All three independent techniques demonstrated the presence of hydrogen cyanide, at concentrations of 14-20 ppm. Owing to the high volatility of hydrogen cyanide, the temperature and the time since exposure have a substantial effect on the likelihood of detecting hydrogen cyanide at a crime scene. The prevailing conditions (closed space, low temperature) must have supported the preservation of HCN in the car thus enabling the identification even though the analysis was performed several days after the hydrogen cyanide source was removed. This paper demonstrates the applicability of combining on-site FTIR measurements and off-site GC-MS analysis of a crime scene in order to ensure fast detection as well as unambiguous identification for forensic purposes of hydrogen cyanide in air.

  14. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  15. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  16. Application of diffusion theory to the analysis of hydrogen desorption data at 25 deg C

    SciTech Connect

    Danford, M.D.

    1985-10-01

    The application of diffusion theory to the analysis of hydrogen desorption data (coulombs of H/sub 2/ desorbed versus time) has been studied. From these analyses, important information concerning hydrogen solubilities and the nature of the hydrogen distributions in the metal has been obtained. Two nickel base alloys, Rene' 41 and Waspaloy, and one ferrous alloy, 4340 steel, are studied in this work. For the nickel base alloys, it is found that the hydrogen distributions after electrolytic charging conforms closely to those which would be predicted by diffusion theory. For Waspaloy samples charged at 5,000 psi, it is found that the hydrogen distributions are essentially the same as those obtained by electrolytic charging. The hydrogen distributions in electrolytically charged 4340 steel, on the other hand, are essentially uniform in nature, which would not be predicted by diffusion theory. A possible explanation has been proposed. Finally, it is found that the hydrogen desorption is completely explained by the nature of the hydrogen distribution in the metal, and that the fast hydrogen is not due to surface and sub-surface hydride formation, as was originally proposed.

  17. Thermal Desorption Analysis of Hydrogen in High Strength Martensitic Steels

    NASA Astrophysics Data System (ADS)

    Enomoto, M.; Hirakami, D.; Tarui, T.

    2012-02-01

    Thermal desorption analyses (TDA) were conducted in high strength martensitic steels containing carbon from 0.33 to 1.0 mass pct, which were charged with hydrogen at 1223 K (950 °C) under hydrogen of one atmospheric pressure and quenched to room temperature. In 0.33C steel, which had the highest M s temperature, only one desorption peak was observed around 373 K (100 °C), whereas two peaks, one at a similar temperature and the other around and above 573 K (300 °C), were observed in the other steels, the height of the second peak increasing with carbon content. In 0.82C steel, both peaks disappeared during exposure at room temperature in 1 week, whereas the peak heights decreased gradually over 2 weeks in specimens electrolytically charged with hydrogen and aged for varying times at room temperature. From computer simulation, by means of the McNabb-Foster theory coupled with theories of carbon segregation, these peaks are likely to be due to trapping of hydrogen in the strain fields and cores of dislocations, and presumably to a lesser extent in prior austenite grain boundaries. The results also indicate that carbon atoms prevent and even expel hydrogen from trapping sites during quenching and aging in these steels.

  18. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  19. Qualitative and quantitative analysis of atmospheric organosulfates in Centreville, Alabama

    NASA Astrophysics Data System (ADS)

    Hettiyadura, Anusha P. S.; Jayarathne, Thilina; Baumann, Karsten; Goldstein, Allen H.; de Gouw, Joost A.; Koss, Abigail; Keutsch, Frank N.; Skog, Kate; Stone, Elizabeth A.

    2017-01-01

    Organosulfates are components of secondary organic aerosols (SOA) that form from oxidation of volatile organic compounds (VOCs) in the presence of sulfate. In this study, the composition and abundance of organosulfates were determined in fine particulate matter (PM2.5) collected from Centreville, AL, during the Southern Oxidant and Aerosol Study (SOAS) in summer 2013. Six organosulfates were quantified using hydrophilic interaction liquid chromatography (HILIC) with triple quadrupole mass spectrometry (TQD) against authentic standards. Among these, the three most abundant species were glycolic acid sulfate (0.5-52.5 ng m-3), lactic acid sulfate (0.5-36.7 ng m-3), and hydroxyacetone sulfate (0.5-14.3 ng m-3). These three species were strongly inter-correlated, suggesting similar precursors and/or formation pathways. Further correlations with sulfate, isoprene, and isoprene oxidation products indicate important roles for these precursors in organosulfate formation in Centreville. Positive filter sampling artifacts associated with these organosulfates due to gas adsorption or reaction of gas phase precursors of organosulfates with sulfuric acid were assessed for a subset of samples and were less than 7.8 % of their PM2.5 concentrations. Together, the quantified organosulfates accounted for < 0.3 % of organic carbon mass in PM2.5. To gain insights into other organosulfates in PM2.5 collected from Centreville, semi-quantitative analysis was employed by way of monitoring characteristic product ions of organosulfates (HSO4- at m/z 97 and SO4- ṡ at m/z 96) and evaluating relative signal strength by HILIC-TQD. Molecular formulas of organosulfates were determined by high-resolution time-of-flight (TOF) mass spectrometry. The major organosulfate signal across all samples corresponded to 2-methyltetrol sulfates, which accounted for 42-62 % of the total bisulfate ion signal. Conversely, glycolic acid sulfate, the most abundant organosulfate quantified in this study, was 0

  20. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  1. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis.

    PubMed

    Radzikowski, Jacek; Stefanidis, Anthony; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more effective strategies that take into account the complex

  2. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  3. Quantitative assessment of hip osteoarthritis based on image texture analysis.

    PubMed

    Boniatis, I S; Costaridou, L I; Cavouras, D A; Panagiotopoulos, E C; Panayiotakis, G S

    2006-03-01

    A non-invasive method was developed to investigate the potential capacity of digital image texture analysis in evaluating the severity of hip osteoarthritis (OA) and in monitoring its progression. 19 textural features evaluating patterns of pixel intensity fluctuations were extracted from 64 images of radiographic hip joint spaces (HJS), corresponding to 32 patients with verified unilateral or bilateral OA. Images were enhanced employing custom developed software for the delineation of the articular margins on digitized pelvic radiographs. The severity of OA for each patient was assessed by expert orthopaedists employing the Kellgren and Lawrence (KL) scale. Additionally, an index expressing HJS-narrowing was computed considering patients from the unilateral OA-group. A textural feature that quantified pixel distribution non-uniformity (grey level non-uniformity, GLNU) demonstrated the strongest correlation with the HJS-narrowing index among all extracted features and utilized in further analysis. Classification rules employing GLNU feature were introduced to characterize a hip as normal or osteoarthritic and to assign it to one of three severity categories, formed in accordance with the KL scale. Application of the proposed rules resulted in relatively high classification accuracies in characterizing a hip as normal or osteoarthritic (90.6%) and in assigning it to the correct KL scale category (88.9%). Furthermore, the strong correlation between the HJS-narrowing index and the pathological GLNU (r = -0.9, p<0.001) was utilized to provide percentages quantifying hip OA-severity. Texture analysis may contribute in the quantitative assessment of OA-severity, in the monitoring of OA-progression and in the evaluation of a chondroprotective therapy.

  4. Lifecycle Cost Analysis of Hydrogen Versus Other Technologies for Electrical Energy Storage

    SciTech Connect

    Steward, D.; Saur, G.; Penev, M.; Ramsden, T.

    2009-11-01

    This report presents the results of an analysis evaluating the economic viability of hydrogen for medium- to large-scale electrical energy storage applications compared with three other storage technologies: batteries, pumped hydro, and compressed air energy storage (CAES).

  5. Quantitative aspects of ESR and spin trapping of hydroxyl radicals and hydrogen atoms in gamma-irradiated aqueous solutions.

    PubMed

    Carmichael, A J; Makino, K; Riesz, P

    1984-11-01

    The efficiency of 5,5-dimethylpyrroline-1-N-oxide (DMPO) and alpha-(4-pyridyl-1-oxide)-N-tert.-butylnitrone (POBN) to spin trap hydroxyl radicals and hydrogen atoms, respectively, was studied in gamma-irradiated solutions where the radical yields are accurately known. The effects of dose, spin trap concentration, and pH and of the stability of the spin adducts on the spin-trapping efficiency were investigated. In degassed or N2-saturated solutions the spin-trapping efficiencies were 35% for DMPO and hydroxyl radicals and 14% for POBN and hydrogen atoms. The low spin-trapping efficiencies were shown not to be due to the instability of the DMPO-OH and POBN-H spin adducts or to the effects of H2O2 or O2. The low spin-trapping efficiency of DMPO may be explained by the reaction of hydroxyl radicals to abstract hydrogen from the DMPO molecule to produce carbon radicals as well as addition to the N = C double bond to form nitroxide radicals. For POBN the low spin-trapping efficiency for hydrogen atoms is explained in terms of addition reactions of hydrogen atoms to the aromatic ring and the pyridinium and nitrone oxygens.

  6. Technical Analysis of the Hydrogen Energy Station Concept, Phase I and Phase II

    SciTech Connect

    TIAX, LLC

    2005-05-04

    patterns would be most viable for an energy station, TIAX developed several criteria for selecting a representative set of technology configurations. TIAX applied these criteria to all possible technology configurations to determine an optimized set for further analysis, as shown in Table ES-1. This analysis also considered potential energy station operational scenarios and their impact upon hydrogen and power production. For example, an energy station with a 50-kWe reformer could generate enough hydrogen to serve up to 12 vehicles/day (at 5 kg/fill) or generate up to 1,200 kWh/day, as shown in Figure ES-1. Buildings that would be well suited for an energy station would utilize both the thermal and electrical output of the station. Optimizing the generation and utilization of thermal energy, hydrogen, and electricity requires a detailed look at the energy transfer within the energy station and the transfer between the station and nearby facilities. TIAX selected the Baseline configuration given in Table ES-1 for an initial analysis of the energy and mass transfer expected from an operating energy station. Phase II The purpose of this technical analysis was to analyze the development of a hydrogen-dispensing infrastructure for transportation applications through the installation of a 50-75 kW stationary fuel cell-based energy station at federal building sites. The various scenarios, costs, designs and impacts of such a station were quantified for a hypothetical cost-shared program that utilizes a natural gas reformer to provide hydrogen fuel for both the stack(s) and a limited number of fuel cell powered vehicles, with the possibility of using cogeneration to support the building heat load.

  7. Development and evaluation of a source sampling and analysis method for hydrogen cyanide

    SciTech Connect

    Steger, J.L.; Merrill, R.G.; Fuerst, R.G.

    1997-12-31

    Laboratory studies were carried out to develop a method for the sampling and analysis of hydrogen cyanide from stationary source air emissions using a dilute NAOH solution as the collection medium. The method evaluated extracts stack gas from the emission sources and stabilizes the reactive gas for subsequent analysis in dilute sodium hydroxide solution. A modified Method 0050 sampling train was evaluated by dynamically spiking hydrogen cyanide into the heated probe while sampling simulated or actual source gas.

  8. Quantitative texture analysis of talc in mantle hydrated mylonites

    NASA Astrophysics Data System (ADS)

    Benitez-Perez, J. M.; Gomez Barreiro, J.; Wenk, H. R.; Vogel, S. C.; Soda, Y.; Voltolini, M.; Martinez-Catalan, J. R.

    2014-12-01

    A quantitative texture analysis of talc-serpentinite mylonites developed in highly deformed ultramafic rocks from different orogenic contexts have been done with neutorn diffraction at HIPPO (Los Álamos National Laboratory). Mineral assemblage, metamorphic evolution and deformative fabric of these samples could be correlated with those verified along the shallow levels (<100km; <5GPa) of a subduction zone. The hydration of mantle (ultramafic) rocks at those levels it is likely to occur dynamically, with important implications on seismogenesis. Given the high anisotropy of the major phases in the samples (i.e. talc and antigorite) it is expected to influence seismic anisotropy of the whole system, in the presence of texture. However to date there was no data on the crystallographic preferred orientation of talc and examples of antigorite textures are very limited. We explore the contribution of talc texture to the seismic anisotropy of mantle hydrated mylonites. Acknowledgements: This work has been funded by research project CGL2011-22728 of Spanish Ministry of Economy and Competitiveness. JGB and JMBP are grateful to the Ramón y Cajal and FPI funding programs. Access to HIPPO (LANSCE) to conduct diffraction experiments is kindly acknowledged.

  9. Quantitative analysis of dynamic association in live biological fluorescent samples.

    PubMed

    Ruusuvuori, Pekka; Paavolainen, Lassi; Rutanen, Kalle; Mäki, Anita; Huttunen, Heikki; Marjomäki, Varpu

    2014-01-01

    Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  10. Quantitative analysis of brain magnetic resonance imaging for hepatic encephalopathy

    NASA Astrophysics Data System (ADS)

    Syh, Hon-Wei; Chu, Wei-Kom; Ong, Chin-Sing

    1992-06-01

    High intensity lesions around ventricles have recently been observed in T1-weighted brain magnetic resonance images for patients suffering hepatic encephalopathy. The exact etiology that causes magnetic resonance imaging (MRI) gray scale changes has not been totally understood. The objective of our study was to investigate, through quantitative means, (1) the amount of changes to brain white matter due to the disease process, and (2) the extent and distribution of these high intensity lesions, since it is believed that the abnormality may not be entirely limited to the white matter only. Eleven patients with proven haptic encephalopathy and three normal persons without any evidence of liver abnormality constituted our current data base. Trans-axial, sagittal, and coronal brain MRI were obtained on a 1.5 Tesla scanner. All processing was carried out on a microcomputer-based image analysis system in an off-line manner. Histograms were decomposed into regular brain tissues and lesions. Gray scale ranges coded as lesion were then brought back to original images to identify distribution of abnormality. Our results indicated the disease process involved pallidus, mesencephalon, and subthalamic regions.

  11. A quantitative analysis of cardiac myocyte relaxation: a simulation study.

    PubMed

    Niederer, S A; Hunter, P J; Smith, N P

    2006-03-01

    The determinants of relaxation in cardiac muscle are poorly understood, yet compromised relaxation accompanies various pathologies and impaired pump function. In this study, we develop a model of active contraction to elucidate the relative importance of the [Ca2+]i transient magnitude, the unbinding of Ca2+ from troponin C (TnC), and the length-dependence of tension and Ca2+ sensitivity on relaxation. Using the framework proposed by one of our researchers, we extensively reviewed experimental literature, to quantitatively characterize the binding of Ca2+ to TnC, the kinetics of tropomyosin, the availability of binding sites, and the kinetics of crossbridge binding after perturbations in sarcomere length. Model parameters were determined from multiple experimental results and modalities (skinned and intact preparations) and model results were validated against data from length step, caged Ca2+, isometric twitches, and the half-time to relaxation with increasing sarcomere length experiments. A factorial analysis found that the [Ca2+]i transient and the unbinding of Ca2+ from TnC were the primary determinants of relaxation, with a fivefold greater effect than that of length-dependent maximum tension and twice the effect of tension-dependent binding of Ca2+ to TnC and length-dependent Ca2+ sensitivity. The affects of the [Ca2+]i transient and the unbinding rate of Ca2+ from TnC were tightly coupled with the effect of increasing either factor, depending on the reference [Ca2+]i transient and unbinding rate.

  12. Quantitative and graphic acoustic analysis of phonatory modulations: the modulogram.

    PubMed

    Buder, Eugene H; Strand, Edythe A

    2003-04-01

    A method is presented for analyzing phonatory instabilities that occur as modulations of fundamental frequency (f0) and sound pressure level (SPL) on the order of 0.2 to 20 cycles per second. Such long-term phonatory instabilities, including but not limited to traditional notions of tremor, are distinct from cycle-to-cycle perturbation such as jitter or shimmer. For each of the 2 parameters (f0, in Hz, and SPL, in dB), 3 frequency domains are proposed: (a) flutter (10-20 Hz), (b) tremor (2-10 Hz), and (c) wow (0.2-2.0 Hz), yielding 6 types of instability. Analyses were implemented using fast Fourier transforms (FFTs) with domain-specific analysis parameters. Outputs include a graphic display in the form of a set of low-frequency spectrograms (the "modulogram") and quantitative measures of the frequencies, magnitudes, durations, and sinusoidal form of the instabilities. An index of a given instability is developed by combining its duration and average modulation magnitude into a single quantity. Performance of the algorithms was assessed by analyzing test signals with known degrees of modulation, and a range of applications was reviewed to provide a rationale for use of modulograms in phonatory assessment.

  13. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-07

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene.

  14. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  15. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    NASA Astrophysics Data System (ADS)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  16. Separation and quantitative analysis of alkyl sulfate ethoxymers by HPLC.

    PubMed

    Morvan, Julien; Hubert-Roux, Marie; Agasse, Valérie; Cardinael, Pascal; Barbot, Florence; Decock, Gautier; Bouillon, Jean-Philippe

    2008-01-01

    Separation of alkyl sulfate ethoxymers is investigated on various high-performance liquid chromatography (HPLC) stationary phases: Acclaim C18 Surfactant, Surfactant C8, and Hypercarb. For a fixed alkyl chain length, ethoxymers are eluted in the order of increasing number of ethoxylated units on Acclaim C18 Surfactant, whereas a reversed elution order is observed on Surfactant C8 and Hypercarb. Moreover, on an Acclaim C18 Surfactant column, non-ethoxylated compounds are eluted in their ethoxymers distribution and the use of sodium acetate additive in mobile phase leads to a co-elution of ethoxymers. HPLC stationary phases dedicated to surfactants analysis are evaluated by means of the Tanaka test. Surfactant C8 presents a great silanol activity whereas Acclaim C18 Surfactant shows a high steric selectivity. For alkyl sulfates, linearity of the calibration curve and limits of detection and quantitation are evaluated. The amount of sodium laureth sulfate raw material found in commercial body product is in agreement with the specification of the manufacturer.

  17. Quantitative analysis of regulatory flexibility under changing environmental conditions

    PubMed Central

    Edwards, Kieron D; Akman, Ozgur E; Knox, Kirsten; Lumsden, Peter J; Thomson, Adrian W; Brown, Paul E; Pokhilko, Alexandra; Kozma-Bognar, Laszlo; Nagy, Ferenc; Rand, David A; Millar, Andrew J

    2010-01-01

    The circadian clock controls 24-h rhythms in many biological processes, allowing appropriate timing of biological rhythms relative to dawn and dusk. Known clock circuits include multiple, interlocked feedback loops. Theory suggested that multiple loops contribute the flexibility for molecular rhythms to track multiple phases of the external cycle. Clear dawn- and dusk-tracking rhythms illustrate the flexibility of timing in Ipomoea nil. Molecular clock components in Arabidopsis thaliana showed complex, photoperiod-dependent regulation, which was analysed by comparison with three contrasting models. A simple, quantitative measure, Dusk Sensitivity, was introduced to compare the behaviour of clock models with varying loop complexity. Evening-expressed clock genes showed photoperiod-dependent dusk sensitivity, as predicted by the three-loop model, whereas the one- and two-loop models tracked dawn and dusk, respectively. Output genes for starch degradation achieved dusk-tracking expression through light regulation, rather than a dusk-tracking rhythm. Model analysis predicted which biochemical processes could be manipulated to extend dusk tracking. Our results reveal how an operating principle of biological regulators applies specifically to the plant circadian clock. PMID:21045818

  18. Quantitative analysis of 3-OH oxylipins in fermentation yeast.

    PubMed

    Potter, Greg; Xia, Wei; Budge, Suzanne M; Speers, R Alex

    2017-02-01

    Despite the ubiquitous distribution of oxylipins in plants, animals, and microbes, and the application of numerous analytical techniques to study these molecules, 3-OH oxylipins have never been quantitatively assayed in yeasts. The formation of heptafluorobutyrate methyl ester derivatives and subsequent analysis with gas chromatography - negative chemical ionization - mass spectrometry allowed for the first determination of yeast 3-OH oxylipins. The concentration of 3-OH 10:0 (0.68-4.82 ng/mg dry cell mass) in the SMA strain of Saccharomyces pastorianus grown in laboratory-scale beverage fermentations was elevated relative to oxylipin concentrations in plant tissues and macroalgae. In fermenting yeasts, the onset of 3-OH oxylipin formation has been related to fermentation progression and flocculation initiation. When the SMA strain was grown in laboratory-scale fermentations, the maximal sugar consumption rate preceded the lowest concentration of 3-OH 10:0 by ∼4.5 h and a distinct increase in 3-OH 10:0 concentration by ∼16.5 h.

  19. Quantitative analysis of cellular metabolic dissipative, self-organized structures.

    PubMed

    de la Fuente, Ildefonso Martínez

    2010-09-27

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life.

  20. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  1. Assessment of hair surface roughness using quantitative image analysis.

    PubMed

    Park, K H; Kim, H J; Oh, B; Lee, E; Ha, J

    2017-07-19

    Focus on the hair and hair cuticle is increasing. The hair cuticle is the first layer to be exposed to damage and the area of primary protection. For such reasons, hair product manufacturers consider cuticle protection important. However, previous studies used only visual assessment to examine the cuticle. This study aimed to obtain the changes in cuticles and measure hair roughness using a HIROX microscope. A total of 23 female subjects used the same products daily for 4 weeks. Three hair samples per subject were collected from three different areas of the head. Measurements were taken before and after 4 weeks of daily product use. The hair surface changes were clearly observed on the captured images. Moreover, hair surface roughness was observed using various parameters on HIROX software. After 4 weeks of daily product use, the roughness parameter value of the hair surface was significantly decreased. Our result suggests that the hair roughness analytical method using HIROX can be a new paradigm for high-quality quantitative analysis of the hair cuticle. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  3. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  4. High throughput, quantitative analysis of human osteoclast differentiation and activity.

    PubMed

    Diepenhorst, Natalie A; Nowell, Cameron J; Rueda, Patricia; Henriksen, Kim; Pierce, Tracie; Cook, Anna E; Pastoureau, Philippe; Sabatini, Massimo; Charman, William N; Christopoulos, Arthur; Summers, Roger J; Sexton, Patrick M; Langmead, Christopher J

    2017-02-15

    Osteoclasts are multinuclear cells that degrade bone under both physiological and pathophysiological conditions. Osteoclasts are therefore a major target of osteoporosis therapeutics aimed at preserving bone. Consequently, analytical methods for osteoclast activity are useful for the development of novel biomarkers and/or pharmacological agents for the treatment of osteoporosis. The nucleation state of an osteoclast is indicative of its maturation and activity. To date, activity is routinely measured at the population level with only approximate consideration of the nucleation state (an 'osteoclast population' is typically defined as cells with ≥3 nuclei). Using a fluorescent substrate for tartrate-resistant acid phosphatase (TRAP), a routinely used marker of osteoclast activity, we developed a multi-labelled imaging method for quantitative measurement of osteoclast TRAP activity at the single cell level. Automated image analysis enables interrogation of large osteoclast populations in a high throughput manner using open source software. Using this methodology, we investigated the effects of receptor activator of nuclear factor kappa-B ligand (RANK-L) on osteoclast maturation and activity and demonstrated that TRAP activity directly correlates with osteoclast maturity (i.e. nuclei number). This method can be applied to high throughput screening of osteoclast-targeting compounds to determine changes in maturation and activity.

  5. Quantitative produced water analysis using mobile 1H NMR

    NASA Astrophysics Data System (ADS)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  6. Global Assessment of Hydrogen Technologies – Tasks 3 & 4 Report Economic, Energy, and Environmental Analysis of Hydrogen Production and Delivery Options in Select Alabama Markets: Preliminary Case Studies

    SciTech Connect

    Fouad, Fouad H.; Peters, Robert W.; Sisiopiku, Virginia P.; Sullivan Andrew J.; Gillette, Jerry; Elgowainy, Amgad; Mintz, Marianne

    2007-12-01

    This report documents a set of case studies developed to estimate the cost of producing, storing, delivering, and dispensing hydrogen for light-duty vehicles for several scenarios involving metropolitan areas in Alabama. While the majority of the scenarios focused on centralized hydrogen production and pipeline delivery, alternative delivery modes were also examined. Although Alabama was used as the case study for this analysis, the results provide insights into the unique requirements for deploying hydrogen infrastructure in smaller urban and rural environments that lie outside the DOE’s high priority hydrogen deployment regions. Hydrogen production costs were estimated for three technologies – steam-methane reforming (SMR), coal gasification, and thermochemical water-splitting using advanced nuclear reactors. In all cases examined, SMR has the lowest production cost for the demands associated with metropolitan areas in Alabama. Although other production options may be less costly for larger hydrogen markets, these were not examined within the context of the case studies.

  7. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  8. Quantitative Analysis of Intracellular Fluorescent Foci in Live Bacteria

    PubMed Central

    Moolman, M. Charl; Kerssemakers, Jacob W.J.; Dekker, Nynke H.

    2015-01-01

    Fluorescence microscopy has revolutionized in vivo cellular biology. Through the specific labeling of a protein of interest with a fluorescent protein, one is able to study movement and colocalization, and even count individual proteins in a live cell. Different algorithms exist to quantify the total intensity and position of a fluorescent focus. Although these algorithms have been rigorously studied for in vitro conditions, which are greatly different than the in-homogenous and variable cellular environments, their exact limits and applicability in the context of a live cell have not been thoroughly and systematically evaluated. In this study, we quantitatively characterize the influence of different background subtraction algorithms on several focus analysis algorithms. We use, to our knowledge, a novel approach to assess the sensitivity of the focus analysis algorithms to background removal, in which simulated and experimental data are combined to maintain full control over the sensitivity of a focus within a realistic background of cellular fluorescence. We demonstrate that the choice of algorithm and the corresponding error are dependent on both the brightness of the focus, and the cellular context. Expectedly, focus intensity estimation and localization accuracy suffer in all algorithms at low focus to background ratios, with the bacteroidal background subtraction in combination with the median excess algorithm, and the region of interest background subtraction in combination with a two-dimensional Gaussian fit algorithm, performing the best. We furthermore show that the choice of background subtraction algorithm is dependent on the expression level of the protein under investigation, and that the localization error is dependent on the distance of a focus from the bacterial edge and pole. Our results establish a set of guidelines for what signals can be analyzed to give a targeted spatial and intensity accuracy within a bacterial cell. PMID:26331246

  9. Quantitative analysis of technological innovation in minimally invasive surgery.

    PubMed

    Hughes-Hallett, A; Mayer, E K; Pratt, P J; Vale, J A; Darzi, A W

    2015-01-01

    In the past 30 years surgical practice has changed considerably owing to the advent of minimally invasive surgery (MIS). This paper investigates the changing surgical landscape chronologically and quantitatively, examining the technologies that have played, and are forecast to play, the largest part in this shift in surgical practice. Electronic patent and publication databases were searched over the interval 1980-2011 for ('minimally invasive' OR laparoscopic OR laparoscopy OR 'minimal access' OR 'key hole') AND (surgery OR surgical OR surgeon). The resulting patent codes were allocated into technology clusters. Technology clusters referred to repeatedly in the contemporary surgical literature were also included in the analysis. Growth curves of patents and publications for the resulting technology clusters were then plotted. The initial search revealed 27,920 patents and 95,420 publications meeting the search criteria. The clusters meeting the criteria for in-depth analysis were: instruments, image guidance, surgical robotics, sutures, single-incision laparoscopic surgery (SILS) and natural-orifice transluminal endoscopic surgery (NOTES). Three patterns of growth were observed among these technology clusters: an S-shape (instruments and sutures), a gradual exponential rise (surgical robotics and image guidance), and a rapid contemporaneous exponential rise (NOTES and SILS). Technological innovation in MIS has been largely stagnant since its initial inception nearly 30 years ago, with few novel technologies emerging. The present study adds objective data to the previous claims that SILS, a surgical technique currently adopted by very few, represents an important part of the future of MIS. © 2015 BJS Society Ltd. Published by John Wiley & Sons Ltd.

  10. A techno-economic analysis of polyhydroxyalkanoate and hydrogen production from syngas fermentation of gasified biomass.

    PubMed

    Choi, DongWon; Chipman, David C; Bents, Scott C; Brown, Robert C

    2010-02-01

    A techno-economic analysis was conducted to investigate the feasibility of a gasification-based hybrid biorefinery producing both hydrogen gas and polyhydroxyalkanoates (PHA), biodegradable polymer materials that can be an attractive substitute for conventional petrochemical plastics. The biorefinery considered used switchgrass as a feedstock and converted that raw material through thermochemical methods into syngas, a gaseous mixture composed mainly of hydrogen and carbon monoxide. The syngas was then fermented using Rhodospirillum rubrum, a purple non-sulfur bacterium, to produce PHA and to enrich hydrogen in the syngas. Total daily production of the biorefinery was assumed to be 12 Mg of PHA and 50 Mg of hydrogen gas. Grassroots capital for the biorefinery was estimated to be $55 million, with annual operating costs at $6.7 million. With a market value of $2.00/kg assumed for the hydrogen, the cost of producing PHA was determined to be $1.65/kg.

  11. Milestone report TCTP application to the SSME hydrogen system analysis

    NASA Technical Reports Server (NTRS)

    Richards, J. S.

    1975-01-01

    The Transient Cryogen Transfer Computer Program (TCTP) developed and verified for LOX systems by analyses of Skylab S-1B stage loading data from John F. Kennedy Space Center launches was extended to include hydrogen as the working fluid. The feasibility of incorporating TCTP into the space shuttle main engine dynamic model was studied. The program applications are documented.

  12. Analysis of neutral hydrogenic emission spectra in a tokamak

    NASA Astrophysics Data System (ADS)

    Ko, J.; Chung, J.; Jaspers, R. J. E.

    2015-10-01

    Balmer-α radiation by the excitation of thermal and fast neutral hydrogenic particles has been investigated in a magnetically confined fusion device, or tokamak, from the Korea Superconducting Tokamak Advanced Research (KSTAR). From the diagnostic point of view, the emission from thermal neutrals is associated with passive spectroscopy and that from energetic neutrals that are usually injected from the outside of the tokamak to the active spectroscopy. The passive spectroscopic measurement for the thermal Balmer-α emission from deuterium and hydrogen estimates the relative concentration of hydrogen in a deuterium-fueled plasma and therefore, makes a useful tool to monitor the vacuum wall condition. The ratio of hydrogen to deuterium obtained from this measurement qualitatively correlates with the energy confinement of the plasma. The Doppler-shifted Balmer-α components from the fast neutrals features the spectrum of the motional Stark effect (MSE) which is an essential principle for the measurement of the magnetic pitch angle profile. Characterization of this active MSE spectra, especially with multiple neutral beam lines crossing along the observation line of sight, has been done for the guideline of the multi-ion-source heating beam operation and for the optimization of the narrow bandpass filters that are required for the polarimeter-based MSE diagnostic system under construction at KSTAR.

  13. Hydrogen isotope analysis of amino acids and whole cells reflects biosynthetic processing of nutrient- and water-derived hydrogen

    NASA Astrophysics Data System (ADS)

    Griffin, P.; Newsome, S.; Steele, A.; Fogel, M. L.

    2011-12-01

    Hydrogen (H) isotopes serve as sensitive tracers of biochemical processes that can be exploited to answer critical questions in biogeochemistry, ecology, and microbiology. Despite this apparent utility, relatively little is known about the specific mechanisms of H isotope fractionation involved in biosynthesis. In order to understand how organisms incorporate hydrogen from their chemical milieu into biomass, we have cultured the model bacterium E. coli MG1655 in a variety of media composed of deuterium-labeled nutrients and waters. Isotopic analysis of bulk cell mass reveals that the H fractionation between media water and cell material varies as a function of the nutrient source, with commonly used organic food sources (glucose and tryptone) leading to far smaller fractionation signals than non-standard ones (such as formamide, adenine, and urea). In addition, we have completed compound specific isotope analysis of amino acids using combined GC-IRMS. Amino acids harvested from E. coli cultured on glucose in water of varied D/H composition posses an extraordinary range of isotopic compositions (400-600 %). Furthermore, these amino acids follow a systematic distribution of D/H where proline is always heaviest and glycine is always lightest. However, when the short-chain peptide tryptone is used in place of glucose, only the non-essential amino acids reflect media water D/H values, suggesting the direct incorporation of some media-borne amino acids into cellular protein. These observations provide a foundation for understanding the cellular routing of hydrogen obtained from food and water sources and indicate that D/H analysis can serve as a powerful probe of biological function.

  14. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles.

  15. Thermodynamic analysis of the efficiency of high-temperature steam electrolysis system for hydrogen production

    NASA Astrophysics Data System (ADS)

    Mingyi, Liu; Bo, Yu; Jingming, Xu; Jing, Chen

    High-temperature steam electrolysis (HTSE), a reversible process of solid oxide fuel cell (SOFC) in principle, is a promising method for highly efficient large-scale hydrogen production. In our study, the overall efficiency of the HTSE system was calculated through electrochemical and thermodynamic analysis. A thermodynamic model in regards to the efficiency of the HTSE system was established and the quantitative effects of three key parameters, electrical efficiency (η el), electrolysis efficiency (η es), and thermal efficiency (η th) on the overall efficiency (η overall) of the HTSE system were investigated. Results showed that the contribution of η el, η es, η th to the overall efficiency were about 70%, 22%, and 8%, respectively. As temperatures increased from 500 °C to 1000 °C, the effect of η el on η overall decreased gradually and the η es effect remained almost constant, while the η th effect increased gradually. The overall efficiency of the high-temperature gas-cooled reactor (HTGR) coupled with the HTSE system under different conditions was also calculated. With the increase of electrical, electrolysis, and thermal efficiency, the overall efficiencies were anticipated to increase from 33% to a maximum of 59% at 1000 °C, which is over two times higher than that of the conventional alkaline water electrolysis.

  16. Column precipitation chromatography: an approach to quantitative analysis of eigencolloids.

    PubMed

    Breynaert, E; Maes, A

    2005-08-01

    A new column precipitation chromatography (CPC) technique, capable of quantitatively measuring technetium eigencolloids in aqueous solutions, is presented. The CPC technique is based on the destabilization and precipitation of eigencolloids by polycations in a confined matrix. Tc(IV) colloids can be quantitatively determined from their precipitation onto the CPC column (separation step) and their subsequent elution upon oxidation to pertechnetate by peroxide (elution step). A clean-bed particle removal model was used to explain the experimental results.

  17. Quantitative Proteomic Analysis of Differentially Expressed Protein Profiles Involved in Pancreatic Ductal Adenocarcinoma

    PubMed Central

    Kuo, Kung-Kai; Kuo, Chao-Jen; Chiu, Chiang-Yen; Liang, Shih-Shin; Huang, Chun-Hao; Chi, Shu-Wen; Tsai, Kun-Bow; Chen, Chiao-Yun; Hsi, Edward; Cheng, Kuang-Hung; Chiou, Shyh-Horng

    2016-01-01

    Objectives The aim of this study was to identify differentially expressed proteins among various stages of pancreatic ductal adenocarcinoma (PDAC) by shotgun proteomics using nano-liquid chromatography coupled tandem mass spectrometry and stable isotope dimethyl labeling. Methods Differentially expressed proteins were identified and compared based on the mass spectral differences of their isotope-labeled peptide fragments generated from protease digestion. Results Our quantitative proteomic analysis of the differentially expressed proteins with stable isotope (deuterium/hydrogen ratio, ≥2) identified a total of 353 proteins, with at least 5 protein biomarker proteins that were significantly differentially expressed between cancer and normal mice by at least a 2-fold alteration. These 5 protein biomarker candidates include α-enolase, α-catenin, 14-3-3 β, VDAC1, and calmodulin with high confidence levels. The expression levels were also found to be in agreement with those examined by Western blot and histochemical staining. Conclusions The systematic decrease or increase of these identified marker proteins may potentially reflect the morphological aberrations and diseased stages of pancreas carcinoma throughout progressive developments leading to PDAC. The results would form a firm foundation for future work concerning validation and clinical translation of some identified biomarkers into targeted diagnosis and therapy for various stages of PDAC. PMID:26262590

  18. Quantitative analysis of weak interactions by Lattice energy calculation, Hirshfeld surface and DFT studies of sulfamonomethoxine

    NASA Astrophysics Data System (ADS)

    Patel, Kinjal D.; Patel, Urmila H.

    2017-01-01

    Sulfamonomethoxine, 4-Amino-N-(6-methoxy-4-pyrimidinyl) benzenesulfonamide (C11H12N4O3S), is investigated by single crystal X-ray diffraction technique. Pair of N-H⋯N and C-H⋯O intermolecular interactions along with π···π interaction are responsible for the stability of the molecular packing of the structure. In order to understand the nature of the interactions and their quantitative contributions towards the crystal packing, the 3D Hirshfeld surface and 2D fingerprint plot analysis are carried out. PIXEL calculations are performed to determine the lattice energies correspond to intermolecular interactions in the crystal structure. Ab initio quantum chemical calculations of sulfamonomethoxine (SMM) have been performed by B3LYP method, using 6-31G** basis set with the help of Schrodinger software. The computed geometrical parameters are in good agreement with the experimental data. The Mulliken charge distribution, calculated using B3LYP method to confirm the presence of electron acceptor and electron donor atoms, responsible for intermolecular hydrogen bond interactions hence the molecular stability.

  19. Quantitative analysis of flavanones and chalcones from willow bark.

    PubMed

    Freischmidt, A; Untergehrer, M; Ziegler, J; Knuth, S; Okpanyi, S; Müller, J; Kelber, O; Weiser, D; Jürgenliemk, G

    2015-09-01

    Willow bark extracts are used for the treatment of fever, pain and inflammation. Recent clinical and pharmacological research revealed that not only the salicylic alcohol derivatives, but also the polyphenols significantly contribute to these effects. Quantitative analysis of the European Pharmacopoeia still focuses on the determination of the salicylic alcohol derivatives. The objective of the present study was the development of an effective quantification method for the determination of as many flavanone and chalcone glycosides as possible in Salix purpurea and other Salix species as well as commercial preparations thereof. As Salix species contain a diverse spectrum of the glycosidated flavanones naringenin, eriodictyol, and the chalcone chalconaringenin, a subsequent acidic and enzymatic hydrolysis was developed to yield naringenin and eriodictyol as aglycones, which were quantified by HPLC. The 5-O-glucosides were cleaved with 11.5% TFA before subsequent hydrolysis of the 7-O-glucosides with an almond β-glucosidase at pH 6-7. The method was validated with regard to LOD, LOQ, intraday and interday precision, accuracy, stability, recovery, time of hydrolysis, robustness and applicability to extracts. All 5-O- and 7-O-glucosides of naringenin, eriodictyol and chalconaringenin were completely hydrolysed and converted to naringenin and eriodictyol. The LOD of the HPLC method was 0.77 μM of naringenin and 0.45 μM of eriodictyol. The LOQ was 2.34 μM of naringenin and 1.35 μM for eriodictyol. The method is robust with regard to sample weight, but susceptible concerning enzyme deterioration. The developed method is applicable to the determination of flavanone and chalcone glycosides in willow bark and corresponding preparations.

  20. A qualitative and quantitative analysis of vegetable pricing in supermarket

    NASA Astrophysics Data System (ADS)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  1. Quantitative analysis of harmonic convergence in mosquito auditory interactions

    PubMed Central

    Aldersley, Andrew; Champneys, Alan; Robert, Daniel

    2016-01-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the ‘harmonic convergence’ phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male–female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male–male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  2. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  3. A quantitative analysis of optimal treatment capacity for perinatal asphyxia.

    PubMed

    Geva, Alon; Gray, James

    2012-01-01

    In centers electing to offer therapeutic hypothermia for treating hypoxic-ischemic encephalopathy (HIE), determining the optimal number of cooling devices is not straightforward. The authors used computer-based modeling to determine the level of service as a function of local HIE caseload and number of cooling devices available. The authors used discrete event simulation to create a model that varied the number of HIE cases and number of cooling devices available. Outcomes of interest were percentage of HIE-affected infants not cooled, number of infants not cooled, and percentage of time that all cooling devices were in use. With 1 cooling device, even the smallest perinatal center did not achieve a cooling rate of 99% of eligible infants. In contrast, 2 devices ensured 99% service in centers treating as many as 20 infants annually. In centers averaging no more than 1 HIE infant monthly, the addition of a third cooling device did not result in a substantial reduction in the number of infants who would not be cooled. Centers electing to offer therapeutic hypothermia with only a single cooling device are at significant risk of being unable to provide treatment to eligible infants, whereas 2 devices appear to suffice for most institutions treating as many as 20 annual HIE cases. Three devices would rarely be needed given current caseloads seen at individual institutions. The quantitative nature of this analysis allows decision makers to determine the number of devices necessary to ensure adequate availability of therapeutic hypothermia given the HIE caseload of a particular institution.

  4. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    PubMed

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry.

  5. Quantitative image analysis of cell colocalization in murine bone marrow.

    PubMed

    Mokhtari, Zeinab; Mech, Franziska; Zehentmeier, Sandra; Hauser, Anja E; Figge, Marc Thilo

    2015-06-01

    Long-term antibody production is a key property of humoral immunity and is accomplished by long-lived plasma cells. They mainly reside in the bone marrow, whose importance as an organ hosting immunological memory is becoming increasingly evident. Signals provided by stromal cells and eosinophils may play an important role for plasma cell maintenance, constituting a survival microenvironment. In this joint study of experiment and theory, we investigated the spatial colocalization of plasma cells, eosinophils and B cells by applying an image-based systems biology approach. To this end, we generated confocal fluorescence microscopy images of histological sections from murine bone marrow that were subsequently analyzed in an automated fashion. This quantitative analysis was combined with computer simulations of the experimental system for hypothesis testing. In particular, we tested the observed spatial colocalization of cells in the bone marrow against the hypothesis that cells are found within available areas at positions that were drawn from a uniform random number distribution. We find that B cells and plasma cells highly colocalize with stromal cells, to an extent larger than in the simulated random situation. While B cells are preferentially in contact with each other, i.e., form clusters among themselves, plasma cells seem to be solitary or organized in aggregates, i.e., loosely defined groups of cells that are not necessarily in direct contact. Our data suggest that the plasma cell bone marrow survival niche facilitates colocalization of plasma cells with stromal cells and eosinophils, respectively, promoting plasma cell longevity. © 2015 International Society for Advancement of Cytometry.

  6. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  7. A life cycle cost analysis framework for geologic storage of hydrogen : a scenario analysis.

    SciTech Connect

    Kobos, Peter Holmes; Lord, Anna Snider; Borns, David James

    2010-10-01

    The U.S. Department of Energy has an interest in large scale hydrogen geostorage, which would offer substantial buffer capacity to meet possible disruptions in supply. Geostorage options being considered are salt caverns, depleted oil/gas reservoirs, aquifers and potentially hard rock cavrns. DOE has an interest in assessing the geological, geomechanical and economic viability for these types of hydrogen storage options. This study has developed an ecocomic analysis methodology to address costs entailed in developing and operating an underground geologic storage facility. This year the tool was updated specifically to (1) a version that is fully arrayed such that all four types of geologic storage options can be assessed at the same time, (2) incorporate specific scenarios illustrating the model's capability, and (3) incorporate more accurate model input assumptions for the wells and storage site modules. Drawing from the knowledge gained in the underground large scale geostorage options for natural gas and petroleum in the U.S. and from the potential to store relatively large volumes of CO{sub 2} in geological formations, the hydrogen storage assessment modeling will continue to build on these strengths while maintaining modeling transparency such that other modeling efforts may draw from this project.

  8. SPITZER IRAC DETECTION AND ANALYSIS OF SHOCKED MOLECULAR HYDROGEN EMISSION

    SciTech Connect

    Ybarra, Jason E.; Lada, Elizabeth A.

    2009-04-10

    We use statistical equilibrium equations to investigate the Infrared Array Camera (IRAC) color space of shocked molecular hydrogen. The location of shocked H{sub 2} in [3.6] - [4.5] versus [4.5] - [5.8] color is determined by the gas temperature and density of neutral atomic hydrogen. We find that high excitation H{sub 2} emission falls in a unique location in the color-color diagram and can unambiguously be distinguished from stellar sources. In addition to searching for outflows, we show that the IRAC data can be used to map the thermal structure of the shocked gas. We analyze archival Spitzer data of Herbig-Haro object HH 54 and create a temperature map, which is consistent with spectroscopically determined temperatures.

  9. Multi-criteria analysis on how to select solar radiation hydrogen production system

    NASA Astrophysics Data System (ADS)

    Badea, G.; Naghiu, G. S.; Felseghi, R.-A.; Rǎboacǎ, S.; Aşchilean, I.; Giurca, I.

    2015-12-01

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  10. Multi-criteria analysis on how to select solar radiation hydrogen production system

    SciTech Connect

    Badea, G.; Naghiu, G. S. Felseghi, R.-A.; Giurca, I.; Răboacă, S.; Aşchilean, I.

    2015-12-23

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  11. Thermodynamic analysis of alternate energy carriers, hydrogen and chemical heat pipes

    NASA Technical Reports Server (NTRS)

    Cox, K. E.; Carty, R. H.; Conger, W. L.; Soliman, M. A.; Funk, J. E.

    1976-01-01

    Hydrogen and chemical heat pipes were proposed as methods of transporting energy from a primary energy source (nuclear, solar) to the user. In the chemical heat pipe system, primary energy is transformed into the energy of a reversible chemical reaction; the chemical species are then transmitted or stored until the energy is required. Analysis of thermochemical hydrogen schemes and chemical heat pipe systems on a second law efficiency or available work basis show that hydrogen is superior especially if the end use of the chemical heat pipe is electrical power.

  12. Determination of hydrogen in niobium by cold neutron prompt gamma ray activation analysis and neutron incoherent scattering

    SciTech Connect

    R.L. Paul; H.H. Cheu-Maya; G.R. Myneni

    2002-11-01

    The presence of trace amounts of hydrogen in niobium is believed to have a detrimental effect on the mechanical and superconducting properties. Unfortunately, few techniques are capable of measuring hydrogen at these levels. We have developed two techniques for measuring hydrogen in materials. Cold neutron prompt gamma-ray activation analysis (PGAA) has proven useful for the determination of hydrogen and other elements in a wide variety of materials. Neutron incoherent scattering (NIS), a complementary tool to PGAA, has been used to measure trace hydrogen in titanium. Both techniques were used to study the effects of vacuum heating and chemical polishing on the hydrogen content of superconducting niobium.

  13. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    PubMed Central

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  14. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    PubMed

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  15. Universal platform for quantitative analysis of DNA transposition.

    PubMed

    Pajunen, Maria I; Rasila, Tiina S; Happonen, Lotta J; Lamberg, Arja; Haapa-Paananen, Saija; Kiljunen, Saija; Savilahti, Harri

    2010-11-26

    Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. The established universal papillation assay platform should be widely applicable to a variety of mobile elements. It can be used for mechanistic

  16. Numerical and experimental analysis of propane-hydrogen mixture ignition in air

    NASA Astrophysics Data System (ADS)

    Sevrouk, K. L.; Krivosheyev, P. N.; Penyazkov, O. G.; Torohov, S. A.; Titova, N. S.; Starik, A. M.

    2016-11-01

    The addition of hydrogen to the various hydrocarbon fuels being examined as a promising method for increasing the efficiency of the engine while improving their emission characteristics. This work is dedicated to experimental investigation of the ignition delay time C3H8-H2 mixture in the air and analysis of the mechanisms responsible for the acceleration of chain reactions with the addition of hydrogen in propane, based on numerical simulation.

  17. Fracture mechanics analysis of a high-pressure hydrogen facility compressor

    NASA Technical Reports Server (NTRS)

    Vroman, G. A.

    1974-01-01

    The investigation and analysis of a high-pressure hydrogen facility compressor is chronicled, and a life prediction based on fracture mechanics is presented. Crack growth rates in SA 105 Gr II steel are developed for the condition of sustained loading, using a hypothesis of hydrogen embrittlement associated with plastic zone reverse yielding. The resultant formula is compared with test data obtained from laboratory specimens.

  18. A quantitative scale for the degree of aromaticity and antiaromaticity: a comparison of theoretical and experimental enthalpies of hydrogenation.

    PubMed

    Mucsi, Zoltán; Viskolcz, Béla; Csizmadia, Imre G

    2007-02-15

    Chemical structures and transition states are often influenced by aromatic stabilization or antiaromatic destabilizing effects, which are not easy to characterize theoretically. The exact description and precise quantification of the aromatic characteristics of ring structures is difficult and requires special theoretical investigation. The present paper suggests a novel, yet simple, method to quantify both aromatic and antiaromatic qualities on the same linear scale, by using the experimentally measured or theoretically computed enthalpy of hydrogenation reaction of the compound examined [DeltaHH2(examined)]. A reference hydrogenation reaction is also considered on a corresponding nonaromatic reference compound [DeltaHH2(reference)] to cancel all secondary structure destabilization factors, such as ring strain or double bond strain. From these data the relative enthalpy of hydrogenation may easily be calculated: DeltaDeltaHH2=DeltaHH2(examined)-DeltaHH2(reference). In the present work concept, the DeltaDeltaHH2 value of benzene defines the completely aromatic character (+100%), and the closed shell of the singlet cyclobutadiene represents maximum antiaromaticity (-100%). The component DeltaHH2 values were computed at different levels of theory offering a computational "method-independent" measure for aromaticity. A total of 28 well-known aromatic, antiaromatic and nonaromatic, neutral and charged compounds were examined to demonstrate the efficiency of this methodology. Finally, a correlation was made between the calculated aromaticity percentage of the compound examined and their popular Schleyers NICS values.

  19. Quantitative proton nuclear magnetic resonance for the structural and quantitative analysis of atropine sulfate.

    PubMed

    Shen, Shi; Yao, Jing; Shi, Yaqin

    2014-02-01

    This study assessed a general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of atropine sulfate (Active Pharmaceutical Ingredient, API) as reference standard. The spectra were acquired in D2O using maleic acid as the internal standard. Conformational behaviors of tropane ring were observed and studied by means of NMR and ROESY experiments at different temperature, which showed that the azine methyl group was at equilibrium for axial and equatorial conformations at room temperature. Signal delay and monitor signals of qNMR experimentation were optimized for quantification. The study reported here validated the method's linearity, range, limit of quantification, stability and precision. The results were consistent with the results obtained from mass balance approach.

  20. Analysis of Improved Reference Design for a Nuclear-Driven High Temperature Electrolysis Hydrogen Production Plant

    SciTech Connect

    Edwin A. Harvego; James E. O'Brien; Michael G. McKellar

    2010-06-01

    The use of High Temperature Electrolysis (HTE) for the efficient production of hydrogen without the greenhouse gas emissions associated with conventional fossil-fuel hydrogen production techniques has been under investigation at the Idaho National Engineering Laboratory (INL) for the last several years. The activities at the INL have included the development, testing and analysis of large numbers of solid oxide electrolysis cells, and the analyses of potential plant designs for large scale production of hydrogen using an advanced Very-High Temperature Reactor (VHTR) to provide the process heat and electricity to drive the electrolysis process. The results of these system analyses, using the UniSim process analysis software, have shown that the HTE process, when coupled to a VHTR capable of operating at reactor outlet temperatures of 800 °C to 950 °C, has the potential to produce the large quantities of hydrogen needed to meet future energy and transportation needs with hydrogen production efficiencies in excess of 50%. In addition, economic analyses performed on the INL reference plant design, optimized to maximize the hydrogen production rate for a 600 MWt VHTR, have shown that a large nuclear-driven HTE hydrogen production plant can to be economically competitive with conventional hydrogen production processes, particularly when the penalties associated with greenhouse gas emissions are considered. The results of this research led to the selection in 2009 of HTE as the preferred concept in the U.S. Department of Energy (DOE) hydrogen technology down-selection process. However, the down-selection process, along with continued technical assessments at the INL, has resulted in a number of proposed modifications and refinements to improve the original INL reference HTE design. These modifications include changes in plant configuration, operating conditions and individual component designs. This paper describes the resulting new INL reference design and presents

  1. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    ERIC Educational Resources Information Center

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  2. The Utility of Quantitative Methods for Political Intelligence Analysis. A Case Study in Latin America

    DTIC Science & Technology

    1995-10-20

    The paper examines the suitability of current intelligence analysis developed during the Cold War era and finds a lack of quantitative techniques...that are prevalent in academic social science research. Several areas where quantitative research might be applied successfully to intelligence analysis are

  3. [Bibliometric analysis of bacterial quantitative proteomics in English literatures].

    PubMed

    Zhang, Xin; She, Danyang; Liu, Youning; Wang, Rui; Di, Xiuzhen; Liang, Beibei; Wang, Yue

    2014-07-01

    To analyze the worldwide advances on bacterial quantitative proteomics over the past fifteen years with bibliometric approach. Literature retrieval was conducted throughout the databases of Pubmed, Embase and Science citation index (SCI), using "bacterium" and "quantitative proteomics" as the key words. The deadline is July 2013. We sorted and analyzed these articles with Endnote X6 from the aspects of published year, the first author, name of journal, published institution, cited frequency and publication type. 932 English articles were included in our research after deleting the duplicates. The first article on bacterial quantitative proteomics was reported in 1999. The maximal publications were 163 related articles in 2012. Up till July 2013, authors from more than 23 countries and regions have published articles in this field. China ranks the fourth. The main publication type is original articles. The most frequently cited article is entitled with "Absolute quantification of proteins by LCMSE: a virtue of parallel MS acquisition" by Silva JC, Gorenstein MV, Li GZ, et al in Mol Cell Proteomics 2006. The most productive author is Smith RD from Biological Sciences Division, Pac. Northwest National Laboratory. The top journal publishing bacterial quantitative proteomics is Proteomics. More and more researchers pay attention to quantitative proteomics which will be widely used in bacteriology.

  4. FUNDAMENTAL SAFETY TESTING AND ANALYSIS OF HYDROGEN STORAGE MATERIALS AND SYSTEMS

    SciTech Connect

    Anton, D

    2007-05-01

    Hydrogen is seen as the future automobile energy storage media due to its inherent cleanliness upon oxidation and its ready utilization in fuel cell applications. Its physical storage in light weight, low volume systems is a key technical requirement. In searching for ever higher gravimetric and volumetric density hydrogen storage materials and systems, it is inevitable that higher energy density materials will be studied and used. To make safe and commercially acceptable systems, it is important to understand quantitatively, the risks involved in using and handling these materials and to develop appropriate risk mitigation strategies to handle unforeseen accidental events. To evaluate these materials and systems, an IPHE sanctioned program was initiated in 2006 partnering laboratories from Europe, North America and Japan. The objective of this international program is to understanding the physical risks involved in synthesis, handling and utilization of solid state hydrogen storage materials and to develop methods to mitigate these risks. This understanding will support ultimate acceptance of commercially high density hydrogen storage system designs. An overview of the approaches to be taken to achieve this objective will be given. Initial experimental results will be presented on environmental exposure of NaAlH{sub 4}, a candidate high density hydrogen storage compound. The tests to be shown are based on United Nations recommendations for the transport of hazardous materials and include air and water exposure of the hydride at three hydrogen charge levels in various physical configurations. Additional tests developed by the American Society for Testing and Materials were used to quantify the dust cloud ignition characteristics of this material which may result from accidental high energy impacts and system breach. Results of these tests are shown along with necessary risk mitigation techniques used in the synthesis and fabrication of a prototype hydrogen storage

  5. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    PubMed

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  7. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  8. Hydrogen Energy Storage (HES) and Power-to-Gas Economic Analysis; NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Eichman, Joshua

    2015-07-30

    This presentation summarizes opportunities for hydrogen energy storage and power-to-gas and presents the results of a market analysis performed by the National Renewable Energy Laboratory to quantify the value of energy storage. Hydrogen energy storage and power-to-gas systems have the ability to integrate multiple energy sectors including electricity, transportation, and industrial. On account of the flexibility of hydrogen systems, there are a variety of potential system configurations. Each configuration will provide different value to the owner, customers and grid system operator. This presentation provides an economic comparison of hydrogen storage, power-to-gas and conventional storage systems. The total cost is compared to the revenue with participation in a variety of markets to assess the economic competitiveness. It is found that the sale of hydrogen for transportation or industrial use greatly increases competitiveness. Electrolyzers operating as demand response devices (i.e., selling hydrogen and grid services) are economically competitive, while hydrogen storage that inputs electricity and outputs only electricity have an unfavorable business case. Additionally, tighter integration with the grid provides greater revenue (e.g., energy, ancillary service and capacity markets are explored). Lastly, additional hours of storage capacity is not necessarily more competitive in current energy and ancillary service markets and electricity markets will require new mechanisms to appropriately compensate long duration storage devices.

  9. Hydrogen Fuel Cell Analysis: Lessons Learned from Stationary Power Generation Final Report

    SciTech Connect

    Scott E. Grasman; John W. Sheffield; Fatih Dogan; Sunggyu Lee; Umit O. Koylu; Angie Rolufs

    2010-04-30

    This study considered opportunities for hydrogen in stationary applications in order to make recommendations related to RD&D strategies that incorporate lessons learned and best practices from relevant national and international stationary power efforts, as well as cost and environmental modeling of pathways. The study analyzed the different strategies utilized in power generation systems and identified the different challenges and opportunities for producing and using hydrogen as an energy carrier. Specific objectives included both a synopsis/critical analysis of lessons learned from previous stationary power programs and recommendations for a strategy for hydrogen infrastructure deployment. This strategy incorporates all hydrogen pathways and a combination of distributed power generating stations, and provides an overview of stationary power markets, benefits of hydrogen-based stationary power systems, and competitive and technological challenges. The motivation for this project was to identify the lessons learned from prior stationary power programs, including the most significant obstacles, how these obstacles have been approached, outcomes of the programs, and how this information can be used by the Hydrogen, Fuel Cells & Infrastructure Technologies Program to meet program objectives primarily related to hydrogen pathway technologies (production, storage, and delivery) and implementation of fuel cell technologies for distributed stationary power. In addition, the lessons learned address environmental and safety concerns, including codes and standards, and education of key stakeholders.

  10. Quantitative analysis of norfloxacin by 1H NMR and HPLC.

    PubMed

    Frackowiak, Anita; Kokot, Zenon J

    2012-01-01

    1H NMR and developed previously HPLC methods were applied to quantitative determination of norfloxacin in veterinary solution form for pigeon. Changes in concentration can lead to significant changes in the 1H chemical shifts of non-exchangeable aromatic protons as a result of extensive self-association phenomena. This chemical shift variation of protons was analyzed and applied in the quantitative determination of norfloxacin. The method is simple, rapid, precise and accurate, and can be used for quality control of this drug.

  11. Quantitative Analysis of Circular Symmetry of Venus Coronae and Craters

    NASA Astrophysics Data System (ADS)

    Stoddard, P. R.; Jurdy, D. M.

    2007-12-01

    The origin of craters has long been debated: Exogenic or endogenic? Impact or volcanic? While for the craters of the Earth and Moon the issue has been largely resolved, it has flared anew in recent papers by Hamilton (2005, 2007), Vita-Finzi et al. (2005), and Jurdy and Stoddard (2005, 2007). We weigh in with a quantitative technique to differentiate between these possible mechanisms. Craters by their nature are circular. They are excavated by a roughly hemispherical shock wave, and thus almost regardless of impact angle, will be round rim-and-basin structures (Melosh, 1989). Although underlying structural features, such as faults, and later tectonic deformation can affect crater shape we suggest that the strongest test of an impact origin for coronae is the circularity of these features. Here we introduce an approach for the assessment of a feature's circular symmetry. Using altimetry data we compare, by cross-correlation, multiple profiles across a single feature. Jurdy and Stoddard (2005) provided an example in which Mead crater and two coronae were analyzed. They found that for each corona, profiles cross- correlated at only 25-30% of perfect cross-correlation. Profiles for Mead crater, however, correlated at a much higher level, 80%. Here, we perform an expanded study of features generally classified as craters, and others whose classification as coronae has been questioned by Hamilton (2007). We choose only the largest craters, since altimetry data are too coarse to allow enough data points for analyses of smaller features, and also because they are of similar size to the coronae in our study. For each feature, 36 profiles are extracted from the altimetry data, de-sloped, and averaged together. For each feature, the individual profiles are correlated against the average, and the correlations themselves were averaged to give an assessment of circular symmetry. Results indicate accepted craters have the highest correlation averages (are most circular) and

  12. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  13. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    PubMed

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  14. PNNL Development and Analysis of Material-Based Hydrogen Storage Systems for the Hydrogen Storage Engineering Center of Excellence

    SciTech Connect

    Brooks, Kriston P.; Alvine, Kyle J.; Johnson, Kenneth I.; Klymyshyn, Nicholas A.; Pires, Richard P.; Ronnebro, Ewa; Simmons, Kevin L.; Weimar, Mark R.; Westman, Matthew P.

    2016-02-29

    The Hydrogen Storage Engineering Center of Excellence is a team of universities, industrial corporations, and federal laboratories with the mandate to develop lower-pressure, materials-based, hydrogen storage systems for hydrogen fuel cell light-duty vehicles. Although not engaged in the development of new hydrogen storage materials themselves, it is an engineering center that addresses engineering challenges associated with the currently available hydrogen storage materials. Three material-based approaches to hydrogen storage are being researched: 1) chemical hydrogen storage materials 2) cryo-adsorbents, and 3) metal hydrides. As a member of this Center, Pacific Northwest National Laboratory (PNNL) has been involved in the design and evaluation of systems developed with each of these three hydrogen storage materials. This report is a compilation of the work performed by PNNL for this Center.

  15. Hydrogen peroxide agarose gels for electrophoretic analysis of RNA.

    PubMed

    Pandey, Renu; Saluja, Daman

    2017-10-01

    Efficient electrophoretic separation of isolated total RNA utilizes chemicals and agents to aid in nuclease free environment. However cost, extensive pre-run processing protocols as well as toxic byproducts limit the usage of such protocols. Moreover, these treatments affect the overall electrophoretic results by altering the conductivity of the running buffer and weaken the gel strength. We here provide a protocol for RNA visualization that obviates these shortcomings by preparation of agarose gel with hydrogen peroxide using the regular TAE buffer. The simple, inexpensive protocol exhibits superior results in a horizontal agarose gel electrophoresis. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Resistance of constructional steel to fracture in hydrogen impregnation and hydrogen sulfide cracking

    SciTech Connect

    Savchenkov, E.A.

    1986-01-01

    This article discusses the resistance of constructional steel to fracture in hydrogen impregnation and hydrogen sulfide cracking. The greater the hydrogen concentration at the crack tip, the lower the failure stress intensity factor. The relationship of the new parameter of hydrogen resistance and the equations of decohesion of the hydrogen impregnated metal is revealed in analysis of the threshold stresses of hydrogen sulfide cracking. The quantitative rules and criteria of fracture of hydrogen impregnated steel established show that the activity of hydrogen, creating ''hydrogen stresses'' and adsorption effects, has primary significance. Results presented show that the harmful influence of hydrogen may be neutralized by selection of appropriate alloying and methods of treatment of the steel to reduce the activity of hydrogen and under certain conditions to even obtain a positive effect from hydrogen impregnation. The author outlines several conclusions and shows that the criterion of failure of constructional steels in hydrogen embrittlement may be represented in clear form through the probability of decohesion, the hydrogen activity and the original crack resistance.

  17. [Quantitative cartilage analysis with magnetic resonance tomography (qMRI)--a new era in arthrosis diagnosis?].

    PubMed

    Eckstein, F; Englmeier, K H; Reiser, M

    2002-06-01

    Magnetic resonance imaging (MRI) is a new and very powerful method for the diagnostics and monitoring of osteoarthritis. Its advantage is that all articular tissues can be visualized directly and are accessible for three-dimensional analysis. This article reviews qualitative, semi-quantitative, and quantitative studies on articular cartilage with MRI. In particular we discuss pulse sequences and three-dimensional postprocessing methods for quantitative analysis of cartilage volume and thickness, along with their accuracy and precision in healthy volunteers and patients with osteoarthritis. It addition, we present approaches for quantitative analyses of structural/biochemical parameters and for the deformational behavior of cartilage in vivo.

  18. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  19. Teaching Quantitative Research Methods: A Quasi-Experimental Analysis.

    ERIC Educational Resources Information Center

    Bridges, George S.; Gillmore, Gerald M.; Pershing, Jana L.; Bates, Kristin A.

    1998-01-01

    Describes an experiment designed to introduce aspects of quantitative reasoning to a large, substantively-focused class in the social sciences. Reveals that participating students' abilities to interpret and manipulate empirical data increased significantly, independent of baseline SAT verbal and mathematics scores. Discusses implications for…

  20. Quantitative proteomic analysis by accurate mass retention time pairs.

    PubMed

    Silva, Jeffrey C; Denny, Richard; Dorschel, Craig A; Gorenstein, Marc; Kass, Ignatius J; Li, Guo-Zhong; McKenna, Therese; Nold, Michael J; Richardson, Keith; Young, Phillip; Geromanos, Scott

    2005-04-01

    Current methodologies for protein quantitation include 2-dimensional gel electrophoresis techniques, metabolic labeling, and stable isotope labeling methods to name only a few. The current literature illustrates both pros and cons for each of the previously mentioned methodologies. Keeping with the teachings of William of Ockham, "with all things being equal the simplest solution tends to be correct", a simple LC/MS based methodology is presented that allows relative changes in abundance of proteins in highly complex mixtures to be determined. Utilizing a reproducible chromatographic separations system along with the high mass resolution and mass accuracy of an orthogonal time-of-flight mass spectrometer, the quantitative comparison of tens of thousands of ions emanating from identically prepared control and experimental samples can be made. Using this configuration, we can determine the change in relative abundance of a small number of ions between the two conditions solely by accurate mass and retention time. Employing standard operating procedures for both sample preparation and ESI-mass spectrometry, one typically obtains under 5 ppm mass precision and quantitative variations between 10 and 15%. The principal focus of this paper will demonstrate the quantitative aspects of the methodology and continue with a discussion of the associated, complementary qualitative capabilities.

  1. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  2. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  3. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    USDA-ARS?s Scientific Manuscript database

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  4. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  5. Improvements in the gaseous hydrogen-water equilibration technique for hydrogen isotope ratio analysis

    USGS Publications Warehouse

    Coplen, T.B.; Wildman, J.D.; Chen, J.

    1991-01-01

    Improved precision in the H2-H2O equilibration method for ??D analysis has been achieved in an automated system. Reduction in 1-?? standard deviation of a single mass-spectrometer analysis to 1.3??? is achieved by (1) bonding catalyst to glass rods and assigning use to specific equilibration chambers to monitor performance of catalyst, (2) improving the apparatus design, and (3) reducing the H3+ contribution of the mass-spectrometer ion source. For replicate analysis of a water sample, the standard deviation improved to 0.8???. H2S-bearing samples and samples as small as 0.1 mL can be analyzed routinely with this method.

  6. Shifts in metabolic hydrogen sinks in the methanogenesis-inhibited ruminal fermentation: a meta-analysis.

    PubMed

    Ungerfeld, Emilio M

    2015-01-01

    Maximizing the flow of metabolic hydrogen ([H]) in the rumen away from CH4 and toward volatile fatty acids (VFA) would increase the efficiency of ruminant production and decrease its environmental impact. The objectives of this meta-analysis were: (i) To quantify shifts in metabolic hydrogen sinks when inhibiting ruminal methanogenesis in vitro; and (ii) To understand the variation in shifts of metabolic hydrogen sinks among experiments and between batch and continuous cultures systems when methanogenesis is inhibited. Batch (28 experiments, N = 193) and continuous (16 experiments, N = 79) culture databases of experiments with at least 50% inhibition in CH4 production were compiled. Inhibiting methanogenesis generally resulted in less fermentation and digestion in most batch culture, but not in most continuous culture, experiments. Inhibiting CH4 production in batch cultures resulted in redirection of metabolic hydrogen toward propionate and H2 but not butyrate. In continuous cultures, there was no overall metabolic hydrogen redirection toward propionate or butyrate, and H2 as a proportion of metabolic hydrogen spared from CH4 production was numerically smaller compared to batch cultures. Dihydrogen accumulation was affected by type of substrate and methanogenesis inhibitor, with highly fermentable substrates resulting in greater redirection of metabolic hydrogen toward H2 when inhibiting methanogenesis, and some oils causing small or no H2 accumulation. In both batch and continuous culture, there was a decrease in metabolic hydrogen recovered as the sum of propionate, butyrate, CH4 and H2 when inhibiting methanogenesis, and it is speculated that as CH4 production decreases metabolic hydrogen could be increasingly incorporated into formate, microbial biomass, and perhaps, reductive acetogenesis in continuous cultures. Energetic benefits of inhibiting methanogenesis depended on the inhibitor and its concentration and on the in vitro system.

  7. Shifts in metabolic hydrogen sinks in the methanogenesis-inhibited ruminal fermentation: a meta-analysis

    PubMed Central

    Ungerfeld, Emilio M.

    2015-01-01

    Maximizing the flow of metabolic hydrogen ([H]) in the rumen away from CH4 and toward volatile fatty acids (VFA) would increase the efficiency of ruminant production and decrease its environmental impact. The objectives of this meta-analysis were: (i) To quantify shifts in metabolic hydrogen sinks when inhibiting ruminal methanogenesis in vitro; and (ii) To understand the variation in shifts of metabolic hydrogen sinks among experiments and between batch and continuous cultures systems when methanogenesis is inhibited. Batch (28 experiments, N = 193) and continuous (16 experiments, N = 79) culture databases of experiments with at least 50% inhibition in CH4 production were compiled. Inhibiting methanogenesis generally resulted in less fermentation and digestion in most batch culture, but not in most continuous culture, experiments. Inhibiting CH4 production in batch cultures resulted in redirection of metabolic hydrogen toward propionate and H2 but not butyrate. In continuous cultures, there was no overall metabolic hydrogen redirection toward propionate or butyrate, and H2 as a proportion of metabolic hydrogen spared from CH4 production was numerically smaller compared to batch cultures. Dihydrogen accumulation was affected by type of substrate and methanogenesis inhibitor, with highly fermentable substrates resulting in greater redirection of metabolic hydrogen toward H2 when inhibiting methanogenesis, and some oils causing small or no H2 accumulation. In both batch and continuous culture, there was a decrease in metabolic hydrogen recovered as the sum of propionate, butyrate, CH4 and H2 when inhibiting methanogenesis, and it is speculated that as CH4 production decreases metabolic hydrogen could be increasingly incorporated into formate, microbial biomass, and perhaps, reductive acetogenesis in continuous cultures. Energetic benefits of inhibiting methanogenesis depended on the inhibitor and its concentration and on the in vitro system. PMID:25699029

  8. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected.

  9. System Evaluation and Economic Analysis of a HTGR Powered High-Temperature Electrolysis Hydrogen Production Plant

    SciTech Connect

    Michael G. McKellar; Edwin A. Harvego; Anastasia A. Gandrik

    2010-10-01

    A design for a commercial-scale high-temperature electrolysis (HTE) plant for hydrogen production has been developed. The HTE plant is powered by a high-temperature gas-cooled reactor (HTGR) whose configuration and operating conditions are based on the latest design parameters planned for the Next Generation Nuclear Plant (NGNP). The current HTGR reference design specifies a reactor power of 600 MWt, with a primary system pressure of 7.0 MPa, and reactor inlet and outlet fluid temperatures of 322°C and 750°C, respectively. The power conversion unit will be a Rankine steam cycle with a power conversion efficiency of 40%. The reference hydrogen production plant operates at a system pressure of 5.0 MPa, and utilizes a steam-sweep system to remove the excess oxygen that is evolved on the anode (oxygen) side of the electrolyzer. The overall system thermal-to-hydrogen production efficiency (based on the higher heating value of the produced hydrogen) is 40.4% at a hydrogen production rate of 1.75 kg/s and an oxygen production rate of 13.8 kg/s. An economic analysis of this plant was performed with realistic financial and cost estimating assumptions. The results of the economic analysis demonstrated that the HTE hydrogen production plant driven by a high-temperature helium-cooled nuclear power plant can deliver hydrogen at a cost of $3.67/kg of hydrogen assuming an internal rate of return, IRR, of 12% and a debt to equity ratio of 80%/20%. A second analysis shows that if the power cycle efficiency increases to 44.4%, the hydrogen production efficiency increases to 42.8% and the hydrogen and oxygen production rates are 1.85 kg/s and 14.6 kg/s respectively. At the higher power cycle efficiency and an IRR of 12% the cost of hydrogen production is $3.50/kg.

  10. Thermal Stress Analysis for a Transfer Line of Hydrogen Moderator in J-Parc

    NASA Astrophysics Data System (ADS)

    Tatsumoto, H.; Teshigawara, M.; Aso, T.; Ohtsu, K.; Maekawa, F.; Kato, T.

    2008-03-01

    An intense spallation neutron source (JSNS) driven by a 1-MW proton beam was constructed, as one of the main experimental facilities in J-PARC. In JSNS, supercritical hydrogen (1.5 MPa, 20 K) was selected as a moderator material. Three kinds of hydrogen moderator are installed (coupled, decoupled, and poisoned) to provide pulsed neutron beam with higher neutronic performance. The moderators contain cryogenic hydrogen transfer lines located in a radioactive area. Therefore, the transfer lines should be designed to have minimum pipe size and elbow-type bend sections to reduce the potential for radiation dose by radiation streaming. The design should also consider mechanical stress concentrations, deformation, and touching between the pipes due to the thermal shrinkage at the cryogenic hydrogen temperature. A FEM code analysis determined the appropriate locations of piping supporting spacers to keep the thermal stress below the allowable stress and to also avoid touching between the pipes.

  11. Structural and Quantitative Analysis of Three C-Glycosylflavones by Variable Temperature Proton Quantitative Nuclear Magnetic Resonance

    PubMed Central

    Liu, Yang; Dai, Zhong

    2017-01-01

    Quantitative nuclear magnetic resonance is a powerful tool in drug analysis because of its speed, precision, and efficiency. In present study, the application of variable temperature proton quantitative nuclear magnetic resonance (VT-1H-qNMR) for the calibration of three C-glycosylflavones including orientin, isoorientin, and schaftoside as reference substances was reported. Since there was conformational equilibrium due to the restricted rotation around the C(sp3)-C(sp2) bond in C-glycosylflavones, the conformational behaviors were investigated by VT-NMR and verified by molecular mechanics (MM) calculation. The VT-1H-qNMR method was validated including the linearity, limit of quantification, precision, and stability. The results were consistent with those obtained from mass balance approach. VT-1H-qNMR can be deployed as an effective tool in analyzing C-glycosylflavones. PMID:28243484

  12. Mapping of the interaction sites of galanthamine: a quantitative analysis through pairwise potentials and quantum chemistry.

    PubMed

    Galland, Nicolas; Kone, Soleymane; Le Questel, Jean-Yves

    2012-10-01

    A quantitative analysis of the interaction sites of the anti-Alzheimer drug galanthamine with molecular probes (water and benzene molecules) representative of its surroundings in the binding site of acetylcholinesterase (AChE) has been realized through pairwise potentials calculations and quantum chemistry. This strategy allows a full and accurate exploration of the galanthamine potential energy surface of interaction. Significantly different results are obtained according to the distances of approaches between the various molecular fragments and the conformation of the galanthamine N-methyl substituent. The geometry of the most relevant complexes has then been fully optimized through MPWB1K/6-31 + G(d,p) calculations, final energies being recomputed at the LMP2/aug-cc-pVTZ(-f) level of theory. Unexpectedly, galanthamine is found to interact mainly from its hydrogen-bond donor groups. Among those, CH groups in the vicinity of the ammonium group are prominent. The trends obtained provide rationales to the predilection of the equatorial orientation of the galanthamine N-methyl substituent for binding to AChE. The analysis of the interaction energies pointed out the independence between the various interaction sites and the rigid character of galanthamine. The comparison between the cluster calculations and the crystallographic observations in galanthamine-AChE co-crystals allows the validation of the theoretical methodology. In particular, the positions of several water molecules appearing as strongly conserved in galanthamine-AChE co-crystals are predicted by the calculations. Moreover, the experimental position and orientation of lateral chains of functionally important aminoacid residues are in close agreement with the ones predicted theoretically. Our study provides relevant information for a rational drug design of galanthamine based AChE inhibitors.

  13. Mapping of the interaction sites of galanthamine: a quantitative analysis through pairwise potentials and quantum chemistry

    NASA Astrophysics Data System (ADS)

    Galland, Nicolas; Kone, Soleymane; Le Questel, Jean-Yves

    2012-10-01

    A quantitative analysis of the interaction sites of the anti-Alzheimer drug galanthamine with molecular probes (water and benzene molecules) representative of its surroundings in the binding site of acetylcholinesterase (AChE) has been realized through pairwise potentials calculations and quantum chemistry. This strategy allows a full and accurate exploration of the galanthamine potential energy surface of interaction. Significantly different results are obtained according to the distances of approaches between the various molecular fragments and the conformation of the galanthamine N-methyl substituent. The geometry of the most relevant complexes has then been fully optimized through MPWB1K/6-31 + G(d,p) calculations, final energies being recomputed at the LMP2/aug-cc-pVTZ(-f) level of theory. Unexpectedly, galanthamine is found to interact mainly from its hydrogen-bond donor groups. Among those, CH groups in the vicinity of the ammonium group are prominent. The trends obtained provide rationales to the predilection of the equatorial orientation of the galanthamine N-methyl substituent for binding to AChE. The analysis of the interaction energies pointed out the independence between the various interaction sites and the rigid character of galanthamine. The comparison between the cluster calculations and the crystallographic observations in galanthamine-AChE co-crystals allows the validation of the theoretical methodology. In particular, the positions of several water molecules appearing as strongly conserved in galanthamine-AChE co-crystals are predicted by the calculations. Moreover, the experimental position and orientation of lateral chains of functionally important aminoacid residues are in close agreement with the ones predicted theoretically. Our study provides relevant information for a rational drug design of galanthamine based AChE inhibitors.

  14. Characterization and high throughput analysis of metal hydrides for hydrogen storage

    NASA Astrophysics Data System (ADS)

    Barcelo, Steven James

    -Ni mixtures. Finally, another technique for improving hydrogen storage performance is presented which focuses on promising materials studied using the high throughput technique. TiO2 powder was ball milled together with NaBH 4, and gravimetric analysis shows a 50% improvement in the kinetics of the hydrogen desorption reaction and a reduction in desorption temperature of 60°C.

  15. Quantitative sectioning and noise analysis for structured illumination microscopy

    PubMed Central

    Hagen, Nathan; Gao, Liang; Tkaczyk, Tomasz S.

    2011-01-01

    Structured illumination (SI) has long been regarded as a nonquantitative technique for obtaining sectioned microscopic images. Its lack of quantitative results has restricted the use of SI sectioning to qualitative imaging experiments, and has also limited researchers’ ability to compare SI again