Sample records for include large amounts

  1. Diet - liver disease

    MedlinePlus

    ... of toxic waste products. Increasing your intake of carbohydrates to be in proportion with the amount of ... severe liver disease include: Eat large amounts of carbohydrate foods. Carbohydrates should be the major source of ...

  2. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  3. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  4. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  5. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  6. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  7. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  8. Profiling of lipid and glycogen accumulations under different growth conditions in the sulfothermophilic red alga Galdieria sulphuraria.

    PubMed

    Sakurai, Toshihiro; Aoki, Motohide; Ju, Xiaohui; Ueda, Tatsuya; Nakamura, Yasunori; Fujiwara, Shoko; Umemura, Tomonari; Tsuzuki, Mikio; Minoda, Ayumi

    2016-01-01

    The unicellular red alga Galdieria sulphuraria grows efficiently and produces a large amount of biomass in acidic conditions at high temperatures. It has great potential to produce biofuels and other beneficial compounds without becoming contaminated with other organisms. In G. sulphuraria, biomass measurements and glycogen and lipid analyses demonstrated that the amounts and compositions of glycogen and lipids differed when cells were grown under autotrophic, mixotrophic, and heterotrophic conditions. Maximum biomass production was obtained in the mixotrophic culture. High amounts of glycogen were obtained in the mixotrophic cultures, while the amounts of neutral lipids were similar between mixotrophic and heterotrophic cultures. The amounts of neutral lipids were highest in red algae, including thermophiles. Glycogen structure and fatty acids compositions largely depended on the growth conditions. Copyright © 2015. Published by Elsevier Ltd.

  9. Method for large-scale fabrication of atomic-scale structures on material surfaces using surface vacancies

    DOEpatents

    Lim, Chong Wee; Ohmori, Kenji; Petrov, Ivan Georgiev; Greene, Joseph E.

    2004-07-13

    A method for forming atomic-scale structures on a surface of a substrate on a large-scale includes creating a predetermined amount of surface vacancies on the surface of the substrate by removing an amount of atoms on the surface of the material corresponding to the predetermined amount of the surface vacancies. Once the surface vacancies have been created, atoms of a desired structure material are deposited on the surface of the substrate to enable the surface vacancies and the atoms of the structure material to interact. The interaction causes the atoms of the structure material to form the atomic-scale structures.

  10. Toothpaste overdose

    MedlinePlus

    Poisonous ingredients include: Sodium fluoride Triclosan ... when swallowing a large amount of toothpaste containing fluoride: Convulsions Diarrhea Difficulty breathing Drooling Heart attack Salty ...

  11. Managing Materials and Wastes for Homeland Security Incidents

    EPA Pesticide Factsheets

    To provide information on waste management planning and preparedness before a homeland security incident, including preparing for the large amounts of waste that would need to be managed when an incident occurs, such as a large-scale natural disaster.

  12. Highly Sensitive GMO Detection Using Real-Time PCR with a Large Amount of DNA Template: Single-Laboratory Validation.

    PubMed

    Mano, Junichi; Hatano, Shuko; Nagatomi, Yasuaki; Futo, Satoshi; Takabatake, Reona; Kitta, Kazumi

    2018-03-01

    Current genetically modified organism (GMO) detection methods allow for sensitive detection. However, a further increase in sensitivity will enable more efficient testing for large grain samples and reliable testing for processed foods. In this study, we investigated real-time PCR-based GMO detection methods using a large amount of DNA template. We selected target sequences that are commonly introduced into many kinds of GM crops, i.e., 35S promoter and nopaline synthase (NOS) terminator. This makes the newly developed method applicable to a wide range of GMOs, including some unauthorized ones. The estimated LOD of the new method was 0.005% of GM maize events; to the best of our knowledge, this method is the most sensitive among the GM maize detection methods for which the LOD was evaluated in terms of GMO content. A 10-fold increase in the DNA amount as compared with the amount used under common testing conditions gave an approximately 10-fold reduction in the LOD without PCR inhibition. Our method is applicable to various analytical samples, including processed foods. The use of other primers and fluorescence probes would permit highly sensitive detection of various recombinant DNA sequences besides the 35S promoter and NOS terminator.

  13. 75 FR 54059 - Extension of Filing Accommodation for Static Pool Information in Filings With Respect to Asset...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... information could include a significant amount of statistical information that would be difficult to file... required static pool information. Given the large amount of statistical information involved, commentators....; and 18 U.S.C. 1350. * * * * * 2. Amend Sec. 232.312 paragraph (a) introductory text by removing...

  14. Hydrocyclone/Filter for Concentrating Biomarkers from Soil

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian; Obenhuber, Donald

    2008-01-01

    The hydrocyclone-filtration extractor (HFE), now undergoing development, is a simple, robust apparatus for processing large amounts of soil to extract trace amounts of microorganisms, soluble organic compounds, and other biomarkers from soil and to concentrate the extracts in amounts sufficient to enable such traditional assays as cell culturing, deoxyribonucleic acid (DNA) analysis, and isotope analysis. Originally intended for incorporation into a suite of instruments for detecting signs of life on Mars, the HFE could also be used on Earth for similar purposes, including detecting trace amounts of biomarkers or chemical wastes in soils.

  15. Novel Bioreactor Platform for Scalable Cardiomyogenic Differentiation from Pluripotent Stem Cell-Derived Embryoid Bodies.

    PubMed

    Rungarunlert, Sasitorn; Ferreira, Joao N; Dinnyes, Andras

    2016-01-01

    Generation of cardiomyocytes from pluripotent stem cells (PSCs) is a common and valuable approach to produce large amount of cells for various applications, including assays and models for drug development, cell-based therapies, and tissue engineering. All these applications would benefit from a reliable bioreactor-based methodology to consistently generate homogenous PSC-derived embryoid bodies (EBs) at a large scale, which can further undergo cardiomyogenic differentiation. The goal of this chapter is to describe a scalable method to consistently generate large amount of homogeneous and synchronized EBs from PSCs. This method utilizes a slow-turning lateral vessel bioreactor to direct the EB formation and their subsequent cardiomyogenic lineage differentiation.

  16. Species Profiles: Life Histories and Environmental Requirements of Coastal Fishes and Invertebrates (Mid-Atlantic): Atlantic Menhaden

    DTIC Science & Technology

    1989-08-01

    metabolism, and Important Atlantic menhaden growth. Low salinities decreased predators include bluefish (Pomatomus survival at temperatures below 5 ’C...large amounts of energy and materials. They are also important prey for large game fishes such as bluefish (Pomatomus saltatrix), striped bass

  17. Method and apparatus for displaying information

    NASA Technical Reports Server (NTRS)

    Huang, Sui (Inventor); Eichler, Gabriel (Inventor); Ingber, Donald E. (Inventor)

    2010-01-01

    A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.

  18. Spontaneous, generalized lipidosis in captive greater horseshoe bats (Rhinolophus ferrumequinum).

    PubMed

    Gozalo, Alfonso S; Schwiebert, Rebecca S; Metzner, Walter; Lawson, Gregory W

    2005-11-01

    During a routine 6-month quarantine period, 3 of 34 greater horseshoe bats (Rhinolophus ferrumequinum) captured in mainland China and transported to the United States for use in echolocation studies were found dead with no prior history of illness. All animals were in good body condition at the time of death. At necropsy, a large amount of white fat was found within the subcutis, especially in the sacrolumbar region. The liver, kidneys, and heart were diffusely tan in color. Microscopic examination revealed that hepatocytes throughout the liver were filled with lipid, and in some areas, lipid granulomas were present. renal lesions included moderate amounts of lipid in the cortical tubular epithelium and large amounts of protein and lipid within Bowman's capsules in the glomeruli. In addition, one bat had large lipid vacuoles diffusely distributed throughout the myocardium. The exact pathologic mechanism inducing the hepatic, renal, and cardiac lipidosis is unknown. The horseshoe bats were captured during hibernation and immediately transported to the United States. It is possible that the large amount of fat stored coupled with changes in photoperiod, lack of exercise, and/or the stress of captivity might have contributed to altering the normal metabolic processes, leading to anorexia and consequently lipidosis in these animals.

  19. Mucopolysaccharidosis type III

    MedlinePlus

    ... the enzymes needed to break down the heparan sulfate sugar chain are missing or defective. There are ... a large amount of a mucopolysaccharide called heparan sulfate in the urine. Other tests may include: Blood ...

  20. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  1. Diet and Co-ecology of Pleistocene Short-Faced Bears and Brown Bears in Eastern Beringia

    NASA Astrophysics Data System (ADS)

    Matheus, Paul E.

    1995-11-01

    Carbon and nitrogen stable isotope analysis of fossil bone collagen reveals that Pleistocene short-faced bears ( Arctodus simus) of Beringia were highly carnivorous, while contemporaneous brown bears ( Ursus arctos) had highly variable diets that included varying amounts of terrestrial vegetation, salmon, and small amounts of terrestrial meat. A reconsideration of the short-faced bear's highly derived morphology indicates that they foraged as scavengers of widely dispersed large mammal carcasses and were simultaneously designed both for highly efficient locomotion and for intimidating other large carnivores. This allowed Arctodus to forage economically over a large home range and seek out, procure, and defend carcasses from other large carnivores. The isotope data and this reconstruction of Arctodus' foraging behavior refute the hypothesis that competition from brown bears was a significant factor in the extinction of short-faced bears.

  2. Information Management System Supporting a Multiple Property Survey Program with Legacy Radioactive Contamination.

    PubMed

    Stager, Ron; Chambers, Douglas; Wiatzka, Gerd; Dupre, Monica; Callough, Micah; Benson, John; Santiago, Erwin; van Veen, Walter

    2017-04-01

    The Port Hope Area Initiative is a project mandated and funded by the Government of Canada to remediate properties with legacy low-level radioactive waste contamination in the Town of Port Hope, Ontario. The management and use of large amounts of data from surveys of some 4800 properties is a significant task critical to the success of the project. A large amount of information is generated through the surveys, including scheduling individual field visits to the properties, capture of field data laboratory sample tracking, QA/QC, property report generation and project management reporting. Web-mapping tools were used to track and display temporal progress of various tasks and facilitated consideration of spatial associations of contamination levels. The IM system facilitated the management and integrity of the large amounts of information collected, evaluation of spatial associations, automated report reproduction and consistent application and traceable execution for this project.x. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Vitamins

    MedlinePlus

    ... about taking large amounts of fat-soluble vitamin supplements. These include vitamins A, D, E, and K. These vitamins are stored in fat cells, and they can build up in your body and may cause harmful effects.

  4. Next-to-leading order Balitsky-Kovchegov equation with resummation

    DOE PAGES

    Lappi, T.; Mantysaari, H.

    2016-05-03

    Here, we solve the Balitsky-Kovchegov evolution equation at next-to-leading order accuracy including a resummation of large single and double transverse momentum logarithms to all orders. We numerically determine an optimal value for the constant under the large transverse momentum logarithm that enables including a maximal amount of the full NLO result in the resummation. When this value is used, the contribution from the α 2 s terms without large logarithms is found to be small at large saturation scales and at small dipoles. Close to initial conditions relevant for phenomenological applications, these fixed-order corrections are shown to be numerically important.

  5. Microphysical, Macrophysical and Radiative Signatures of Volcanic Aerosols in Trade Wind Cumulus Observed by the A-Train

    NASA Technical Reports Server (NTRS)

    Yuan, T.; Remer, L. A.; Yu, H.

    2011-01-01

    Increased aerosol concentrations can raise planetary albedo not only by reflecting sunlight and increasing cloud albedo, but also by changing cloud amount. However, detecting aerosol effect on cloud amount has been elusive to both observations and modeling due to potential buffering mechanisms and convolution of meteorology. Here through a natural experiment provided by long-tem1 degassing of a low-lying volcano and use of A-Train satellite observations, we show modifications of trade cumulus cloud fields including decreased droplet size, decreased precipitation efficiency and increased cloud amount are associated with volcanic aerosols. In addition we find significantly higher cloud tops for polluted clouds. We demonstrate that the observed microphysical and macrophysical changes cannot be explained by synoptic meteorology or the orographic effect of the Hawaiian Islands. The "total shortwave aerosol forcin", resulting from direct and indirect forcings including both cloud albedo and cloud amount. is almost an order of magnitude higher than aerosol direct forcing alone. Furthermore, the precipitation reduction associated with enhanced aerosol leads to large changes in the energetics of air-sea exchange and trade wind boundary layer. Our results represent the first observational evidence of large-scale increase of cloud amount due to aerosols in a trade cumulus regime, which can be used to constrain the representation of aerosol-cloud interactions in climate models. The findings also have implications for volcano-climate interactions and climate mitigation research.

  6. ODOT research news : fall 2006.

    DOT National Transportation Integrated Search

    2006-01-01

    ODOT Research News Fall 2006 includes : 1) calling for research unit. 2) Development of customized factors was possible because Oregon collects a large amount of high quality weigh-in-motion (WIM) data from sites around the State. 3) The Mechanically...

  7. BRFS: TOXICOLOGY AND RISK

    EPA Science Inventory

    Brominated flame retardants are a large class of diverse chemicals which are being used in increasing amounts world wide to protect against fires. The major classes include the polybrominated diphenyl ethers (PBDEs), the brominated bisphenols (e.g., tetrabromobisphenol A, TBBPA)...

  8. Gigwa-Genotype investigator for genome-wide analyses.

    PubMed

    Sempéré, Guilhem; Philippe, Florian; Dereeper, Alexis; Ruiz, Manuel; Sarah, Gautier; Larmande, Pierre

    2016-06-06

    Exploring the structure of genomes and analyzing their evolution is essential to understanding the ecological adaptation of organisms. However, with the large amounts of data being produced by next-generation sequencing, computational challenges arise in terms of storage, search, sharing, analysis and visualization. This is particularly true with regards to studies of genomic variation, which are currently lacking scalable and user-friendly data exploration solutions. Here we present Gigwa, a web-based tool that provides an easy and intuitive way to explore large amounts of genotyping data by filtering it not only on the basis of variant features, including functional annotations, but also on genotype patterns. The data storage relies on MongoDB, which offers good scalability properties. Gigwa can handle multiple databases and may be deployed in either single- or multi-user mode. In addition, it provides a wide range of popular export formats. The Gigwa application is suitable for managing large amounts of genomic variation data. Its user-friendly web interface makes such processing widely accessible. It can either be simply deployed on a workstation or be used to provide a shared data portal for a given community of researchers.

  9. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Tanning Products

    MedlinePlus

    ... the FDA. Some tanning pills contain the color additive canthaxanthin. When large amounts of canthaxanthin are ingested, it can turn the skin a range of colors from orange to brown. It can also cause serious health problems including liver damage; hives; and an eye ...

  11. Job Prospects for Industrial Engineers.

    ERIC Educational Resources Information Center

    Basta, Nicholas

    1985-01-01

    Recent economic growth and improved manufacturing profitability are supporting increased employment for industrial engineers. Promising areas include modernizing manufacturing technology and productivity with large amounts of hiring in aerospace, electronics, and instrumentation. Percentages of women employed in these fields for 1982 and 1983 are…

  12. Analyzing large scale genomic data on the cloud with Sparkhit

    PubMed Central

    Huang, Liren; Krüger, Jan

    2018-01-01

    Abstract Motivation The increasing amount of next-generation sequencing data poses a fundamental challenge on large scale genomic analytics. Existing tools use different distributed computational platforms to scale-out bioinformatics workloads. However, the scalability of these tools is not efficient. Moreover, they have heavy run time overheads when pre-processing large amounts of data. To address these limitations, we have developed Sparkhit: a distributed bioinformatics framework built on top of the Apache Spark platform. Results Sparkhit integrates a variety of analytical methods. It is implemented in the Spark extended MapReduce model. It runs 92–157 times faster than MetaSpark on metagenomic fragment recruitment and 18–32 times faster than Crossbow on data pre-processing. We analyzed 100 terabytes of data across four genomic projects in the cloud in 21 h, which includes the run times of cluster deployment and data downloading. Furthermore, our application on the entire Human Microbiome Project shotgun sequencing data was completed in 2 h, presenting an approach to easily associate large amounts of public datasets with reference data. Availability and implementation Sparkhit is freely available at: https://rhinempi.github.io/sparkhit/. Contact asczyrba@cebitec.uni-bielefeld.de Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253074

  13. ERTS-1 data user investigation of wetlands ecology

    NASA Technical Reports Server (NTRS)

    Anderson, R. R. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. ERTS-1 imagery (enlarged to 1:250,000) is an excellent tool by which large area coastal marshland mapping may be undertaken. If states can sacrifice some accuracy (amount unknown at this time) in placing of boundary lines, the technique may be used to do the following: (1) estimate extent of man's impact on marshes by ditching and lagooning and accelerated successional trends; (2) place boundaries between wetland and upland and hence estimate amount of coastal marshland remaining in the state; (3) distinguish among relatively large zones of various plant species including high and low growth S. alterniflora, J. roemerianus, and S. cynosuroides; and (4) estimate marsh plant species productivity when ground based information is available.

  14. Fourier transform infrared microspectroscopy for the analysis of the biochemical composition of C. elegans worms.

    PubMed

    Sheng, Ming; Gorzsás, András; Tuck, Simon

    2016-01-01

    Changes in intermediary metabolism have profound effects on many aspects of C. elegans biology including growth, development and behavior. However, many traditional biochemical techniques for analyzing chemical composition require relatively large amounts of starting material precluding the analysis of mutants that cannot be grown in large amounts as homozygotes. Here we describe a technique for detecting changes in the chemical compositions of C. elegans worms by Fourier transform infrared microspectroscopy. We demonstrate that the technique can be used to detect changes in the relative levels of carbohydrates, proteins and lipids in one and the same worm. We suggest that Fourier transform infrared microspectroscopy represents a useful addition to the arsenal of techniques for metabolic studies of C. elegans worms.

  15. Low flow fume hood

    DOEpatents

    Bell, Geoffrey C.; Feustel, Helmut E.; Dickerhoff, Darryl J.

    2002-01-01

    A fume hood is provided having an adequate level of safety while reducing the amount of air exhausted from the hood. A displacement flow fume hood works on the principal of a displacement flow which displaces the volume currently present in the hood using a push-pull system. The displacement flow includes a plurality of air supplies which provide fresh air, preferably having laminar flow, to the fume hood. The displacement flow fume hood also includes an air exhaust which pulls air from the work chamber in a minimally turbulent manner. As the displacement flow produces a substantially consistent and minimally turbulent flow in the hood, inconsistent flow patterns associated with contaminant escape from the hood are minimized. The displacement flow fume hood largely reduces the need to exhaust large amounts of air from the hood. It has been shown that exhaust air flow reductions of up to 70% are possible without a decrease in the hood's containment performance. The fume hood also includes a number of structural adaptations which facilitate consistent and minimally turbulent flow within a fume hood.

  16. Integrated sequencing of exome and mRNA of large-sized single cells.

    PubMed

    Wang, Lily Yan; Guo, Jiajie; Cao, Wei; Zhang, Meng; He, Jiankui; Li, Zhoufang

    2018-01-10

    Current approaches of single cell DNA-RNA integrated sequencing are difficult to call SNPs, because a large amount of DNA and RNA is lost during DNA-RNA separation. Here, we performed simultaneous single-cell exome and transcriptome sequencing on individual mouse oocytes. Using microinjection, we kept the nuclei intact to avoid DNA loss, while retaining the cytoplasm inside the cell membrane, to maximize the amount of DNA and RNA captured from the single cell. We then conducted exome-sequencing on the isolated nuclei and mRNA-sequencing on the enucleated cytoplasm. For single oocytes, exome-seq can cover up to 92% of exome region with an average sequencing depth of 10+, while mRNA-sequencing reveals more than 10,000 expressed genes in enucleated cytoplasm, with similar performance for intact oocytes. This approach provides unprecedented opportunities to study DNA-RNA regulation, such as RNA editing at single nucleotide level in oocytes. In future, this method can also be applied to other large cells, including neurons, large dendritic cells and large tumour cells for integrated exome and transcriptome sequencing.

  17. Effects of precipitation changes on switchgrass photosynthesis, growth, and biomass: A mesocosm experiment

    USDA-ARS?s Scientific Manuscript database

    Climate changes, including chronic changes in precipitation amounts, will influence plant physiology and growth. However, such precipitation effects on switchgrass, a major bioenergy crop, have not been well investigated. We conducted a two-year precipitation simulation experiment using large pots...

  18. Expression, purification, and characterization of almond (Prunus dulcis) allergen Pru du 4

    USDA-ARS?s Scientific Manuscript database

    Biochemical characterizations of food allergens are required for understanding the allergenicity of food allergens. Such studies require a relatively large amount of highly purified allergens. Profilins from numerous species are known to be allergens, including food allergens, such as almond (Prunus...

  19. Evaluation of a Viscosity-Molecular Weight Relationship.

    ERIC Educational Resources Information Center

    Mathias, Lon J.

    1983-01-01

    Background information, procedures, and results are provided for a series of graduate/undergraduate polymer experiments. These include synthesis of poly(methylmethacrylate), viscosity experiment (indicating large effect even small amounts of a polymer may have on solution properties), and measurement of weight-average molecular weight by light…

  20. What about the Bottle? Answers to Common Questions.

    ERIC Educational Resources Information Center

    Laird, Valerie

    2001-01-01

    Acknowledges the large amount of confusing information about bottle feeding in areas including nutrition, sanitation, dental health, psychology, and child development. Answers specific questions pertaining to choice of formula and formula preparation, supporting breastfeeding, bottle choice, solid food introduction, feeding position, spitting up,…

  1. Open-Ended Electric Motor

    ERIC Educational Resources Information Center

    Gould, Mauri

    1975-01-01

    Presents complete instructions for assembling an electric motor which does not require large amounts of power to operate and which is inexpensive as well as reliable. Several open-ended experiments with the motor are included as well as information for obtaining a kit of parts and instructions. (BR)

  2. Monovalent cation conductance in Xenopus laevis oocytes expressing hCAT-3.

    PubMed

    Gilles, Wolfgang; Vulcu, Sebastian D; Liewald, Jana F; Habermeier, Alice; Vékony, Nicole; Closs, Ellen I; Rupp, Johanna; Nawrath, Hermann

    2005-03-01

    hCAT-3 (human cationic amino acid transporter type three) was investigated with both the two-electrode voltage clamp method and tracer experiments. Oocytes expressing hCAT-3 displayed less negative membrane potentials and larger voltage-dependent currents than native or water-injected oocytes did. Ion substitution experiments in hCAT-3-expressing oocytes revealed a large conductance for Na+ and K+. In the presence of L-Arg, voltage-dependent inward and outward currents were observed. At symmetrical (inside/outside) concentrations of L-Arg, the conductance of the transporter increased monoexponentially with the L-Arg concentrations; the calculated Vmax and KM values amounted to 8.3 microS and 0.36 mM, respectively. The time constants of influx and efflux of [3H]L-Arg, at symmetrically inside/outside L-Arg concentrations (1 mM), amounted to 79 and 77 min, respectively. The flux data and electrophysiological experiments suggest that the transport of L-Arg through hCAT-3 is symmetric, when the steady state of L-Arg flux has been reached. It is concluded that hCAT-3 is a passive transport system that conducts monovalent cations including L-Arg. The particular role of hCAT-3 in the diverse tissues remains to be elucidated.

  3. Phosphorus: a limiting nutrient for humanity?

    PubMed

    Elser, James J

    2012-12-01

    Phosphorus is a chemical element that is essential to life because of its role in numerous key molecules, including DNA and RNA; indeed, organisms require large amounts of P to grow rapidly. However, the supply of P from the environment is often limiting to production, including to crops. Thus, large amounts of P are mined annually to produce fertilizer that is applied in support of the 'Green Revolution.' However, much of this fertilizer eventually ends up in rivers, lakes and oceans where it causes costly eutrophication. Furthermore, given increasing human population, expanding meat consumption, and proliferating bioenergy pressures, concerns have recently been raised about the long-term geological, economic, and geopolitical viability of mined P for fertilizer production. Together, these issues highlight the non-sustainable nature of current human P use. To achieve P sustainability, farms need to become more efficient in how they use P while society as a whole must develop technologies and practices to recycle P from the food chain. Such large-scale changes will probably require a radical restructuring of the entire food system, highlighting the need for prompt but sustained action. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Exploring the Amount and Type of Writing Instruction during Language Arts Instruction in Kindergarten Classrooms

    PubMed Central

    Puranik, Cynthia S.; Al Otaiba, Stephanie; Sidler, Jessica Folsom; Greulich, Luana

    2014-01-01

    The objective of this exploratory investigation was to examine the nature of writing instruction in kindergarten classrooms and to describe student writing outcomes at the end of the school year. Participants for this study included 21 teachers and 238 kindergarten children from nine schools. Classroom teachers were videotaped once each in the fall and winter during the 90 minute instructional block for reading and language arts to examine time allocation and the types of writing instructional practices taking place in the kindergarten classrooms. Classroom observation of writing was divided into student-practice variables (activities in which students were observed practicing writing or writing independently) and teacher-instruction variables (activities in which the teacher was observed providing direct writing instruction). In addition, participants completed handwriting fluency, spelling, and writing tasks. Large variability was observed in the amount of writing instruction occurring in the classroom, the amount of time kindergarten teachers spent on writing and in the amount of time students spent writing. Marked variability was also observed in classroom practices both within and across schools and this fact was reflected in the large variability noted in kindergartners’ writing performance. PMID:24578591

  5. Exploring the Amount and Type of Writing Instruction during Language Arts Instruction in Kindergarten Classrooms.

    PubMed

    Puranik, Cynthia S; Al Otaiba, Stephanie; Sidler, Jessica Folsom; Greulich, Luana

    2014-02-01

    The objective of this exploratory investigation was to examine the nature of writing instruction in kindergarten classrooms and to describe student writing outcomes at the end of the school year. Participants for this study included 21 teachers and 238 kindergarten children from nine schools. Classroom teachers were videotaped once each in the fall and winter during the 90 minute instructional block for reading and language arts to examine time allocation and the types of writing instructional practices taking place in the kindergarten classrooms. Classroom observation of writing was divided into student-practice variables (activities in which students were observed practicing writing or writing independently) and teacher-instruction variables (activities in which the teacher was observed providing direct writing instruction). In addition, participants completed handwriting fluency, spelling, and writing tasks. Large variability was observed in the amount of writing instruction occurring in the classroom, the amount of time kindergarten teachers spent on writing and in the amount of time students spent writing. Marked variability was also observed in classroom practices both within and across schools and this fact was reflected in the large variability noted in kindergartners' writing performance.

  6. Large-scale volcanism associated with coronae on Venus

    NASA Technical Reports Server (NTRS)

    Roberts, K. Magee; Head, James W.

    1993-01-01

    The formation and evolution of coronae on Venus are thought to be the result of mantle upwellings against the crust and lithosphere and subsequent gravitational relaxation. A variety of other features on Venus have been linked to processes associated with mantle upwelling, including shield volcanoes on large regional rises such as Beta, Atla and Western Eistla Regiones and extensive flow fields such as Mylitta and Kaiwan Fluctus near the Lada Terra/Lavinia Planitia boundary. Of these features, coronae appear to possess the smallest amounts of associated volcanism, although volcanism associated with coronae has only been qualitatively examined. An initial survey of coronae based on recent Magellan data indicated that only 9 percent of all coronae are associated with substantial amounts of volcanism, including interior calderas or edifices greater than 50 km in diameter and extensive, exterior radial flow fields. Sixty-eight percent of all coronae were found to have lesser amounts of volcanism, including interior flooding and associated volcanic domes and small shields; the remaining coronae were considered deficient in associated volcanism. It is possible that coronae are related to mantle plumes or diapirs that are lower in volume or in partial melt than those associated with the large shields or flow fields. Regional tectonics or variations in local crustal and thermal structure may also be significant in determining the amount of volcanism produced from an upwelling. It is also possible that flow fields associated with some coronae are sheet-like in nature and may not be readily identified. If coronae are associated with volcanic flow fields, then they may be a significant contributor to plains formation on Venus, as they number over 300 and are widely distributed across the planet. As a continuation of our analysis of large-scale volcanism on Venus, we have reexamined the known population of coronae and assessed quantitatively the scale of volcanism associated with them. In particular, we have examined the percentage of coronae associated with volcanic flow fields (i.e., a collection of digitate or sheet-like lava flows extending from the corona interior or annulus); the range in scale of these flow fields; the variations in diameter, structure and stratigraphy of coronae with flow fields; and the global distribution of coronae associated with flow fields.

  7. Accelerator infrastructure in Europe: EuCARD 2011

    NASA Astrophysics Data System (ADS)

    Romaniuk, Ryszard S.

    2011-10-01

    The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, shown during the annual meeting of the EuCARD - European Coordination of Accelerator Research and Development. The conference concerns building of the research infrastructure, including in this advanced photonic and electronic systems for servicing large high energy physics experiments. There are debated a few basic groups of such systems like: measurement - control networks of large geometrical extent, multichannel systems for large amounts of metrological data acquisition, precision photonic networks of reference time, frequency and phase distribution.

  8. Technology in Education: Research Says!!

    ERIC Educational Resources Information Center

    Canuel, Ron

    2011-01-01

    A large amount of research existed in the field of technology in the classroom; however, almost all was focused on the impact of desktop computers and the infamous "school computer room". However, the activities in a classroom represent a multitude of behaviours and interventions, including personal dynamics, classroom management and…

  9. Aquarius/SAC-D soil moisture product using V3.0 observations

    USDA-ARS?s Scientific Manuscript database

    Although Aquarius was designed for ocean salinity mapping, our objective in this investigation is to exploit the large amount of land observations that Aquarius acquires and extend the mission scope to include the retrieval of surface soil moisture. The soil moisture retrieval algorithm development ...

  10. Investigation of estimators of probability density functions

    NASA Technical Reports Server (NTRS)

    Speed, F. M.

    1972-01-01

    Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.

  11. Destruction of Navy Hazardous Wastes by Supercritical Water Oxidation

    DTIC Science & Technology

    1994-08-01

    cleaning and derusting (nitrite and citric acid solutions), electroplating ( acids and metal bearing solutions), electronics and refrigeration... acid forming chemical species or that contain a large amount of dissolved solids present a challenge to current SCWO •-chnology. Approved for public...Waste streams that contain a large amount of mineral- acid forming chemical species or that contain a large amount of dissolved solids present a challenge

  12. Stream measurement work: Chapter 8 in Seventeenth biennial report of the State Engineer to the governor of Utah: 1929-1930

    USGS Publications Warehouse

    Purton, A.B.

    1930-01-01

    General stream measurement work looking toward a comprehensive inventory of the water resources of the state has been continued during the biennium by the United States Geological Survey under the usual cooperative agreement with the State Engineer.Since 1909 Utah in company with many other states has made regular legislative appropriations for the purpose of assisting and hastening the determination of the water supply of the United States by the Geographical Survey. Because of the comparatively small Federal appropriations the scope of this wok in the individual states has been largely influenced by the amount of the state cooperation. The funds contributed by each state have all been expended within that state and matched as far as possible by funds of the Geographical Survey. Up to the present, however, the Federal funds have been insufficient to match the state contributions beyond a very limited amount and in many localities the large amount of work done has been made possible only by correspondingly large unmatched state appropriations.During this period the regular stream gaging work in Utah has been practically limited to that possible with approximately ten thousand dollars annually divided about equally between the state and Geological Survey with the government’s share including the cost at Washington of general supervision, and the review, editing, and publication of the records. This has been the maximum amount that it has been possible to allot any one state to meet state cooperation.

  13. 17 CFR Appendix B to Part 420 - Sample Large Position Report

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Memorandum 1 $ Memorandum 2: Report the gross par amount of fails to deliver. Included in the calculation of... millions at par value as of trade date] Security Being Reported Date For Which Information is Being... Principal Components of the Specific Security $________ Total Net Trading Position $ 2. Gross Financing...

  14. Flow cytometry of sputum: assessing inflammation and immune response elements in the bronchial airways**

    EPA Science Inventory

    Rationale: The evaluation of sputum leukocytes by flow cytometry is an opportunity to assess characteristics of cells residing in the central airways, yet it is hampered by certain inherent properties of sputum including mucus and large amounts of contaminating cells and debris. ...

  15. Virtual Games in Social Science Education

    ERIC Educational Resources Information Center

    Lopez, Jose M. Cuenca; Caceres, Myriam J. Martin

    2010-01-01

    The new technologies make the appearance of highly motivating and dynamic games with different levels of interaction possible, in which large amounts of data, information, procedures and values are included which are intimately bound with the social sciences. We set out from the hypothesis that videogames may become interesting resources for their…

  16. Peanut fatty acids and their impact on human health

    USDA-ARS?s Scientific Manuscript database

    Peanuts contain a large amount of fat. Much of it is unsaturated, giving peanuts a positive effect on human health. A number of positive health effects from consuming peanuts have been reported in the scientific literature. These include lowering blood pressure, decreasing the risk of heart disea...

  17. Your Guide to Smart Year-Round Fundraising.

    ERIC Educational Resources Information Center

    Gensheimer, Cynthia Francis

    1994-01-01

    Describes three seasonal fund-raising projects that can be linked with the curriculum and require only a few parent volunteers. The projects are profitable without requiring large amounts of effort. They include selling holiday cards, conducting read-a-thons, and participating in save-the-rainforest group sales. Tips for holding successful fund…

  18. Empirical relationships between tree fall and landscape-level amounts of logging and fire

    PubMed Central

    Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C.

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape. PMID:29474487

  19. Empirical relationships between tree fall and landscape-level amounts of logging and fire.

    PubMed

    Lindenmayer, David B; Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape.

  20. Switch: a planning tool for power systems with large shares of intermittent renewable energy.

    PubMed

    Fripp, Matthias

    2012-06-05

    Wind and solar power are highly variable, so it is it unclear how large a role they can play in future power systems. This work introduces a new open-source electricity planning model--Switch--that identifies the least-cost strategy for using renewable and conventional generators and transmission in a large power system over a multidecade period. Switch includes an unprecedented amount of spatial and temporal detail, making it possible to address a new type of question about the optimal design and operation of power systems with large amounts of renewable power. A case study of California for 2012-2027 finds that there is no maximum possible penetration of wind and solar power--these resources could potentially be used to reduce emissions 90% or more below 1990 levels without reducing reliability or severely raising the cost of electricity. This work also finds that policies that encourage customers to shift electricity demand to times when renewable power is most abundant (e.g., well-timed charging of electric vehicles) could make it possible to achieve radical emission reductions at moderate costs.

  1. A model for prioritizing landfills for remediation and closure: A case study in Serbia.

    PubMed

    Ubavin, Dejan; Agarski, Boris; Maodus, Nikola; Stanisavljevic, Nemanja; Budak, Igor

    2018-01-01

    The existence of large numbers of landfills that do not fulfill sanitary prerequisites presents a serious hazard for the environment in lower income countries. One of the main hazards is landfill leachate that contains various pollutants and presents a threat to groundwater. Groundwater pollution from landfills depends on various mutually interconnected factors such as the waste type and amount, the amount of precipitation, the landfill location characteristics, and operational measures, among others. Considering these factors, lower income countries face a selection problem where landfills urgently requiring remediation and closure must be identified from among a large number of sites. The present paper proposes a model for prioritizing landfills for closure and remediation based on multicriteria decision making, in which the hazards of landfill groundwater pollution are evaluated. The parameters for the prioritization of landfills are the amount of waste disposed, the amount of precipitation, the vulnerability index, and the rate of increase of the amount of waste in the landfill. Verification was performed using a case study in Serbia where all municipal landfills were included and 128 landfills were selected for prioritization. The results of the evaluation of Serbian landfills, prioritizing sites for closure and remediation, are presented for the first time. Critical landfills are identified, and prioritization ranks for the selected landfills are provided. Integr Environ Assess Manag 2018;14:105-119. © 2017 SETAC. © 2017 SETAC.

  2. CD-ROM technology at the EROS data center

    USGS Publications Warehouse

    Madigan, Michael E.; Weinheimer, Mary C.

    1993-01-01

    The vast amount of digital spatial data often required by a single user has created a demand for media alternatives to 1/2" magnetic tape. One such medium that has been recently adopted at the U.S. Geological Survey's EROS Data Center is the compact disc (CD). CD's are a versatile, dynamic, and low-cost method for providing a variety of data on a single media device and are compatible with various computer platforms. CD drives are available for personal computers, UNIX workstations, and mainframe systems, either directly connected, or through a network. This medium furnishes a quick method of reproducing and distributing large amounts of data on a single CD. Several data sets are already available on CD's, including collections of historical Landsat multispectral scanner data and biweekly composites of Advanced Very High Resolution Radiometer data for the conterminous United States. The EROS Data Center intends to provide even more data sets on CD's. Plans include specific data sets on a customized disc to fulfill individual requests, and mass production of unique data sets for large-scale distribution. Requests for a single compact disc-read only memory (CD-ROM) containing a large volume of data either for archiving or for one-time distribution can be addressed with a CD-write once (CD-WO) unit. Mass production and large-scale distribution will require CD-ROM replication and mastering.

  3. Evaluation of high-level clouds in cloud resolving model simulations with ARM and KWAJEX observations

    DOE PAGES

    Liu, Zheng; Muhlbauer, Andreas; Ackerman, Thomas

    2015-11-05

    In this paper, we evaluate high-level clouds in a cloud resolving model during two convective cases, ARM9707 and KWAJEX. The simulated joint histograms of cloud occurrence and radar reflectivity compare well with cloud radar and satellite observations when using a two-moment microphysics scheme. However, simulations performed with a single moment microphysical scheme exhibit low biases of approximately 20 dB. During convective events, two-moment microphysical overestimate the amount of high-level cloud and one-moment microphysics precipitate too readily and underestimate the amount and height of high-level cloud. For ARM9707, persistent large positive biases in high-level cloud are found, which are not sensitivemore » to changes in ice particle fall velocity and ice nuclei number concentration in the two-moment microphysics. These biases are caused by biases in large-scale forcing and maintained by the periodic lateral boundary conditions. The combined effects include significant biases in high-level cloud amount, radiation, and high sensitivity of cloud amount to nudging time scale in both convective cases. The high sensitivity of high-level cloud amount to the thermodynamic nudging time scale suggests that thermodynamic nudging can be a powerful ‘‘tuning’’ parameter for the simulated cloud and radiation but should be applied with caution. The role of the periodic lateral boundary conditions in reinforcing the biases in cloud and radiation suggests that reducing the uncertainty in the large-scale forcing in high levels is important for similar convective cases and has far reaching implications for simulating high-level clouds in super-parameterized global climate models such as the multiscale modeling framework.« less

  4. A simple biosynthetic pathway for large product generation from small substrate amounts

    NASA Astrophysics Data System (ADS)

    Djordjevic, Marko; Djordjevic, Magdalena

    2012-10-01

    A recently emerging discipline of synthetic biology has the aim of constructing new biosynthetic pathways with useful biological functions. A major application of these pathways is generating a large amount of the desired product. However, toxicity due to the possible presence of toxic precursors is one of the main problems for such production. We consider here the problem of generating a large amount of product from a potentially toxic substrate. To address this, we propose a simple biosynthetic pathway, which can be induced in order to produce a large number of the product molecules, by keeping the substrate amount at low levels. Surprisingly, we show that the large product generation crucially depends on fast non-specific degradation of the substrate molecules. We derive an optimal induction strategy, which allows as much as three orders of magnitude increase in the product amount through biologically realistic parameter values. We point to a recently discovered bacterial immune system (CRISPR/Cas in E. coli) as a putative example of the pathway analysed here. We also argue that the scheme proposed here can be used not only as a stand-alone pathway, but also as a strategy to produce a large amount of the desired molecules with small perturbations of endogenous biosynthetic pathways.

  5. Estimation and change tendency of rape straw resource in Leshan

    NASA Astrophysics Data System (ADS)

    Guan, Qinlan; Gong, Mingfu

    2018-04-01

    Rape straw in Leshan area are rape stalks, including stems, leaves and pods after removing rapeseed. Leshan area is one of the main rape planting areas in Sichuan Province and rape planting area is large. Each year will produce a lot of rape straw. Based on the analysis of the trend of rapeseed planting area and rapeseed yield from 2008 to 2014, the change trend of rape straw resources in Leshan from 2008 to 2014 was analyzed and the decision-making reference was provided for resource utilization of rape straw. The results showed that the amount of rape straw resources in Leshan was very large, which was more than 100,000 tons per year, which was increasing year by year. By 2014, the amount of rape straw resources in Leshan was close to 200,000 tons.

  6. Discovering Related Clinical Concepts Using Large Amounts of Clinical Notes

    PubMed Central

    Ganesan, Kavita; Lloyd, Shane; Sarkar, Vikren

    2016-01-01

    The ability to find highly related clinical concepts is essential for many applications such as for hypothesis generation, query expansion for medical literature search, search results filtering, ICD-10 code filtering and many other applications. While manually constructed medical terminologies such as SNOMED CT can surface certain related concepts, these terminologies are inadequate as they depend on expertise of several subject matter experts making the terminology curation process open to geographic and language bias. In addition, these terminologies also provide no quantifiable evidence on how related the concepts are. In this work, we explore an unsupervised graphical approach to mine related concepts by leveraging the volume within large amounts of clinical notes. Our evaluation shows that we are able to use a data driven approach to discovering highly related concepts for various search terms including medications, symptoms and diseases. PMID:27656096

  7. Enhancing the use of waste activated sludge as bio-fuel through selectively reducing its heavy metal content.

    PubMed

    Dewil, Raf; Baeyens, Jan; Appels, Lise

    2007-06-18

    Power plant or cement kiln co-incineration are important disposal routes for the large amounts of waste activated sludge (WAS) which are generated annually. The presence of significant amounts of heavy metals in the sludge however poses serious problems since they are partly emitted with the flue gases (and collected in the flue gas dedusting) and partly incorporated in the ashes of the incinerator: in both cases, the disposal or reuse of the fly ash and bottom ashes can be jeopardized since subsequent leaching in landfill disposal can occur, or their "pozzolanic" incorporation in cement cannot be applied. The present paper studies some physicochemical methods for reducing the heavy metal content of WAS. The used techniques include acid and alkaline thermal hydrolysis and Fenton's peroxidation. By degrading the extracellular polymeric substances, binding sites for a large amount of heavy metals, the latter are released into the sludge water. The behaviour of several heavy metals (Cd, Cr, Cu, Hg, Pb, Ni, Zn) was assessed in laboratory tests. Results of these show a significant reduction of most heavy metals.

  8. Coping with Prescription Drug Cost Sharing: Knowledge, Adherence, and Financial Burden

    PubMed Central

    Reed, Mary; Brand, Richard; Newhouse, Joseph P; Selby, Joe V; Hsu, John

    2008-01-01

    Objective Assess patient knowledge of and response to drug cost sharing. Study Setting Adult members of a large prepaid, integrated delivery system. Study Design/Data Collection Telephone interviews with 932 participants (72 percent response rate) who reported knowledge of the structures and amounts of their prescription drug cost sharing. Participants reported cost-related changes in their drug adherence, any financial burden, and other cost-coping behaviors. Actual cost sharing amounts came from administrative databases. Principal Findings Overall, 27 percent of patients knew all of their drug cost sharing structures and amounts. After adjustment for individual characteristics, additional patient cost sharing structures (tiers and caps), and higher copayment amounts were associated with reporting decreased adherence, financial burden, or other cost-coping behaviors. Conclusions Patient knowledge of their drug benefits is limited, especially for more complex cost sharing structures. Patients also report a range of responses to greater cost sharing, including decreasing adherence. PMID:18370979

  9. Annealing Increases Stability Of Iridium Thermocouples

    NASA Technical Reports Server (NTRS)

    Germain, Edward F.; Daryabeigi, Kamran; Alderfer, David W.; Wright, Robert E.; Ahmed, Shaffiq

    1989-01-01

    Metallurgical studies carried out on samples of iridium versus iridium/40-percent rhodium thermocouples in condition received from manufacturer. Metallurgical studies included x-ray, macroscopic, resistance, and metallographic studies. Revealed large amount of internal stress caused by cold-working during manufacturing, and large number of segregations and inhomogeneities. Samples annealed in furnace at temperatures from 1,000 to 2,000 degree C for intervals up to 1 h to study effects of heat treatment. Wire annealed by this procedure found to be ductile.

  10. Proxy system modeling of tree-ring isotope chronologies over the Common Era

    NASA Astrophysics Data System (ADS)

    Anchukaitis, K. J.; LeGrande, A. N.

    2017-12-01

    The Asian monsoon can be characterized in terms of both precipitation variability and atmospheric circulation across a range of spatial and temporal scales. While multicentury time series of tree-ring widths at hundreds of sites across Asia provide estimates of past rainfall, the oxygen isotope ratios of annual rings may reveal broader regional hydroclimate and atmosphere-ocean dynamics. Tree-ring oxygen isotope chronologies from Monsoon Asia have been interpreted to reflect a local 'amount effect', relative humidity, source water and seasonality, and winter snowfall. Here, we use an isotope-enabled general circulation model simulation from the NASA Goddard Institute for Space Science (GISS) Model E and a proxy system model of the oxygen isotope composition of tree-ring cellulose to interpret the large-scale and local climate controls on δ 18O chronologies. Broad-scale dominant signals are associated with a suite of covarying hydroclimate variables including growing season rainfall amounts, relative humidity, and vapor pressure deficit. Temperature and source water influences are region-dependent, as are the simulated tree-ring isotope signals associated with the El Nino Southern Oscillation (ENSO) and large-scale indices of the Asian monsoon circulation. At some locations, including southern coastal Viet Nam, local precipitation isotope ratios and the resulting simulated δ 18O tree-ring chronologies reflect upstream rainfall amounts and atmospheric circulation associated with monsoon strength and wind anomalies.

  11. 17 CFR Appendix B to Part 420 - Sample Large Position Report

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... $ Memorandum 2: Report the gross par amount of fails to deliver. Included in the calculation of line item 3... millions at par value as of trade date] Security Being Reported Date For Which Information is Being... Principal Components of the Specific Security $ Total Net Trading Position $ 2. Gross Financing Position...

  12. 17 CFR Appendix B to Part 420 - Sample Large Position Report

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... $ Memorandum 2: Report the gross par amount of fails to deliver. Included in the calculation of line item 3... millions at par value as of trade date] Security Being Reported Date For Which Information is Being... Principal Components of the Specific Security $ Total Net Trading Position $ 2. Gross Financing Position...

  13. 17 CFR Appendix B to Part 420 - Sample Large Position Report

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... $ Memorandum 2: Report the gross par amount of fails to deliver. Included in the calculation of line item 3... millions at par value as of trade date] Security Being Reported Date For Which Information is Being... Principal Components of the Specific Security $ Total Net Trading Position $ 2. Gross Financing Position...

  14. A Review of Large-Scale "How Much Information?" Inventories: Variations, Achievements and Challenges

    ERIC Educational Resources Information Center

    Hilbert, Martin

    2015-01-01

    Introduction: Pressed by the increasing social importance of digital information, including the current attention given to the "big data paradigm", several research projects have taken up the challenge to quantify the amount of technologically mediated information. Method: This meta-study reviews the eight most important inventories in a…

  15. 76 FR 66629 - Establishment of the Pine Mountain-Cloverdale Peak Viticultural Area

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-27

    ... explains. The petition states that local growers report that Pine Mountain vineyards are naturally free of.... Southern storms often stall over Pine Mountain and the Mayacmas range, dropping more rain than in other..., and very well to excessively well-drained. Also, these mountain soils include large amounts of sand...

  16. 75 FR 29686 - Proposed Establishment of the Pine Mountain-Mayacmas Viticultural Area

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-27

    ... states that local growers report that Pine Mountain vineyards are naturally free of mildew, a vineyard... often stall over Pine Mountain and the Mayacmas range, dropping more rain than in other areas. Pine..., these mountain soils include large amounts of sand and gravel. Pine Mountain soils are generally less...

  17. Light intensity and quality from sole-source light-emitting diodes impact growth, morphology, and nutrient content of Brassica microgreens

    USDA-ARS?s Scientific Manuscript database

    Multi-layer vertical production systems using sole-source (SS) lighting can be used for microgreen production; however, traditional SS lighting can consume large amounts of electrical energy. Light-emitting diodes (LEDs) offer many advantages over conventional light sources including: high photoelec...

  18. Helping Young Children Understand Graphs: A Demonstration Study.

    ERIC Educational Resources Information Center

    Freeland, Kent; Madden, Wendy

    1990-01-01

    Outlines a demonstration lesson showing third graders how to make and interpret graphs. Includes descriptions of purpose, vocabulary, and learning activities in which students graph numbers of students with dogs at home and analyze the contents of M&M candy packages by color. Argues process helps students understand large amounts of abstract…

  19. Response to Fenton and Fenton: evidence does not support the alkaline diet

    USDA-ARS?s Scientific Manuscript database

    In the space available in this broad review, we focused on large trials published since the 2011 Fenton meta-analysis. This included two trials published in 2013 and one in 2015. These trials found favorable effects of supplementation with alkaline salts of potassium, in amounts of 60 mmol/day and h...

  20. Bringing Text Display Digital Radio to Consumers with Hearing Loss

    ERIC Educational Resources Information Center

    Sheffield, Ellyn G.; Starling, Michael; Schwab, Daniel

    2011-01-01

    Radio is migrating to digital transmission, expanding its offerings to include captioning for individuals with hearing loss. Text display radio requires a large amount of word throughput with minimal screen display area, making good user interface design crucial to its success. In two experiments, we presented hearing, hard-of-hearing, and deaf…

  1. Ammonia losses from a southern high plains dairy during summer

    USDA-ARS?s Scientific Manuscript database

    Animal agriculture is a significant source of ammonia (NH3). Cattle excrete a large amount of nitrogen (N); most urinary N is converted to NH3, volatilized and lost to the atmosphere. Open lot dairies on the southern High Plains are a growing industry and face environmental challenges including repo...

  2. Looking for High Quality Accreditation in Higher Education in Colombia

    ERIC Educational Resources Information Center

    Pérez Gama, Jesús Alfonso; Vega Vega, Anselmo

    2017-01-01

    We look for the High Quality Accreditation of tertiary education in two ways: one, involving large amount of information, including issues such as self-assessment, high quality, statistics, indicators, surveys, and field work (process engineering), during several periods of time; and the second, in relation to the information contained there about…

  3. Molecular Hydrogen as an Emerging Therapeutic Medical Gas for Neurodegenerative and Other Diseases

    PubMed Central

    Ohno, Kinji; Ito, Mikako; Ichihara, Masatoshi; Ito, Masafumi

    2012-01-01

    Effects of molecular hydrogen on various diseases have been documented for 63 disease models and human diseases in the past four and a half years. Most studies have been performed on rodents including two models of Parkinson's disease and three models of Alzheimer's disease. Prominent effects are observed especially in oxidative stress-mediated diseases including neonatal cerebral hypoxia; Parkinson's disease; ischemia/reperfusion of spinal cord, heart, lung, liver, kidney, and intestine; transplantation of lung, heart, kidney, and intestine. Six human diseases have been studied to date: diabetes mellitus type 2, metabolic syndrome, hemodialysis, inflammatory and mitochondrial myopathies, brain stem infarction, and radiation-induced adverse effects. Two enigmas, however, remain to be solved. First, no dose-response effect is observed. Rodents and humans are able to take a small amount of hydrogen by drinking hydrogen-rich water, but marked effects are observed. Second, intestinal bacteria in humans and rodents produce a large amount of hydrogen, but an addition of a small amount of hydrogen exhibits marked effects. Further studies are required to elucidate molecular bases of prominent hydrogen effects and to determine the optimal frequency, amount, and method of hydrogen administration for each human disease. PMID:22720117

  4. Natural gas hydrate occurrence and issues

    USGS Publications Warehouse

    Kvenvolden, K.A.

    1994-01-01

    Naturally occurring gas hydrate is found in sediment of two regions: (1) continental, including continental shelves, at high latitudes where surface temperatures are very cold, and (2) submarine outer continental margins where pressures are very high and bottom-water temperatures are near 0??C. Continental gas hydrate is found in association with onshore and offshore permafrost. Submarine gas hydrate is found in sediment of continental slopes and rises. The amount of methane present in gas hydrate is thought to be very large, but the estimates that have been made are more speculative than real. Nevertheless, at the present time there has been a convergence of ideas regarding the amount of methane in gas hydrate deposits worldwide at about 2 x 1016 m3 or 7 x 1017 ft3 = 7 x 105 Tcf [Tcf = trillion (1012) ft3]. The potentially large amount of methane in gas hydrate and the shallow depth of gas hydrate deposits are two of the principal factors driving research concerning this substance. Such a large amount of methane, if it could be commercially produced, provides a potential energy resource for the future. Because gas hydrate is metastable, changes of surface pressure and temperature affect its stability. Destabilized gas hydrate beneath the sea floor leads to geologic hazards such as submarine mass movements. Examples of submarine slope failures attributed to gas hydrate are found worldwide. The metastability of gas hydrate may also have an effect on climate. The release of methane, a 'greenhouse' gas, from destabilized gas hydrate may contribute to global warming and be a factor in global climate change.

  5. Gas Production Within Stromatolites Across the Archean: Evidence For Ancient Microbial Metabolisms

    NASA Astrophysics Data System (ADS)

    Wilmeth, D.; Corsetti, F. A.; Berelson, W.; Beukes, N. J.; Awramik, S. M.; Petryshyn, V. A.

    2017-12-01

    Identifying the presence of specific microbial metabolisms in the Archean is a fundamental goal of deep-time geobiology. Certain fenestral textures within Archean stromatolites provide evidence for the presence of gas, and therefore gas-releasing metabolisms, within ancient microbial mats. Paleoenvironmental analysis indicates many of the stromatolites formed in shallow, agitated aqueous environments, with relatively rapid gas production and lithification of fenestrae. Proposed gases include oxygen, carbon dioxide, methane, hydrogen sulfide, and various nitrogen species, produced by appropriate metabolisms. This study charts the presence of gas-related fenestrae in Archean stromatolites over time, and examines the potential for various metabolisms to produce fenestral textures. Fenestral textures are present in Archean stromatolites on at least four separate cratons from 3.5 to 2.5 Ga. Fenestrae are preserved in carbonate and chert microbialites of various morphologies, including laminar, domal, and conical forms. Extensive fenestral textures, with dozens of fenestrae along individual laminae, are especially prevalent in Neoarchean stromatolites (2.8 -2.5 Ga). The volume of gas within Archean microbial mats was estimated by measuring fenestrae in ancient stromatolites and bubbles within modern mats. The time needed for metabolisms to produce appropriate gas volumes was calculated using modern rates obtained from the literature. Given the paleoenvironmental conditions, the longer a metabolism takes to make large amounts of gas, the less likely large bubbles will remain long enough to become preserved. Additionally, limiting reactants were estimated for each metabolism using previous Archean geochemical models. Metabolisms with limited reactants are less likely to produce large amounts of gas. Oxygenic photosynthesis can produce large amounts of gas within minutes, and the necessary reactants (carbon dioxide and water) were readily available in Archean environments. In the absence of clear sedimentary or geochemical evidence for abundant hydrogen or oxidized sulfur and nitrogen species during stromatolite morphogenesis, oxygenic photosynthesis is the metabolism with the highest potential for producing fenestrae before the Great Oxidation Event.

  6. Querying Large Biological Network Datasets

    ERIC Educational Resources Information Center

    Gulsoy, Gunhan

    2013-01-01

    New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…

  7. The dust cloud of the century

    NASA Astrophysics Data System (ADS)

    Robock, A.

    1983-02-01

    The structure and composition of the dust cloud from the 4 April 1982 eruption of the El Chichon volcano in Chiapas state, Mexico, is examined and the possible effects of the dust cloud on the world's weather patterns are discussed. Observations of the cloud using a variety of methods are evaluated, including data from the GOES and NOAA-7 weather satellites, vertically pointing lidar measurements, the SME satellite, and the Nimbus-7 satellite. Studies of the gaseous and particulate composition of the cloud reveal the presence of large amounts of sulfuric acid particles, which have a long mean residence time in the atmosphere and have a large effect on the amount of solar radiation received at the earth's surface by scattering several percent of the radiation back to space. Estimates of the effect of this cloud on surface air temperature changes are presented based on findings from climate models.

  8. The industry of wildcrafting, gathering, and harvesting of NTFPs: an insider's perspective

    Treesearch

    Barb Letchworth

    2001-01-01

    The natural products industry has been undergoing a tremendous amount of change in the past few years. Large corporations including pharmaceutical companies, food and drug store chains, and even department stores have been adding medicinal herbs to their offerings. We can find anything from naturally raw bulk herbs, to standardized extracts, to time-released...

  9. 44 CFR 10.8 - Determination of requirement for environmental review.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... apply: (i) If an action will result in an extensive change in land use or the commitment of a large amount of land; (ii) If an action will result in a land use change which is incompatible with the... ecosystems, including endangered species; (vi) If an action will result in a major adverse impact upon air or...

  10. 44 CFR 10.8 - Determination of requirement for environmental review.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... apply: (i) If an action will result in an extensive change in land use or the commitment of a large amount of land; (ii) If an action will result in a land use change which is incompatible with the... ecosystems, including endangered species; (vi) If an action will result in a major adverse impact upon air or...

  11. 44 CFR 10.8 - Determination of requirement for environmental review.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... apply: (i) If an action will result in an extensive change in land use or the commitment of a large amount of land; (ii) If an action will result in a land use change which is incompatible with the... ecosystems, including endangered species; (vi) If an action will result in a major adverse impact upon air or...

  12. 44 CFR 10.8 - Determination of requirement for environmental review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... apply: (i) If an action will result in an extensive change in land use or the commitment of a large amount of land; (ii) If an action will result in a land use change which is incompatible with the... ecosystems, including endangered species; (vi) If an action will result in a major adverse impact upon air or...

  13. 44 CFR 10.8 - Determination of requirement for environmental review.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... apply: (i) If an action will result in an extensive change in land use or the commitment of a large amount of land; (ii) If an action will result in a land use change which is incompatible with the... ecosystems, including endangered species; (vi) If an action will result in a major adverse impact upon air or...

  14. Retrospective Mining of Toxicology Data to Discover Multispecies and Chemical Class Effects: Anemia as a Case Study

    EPA Science Inventory

    Predictive toxicity models (in vitro to in vivo, QSAR, read-across) rely on large amounts of accurate in vivo data. Here, we analyze the quality of in vivo data from the Toxicity Reference Database (ToxRefDB), using chemical-induced anemia as an example. Considerations include v...

  15. Measuring Well-Being and Progress

    ERIC Educational Resources Information Center

    D'Acci, Luca

    2011-01-01

    Well-being is becoming a concept which is more and more involved in any world development consideration. A large amount of work is being carried out to study measurements of well-being, including a more holistic vision on the development and welfare of a country. This paper proposes an idea of well-being and progress being in equilibrium with each…

  16. Inventory: 26 Reasons for Doing One

    ERIC Educational Resources Information Center

    Braxton, Barbara

    2005-01-01

    A stocktake is a legal requirement that ensures that teacher-librarians are accountable for the money they have spent throughout the year. Including staff salaries, the library absorbs a large amount of the annual school budget, so it is essential that funds are spent wisely. Inventory is done at the end of each academic year as part of the…

  17. 26 CFR 301.6867-1 - Presumptions where owner of large amount of cash is not identified.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Revenue Code, relating to abatements, credits, and refunds, and may not institute a suit for refund in... 6532(c), relating to the 9-month statute of limitations for suits under section 7426. In addition, the...) Postage stamps; (F) Traveler's checks in any form; (G) Negotiable instruments (including personal checks...

  18. 26 CFR 301.6867-1 - Presumptions where owner of large amount of cash is not identified.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Revenue Code, relating to abatements, credits, and refunds, and may not institute a suit for refund in... 6532(c), relating to the 9-month statute of limitations for suits under section 7426. In addition, the...) Postage stamps; (F) Traveler's checks in any form; (G) Negotiable instruments (including personal checks...

  19. Engineering Design Handbook. Development Guide for Reliability. Part Two. Design for Reliability

    DTIC Science & Technology

    1976-01-01

    Component failure rates, however, have been recorded by many sources as a function of use and environment. Some of these sources are listed in Refs. 13-17...other systems capable of creating an explosive reac- tion. The second category is fairly obvious and includes many variations on methods for providing...aboutthem. 4. Ability to detect signals ( including patterns) in high noise environments. 5. Ability to store large amounts of informa- tion for long

  20. Handling Qualities of Large Flexible Aircraft. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Poopaka, S.

    1980-01-01

    The effects on handling qualities of elastic modes interaction with the rigid body dynamics of a large flexible aircraft are studied by a mathematical computer simulation. An analytical method to predict the pilot ratings when there is a severe modes interactions is developed. This is done by extending the optimal control model of the human pilot response to include the mode decomposition mechanism into the model. The handling qualities are determined for a longitudinal tracking task using a large flexible aircraft with parametric variations in the undamped natural frequencies of the two lowest frequency, symmetric elastic modes made to induce varying amounts of mode interaction.

  1. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  2. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  3. The NASA Lewis large wind turbine program

    NASA Technical Reports Server (NTRS)

    Thomas, R. L.; Baldwin, D. H.

    1981-01-01

    The program is directed toward development of the technology for safe, reliable, environmentally acceptable large wind turbines that have the potential to generate a significant amount of electricity at costs competitive with conventional electric generation systems. In addition, these large wind turbines must be fully compatible with electric utility operations and interface requirements. Advances are made by gaining a better understanding of the system design drivers, improvements in the analytical design tools, verification of design methods with operating field data, and the incorporation of new technology and innovative designs. An overview of the program activities is presented and includes results from the first and second generation field machines (Mod-OA, -1, and -2), the design phase of the third generation wind turbine (Mod-5) and the advanced technology projects. Also included is the status of the Department of Interior WTS-4 machine.

  4. Identifying and quantifying urban recharge: a review

    NASA Astrophysics Data System (ADS)

    Lerner, David N.

    2002-02-01

    The sources of and pathways for groundwater recharge in urban areas are more numerous and complex than in rural environments. Buildings, roads, and other surface infrastructure combine with man-made drainage networks to change the pathways for precipitation. Some direct recharge is lost, but additional recharge can occur from storm drainage systems. Large amounts of water are imported into most cities for supply, distributed through underground pipes, and collected again in sewers or septic tanks. The leaks from these pipe networks often provide substantial recharge. Sources of recharge in urban areas are identified through piezometry, chemical signatures, and water balances. All three approaches have problems. Recharge is quantified either by individual components (direct recharge, water-mains leakage, septic tanks, etc.) or holistically. Working with individual components requires large amounts of data, much of which is uncertain and is likely to lead to large uncertainties in the final result. Recommended holistic approaches include the use of groundwater modelling and solute balances, where various types of data are integrated. Urban recharge remains an under-researched topic, with few high-quality case studies reported in the literature.

  5. Signal and image processing algorithm performance in a virtual and elastic computing environment

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  6. Environmental contaminants and the management of bat populations in the United States

    USGS Publications Warehouse

    Clark, D.R.

    1988-01-01

    Food-chain residues of organochlorine pesticides probably have been involved in declines of some U.S. bat populations; examples include free-tailed bats at Carlsbad Cavern, New Mexico, and the endangered gray bat at sites in Missouri and Alabama. If a long-lived contaminant has not been dispersed in large amounts over large areas, its impact may be controlled by administrative action that stops its use or other environmental discharge, or that results in physical isolation of localized contamination so that it no longer enters food chains

  7. Automated Absorber Attachment for X-ray Microcalorimeter Arrays

    NASA Technical Reports Server (NTRS)

    Moseley, S.; Allen, Christine; Kilbourne, Caroline; Miller, Timothy M.; Costen, Nick; Schulte, Eric; Moseley, Samuel J.

    2007-01-01

    Our goal is to develop a method for the automated attachment of large numbers of absorber tiles to large format detector arrays. This development includes the fabrication of high quality, closely spaced HgTe absorber tiles that are properly positioned for pick-and-place by our FC150 flip chip bonder. The FC150 also transfers the appropriate minute amount of epoxy to the detectors for permanent attachment of the absorbers. The success of this development will replace an arduous, risky and highly manual task with a reliable, high-precision automated process.

  8. Analysis of volatile organic compounds. [trace amounts of organic volatiles in gas samples

    NASA Technical Reports Server (NTRS)

    Zlatkis, A. (Inventor)

    1977-01-01

    An apparatus and method are described for reproducibly analyzing trace amounts of a large number of organic volatiles existing in a gas sample. Direct injection of the trapped volatiles into a cryogenic percolum provides a sharply defined plug. Applications of the method include: (1) analyzing the headspace gas of body fluids and comparing a profile of the organic volatiles with standard profiles for the detection and monitoring of disease; (2) analyzing the headspace gas of foods and beverages and comparing the profile with standard profiles to monitor and control flavor and aroma; and (3) analyses for determining the organic pollutants in air or water samples.

  9. INFLUENCE OF ANESTHESIA ON EXPERIMENTAL NEUROTROPIC VIRUS INFECTIONS

    PubMed Central

    Sulkin, S. Edward; Zarafonetis, Christine

    1947-01-01

    1. Experimental neurotropic virus infections previously shown to be altered by ether anesthesia are caused by viruses destroyed in vitro by anesthetic ether; this group includes the viruses of Eastern equine encephalomyelitis, Western equine encephalomyelitis, and St. Louis encephalitis. 2. Experimental neurotropic virus infections which were not altered by ether anesthesia are caused by viruses which are refractory to the in vitro virucidal activity of even large amounts of anesthetic ether; this group includes the viruses of poliomyelitis (Lansing) and rabies. 3. Quantitative studies of the in vitro virucidal activity of ether indicate that concentrations of this anesthetic within the range found in central nervous system tissues of anesthetized animals possess no virucidal activity. 4. The lowest concentration of ether possessing significant virucidal capacity is more than fifteen times the maximum concentration of the anesthetic tolerated by the experimental animal. 5. Concentrations of ether 50 to 100 times the maximum amount tolerated by the anesthetized animal are capable of destroying large amounts of susceptible viruses, the average lethal dose (LD50) being reduced more than 5 log units. 6. On the basis of the studies presented in this report, it cannot be concluded that direct virucidal activity of ether is not the underlying mechanism of the inhibition by anesthesia of certain experimental neurotropic virus infections. Indirect inhibition of the virus by the anesthetic through an alteration in the metabolism of either the host cell or the host animal as a whole appears at this point to be a more likely possibility. PMID:19871636

  10. Analysis of inorganic and organic constituents of myrrh resin by GC-MS and ICP-MS: An emphasis on medicinal assets.

    PubMed

    Ahamad, Syed Rizwan; Al-Ghadeer, Abdul Rahman; Ali, Raisuddin; Qamar, Wajhul; Aljarboa, Suliman

    2017-07-01

    The aim of the present investigation was to explore the constituents of the Arabian myrrh resin obtained from Commiphora myrrha. The organic and inorganic composition of the myrrh gum resin has been investigated using gas chromatography-mass spectrometry (GC-MS) and inductively coupled plasma-mass spectrometry (ICP-MS). Analysis executed by ICP-MS reveals the presence of various inorganic elements in significant amount in the myrrh resin. The elements that were found to be present in large amounts include calcium, magnesium, aluminum, phosphorus, chlorine, chromium, bromine and scandium. The important organic constituents identified in the myrrh ethanolic extract include limonene, curzerene, germacrene B, isocericenine, myrcenol, beta selinene, and spathulenol,. The present work complements other myrrh associated investigations done in the past and provides additional data for the future researches.

  11. A New Student Performance Analysing System Using Knowledge Discovery in Higher Educational Databases

    ERIC Educational Resources Information Center

    Guruler, Huseyin; Istanbullu, Ayhan; Karahasan, Mehmet

    2010-01-01

    Knowledge discovery is a wide ranged process including data mining, which is used to find out meaningful and useful patterns in large amounts of data. In order to explore the factors having impact on the success of university students, knowledge discovery software, called MUSKUP, has been developed and tested on student data. In this system a…

  12. Responding to Big Data in the Art Education Classroom: Affordances and Problematics

    ERIC Educational Resources Information Center

    Duncum, Paul

    2018-01-01

    The article raises questions about the use in art education classrooms of social networking sites like Facebook and image sharing sites like YouTube that rely upon the ability of Big Data to aggregate large amounts of data, including data on students. The article also offers suggestions for the responsible use of these sites. Many youth are using…

  13. Urbanization eases water crisis in China

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang; Ji, Chen

    2012-01-01

    Socioeconomic development in China has resulted in rapid urbanization, which includes a large amount of people making the transition from rural areas to cities. Many have speculated that this mass migration may have worsened the water crisis in many parts of the country. However, this study shows that the water crisis would be more severe if the rural-to-urban migration did not occur.

  14. Generation and Recovery of Solid Wood Waste in the U.S.

    Treesearch

    Bob Falk; David McKeever

    2012-01-01

    North America has a vast system of hardwood and softwood forests, and the wood harvested from this resource is widely used in many applications. These include lumber and other building materials, furniture, crating, containers, pallets and other consumer goods. This wide array of wood products generates not only a large amount of industrial wood by-product during the...

  15. Insulating Cryogenic Pipes With Frost

    NASA Technical Reports Server (NTRS)

    Stephenson, J. G.; Bova, J. A.

    1985-01-01

    Crystallized water vapor fills voids in pipe insulation. Small, carefully controlled amount of water vapor introduced into dry nitrogen gas before it enters aft fuselage. Vapor freezes on pipes, filling cracks in insulation. Ice prevents gaseous nitrogen from condensing on pipes and dripping on structure, in addition to helping to insulate all parts. Industrial applications include large refrigeration plants or facilities that use cryogenic liquids.

  16. Helioviewer.org: Browsing Very Large Image Archives Online Using JPEG 2000

    NASA Astrophysics Data System (ADS)

    Hughitt, V. K.; Ireland, J.; Mueller, D.; Dimitoglou, G.; Garcia Ortiz, J.; Schmidt, L.; Wamsler, B.; Beck, J.; Alexanderian, A.; Fleck, B.

    2009-12-01

    As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Recent efforts have resulted in increased performance, dynamic movie generation, and improved support for mobile web browsers. Future functionality will include: support for additional data-sources including RHESSI, SDO, STEREO, and TRACE, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.

  17. A sequential coalescent algorithm for chromosomal inversions

    PubMed Central

    Peischl, S; Koch, E; Guerrero, R F; Kirkpatrick, M

    2013-01-01

    Chromosomal inversions are common in natural populations and are believed to be involved in many important evolutionary phenomena, including speciation, the evolution of sex chromosomes and local adaptation. While recent advances in sequencing and genotyping methods are leading to rapidly increasing amounts of genome-wide sequence data that reveal interesting patterns of genetic variation within inverted regions, efficient simulation methods to study these patterns are largely missing. In this work, we extend the sequential Markovian coalescent, an approximation to the coalescent with recombination, to include the effects of polymorphic inversions on patterns of recombination. Results show that our algorithm is fast, memory-efficient and accurate, making it feasible to simulate large inversions in large populations for the first time. The SMC algorithm enables studies of patterns of genetic variation (for example, linkage disequilibria) and tests of hypotheses (using simulation-based approaches) that were previously intractable. PMID:23632894

  18. Large-scale deposition of weathered oil in the Gulf of Mexico following a deep-water oil spill.

    PubMed

    Romero, Isabel C; Toro-Farmer, Gerardo; Diercks, Arne-R; Schwing, Patrick; Muller-Karger, Frank; Murawski, Steven; Hollander, David J

    2017-09-01

    The blowout of the Deepwater Horizon (DWH) drilling rig in 2010 released an unprecedented amount of oil at depth (1,500 m) into the Gulf of Mexico (GoM). Sedimentary geochemical data from an extensive area (∼194,000 km 2 ) was used to characterize the amount, chemical signature, distribution, and extent of the DWH oil deposited on the seafloor in 2010-2011 from coastal to deep-sea areas in the GoM. The analysis of numerous hydrocarbon compounds (N = 158) and sediment cores (N = 2,613) suggests that, 1.9 ± 0.9 × 10 4 metric tons of hydrocarbons (>C9 saturated and aromatic fractions) were deposited in 56% of the studied area, containing 21± 10% (up to 47%) of the total amount of oil discharged and not recovered from the DWH spill. Examination of the spatial trends and chemical diagnostic ratios indicate large deposition of weathered DWH oil in coastal and deep-sea areas and negligible deposition on the continental shelf (behaving as a transition zone in the northern GoM). The large-scale analysis of deposited hydrocarbons following the DWH spill helps understanding the possible long-term fate of the released oil in 2010, including sedimentary transformation processes, redistribution of deposited hydrocarbons, and persistence in the environment as recycled petrocarbon. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. System simulation application for determining the size of daily raw material purchases at PT XY

    NASA Astrophysics Data System (ADS)

    Napitupulu, H. L.

    2018-02-01

    Every manufacturing company needs to implement green production, including PT XY as a marine catchment processing industry in Sumatera Utara Province. The company is engaged in the processing of squid for export purposes. The company’s problem relates to the absence of a decision on the daily purchase amount of the squid. The purchase of daily raw materials in varying quantities has caused companies to face the problem of excess raw materials or otherwise the lack of raw materials. The low purchase of raw materials will result in reduced productivity, while large purchases will lead to increased cooling costs for storage of excess raw materials, as well as possible loss of damage raw material. Therefore it is necessary to determine the optimal amount of raw material purchases every day. This can be determined by applying simulation. Application of system simulations can provide the expected optimal amount of raw material purchases.

  20. Determination of Acreage Thermal Protection Foam Loss From Ice and Foam Impacts

    NASA Technical Reports Server (NTRS)

    Carney, Kelly S.; Lawrence, Charles

    2015-01-01

    A parametric study was conducted to establish Thermal Protection System (TPS) loss from foam and ice impact conditions similar to what might occur on the Space Launch System. This study was based upon the large amount of testing and analysis that was conducted with both ice and foam debris impacts on TPS acreage foam for the Space Shuttle Project External Tank. Test verified material models and modeling techniques that resulted from Space Shuttle related testing were utilized for this parametric study. Parameters varied include projectile mass, impact velocity and impact angle (5 degree and 10 degree impacts). The amount of TPS acreage foam loss as a result of the various impact conditions is presented.

  1. Transfer of interferon alfa into human breast milk.

    PubMed

    Kumar, A R; Hale, T W; Mock, R E

    2000-08-01

    Originally assumed to be antiviral substances, the efficacy of interferons in a number of pathologies, including malignancies, multiple sclerosis, and other immune syndromes, is increasingly recognized. This study provides data on the transfer of interferon alfa (2B) into human milk of a patient receiving massive intravenous doses for the treatment of malignant melanoma. Following an intravenous dose of 30 million IU, the amount of interferon transferred into human milk was only slightly elevated (1551 IU/mL) when compared to control milk (1249 IU/mL). These data suggest that even following enormous doses, interferon is probably too large in molecular weight to transfer into human milk in clinically relevant amounts.

  2. Immunoinformatics: an integrated scenario

    PubMed Central

    Tomar, Namrata; De, Rajat K

    2010-01-01

    Genome sequencing of humans and other organisms has led to the accumulation of huge amounts of data, which include immunologically relevant data. A large volume of clinical data has been deposited in several immunological databases and as a result immunoinformatics has emerged as an important field which acts as an intersection between experimental immunology and computational approaches. It not only helps in dealing with the huge amount of data but also plays a role in defining new hypotheses related to immune responses. This article reviews classical immunology, different databases and prediction tools. It also describes applications of immunoinformatics in designing in silico vaccination and immune system modelling. All these efforts save time and reduce cost. PMID:20722763

  3. A hypermedia reference system to the Forest Ecosystem Management Assessment team report and some related publications.

    Treesearch

    K.M. Reynolds; H.M. Rauscher; C.V. Worth

    1995-01-01

    The hypermedia system, ForestEM, was developed in HyperWriter for use in Microsoft Windows. ForestEM version 1.0 includes text and figures from the FEMAT report and the Record of Decision and Standards and Guidelines. Hypermedia introduces two fundamental changes to knowledge management. The first is the capability to interactively store and retrieve large amounts of...

  4. Yield comparisons from floating blade and fixed arbor gang ripsaws when processing boards before and after crook removal

    Treesearch

    Charles J. Gatchell; Charles J. Gatchell

    1991-01-01

    Gang-ripping technology that uses a movable (floating) outer blade to eliminate unusable edgings is described, including new tenn1nology for identifying preferred and minimally acceptable strip widths. Because of the large amount of salvage required to achieve total yields, floating blade gang ripping is not recommended for boards with crook. With crook removed by...

  5. Frequency-Modulated Microwave Photonic Links with Direct Detection: Review and Theory

    DTIC Science & Technology

    2010-12-15

    create large amounts of signal distortion. Alternatives to MZIs have been pro- posed, including Fabry - Perot interferometers, ber Bragg gratings (FBGs...multiplexed, analog signals for applications in cable television distribution. Experimental results for a Fabry - Perot discriminated, FM subcarrier...multiplexed system were presented by [17]. An array of optical frequency modulated DFB lasers and a Fabry - Perot discriminator were used to transmit and

  6. Generation M[superscript 2]: Media in the Lives of 8- to 18-Year-Olds

    ERIC Educational Resources Information Center

    Rideout, Victoria J.; Foehr, Ulla G.; Roberts, Donald F.

    2010-01-01

    This study is one of the largest and most comprehensive publicly available sources of information on the amount and nature of media use among American youth: (1) It includes a large national sample of more than 2,000 young people from across the country; (2) It covers children from ages 8 to 18, to track changes from childhood through the…

  7. Successful integration of ergonomics into continuous improvement initiatives.

    PubMed

    Monroe, Kimberly; Fick, Faye; Joshi, Madina

    2012-01-01

    Process improvement initiatives are receiving renewed attention by large corporations as they attempt to reduce manufacturing costs and stay competitive in the global marketplace. These initiatives include 5S, Six Sigma, and Lean. These programs often take up a large amount of available time and budget resources. More often than not, existing ergonomics processes are considered separate initiatives by upper management and struggle to gain a seat at the table. To effectively maintain their programs, ergonomics program managers need to overcome those obstacles and demonstrate how ergonomics initiatives are a natural fit with continuous improvement philosophies.

  8. Parallelization and visual analysis of multidimensional fields: Application to ozone production, destruction, and transport in three dimensions

    NASA Technical Reports Server (NTRS)

    Schwan, Karsten

    1994-01-01

    Atmospheric modeling is a grand challenge problem for several reasons, including its inordinate computational requirements and its generation of large amounts of data concurrent with its use of very large data sets derived from measurement instruments like satellites. In addition, atmospheric models are typically run several times, on new data sets or to reprocess existing data sets, to investigate or reinvestigate specific chemical or physical processes occurring in the earth's atmosphere, to understand model fidelity with respect to observational data, or simply to experiment with specific model parameters or components.

  9. A note on the microeconomics of migration.

    PubMed

    Stahl, K

    1983-11-01

    "The purpose of this note is to demonstrate in a simple model that an individual's migration from a small town to a large city may be rationalized purely by a consumption motive, rather than the motive of obtaining a higher income. More specifically, it is shown that in a large city an individual may derive a higher utility from spending a given amount of income than in a small town." A formal model is first developed that includes the principal forces at work and is then illustrated using a graphic example. The theoretical and empirical issues raised are considered in the concluding section. excerpt

  10. Lenticular card: a new method for denture identification.

    PubMed

    Colvenkar, Shreya S

    2010-01-01

    The need for denture marking is important for forensic and social reasons in case patients need to be identified individually. Majority of the surface marking and inclusion techniques are expensive, time consuming, and do not permit the incorporation of large amounts of information. In this article, the method to include a lenticular identification card stood out from the currently available denture marking methods in various ways. The lenticular card stores the patient's information has two or more images that can be viewed by changing the angle of view. The maxillary denture was processed according to the manufacturer's instructions. The lenticular identification card was incorporated in the external posterior buccal surface of the maxillary denture using salt and pepper technique. For testing of durability, denture with the identifier was placed in water for up to 4 months. The proposed method is simple, cheap, and can store a large amount of information, thus allowing quick identification of the denture wearer. The labels showed no sign of fading or deterioration.

  11. A preliminary study of factors affecting the calibration stability of the iridium versus iridium-40 percent rhodium thermocouple

    NASA Technical Reports Server (NTRS)

    Ahmed, Shaffiq; Germain, Edward F.; Daryabeigi, Kamran; Alderfer, David W.; Wright, Robert E.

    1987-01-01

    An iridium versus iridium-40% rhodium thermocouple was studied. Problems associated with the use of this thermocouple for high temperature applications (up to 2000 C) were investigated. The metallurgical studies included X-ray, macroscopic, resistance, and metallographic studies. The thermocouples in the as-received condition from the manufacturer revealed large amounts of internal stress caused by cold working during manufacturing. The thermocouples also contained a large amount of inhomogeneities and segregations. No phase transformations were observed in the alloy up to 1100 C. It was found that annealing the thermocouple at 1800 C for two hours, and then at 1400 C for 2 to 3 hours yielded a fine grain structure, relieving some of the strains, and making the wire more ductile. It was also found that the above annealing procedure stabilized the thermal emf behavior of the thermocouple for application below 1800 C (an improvement from + or - 1% to + or - 0.02% within the range of the test parameters used).

  12. National Offshore Wind Energy Grid Interconnection Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniel, John P.; Liu, Shu; Ibanez, Eduardo

    2014-07-30

    The National Offshore Wind Energy Grid Interconnection Study (NOWEGIS) considers the availability and potential impacts of interconnecting large amounts of offshore wind energy into the transmission system of the lower 48 contiguous United States. A total of 54GW of offshore wind was assumed to be the target for the analyses conducted. A variety of issues are considered including: the anticipated staging of offshore wind; the offshore wind resource availability; offshore wind energy power production profiles; offshore wind variability; present and potential technologies for collection and delivery of offshore wind energy to the onshore grid; potential impacts to existing utility systemsmore » most likely to receive large amounts of offshore wind; and regulatory influences on offshore wind development. The technologies considered the reliability of various high-voltage ac (HVAC) and high-voltage dc (HVDC) technology options and configurations. The utility system impacts of GW-scale integration of offshore wind are considered from an operational steady-state perspective and from a regional and national production cost perspective.« less

  13. Biogeographic patterns in below-ground diversity in New York City's Central Park are similar to those observed globally

    PubMed Central

    Ramirez, Kelly S.; Leff, Jonathan W.; Barberán, Albert; Bates, Scott Thomas; Betley, Jason; Crowther, Thomas W.; Kelly, Eugene F.; Oldfield, Emily E.; Shaw, E. Ashley; Steenbock, Christopher; Bradford, Mark A.; Wall, Diana H.; Fierer, Noah

    2014-01-01

    Soil biota play key roles in the functioning of terrestrial ecosystems, however, compared to our knowledge of above-ground plant and animal diversity, the biodiversity found in soils remains largely uncharacterized. Here, we present an assessment of soil biodiversity and biogeographic patterns across Central Park in New York City that spanned all three domains of life, demonstrating that even an urban, managed system harbours large amounts of undescribed soil biodiversity. Despite high variability across the Park, below-ground diversity patterns were predictable based on soil characteristics, with prokaryotic and eukaryotic communities exhibiting overlapping biogeographic patterns. Further, Central Park soils harboured nearly as many distinct soil microbial phylotypes and types of soil communities as we found in biomes across the globe (including arctic, tropical and desert soils). This integrated cross-domain investigation highlights that the amount and patterning of novel and uncharacterized diversity at a single urban location matches that observed across natural ecosystems spanning multiple biomes and continents. PMID:25274366

  14. Environmental impact evaluation of feeds prepared from food residues using life cycle assessment.

    PubMed

    Ogino, Akifumi; Hirooka, Hiroyuki; Ikeguchi, Atsuo; Tanaka, Yasuo; Waki, Miyoko; Yokoyama, Hiroshi; Kawashima, Tomoyuki

    2007-01-01

    There is increasing concern about feeds prepared from food residues (FFR) from an environmental viewpoint; however, various forms of energy are consumed in the production of FFR. Environmental impacts of three scenarios were therefore investigated and compared using life cycle assessment (LCA): production of liquid FFR by sterilization with heat (LQ), production of dehydrated FFR by dehydration (DH), and disposal of food residues by incineration (IC). The functional unit was defined as 1 kg dry matter of produced feed standardized to a fixed energy content. The system boundaries included collection of food residues and production of feed from food residues. In IC, food residues are incinerated as waste, and thus the impacts of production and transportation of commercial concentrate feeds equivalent to the FFR in the other scenarios are included in the analysis. Our results suggested that the average amounts of greenhouse gas (GHG) emissions from LQ, DH, and IC were 268, 1073, and 1066 g of CO(2) equivalent, respectively. The amount of GHG emissions from LQ was remarkably small, indicating that LQ was effective for reducing the environmental impact of animal production. Although the average amount of GHG emissions from DH was nearly equal to that from IC, a large variation of GHG emissions was observed among the DH units. The energy consumption of the three scenarios followed a pattern similar to that of GHG emissions. The water consumption of the FFR-producing units was remarkably smaller than that of IC due to the large volumes of water consumed in forage crop production.

  15. Impact of water quality on chlorine demand of corroding copper.

    PubMed

    Lytle, Darren A; Liggett, Jennifer

    2016-04-01

    Copper is widely used in drinking water premise plumbing system materials. In buildings such as hospitals, large and complicated plumbing networks make it difficult to maintain good water quality. Sustaining safe disinfectant residuals throughout a building to protect against waterborne pathogens such as Legionella is particularly challenging since copper and other reactive distribution system materials can exert considerable demands. The objective of this work was to evaluate the impact of pH and orthophosphate on the consumption of free chlorine associated with corroding copper pipes over time. A copper test-loop pilot system was used to control test conditions and systematically meet the study objectives. Chlorine consumption trends attributed to abiotic reactions with copper over time were different for each pH condition tested, and the total amount of chlorine consumed over the test runs increased with increasing pH. Orthophosphate eliminated chlorine consumption trends with elapsed time (i.e., chlorine demand was consistent across entire test runs). Orthophosphate also greatly reduced the total amount of chlorine consumed over the test runs. Interestingly, the total amount of chlorine consumed and the consumption rate were not pH dependent when orthophosphate was present. The findings reflect the complex and competing reactions at the copper pipe wall including corrosion, oxidation of Cu(I) minerals and ions, and possible oxidation of Cu(II) minerals, and the change in chlorine species all as a function of pH. The work has practical applications for maintaining chlorine residuals in premise plumbing drinking water systems including large buildings such as hospitals. Published by Elsevier Ltd.

  16. Use of tropical maize for bioethanol production

    USDA-ARS?s Scientific Manuscript database

    Tropical maize is an alternative energy crop being considered as a feedstock for bioethanol production in the North Central and Midwest United States. Tropical maize is advantageous because it produces large amounts of soluble sugars in its stalks, creates a large amount of biomass, and requires lo...

  17. Were Ocean Impacts an Important Mechanism to Deliver Meteoritic Organic Matter to the Early Earth? Some Inferences from Eltanin

    NASA Technical Reports Server (NTRS)

    Kyte, Frank T.; Gersonde, Rainer; Kuhn. Gerhard

    2002-01-01

    Several workers have addressed the potential for extraterrestrial delivery of volatles, including water and complex organic compounds, to the early Earth. For example, Chyba and Sagan (1992) argued that since impacts would destroy organic matter, most extraterrestrial organics must be delivered in the fine-fractions of interplanetary dust. More recent computer simulations (Pierazzo and Chyba, 1999), however, have shown that substantial amounts of amino acids may survive the impacts of large (km-sized) comets and that this may exceed the amounts derived from IDPs or Miller-Urey synthesis in the atmosphere. Once an ocean developed on the early Earth, impacts of small ,asteroids and comets into deep-ocean basins were potentially common and may have been the most likely events to deliver large amounts of organics. The deposits of the late Pliocene impact of the Eltanin asteroid into the Bellingshausen Sea provide the only record of a deep-ocean (approx. 5 km) impact that can be used to constrain models of these events. This impact was first discovered in 1981 as an Ir anomaly in sediment cores collected by the USNS Eltanin in 1965 (Kyte et al., 1981). In 1995, Polarstem expedition ANT XII/4 made the first geological survey of the suspected impact region. Three sediment cores sampled around the San Martin seamounts (approx. 57.5S, 91 W) contained well-preserved impact deposits that include disturbed ocean sediments and meteoritic impact ejecta (Gersonde et al., 1997). The latter is composed of shock- melted asteroidal materials and unmelted meteorites. In 2001, the FS Polarstem returned to the impact area during expedition ANT XVIII/5a. At least 16 cores were recovered that contain ejecta deposits. These cores and geophysical data from the expedition can be used to map the effects of the impact over a large region of the ocean floor.

  18. THE SEGUE K GIANT SURVEY. III. QUANTIFYING GALACTIC HALO SUBSTRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janesh, William; Morrison, Heather L.; Ma, Zhibo

    2016-01-10

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5–125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey’s Sloan Extension for Galactic Understanding and Exploration project. Using a position–velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earliermore » work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (∼33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity.« less

  19. VizieR Online Data Catalog: The SEGUE K giant survey. III. Galactic halo (Janesh+, 2016)

    NASA Astrophysics Data System (ADS)

    Janesh, W.; Morrison, H. L.; Ma, Z.; Rockosi, C.; Starkenburg, E.; Xue, X. X.; Rix, H.-W.; Harding, P.; Beers, T. C.; Johnson, J.; Lee, Y. S.; Schneider, D. P.

    2016-03-01

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5-125kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey's Sloan Extension for Galactic Understanding and Exploration (SEGUE) project. Using a position-velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earlier work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (~33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity. (2 data files).

  20. A Parallel Nonrigid Registration Algorithm Based on B-Spline for Medical Images.

    PubMed

    Du, Xiaogang; Dang, Jianwu; Wang, Yangping; Wang, Song; Lei, Tao

    2016-01-01

    The nonrigid registration algorithm based on B-spline Free-Form Deformation (FFD) plays a key role and is widely applied in medical image processing due to the good flexibility and robustness. However, it requires a tremendous amount of computing time to obtain more accurate registration results especially for a large amount of medical image data. To address the issue, a parallel nonrigid registration algorithm based on B-spline is proposed in this paper. First, the Logarithm Squared Difference (LSD) is considered as the similarity metric in the B-spline registration algorithm to improve registration precision. After that, we create a parallel computing strategy and lookup tables (LUTs) to reduce the complexity of the B-spline registration algorithm. As a result, the computing time of three time-consuming steps including B-splines interpolation, LSD computation, and the analytic gradient computation of LSD, is efficiently reduced, for the B-spline registration algorithm employs the Nonlinear Conjugate Gradient (NCG) optimization method. Experimental results of registration quality and execution efficiency on the large amount of medical images show that our algorithm achieves a better registration accuracy in terms of the differences between the best deformation fields and ground truth and a speedup of 17 times over the single-threaded CPU implementation due to the powerful parallel computing ability of Graphics Processing Unit (GPU).

  1. Observational evidence for enhanced magnetic activity of superflare stars.

    PubMed

    Karoff, Christoffer; Knudsen, Mads Faurschou; De Cat, Peter; Bonanno, Alfio; Fogtmann-Schulz, Alexandra; Fu, Jianning; Frasca, Antonio; Inceoglu, Fadil; Olsen, Jesper; Zhang, Yong; Hou, Yonghui; Wang, Yuefei; Shi, Jianrong; Zhang, Wei

    2016-03-24

    Superflares are large explosive events on stellar surfaces one to six orders-of-magnitude larger than the largest flares observed on the Sun throughout the space age. Due to the huge amount of energy released in these superflares, it has been speculated if the underlying mechanism is the same as for solar flares, which are caused by magnetic reconnection in the solar corona. Here, we analyse observations made with the LAMOST telescope of 5,648 solar-like stars, including 48 superflare stars. These observations show that superflare stars are generally characterized by larger chromospheric emissions than other stars, including the Sun. However, superflare stars with activity levels lower than, or comparable to, the Sun do exist, suggesting that solar flares and superflares most likely share the same origin. The very large ensemble of solar-like stars included in this study enables detailed and robust estimates of the relation between chromospheric activity and the occurrence of superflares.

  2. Observational evidence for enhanced magnetic activity of superflare stars

    PubMed Central

    Karoff, Christoffer; Knudsen, Mads Faurschou; De Cat, Peter; Bonanno, Alfio; Fogtmann-Schulz, Alexandra; Fu, Jianning; Frasca, Antonio; Inceoglu, Fadil; Olsen, Jesper; Zhang, Yong; Hou, Yonghui; Wang, Yuefei; Shi, Jianrong; Zhang, Wei

    2016-01-01

    Superflares are large explosive events on stellar surfaces one to six orders-of-magnitude larger than the largest flares observed on the Sun throughout the space age. Due to the huge amount of energy released in these superflares, it has been speculated if the underlying mechanism is the same as for solar flares, which are caused by magnetic reconnection in the solar corona. Here, we analyse observations made with the LAMOST telescope of 5,648 solar-like stars, including 48 superflare stars. These observations show that superflare stars are generally characterized by larger chromospheric emissions than other stars, including the Sun. However, superflare stars with activity levels lower than, or comparable to, the Sun do exist, suggesting that solar flares and superflares most likely share the same origin. The very large ensemble of solar-like stars included in this study enables detailed and robust estimates of the relation between chromospheric activity and the occurrence of superflares. PMID:27009381

  3. Gene Expression Browser: Large-Scale and Cross-Experiment Microarray Data Management, Search & Visualization

    USDA-ARS?s Scientific Manuscript database

    The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...

  4. Stratospheric Aerosols for Solar Radiation Management

    NASA Astrophysics Data System (ADS)

    Kravitz, Ben

    SRM in the context of this entry involves placing a large amount of aerosols in the stratosphere to reduce the amount of solar radiation reaching the surface, thereby cooling the surface and counteracting some of the warming from anthropogenic greenhouse gases. The way this is accomplished depends on the specific aerosol used, but the basic mechanism involves backscattering and absorbing certain amounts of solar radiation aloft. Since warming from greenhouse gases is due to longwave (thermal) emission, compensating for this warming by reduction of shortwave (solar) energy is inherently imperfect, meaning SRM will have climate effects that are different from the effects of climate change. This will likely manifest in the form of regional inequalities, in that, similarly to climate change, some regions will benefit from SRM, while some will be adversely affected, viewed both in the context of present climate and a climate with high CO2 concentrations. These effects are highly dependent upon the means of SRM, including the type of aerosol to be used, the particle size and other microphysical concerns, and the methods by which the aerosol is placed in the stratosphere. SRM has never been performed, nor has deployment been tested, so the research up to this point has serious gaps. The amount of aerosols required is large enough that SRM would require a major engineering endeavor, although SRM is potentially cheap enough that it could be conducted unilaterally. Methods of governance must be in place before deployment is attempted, should deployment even be desired. Research in public policy, ethics, and economics, as well as many other disciplines, will be essential to the decision-making process. SRM is only a palliative treatment for climate change, and it is best viewed as part of a portfolio of responses, including mitigation, adaptation, and possibly CDR. At most, SRM is insurance against dangerous consequences that are directly due to increased surface air temperatures.

  5. The impact of a windshield in a tipping bucket rain gauge on the reduction of losses in precipitation measurements during snowfall events

    NASA Astrophysics Data System (ADS)

    Buisan, Samuel T.; Collado, Jose Luis; Alastrue, Javier

    2016-04-01

    The amount of snow available controls the ecology and hydrological response of mountainous areas and cold regions and affects economic activities including winter tourism, hydropower generation, floods and water supply. An accurate measurement of snowfall accumulation amount is critical and source of error for a better evaluation and verification of numerical weather forecast, hydrological and climate models. It is well known that the undercatch of solid precipitation resulting from wind-induced updrafts at the gauge orifice is the main factor affecting the quality and accuracy of the amount of snowfall precipitation. This effect can be reduced by the use of different windshields. Overall, Tipping Bucket Rain Gauges (TPBRG) provide a large percentage of the precipitation amount measurements, in all climate regimes, estimated at about 80% of the total of observations by automatic instruments. In the frame of the WMO-SPICE project, we compared at the Formigal-Sarrios station (Spanish Pyrenees, 1800 m a.s.l.) the measured precipitation in two heated TPBRGs, one of them protected with a single alter windshield in order to reduce the wind bias. Results were contrasted with measured precipitation using the SPICE reference gauge (Pluvio2 OTT) in a Double Fence Intercomparison Reference (DFIR). Results reported that shielded reduces undercatch up to 40% when wind speed exceeds 6 m/s. The differences when compared with the reference gauge reached values higher than 70%. The inaccuracy of these measurements showed a significant impact in nowcasting operations and climatology in Spain, especially during some heavy snowfall episodes. Also, hydrological models showed a better agreement with the observed rivers flow when including the precipitation not accounted during these snowfall events. The conclusions of this experiment will be used to take decisions on the suitability of the installation of windshields in stations characterized by a large quantity of snowfalls during the winter season and which are mainly located in Northern Spain

  6. Precision medicine for psychopharmacology: a general introduction.

    PubMed

    Shin, Cheolmin; Han, Changsu; Pae, Chi-Un; Patkar, Ashwin A

    2016-07-01

    Precision medicine is an emerging medical model that can provide accurate diagnoses and tailored therapeutic strategies for patients based on data pertaining to genes, microbiomes, environment, family history and lifestyle. Here, we provide basic information about precision medicine and newly introduced concepts, such as the precision medicine ecosystem and big data processing, and omics technologies including pharmacogenomics, pharamacometabolomics, pharmacoproteomics, pharmacoepigenomics, connectomics and exposomics. The authors review the current state of omics in psychiatry and the future direction of psychopharmacology as it moves towards precision medicine. Expert commentary: Advances in precision medicine have been facilitated by achievements in multiple fields, including large-scale biological databases, powerful methods for characterizing patients (such as genomics, proteomics, metabolomics, diverse cellular assays, and even social networks and mobile health technologies), and computer-based tools for analyzing large amounts of data.

  7. Consumption with Large Sip Sizes Increases Food Intake and Leads to Underestimation of the Amount Consumed

    PubMed Central

    Bolhuis, Dieuwerke P.; Lakemond, Catriona M. M.; de Wijk, Rene A.; Luning, Pieternel A.; de Graaf, Cees

    2013-01-01

    Background A number of studies have shown that bite and sip sizes influence the amount of food intake. Consuming with small sips instead of large sips means relatively more sips for the same amount of food to be consumed; people may believe that intake is higher which leads to faster satiation. This effect may be disturbed when people are distracted. Objective The objective of the study is to assess the effects of sip size in a focused state and a distracted state on ad libitum intake and on the estimated amount consumed. Design In this 3×2 cross-over design, 53 healthy subjects consumed ad libitum soup with small sips (5 g, 60 g/min), large sips (15 g, 60 g/min), and free sips (where sip size was determined by subjects themselves), in both a distracted and focused state. Sips were administered via a pump. There were no visual cues toward consumption. Subjects then estimated how much they had consumed by filling soup in soup bowls. Results Intake in the small-sip condition was ∼30% lower than in both the large-sip and free-sip conditions (P<0.001). In addition, subjects underestimated how much they had consumed in the large-sip and free-sip conditions (P<0.03). Distraction led to a general increase in food intake (P = 0.003), independent of sip size. Distraction did not influence sip size or estimations. Conclusions Consumption with large sips led to higher food intake, as expected. Large sips, that were either fixed or chosen by subjects themselves led to underestimations of the amount consumed. This may be a risk factor for over-consumption. Reducing sip or bite sizes may successfully lower food intake, even in a distracted state. PMID:23372657

  8. Planform changes and large wood dynamics in two torrents during a severe flash flood in Braunsbach, Germany 2016.

    PubMed

    Lucía, Ana; Schwientek, Marc; Eberle, Joachim; Zarfl, Christiane

    2018-05-30

    This work presents a post-event survey study, addressing the geomorphic response and large wood budget of two torrents, Grimmbach and Orlacher Bach, in southwestern Germany that were affected by a flash flood on May 29, 2016. During the event, large amounts of wood clogged and damaged a bridge of a cycling path at the outlet of the Grimmbach, while the town of Braunsbach was devastated by discharge and material transported along the Orlacher Bach. The severity of the event in these two small catchments (30.0 km 2 and 5.95 km 2 , respectively) is remarkable in basins with a relatively low average slope (10.7 and 12.0%, respectively). In order to gain a better understanding of the driving forces during this flood event an integrated approach was applied including (i) an estimate of peak discharges, (ii) an analysis of changes in channel width by comparing available aerial photographs before the flood with a post-flood aerial surveys with an Unmanned Aerial Vehicle and validation with field observations, (iii) a detailed mapping of landslides and analysis of their connectivity with the channel network and finally (iv) an analysis of the amounts of large wood recruited and deposited in the channel. The morphological changes in the channels can be explained by hydraulic parameters, such as stream power and unit stream power, and by morphological parameters such as the valley confinement. This is similar for LW recruitment amounts and volume of exported LW since most of it comes from the erosion of the valley floor. The morphological changes and large wood recruitment and deposit are in the range of studied mountain rivers. Both factors thus need to be considered for mapping and mitigating flash flood hazards also in this kind of low range mountains. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Relationship Between Nutritional Knowledge and the Amount of Sugar-Sweetened Beverages Consumed in Los Angeles County.

    PubMed

    Gase, Lauren N; Robles, Brenda; Barragan, Noel C; Kuo, Tony

    2014-08-01

    Although consumption of sugar-sweetened beverages (SSBs) is associated with many negative health outcomes, including obesity, diabetes, and cardiovascular disease, the relationship between consumer nutritional knowledge and the amount consumed is poorly understood. The objective of this study was to examine the relationship between knowledge of daily calorie recommendations and the amount of SSBs consumed in a large, economically and racially diverse sample of adults recruited at selected Metro subway and bus shelters in Los Angeles County. In June 2012, the Los Angeles County Department of Public Health conducted street intercept surveys to assess food attitudes and consumption behaviors and public opinions related to a recent 8-week health marketing campaign targeting SSB consumption. Descriptive and comparative analyses were conducted, including a negative binomial regression model, to examine the relationship between knowledge of the daily calorie recommendations and the amount of SSBs consumed. Among survey respondents (n = 1,041), less than one third correctly identified the daily calorie recommendations for a typical adult. After controlling for sociodemographics and weight status, respondents who correctly identified recommended calorie needs reported, on average, drinking nine fewer SSBs per month than respondents who did not. Results suggest that efforts to reduce SSB consumption might benefit from the inclusion of educational interventions that empower consumers to make healthy choices. © 2014 Society for Public Health Education.

  10. Model for fluorescence quenching in light harvesting complex II in different aggregation states.

    PubMed

    Andreeva, Atanaska; Abarova, Silvia; Stoitchkova, Katerina; Busheva, Mira

    2009-02-01

    Low-temperature (77 K) steady-state fluorescence emission spectroscopy and dynamic light scattering were applied to the main chlorophyll a/b protein light harvesting complex of photosystem II (LHC II) in different aggregation states to elucidate the mechanism of fluorescence quenching within LHC II oligomers. Evidences presented that LHC II oligomers are heterogeneous and consist of large and small particles with different fluorescence yield. At intermediate detergent concentrations the mean size of the small particles is similar to that of trimers, while the size of large particles is comparable to that of aggregated trimers without added detergent. It is suggested that in small particles and trimers the emitter is monomeric chlorophyll, whereas in large aggregates there is also another emitter, which is a poorly fluorescing chlorophyll associate. A model, describing populations of antenna chlorophyll molecules in small and large aggregates in their ground and first singlet excited states, is considered. The model enables us to obtain the ratio of the singlet excited-state lifetimes in small and large particles, the relative amount of chlorophyll molecules in large particles, and the amount of quenchers as a function of the degree of aggregation. These dependencies reveal that the quenching of the chl a fluorescence upon aggregation is due to the formation of large aggregates and the increasing of the amount of chlorophyll molecules forming these aggregates. As a consequence, the amount of quenchers, located in large aggregates, is increased, and their singlet excited-state lifetimes steeply decrease.

  11. A case for automated tape in clinical imaging.

    PubMed

    Bookman, G; Baune, D

    1998-08-01

    Electronic archiving of radiology images over many years will require many terabytes of storage with a need for rapid retrieval of these images. As more large PACS installations are installed and implemented, a data crisis occurs. The ability to store this large amount of data using the traditional method of optical jukeboxes or online disk alone becomes an unworkable solution. The amount of floor space number of optical jukeboxes, and off-line shelf storage required to store the images becomes unmanageable. With the recent advances in tape and tape drives, the use of tape for long term storage of PACS data has become the preferred alternative. A PACS system consisting of a centrally managed system of RAID disk, software and at the heart of the system, tape, presents a solution that for the first time solves the problems of multi-modality high end PACS, non-DICOM image, electronic medical record and ADT data storage. This paper will examine the installation of the University of Utah, Department of Radiology PACS system and the integration of automated tape archive. The tape archive is also capable of storing data other than traditional PACS data. The implementation of an automated data archive to serve the many other needs of a large hospital will also be discussed. This will include the integration of a filmless cardiology department and the backup/archival needs of a traditional MIS department. The need for high bandwidth to tape with a large RAID cache will be examined and how with an interface to a RIS pre-fetch engine, tape can be a superior solution to optical platters or other archival solutions. The data management software will be discussed in detail. The performance and cost of RAID disk cache and automated tape compared to a solution that includes optical will be examined.

  12. Accelerator science and technology in Europe: EuCARD 2012

    NASA Astrophysics Data System (ADS)

    Romaniuk, Ryszard S.

    2012-05-01

    Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, shown during the third annual meeting of the EuCARD - European Coordination of Accelerator Research and Development. The conference concerns building of the research infrastructure, including in this advanced photonic and electronic systems for servicing large high energy physics experiments. There are debated a few basic groups of such systems like: measurement - control networks of large geometrical extent, multichannel systems for large amounts of metrological data acquisition, precision photonic networks of reference time, frequency and phase distribution.

  13. Large, horizontal-axis wind turbines

    NASA Technical Reports Server (NTRS)

    Linscott, B. S.; Perkins, P.; Dennett, J. T.

    1984-01-01

    Development of the technology for safe, reliable, environmentally acceptable large wind turbines that have the potential to generate a significant amount of electricity at costs competitive with conventional electric generating systems are presented. In addition, these large wind turbines must be fully compatible with electric utility operations and interface requirements. There are several ongoing large wind system development projects and applied research efforts directed toward meeting the technology requirements for utility applications. Detailed information on these projects is provided. The Mod-O research facility and current applied research effort in aerodynamics, structural dynamics and aeroelasticity, composite and hybrid composite materials, and multiple system interaction are described. A chronology of component research and technology development for large, horizontal axis wind turbines is presented. Wind characteristics, wind turbine economics, and the impact of wind turbines on the environment are reported. The need for continued wind turbine research and technology development is explored. Over 40 references are sited and a bibliography is included.

  14. Transport Traffic Analysis for Abusive Infrastructure Characterization

    DTIC Science & Technology

    2012-12-14

    Introduction Abusive traffic abounds on the Internet, in the form of email, malware, vulnerability scanners, worms, denial-of-service, drive-by-downloads, scam ...insight is two-fold. First, attackers have a basic requirement to source large amounts of data, be it denial-of-service, scam -hosting, spam, or other...the network core. This paper explores the power of transport-layer traffic analysis to detect and characterize scam hosting infrastructure, including

  15. Adaptation of the Black Yeast Wangiella dermatitidis to Ionizing Radiation: Molecular and Cellular Mechanisms

    DTIC Science & Technology

    2012-11-01

    laboratory and in the damaged Chernobyl nuclear reactor suggest they have adapted the ability to survive or even benefit from exposure to ionizing...damaged nuclear reactor at Chernobyl , which are constantly exposed to ionizing radiation, harbor large of amounts of microorganisms, including fungal...species [3,4]. Furthermore, Zhdanova et al. reported that beta and gamma radiation promoted directional growth of fungi isolated from the Chernobyl

  16. Atmospheric density models

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An atmospheric model developed by Jacchia, quite accurate but requiring a large amount of computer storage and execution time, was found to be ill-suited for the space shuttle onboard program. The development of a simple atmospheric density model to simulate the Jacchia model was studied. Required characteristics including variation with solar activity, diurnal variation, variation with geomagnetic activity, semiannual variation, and variation with height were met by the new atmospheric density model.

  17. Total Hadron Cross Section, New Particles, and Muon Electron Events in e{sup +}e{sup -} Annihilation at SPEAR

    DOE R&D Accomplishments Database

    Richter, B.

    1976-01-01

    The review of total hadron electroproduction cross sections, the new states, and the muon--electron events includes large amount of information on hadron structure, nine states with width ranging from 10's of keV to many MeV, the principal decay modes and quantum numbers of some of the states, and limits on charm particle production. 13 references. (JFP)

  18. Fruit and vegetables and cancer risk.

    PubMed

    Key, T J

    2011-01-04

    The possibility that fruit and vegetables may help to reduce the risk of cancer has been studied for over 30 years, but no protective effects have been firmly established. For cancers of the upper gastrointestinal tract, epidemiological studies have generally observed that people with a relatively high intake of fruit and vegetables have a moderately reduced risk, but these observations must be interpreted cautiously because of potential confounding by smoking and alcohol. For lung cancer, recent large prospective analyses with detailed adjustment for smoking have not shown a convincing association between fruit and vegetable intake and reduced risk. For other common cancers, including colorectal, breast and prostate cancer, epidemiological studies suggest little or no association between total fruit and vegetable consumption and risk. It is still possible that there are benefits to be identified: there could be benefits in populations with low average intakes of fruit and vegetables, such that those eating moderate amounts have a lower cancer risk than those eating very low amounts, and there could also be effects of particular nutrients in certain fruits and vegetables, as fruit and vegetables have very varied composition. Nutritional principles indicate that healthy diets should include at least moderate amounts of fruit and vegetables, but the available data suggest that general increases in fruit and vegetable intake would not have much effect on cancer rates, at least in well-nourished populations. Current advice in relation to diet and cancer should include the recommendation to consume adequate amounts of fruit and vegetables, but should put most emphasis on the well-established adverse effects of obesity and high alcohol intakes.

  19. Fruit and vegetables and cancer risk

    PubMed Central

    Key, T J

    2011-01-01

    The possibility that fruit and vegetables may help to reduce the risk of cancer has been studied for over 30 years, but no protective effects have been firmly established. For cancers of the upper gastrointestinal tract, epidemiological studies have generally observed that people with a relatively high intake of fruit and vegetables have a moderately reduced risk, but these observations must be interpreted cautiously because of potential confounding by smoking and alcohol. For lung cancer, recent large prospective analyses with detailed adjustment for smoking have not shown a convincing association between fruit and vegetable intake and reduced risk. For other common cancers, including colorectal, breast and prostate cancer, epidemiological studies suggest little or no association between total fruit and vegetable consumption and risk. It is still possible that there are benefits to be identified: there could be benefits in populations with low average intakes of fruit and vegetables, such that those eating moderate amounts have a lower cancer risk than those eating very low amounts, and there could also be effects of particular nutrients in certain fruits and vegetables, as fruit and vegetables have very varied composition. Nutritional principles indicate that healthy diets should include at least moderate amounts of fruit and vegetables, but the available data suggest that general increases in fruit and vegetable intake would not have much effect on cancer rates, at least in well-nourished populations. Current advice in relation to diet and cancer should include the recommendation to consume adequate amounts of fruit and vegetables, but should put most emphasis on the well-established adverse effects of obesity and high alcohol intakes. PMID:21119663

  20. Large wood in the Snowy River estuary, Australia

    NASA Astrophysics Data System (ADS)

    Hinwood, Jon B.; McLean, Errol J.

    2017-02-01

    In this paper we report on 8 years of data collection and interpretation of large wood in the Snowy River estuary in southeastern Australia, providing quantitative data on the amount, sources, transport, decay, and geomorphic actions. No prior census data for an estuary is known to the authors despite their environmental and economic importance and the significant differences between a fluvial channel and an estuarine channel. Southeastern Australian estuaries contain a significant quantity of large wood that is derived from many sources, including river flood flows, local bank erosion, and anthropogenic sources. Wind and tide are shown to be as important as river flow in transporting and stranding large wood. Tidal action facilitates trapping of large wood on intertidal bars and shoals; but channels are wider and generally deeper, so log jams are less likely than in rivers. Estuarine large wood contributes to localised scour and accretion and hence to the modification of estuarine habitat, but in the study area it did not have large-scale impacts on the hydraulic gradients nor the geomorphology.

  1. Full-scale flammability test data for validation of aircraft fire mathematical models

    NASA Technical Reports Server (NTRS)

    Kuminecz, J. F.; Bricker, R. W.

    1982-01-01

    Twenty-five large scale aircraft flammability tests were conducted in a Boeing 737 fuselage at the NASA Johnson Space Center (JSC). The objective of this test program was to provide a data base on the propagation of large scale aircraft fires to support the validation of aircraft fire mathematical models. Variables in the test program included cabin volume, amount of fuel, fuel pan area, fire location, airflow rate, and cabin materials. A number of tests were conducted with jet A-1 fuel only, while others were conducted with various Boeing 747 type cabin materials. These included urethane foam seats, passenger service units, stowage bins, and wall and ceiling panels. Two tests were also included using special urethane foam and polyimide foam seats. Tests were conducted with each cabin material individually, with various combinations of these materials, and finally, with all materials in the cabin. The data include information obtained from approximately 160 locations inside the fuselage.

  2. TransAtlasDB: an integrated database connecting expression data, metadata and variants

    PubMed Central

    Adetunji, Modupeore O; Lamont, Susan J; Schmidt, Carl J

    2018-01-01

    Abstract High-throughput transcriptome sequencing (RNAseq) is the universally applied method for target-free transcript identification and gene expression quantification, generating huge amounts of data. The constraint of accessing such data and interpreting results can be a major impediment in postulating suitable hypothesis, thus an innovative storage solution that addresses these limitations, such as hard disk storage requirements, efficiency and reproducibility are paramount. By offering a uniform data storage and retrieval mechanism, various data can be compared and easily investigated. We present a sophisticated system, TransAtlasDB, which incorporates a hybrid architecture of both relational and NoSQL databases for fast and efficient data storage, processing and querying of large datasets from transcript expression analysis with corresponding metadata, as well as gene-associated variants (such as SNPs) and their predicted gene effects. TransAtlasDB provides the data model of accurate storage of the large amount of data derived from RNAseq analysis and also methods of interacting with the database, either via the command-line data management workflows, written in Perl, with useful functionalities that simplifies the complexity of data storage and possibly manipulation of the massive amounts of data generated from RNAseq analysis or through the web interface. The database application is currently modeled to handle analyses data from agricultural species, and will be expanded to include more species groups. Overall TransAtlasDB aims to serve as an accessible repository for the large complex results data files derived from RNAseq gene expression profiling and variant analysis. Database URL: https://modupeore.github.io/TransAtlasDB/ PMID:29688361

  3. Branching habit and the allocation of reproductive resources in conifers.

    PubMed

    Leslie, Andrew B

    2012-09-01

    Correlated relationships between branch thickness, branch density, and twig and leaf size have been used extensively to study the evolution of plant canopy architecture, but fewer studies have explored the impact of these relationships on the allocation of reproductive resources. This study quantifies pollen cone production in conifers, which have similar basic reproductive biology but vary dramatically in branching habit, in order to test how differences in branch diameter influence pollen cone size and the density with which they are deployed in the canopy. Measurements of canopy branch density, the number of cones per branch and cone size were used to estimate the amount of pollen cone tissues produced by 16 species in three major conifer clades. The number of pollen grains produced was also estimated using direct counts from individual pollen cones. The total amount of pollen cone tissues in the conifer canopy varied little among species and clades, although vegetative traits such as branch thickness, branch density and pollen cone size varied over several orders of magnitude. However, branching habit controls the way these tissues are deployed: taxa with small branches produce small pollen cones at a high density, while taxa with large branches produce large cones relatively sparsely. Conifers appear to invest similar amounts of energy in pollen production independent of branching habit. However, similar associations between branch thickness, branch density and pollen cone size are seen across conifers, including members of living and extinct groups not directly studied here. This suggests that reproductive features relating to pollen cone size are in large part a function of the evolution of vegetative morphology and branching habit.

  4. Artificial maturation of an immature sulfur- and organic matter-rich limestone from the Ghareb Formation, Jordan

    USGS Publications Warehouse

    Koopmans, M.P.; Rijpstra, W.I.C.; De Leeuw, J. W.; Lewan, M.D.; Damste, J.S.S.

    1998-01-01

    An immature (Ro=0.39%), S-rich (S(org)/C = 0.07), organic matter-rich (19.6 wt. % TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220 ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophenes and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.An immature (Ro = 0.39%), S-rich (Sorg/C = 0.07), organic matter-rich (19.6 wt.% TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220, ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophene and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.

  5. Differentiation of oligodendrocyte progenitor cells from dissociated monolayer and feeder-free cultured pluripotent stem cells.

    PubMed

    Yamashita, Tomoko; Miyamoto, Yuki; Bando, Yoshio; Ono, Takashi; Kobayashi, Sakurako; Doi, Ayano; Araki, Toshihiro; Kato, Yosuke; Shirakawa, Takayuki; Suzuki, Yutaka; Yamauchi, Junji; Yoshida, Shigetaka; Sato, Naoya

    2017-01-01

    Oligodendrocytes myelinate axons and form myelin sheaths in the central nervous system. The development of therapies for demyelinating diseases, including multiple sclerosis and leukodystrophies, is a challenge because the pathogenic mechanisms of disease remain poorly understood. Primate pluripotent stem cell-derived oligodendrocytes are expected to help elucidate the molecular pathogenesis of these diseases. Oligodendrocytes have been successfully differentiated from human pluripotent stem cells. However, it is challenging to prepare large amounts of oligodendrocytes over a short amount of time because of manipulation difficulties under conventional primate pluripotent stem cell culture methods. We developed a proprietary dissociated monolayer and feeder-free culture system to handle pluripotent stem cell cultures. Because the dissociated monolayer and feeder-free culture system improves the quality and growth of primate pluripotent stem cells, these cells could potentially be differentiated into any desired functional cells and consistently cultured in large-scale conditions. In the current study, oligodendrocyte progenitor cells and mature oligodendrocytes were generated within three months from monkey embryonic stem cells. The embryonic stem cell-derived oligodendrocytes exhibited in vitro myelinogenic potency with rat dorsal root ganglion neurons. Additionally, the transplanted oligodendrocyte progenitor cells differentiated into myelin basic protein-positive mature oligodendrocytes in the mouse corpus callosum. This preparative method was used for human induced pluripotent stem cells, which were also successfully differentiated into oligodendrocyte progenitor cells and mature oligodendrocytes that were capable of myelinating rat dorsal root ganglion neurons. Moreover, it was possible to freeze, thaw, and successfully re-culture the differentiating cells. These results showed that embryonic stem cells and human induced pluripotent stem cells maintained in a dissociated monolayer and feeder-free culture system have the potential to generate oligodendrocyte progenitor cells and mature oligodendrocytes in vitro and in vivo. This culture method could be applied to prepare large amounts of oligodendrocyte progenitor cells and mature oligodendrocytes in a relatively short amount of time.

  6. Identification of p53 unbound to T-antigen in human cells transformed by simian virus 40 T-antigen.

    PubMed

    O'Neill, F J; Hu, Y; Chen, T; Carney, H

    1997-02-27

    In several clones of SV40-transformed human cells, we investigated the relative amounts of large T-Antigen (T-Ag) and p53 proteins, both unbound and associated within complexes, with the goal of identifying changes associated with transformation and immortalization. Cells were transformed by wild type (wt) T-Ag, a functionally temperature sensitive T-Ag (tsA58) and other T-Ag variants. Western analysis showed that while most of the T-Ag was ultimately bound by p53, most of the p53 remained unbound to T-Ag. Unbound p53 remained in the supernatant after a T-Ag immunoprecipitation and p53 was present in two to fourfold excess of T-Ag. In one transformant there was five to tenfold more p53 than T-Ag. p53 was present in transformants in amounts at least 200-fold greater than in untransformed human cells. In wt and variant T-Ag transformants, including those generated with tsA58 T-Ag, large amounts of unbound p53 were present in both pre-crisis and immortal cells and when the cells were grown at permissive or non-permissive temperatures. We also found that in transformants produced by tsA58, an SV40/JCV chimeric T-Ag and other variants, T-Ag appeared to form a complex with p53 slowly perhaps because one or both proteins matured slowly. The presence in transformed human cells of large amounts of unbound p53 and in excess of T-Ag suggests that sequestration of p53 by T-Ag, resulting from complex formation, is required neither for morphological transformation nor immortalization of human cells. Rather, these results support the proposal that high levels of p53, the T-Ag/p53 complexes, or other biochemical event(s), lead to transformation and immortalization of human cells by T-Ag.

  7. Biospheric effects of a large extraterrestrial impact: Case study of the Cretaceous/Tertiary boundary crater

    NASA Technical Reports Server (NTRS)

    Pope, Kevin O.

    1994-01-01

    The Chicxulub Crater in Yucatan, Mexico, is the primary candidate for the impact that caused mass extinctions at the Cretaceous/Tertiary boundary. The target rocks at Chicxulub contain 750 to 1500 m of anhydrite (CaSO4), which was vaporized upon impact, creating a large sulfuric acid aerosol cloud. In this study we apply a hydrocode model of asteroid impact to calculate the amount of sulfuric acid produced. We then apply a radiative transfer model to determine the atmospheric effects. Results include 6 to 9 month period of darkness followed by 12 to 26 years of cooling.

  8. SPARK GAP SWITCH

    DOEpatents

    Neal, R.B.

    1957-12-17

    An improved triggered spark gap switch is described, capable of precisely controllable firing time while switching very large amounts of power. The invention in general comprises three electrodes adjustably spaced and adapted to have a large potential impressed between the outer electrodes. The central electrode includes two separate elements electrically connected togetaer and spaced apart to define a pair of spark gaps between the end electrodes. Means are provided to cause the gas flow in the switch to pass towards the central electrode, through a passage in each separate element, and out an exit disposed between the two separate central electrode elements in order to withdraw ions from the spark gap.

  9. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  10. Massively parallel processor computer

    NASA Technical Reports Server (NTRS)

    Fung, L. W. (Inventor)

    1983-01-01

    An apparatus for processing multidimensional data with strong spatial characteristics, such as raw image data, characterized by a large number of parallel data streams in an ordered array is described. It comprises a large number (e.g., 16,384 in a 128 x 128 array) of parallel processing elements operating simultaneously and independently on single bit slices of a corresponding array of incoming data streams under control of a single set of instructions. Each of the processing elements comprises a bidirectional data bus in communication with a register for storing single bit slices together with a random access memory unit and associated circuitry, including a binary counter/shift register device, for performing logical and arithmetical computations on the bit slices, and an I/O unit for interfacing the bidirectional data bus with the data stream source. The massively parallel processor architecture enables very high speed processing of large amounts of ordered parallel data, including spatial translation by shifting or sliding of bits vertically or horizontally to neighboring processing elements.

  11. Deep learning-based fine-grained car make/model classification for visual surveillance

    NASA Astrophysics Data System (ADS)

    Gundogdu, Erhan; Parıldı, Enes Sinan; Solmaz, Berkan; Yücesoy, Veysel; Koç, Aykut

    2017-10-01

    Fine-grained object recognition is a potential computer vision problem that has been recently addressed by utilizing deep Convolutional Neural Networks (CNNs). Nevertheless, the main disadvantage of classification methods relying on deep CNN models is the need for considerably large amount of data. In addition, there exists relatively less amount of annotated data for a real world application, such as the recognition of car models in a traffic surveillance system. To this end, we mainly concentrate on the classification of fine-grained car make and/or models for visual scenarios by the help of two different domains. First, a large-scale dataset including approximately 900K images is constructed from a website which includes fine-grained car models. According to their labels, a state-of-the-art CNN model is trained on the constructed dataset. The second domain that is dealt with is the set of images collected from a camera integrated to a traffic surveillance system. These images, which are over 260K, are gathered by a special license plate detection method on top of a motion detection algorithm. An appropriately selected size of the image is cropped from the region of interest provided by the detected license plate location. These sets of images and their provided labels for more than 30 classes are employed to fine-tune the CNN model which is already trained on the large scale dataset described above. To fine-tune the network, the last two fully-connected layers are randomly initialized and the remaining layers are fine-tuned in the second dataset. In this work, the transfer of a learned model on a large dataset to a smaller one has been successfully performed by utilizing both the limited annotated data of the traffic field and a large scale dataset with available annotations. Our experimental results both in the validation dataset and the real field show that the proposed methodology performs favorably against the training of the CNN model from scratch.

  12. The "strong" RNA world hypothesis: fifty years old.

    PubMed

    Neveu, Marc; Kim, Hyo-Joong; Benner, Steven A

    2013-04-01

    This year marks the 50(th) anniversary of a proposal by Alex Rich that RNA, as a single biopolymer acting in two capacities, might have supported both genetics and catalysis at the origin of life. We review here both published and previously unreported experimental data that provide new perspectives on this old proposal. The new data include evidence that, in the presence of borate, small amounts of carbohydrates can fix large amounts of formaldehyde that are expected in an environment rich in carbon dioxide. Further, we consider other species, including arsenate, arsenite, phosphite, and germanate, that might replace phosphate as linkers in genetic biopolymers. While linkages involving these oxyanions are judged to be too unstable to support genetics on Earth, we consider the possibility that they might do so in colder semi-aqueous environments more exotic than those found on Earth, where cosolvents such as ammonia might prevent freezing at temperatures well below 273 K. These include the ammonia-water environments that are possibly present at low temperatures beneath the surface of Titan, Saturn's largest moon.

  13. A MODIFIED METHOD OF OBTAINING LARGE AMOUNTS OF RICKETTSIA PROWAZEKI BY ROENTGEN IRRADIATION OF RATS

    PubMed Central

    Macchiavello, Atilio; Dresser, Richard

    1935-01-01

    The radiation method described by Zinsser and Castaneda for obtaining large amounts of Rickettsia has been carried out successfully with an ordinary radiographic machine. This allows the extension of the method to those communities which do not possess a high voltage Roentgen therapy unit as originally employed. PMID:19870416

  14. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  15. Synthesis and Labeling of RNA In Vitro

    PubMed Central

    Huang, Chao; Yu, Yi-Tao

    2013-01-01

    This unit discusses several methods for generating large amounts of uniformly labeled, end-labeled, and site-specifically labeled RNAs in vitro. The methods involve a number of experimental procedures, including RNA transcription, 5′ dephosphorylation and rephosphorylation, 3′ terminal nucleotide addition (via ligation), site-specific RNase H cleavage directed by 2′-O-methyl RNA-DNA chimeras, and 2-piece splint ligation. The applications of these RNA radiolabeling approaches are also discussed. PMID:23547015

  16. The ROK Army’s Role When North Korea Collapses Without a War with the ROK

    DTIC Science & Technology

    2001-02-01

    produced large amounts of biological and chemical weapons. In addition, North Korea continues to develop nuclear weapons and missile technology and export...process. 6. Security and safe disposal of WMD. This includes research, production and storage facilities for nuclear, biological and chemical weapons...Publishers, 1989. Naisbitt, John . Megatrends Asia: Eight Asian Megatrends That Are Reshaping Our World, New York: Simon and Schuster. 1996. The New

  17. Cultural Factors in Managing an FMS Case Program: Saudi Arabian Army Ordnance Corps (SOCP) Program

    DTIC Science & Technology

    1977-11-01

    which included the purchase of large amounts of US;--’,oducee current generation self-Dromelled artillery, personnel earri- ero, tanks, mortar carriers...exores:ecd when attempting, to discuss 13 complex, sophisticated technical material with senior counterparts who possessed relative fluency in...i.ored -:ith ’ mop ity; they crnnot be rvoided; the: can to a rrroat extent be anticipated as critical man- cement factors. Bfy anticipating and preparing

  18. Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser

    NASA Astrophysics Data System (ADS)

    Christen, M.

    2016-06-01

    Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.

  19. Mouthwash overdose

    MedlinePlus

    ... are: Chlorhexidine gluconate Ethanol (ethyl alcohol) Hydrogen peroxide Methyl salicylate ... amounts of alcohol (drunkenness). Swallowing large amounts of methyl salicylate and hydrogen peroxide may also cause serious stomach ...

  20. Reverse microemulsion synthesis of layered gadolinium hydroxide nanoparticles

    NASA Astrophysics Data System (ADS)

    Xu, Yadong; Suthar, Jugal; Egbu, Raphael; Weston, Andrew J.; Fogg, Andrew M.; Williams, Gareth R.

    2018-02-01

    A reverse microemulsion approach has been explored for the synthesis of layered gadolinium hydroxide (LGdH) nanoparticles in this work. This method uses oleylamine as a multifunctional agent, acting as surfactant, oil phase and base. 1-butanol is additionally used as a co-surfactant. A systematic study of the key reaction parameters was undertaken, including the volume ratio of surfactant (oleylamine) to water, the reaction time, synthesis temperature, and the amount of co-surfactant (1-butanol) added. It proved possible to obtain pristine LGdH materials at temperatures of 120 °C or below with an oleylamine: water ratio of 1:4. Using larger amounts of surfactant or higher temperatures caused the formation of Gd(OH)3, either as the sole product or as a major impurity phase. The LGdH particles produced have sizes of ca. 200 nm, with this size being largely independent of temperature or reaction time. Adjusting the amount of 1-butanol co-surfactant added permits the size to be varied between 200 and 300 nm.

  1. Metalloporphyrin-based porous polymers prepared via click chemistry for size-selective adsorption of protein.

    PubMed

    Zhu, Dailian; Qin, Cunqi; Ao, Shanshi; Su, Qiuping; Sun, Xiying; Jiang, Tengfei; Pei, Kemei; Ni, Huagang; Ye, Peng

    2018-08-01

    Zinc porphyrin-based porous polymers (PPs-Zn) with different pore sizes were prepared by controlling the reaction condition of click chemistry, and the protein adsorption in PPs-Zn and the catalytic activity of immobilized enzyme were investigated. PPs-Zn-1 with 18 nm and PPS-Zn-2 with 90 nm of pore size were characterized by FTIR, NMR and nitrogen absorption experiments. The amount of adsorbed protein in PPs-Zn-1 was more than that in PPs-Zn-2 for small size proteins, such as lysozyme, lipase and bovine serum albumin (BSA). And for large size proteins including myosin and human fibrinogen (HFg), the amount of adsorbed protein in PPs-Zn-1 was less than that in PPs-Zn-2. The result indicates that the protein adsorption is size-selective in PPs-Zn. Both the protein size and the pore size have a significant effect on the amount of adsorbed protein in the PPs-Zn. Lipase and lysozyme immobilized in PPs-Zn exhibited excellent reuse stability.

  2. Hurricane Isabel, Amount of Atmospheric Water Vapor Observed By AIRS

    NASA Image and Video Library

    2003-09-20

    This false-color image shows the amount of atmospheric water vapor observed by AIRS two weeks prior to the passage of Hurricane Isabel, and then when it was a Category 5 storm. The region shown includes parts of South America and the West Indies. Puerto Rico is the large island below the upper left corner. Total water vapor represents the depth of a layer if all the water vapor in the atmosphere were to condense and fall to the surface. The color bar on the right sides of the plots give the thickness of this layer in millimeters (mm). The first image, from August 28, shows typical tropical water vapor amounts over the ocean: between roughly 25 and 50 mm, or 1 to 2 inches. The highest values of roughly 80 mm, seen as a red blob over South America, corresponds to intense thunderstorms. Thunderstorms pull in water vapor from surrounding regions and concentrate it, with much of it then falling as rain. http://photojournal.jpl.nasa.gov/catalog/PIA00430

  3. Nonparametric Density Estimation Based on Self-Organizing Incremental Neural Network for Large Noisy Data.

    PubMed

    Nakamura, Yoshihiro; Hasegawa, Osamu

    2017-01-01

    With the ongoing development and expansion of communication networks and sensors, massive amounts of data are continuously generated in real time from real environments. Beforehand, prediction of a distribution underlying such data is difficult; furthermore, the data include substantial amounts of noise. These factors make it difficult to estimate probability densities. To handle these issues and massive amounts of data, we propose a nonparametric density estimator that rapidly learns data online and has high robustness. Our approach is an extension of both kernel density estimation (KDE) and a self-organizing incremental neural network (SOINN); therefore, we call our approach KDESOINN. An SOINN provides a clustering method that learns about the given data as networks of prototype of data; more specifically, an SOINN can learn the distribution underlying the given data. Using this information, KDESOINN estimates the probability density function. The results of our experiments show that KDESOINN outperforms or achieves performance comparable to the current state-of-the-art approaches in terms of robustness, learning time, and accuracy.

  4. A multi-landing pad DNA integration platform for mammalian cell engineering

    PubMed Central

    Gaidukov, Leonid; Wroblewska, Liliana; Teague, Brian; Nelson, Tom; Zhang, Xin; Liu, Yan; Jagtap, Kalpana; Mamo, Selamawit; Tseng, Wen Allen; Lowe, Alexis; Das, Jishnu; Bandara, Kalpanie; Baijuraj, Swetha; Summers, Nevin M; Zhang, Lin; Weiss, Ron

    2018-01-01

    Abstract Engineering mammalian cell lines that stably express many transgenes requires the precise insertion of large amounts of heterologous DNA into well-characterized genomic loci, but current methods are limited. To facilitate reliable large-scale engineering of CHO cells, we identified 21 novel genomic sites that supported stable long-term expression of transgenes, and then constructed cell lines containing one, two or three ‘landing pad’ recombination sites at selected loci. By using a highly efficient BxB1 recombinase along with different selection markers at each site, we directed recombinase-mediated insertion of heterologous DNA to selected sites, including targeting all three with a single transfection. We used this method to controllably integrate up to nine copies of a monoclonal antibody, representing about 100 kb of heterologous DNA in 21 transcriptional units. Because the integration was targeted to pre-validated loci, recombinant protein expression remained stable for weeks and additional copies of the antibody cassette in the integrated payload resulted in a linear increase in antibody expression. Overall, this multi-copy site-specific integration platform allows for controllable and reproducible insertion of large amounts of DNA into stable genomic sites, which has broad applications for mammalian synthetic biology, recombinant protein production and biomanufacturing. PMID:29617873

  5. Computational Results for the KTH-NASA Wind-Tunnel Model Used for Acquisition of Transonic Nonlinear Aeroelastic Data

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Chwalowski, Pawel; Wieseman, Carol D.; Eller, David; Ringertz, Ulf

    2017-01-01

    A status report is provided on the collaboration between the Royal Institute of Technology (KTH) in Sweden and the NASA Langley Research Center regarding the aeroelastic analyses of a full-span fighter configuration wind-tunnel model. This wind-tunnel model was tested in the Transonic Dynamics Tunnel (TDT) in the summer of 2016. Large amounts of data were acquired including steady/unsteady pressures, accelerations, strains, and measured dynamic deformations. The aeroelastic analyses presented include linear aeroelastic analyses, CFD steady analyses, and analyses using CFD-based reduced-order models (ROMs).

  6. Plasma fractionation issues.

    PubMed

    Farrugia, Albert; Evers, Theo; Falcou, Pierre-Francois; Burnouf, Thierry; Amorim, Luiz; Thomas, Sylvia

    2009-04-01

    Procurement and processing of human plasma for fractionation of therapeutic proteins or biological medicines used in clinical practice is a multi-billion dollar international trade. Together the private sector and public sector (non-profit) provide large amounts of safe and effective therapeutic plasma proteins needed worldwide. The principal therapeutic proteins produced by the dichotomous industry include gamma globulins or immunoglobulins (including pathogen-specific hyperimmune globulins, such as hepatitis B immune globulins) albumin, factor VIII and Factor IX concentrates. Viral inactivation, principally by solvent detergent and other processes, has proven highly effective in preventing transmission of enveloped viruses, viz. HBV, HIV, and HCV.

  7. An Evidence-Based Practical Approach to Pediatric Otolaryngology in the Developing World.

    PubMed

    Belcher, Ryan H; Molter, David W; Goudy, Steven L

    2018-06-01

    Despite humanitarian otolaryngology groups traveling in record numbers to resource-limited areas treating pediatric otolaryngology disease processes and training local providers, there remains a large burden of unmet needs. There is a meager amount of published information that comes from the developing world from an otolaryngology standpoint. As would be expected, the little information that does comes involves some of the most common pediatric otolaryngology diseases and surgical burdens including childhood hearing loss, otitis media, adenotonsillectomies, airway obstructions requiring tracheostomies, foreign body aspirations, and craniomaxillofacial surgeries, including cleft lip and palate. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Visual attention mitigates information loss in small- and large-scale neural codes

    PubMed Central

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  9. Lightweight Integrated Solar Array (LISA): Providing Higher Power to Small Spacecraft

    NASA Technical Reports Server (NTRS)

    Johnson, Les; Carr, John; Fabisinski, Leo; Lockett, Tiffany Russell

    2015-01-01

    Affordable and convenient access to electrical power is essential for all spacecraft and is a critical design driver for the next generation of smallsats, including CubeSats, which are currently extremely power limited. The Lightweight Integrated Solar Array (LISA), a concept designed, prototyped, and tested at the NASA Marshall Space Flight Center (MSFC) in Huntsville, Alabama provides an affordable, lightweight, scalable, and easily manufactured approach for power generation in space. This flexible technology has many wide-ranging applications from serving small satellites to providing abundant power to large spacecraft in GEO and beyond. By using very thin, ultraflexible solar arrays adhered to an inflatable or deployable structure, a large area (and thus large amount of power) can be folded and packaged into a relatively small volume.

  10. Large Scale GW Calculations on the Cori System

    NASA Astrophysics Data System (ADS)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  11. Research on wastewater reuse planning in Beijing central region.

    PubMed

    Jia, H; Guo, R; Xin, K; Wang, J

    2005-01-01

    The need to implement wastewater reuse in Beijing is discussed. Based on the investigation of the built wastewater reuse projects in Beijing, the differences between small wastewater reuse system and large systems were analyzed according to the technical, economical and social issues. The advantages and disadvantages of the small system and the large system were then given. In wastewater reuse planning in Beijing urban region, the large system was adopted. The rations of reclaimed water for difference land use type, including industrial reuse, municipal reuse, grass irrigation, and scenes water reuse were determined. Then according to the land use information in every block in central Beijing, using GIS techniques, the amounts of the reclaimed water needed in every block were calculated, and the main pipe system of reclaimed water was planned.

  12. A Parallel Nonrigid Registration Algorithm Based on B-Spline for Medical Images

    PubMed Central

    Wang, Yangping; Wang, Song

    2016-01-01

    The nonrigid registration algorithm based on B-spline Free-Form Deformation (FFD) plays a key role and is widely applied in medical image processing due to the good flexibility and robustness. However, it requires a tremendous amount of computing time to obtain more accurate registration results especially for a large amount of medical image data. To address the issue, a parallel nonrigid registration algorithm based on B-spline is proposed in this paper. First, the Logarithm Squared Difference (LSD) is considered as the similarity metric in the B-spline registration algorithm to improve registration precision. After that, we create a parallel computing strategy and lookup tables (LUTs) to reduce the complexity of the B-spline registration algorithm. As a result, the computing time of three time-consuming steps including B-splines interpolation, LSD computation, and the analytic gradient computation of LSD, is efficiently reduced, for the B-spline registration algorithm employs the Nonlinear Conjugate Gradient (NCG) optimization method. Experimental results of registration quality and execution efficiency on the large amount of medical images show that our algorithm achieves a better registration accuracy in terms of the differences between the best deformation fields and ground truth and a speedup of 17 times over the single-threaded CPU implementation due to the powerful parallel computing ability of Graphics Processing Unit (GPU). PMID:28053653

  13. Biogeographic patterns in below-ground diversity in New York City's Central Park are similar to those observed globally.

    PubMed

    Ramirez, Kelly S; Leff, Jonathan W; Barberán, Albert; Bates, Scott Thomas; Betley, Jason; Crowther, Thomas W; Kelly, Eugene F; Oldfield, Emily E; Shaw, E Ashley; Steenbock, Christopher; Bradford, Mark A; Wall, Diana H; Fierer, Noah

    2014-11-22

    Soil biota play key roles in the functioning of terrestrial ecosystems, however, compared to our knowledge of above-ground plant and animal diversity, the biodiversity found in soils remains largely uncharacterized. Here, we present an assessment of soil biodiversity and biogeographic patterns across Central Park in New York City that spanned all three domains of life, demonstrating that even an urban, managed system harbours large amounts of undescribed soil biodiversity. Despite high variability across the Park, below-ground diversity patterns were predictable based on soil characteristics, with prokaryotic and eukaryotic communities exhibiting overlapping biogeographic patterns. Further, Central Park soils harboured nearly as many distinct soil microbial phylotypes and types of soil communities as we found in biomes across the globe (including arctic, tropical and desert soils). This integrated cross-domain investigation highlights that the amount and patterning of novel and uncharacterized diversity at a single urban location matches that observed across natural ecosystems spanning multiple biomes and continents. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  14. GCM simulations of volcanic aerosol forcing. I - Climate changes induced by steady-state perturbations

    NASA Technical Reports Server (NTRS)

    Pollack, James B.; Rind, David; Lacis, Andrew; Hansen, James E.; Sato, Makiko; Ruedy, Reto

    1993-01-01

    The response of the climate system to a temporally and spatially constant amount of volcanic particles is simulated using a general circulation model (GCM). The optical depth of the aerosols is chosen so as to produce approximately the same amount of forcing as results from doubling the present CO2 content of the atmosphere and from the boundary conditions associated with the peak of the last ice age. The climate changes produced by long-term volcanic aerosol forcing are obtained by differencing this simulation and one made for the present climate with no volcanic aerosol forcing. The simulations indicate that a significant cooling of the troposphere and surface can occur at times of closely spaced multiple sulfur-rich volcanic explosions that span time scales of decades to centuries. The steady-state climate response to volcanic forcing includes a large expansion of sea ice, especially in the Southern Hemisphere; a resultant large increase in surface and planetary albedo at high latitudes; and sizable changes in the annually and zonally averaged air temperature.

  15. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  16. Mitigation of Fluorosis - A Review

    PubMed Central

    Dodamani, Arun S.; Jadhav, Harish C.; Naik, Rahul G.; Deshmukh, Manjiri A.

    2015-01-01

    Fluoride is required for normal development and growth of the body. It is found in plentiful quantity in environment and fluoride content in drinking water is largest contributor to the daily fluoride intake. The behaviour of fluoride ions in the human organism can be regarded as that of “double-edged sword”. Fluoride is beneficial in small amounts but toxic in large amounts. Excessive consumption of fluorides in various forms leads to development of fluorosis. Fluorosis is major health problem in 24 countries, including India, which lies in the geographical fluoride belt. Various technologies are being used to remove fluoride from water but still the problem has not been rooted out. The purpose of this paper is to review the available treatment modalities for fluorosis, available technologies for fluoride removal from water and ongoing fluorosis mitigation programs based on literature survey. Medline was the primary database used in the literature search. Other databases included: PubMed, Web of Science, Google Scholar, WHO, Ebscohost, Science Direct, Google Search Engine, etc. PMID:26266235

  17. Minimising generation of acid whey during Greek yoghurt manufacturing.

    PubMed

    Uduwerella, Gangani; Chandrapala, Jayani; Vasiljevic, Todor

    2017-08-01

    Greek yoghurt, a popular dairy product, generates large amounts of acid whey as a by-product during manufacturing. Post-processing treatment of this stream presents one of the main concerns for the industry. The objective of this study was to manipulate initial milk total solids content (15, 20 or 23 g/100 g) by addition of milk protein concentrate, thus reducing whey expulsion. Such an adjustment was investigated from the technological standpoint including starter culture performance, chemical and physical properties of manufactured Greek yoghurt and generated acid whey. A comparison was made to commercially available products. Increasing protein content in regular yoghurt reduced the amount of acid whey during whey draining. This protein fortification also enhanced the Lb. bulgaricus growth rate and proteolytic activity. Best structural properties including higher gel strength and lower syneresis were observed in the Greek yoghurt produced with 20 g/100 g initial milk total solid compared to manufactured or commercially available products, while acid whey generation was lowered due to lower drainage requirement.

  18. Comprehensive characterization of atmospheric organic matter in Fresno, California fog water

    USGS Publications Warehouse

    Herckes, P.; Leenheer, J.A.; Collett, J.L.

    2007-01-01

    Fogwater collected during winter in Fresno (CA) was characterized by isolating several distinct fractions and characterizing them by infrared and nuclear magnetic resonance (NMR) spectroscopy. More than 80% of the organic matter in the fogwater was recovered and characterized. The most abundant isolated fractions were those comprised of volatile acids (24% of isolated carbon) and hydrophilic acids plus neutrals (28%). Volatile acids, including formic and acetic acid, have been previously identified as among the most abundant individual species in fogwater. Recovered hydrophobic acids exhibited some properties similar to aquatic fulvic acids. An insoluble particulate organic matter fraction contained a substantial amount of biological material, while hydrophilic and transphilic fractions also contained material suggestive of biotic origin. Together, these fractions illustrate the important contribution biological sources make to organic matter in atmospheric fog droplets. The fogwater also was notable for containing a large amount of organic nitrogen present in a variety of species, including amines, nitrate esters, peptides, and nitroso compounds. ?? 2007 American Chemical Society.

  19. Flexible horseshoe

    DOEpatents

    Ford, Donald F.

    1985-01-01

    A screw-on horseshoe formed from a plastic material is disclosed. A flex joint is provided that allows the horseshoe to expand and contract as pressure is applied to the horse's hoof, thereby reducing friction between the hoof and the shoe. The horseshoe also provides a lip portion projecting upwardly from a horseshoe base portion to protect the horse hoof wall from obstacles encountered during the movement of the horse. A novel screw having a double helix thread pattern including a high thread pattern and a low thread pattern is used to fasten the horseshoe to the horse's hoof without piercing the hoof wall. The screw includes a keyed recessed self-holding head that is complementary to, and therefore readily driven by, a power drill. A lightweight, yet wear-resistant, horseshoe that is readily attached to a horse's hoof with a minimum amount of labor and a minimum amount of damage to the hoof that can be constructed in many styles and sizes to match a large variety of horse uses is thus described.

  20. Comprehensive characterization of atmospheric organic matter in Fresno, California fog water.

    PubMed

    Herckes, Pierre; Leenheer, Jerry A; Collett, Jeffrey L

    2007-01-15

    Fogwater collected during winter in Fresno (CA) was characterized by isolating several distinct fractions and characterizing them by infrared and nuclear magnetic resonance (NMR) spectroscopy. More than 80% of the organic matter in the fogwater was recovered and characterized. The most abundant isolated fractions were those comprised of volatile acids (24% of isolated carbon) and hydrophilic acids plus neutrals (28%). Volatile acids, including formic and acetic acid, have been previously identified as among the most abundant individual species in fogwater. Recovered hydrophobic acids exhibited some properties similar to aquatic fulvic acids. An insoluble particulate organic matter fraction contained a substantial amount of biological material, while hydrophilic and transphilic fractions also contained material suggestive of biotic origin. Together, these fractions illustrate the important contribution biological sources make to organic matter in atmospheric fog droplets. The fogwater also was notable for containing a large amount of organic nitrogen present in a variety of species, including amines, nitrate esters, peptides, and nitroso compounds.

  1. Design of a ``Digital Atlas Vme Electronics'' (DAVE) module

    NASA Astrophysics Data System (ADS)

    Goodrick, M.; Robinson, D.; Shaw, R.; Postranecky, M.; Warren, M.

    2012-01-01

    ATLAS-SCT has developed a new ATLAS trigger card, 'Digital Atlas Vme Electronics' (``DAVE''). The unit is designed to provide a versatile array of interface and logic resources, including a large FPGA. It interfaces to both VME bus and USB hosts. DAVE aims to provide exact ATLAS CTP (ATLAS Central Trigger Processor) functionality, with random trigger, simple and complex deadtime, ECR (Event Counter Reset), BCR (Bunch Counter Reset) etc. being generated to give exactly the same conditions in standalone running as experienced in combined runs. DAVE provides additional hardware and a large amount of free firmware resource to allow users to add or change functionality. The combination of the large number of individually programmable inputs and outputs in various formats, with very large external RAM and other components all connected to the FPGA, also makes DAVE a powerful and versatile FPGA utility card.

  2. Unit-Dose Bags For Formulating Intravenous Solutions

    NASA Technical Reports Server (NTRS)

    Finley, Mike; Kipp, Jim; Scharf, Mike; Packard, Jeff; Owens, Jim

    1993-01-01

    Smaller unit-dose flowthrough bags devised for use with large-volume parenteral (LVP) bags in preparing sterile intravenous solutions. Premeasured amount of solute stored in such unit-dose bag flushed by predetermined amount of water into LVP bag. Relatively small number of LVP bags used in conjunction with smaller unit-dose bags to formulate large number of LVP intravenous solutions in nonsterile environment.

  3. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  4. Patchy reaction-diffusion and population abundance: the relative importance of habitat amount and arrangement

    Treesearch

    Curtis H. Flather; Michael Bevers

    2002-01-01

    A discrete reaction-diffusion model was used to estimate long-term equilibrium populations of a hypothetical species inhabiting patchy landscapes to examine the relative importance of habitat amount and arrangement in explaining population size. When examined over a broad range of habitat amounts and arrangements, population size was largely determined by a pure amount...

  5. Estimate of the Potential Amount of Low-Level Waste from the Fukushima Prefecture - 12370

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, Carolyn; Olson, Eric A.J.; Elmer, John

    2012-07-01

    The amount of waste generated by the cleanup of the Fukushima Prefecture (Fukushima-ken) following the releases from the Fukushima Daiichi nuclear power plant accident (March 2011) is dependent on many factors, including: - Contamination amounts; - Cleanup levels determined for the radioisotopes contaminating the area; - Future land use expectations and human exposure scenarios; - Groundwater contamination considerations; - Costs and availability of storage areas, and eventually disposal areas for the waste; and - Decontamination and volume reduction techniques and technologies used. For the purposes of estimating these waste volumes, Fukushima-ken is segregated into zones of similar contamination level andmore » expected future use. Techniques for selecting the appropriate cleanup methods for each area are shown in a decision tree format. This approach is broadly applied to the 20 km evacuation zone and the total amounts and types of waste are estimated; waste resulting from cleanup efforts outside of the evacuation zone is not considered. Some of the limits of future use and potential zones where residents must be excluded within the prefecture are also described. The size and design of the proposed intermediate storage facility is also discussed and the current situation, cleanup, waste handling, and waste storage issues in Japan are described. The method for estimating waste amounts outlined above illustrates the large amount of waste that could potentially be generated by remediation of the 20 km evacuation zone (619 km{sup 2} total) if the currently proposed cleanup goals are uniformly applied. The Japanese environment ministry estimated in early October that the 1 mSv/year exposure goal would make the government responsible for decontaminating about 8,000 km{sup 2} within Fukushima-ken and roughly 4,900 km{sup 2} in areas outside the prefecture. The described waste volume estimation method also does not give any consideration to areas with localized hot spots. Land use and area dose rate estimates for the 20 km evacuation zone indicate there are large areas where doses to the public can be mitigated through methods other than removal and disposal of soil and other wastes. Several additional options for waste reduction can also be considered, including: - Recycling/reusing or disposing of as municipal waste material that can be unconditionally cleared; - Establishing additional precautionary (e.g., liners) and monitoring requirements for municipal landfills to dispose of some conditionally-cleared material; and - Using slightly-contaminated material in construction of reclamations, banks and roads. Waste estimates for cleanup will continue to evolve as decontamination plans are drafted and finalized. (authors)« less

  6. Fibroblast responses and antibacterial activity of Cu and Zn co-doped TiO2 for percutaneous implants

    NASA Astrophysics Data System (ADS)

    Zhang, Lan; Guo, Jiaqi; Yan, Ting; Han, Yong

    2018-03-01

    In order to enhance skin integration and antibacterial activity of Ti percutaneous implants, microporous TiO2 coatings co-doped with different doses of Cu2+ and Zn2+ were directly fabricated on Ti via micro-arc oxidation (MAO). The structures of coatings were investigated; the behaviors of fibroblasts (L-929) as well as the response of Staphylococcus aureus (S. aureus) were evaluated. During the MAO process, a large number of micro-arc discharges forming on Ti performed as penetrating channels; O2-, Ca2+, Zn2+, Cu2+ and PO43- delivered via the channels, giving rise to the formation of doped TiO2. Surface characteristics including phase component, topography, surface roughness and wettability were almost the same for different coatings, whereas, the amount of Cu doped in TiO2 decreased with the increased Zn amount. Compared with Cu single-doped TiO2 (0.77 Wt% Cu), the co-doped with appropriate amounts of Cu and Zn, for example, 0.55 Wt% Cu and 2.53 Wt% Zn, further improved proliferation of L-929, facilitated fibroblasts to switch to fibrotic phenotype, and enhanced synthesis of collagen I as well as the extracellular collagen secretion; the antibacterial properties including contact-killing and release-killing were also enhanced. By analyzing the relationship of Cu/Zn amount in TiO2 and the behaviors of L-929 and S. aureus, it can be deduced that when the doped Zn is in a low dose (<1.79 Wt%), the behaviors of L-929 and S. aureus are sensitive to the reduced amount of Cu2+, whereas, Zn2+ plays a key role in accelerating fibroblast functions and reducing S. aureus when its dose obviously increases from 2.63 to 6.47 Wt%.

  7. Anisotropic reflectance from turbid media. II. Measurements.

    PubMed

    Neuman, Magnus; Edström, Per

    2010-05-01

    The anisotropic reflectance from turbid media predicted using the radiative transfer based DORT2002 model is experimentally verified through goniophotometric measurements. A set of paper samples with varying amounts of dye and thickness is prepared, and their angle resolved reflectance is measured. An alleged perfect diffusor is also included. The corresponding simulations are performed. A complete agreement between the measurements and model predictions is seen regarding the characteristics of the anisotropy. They show that relatively more light is reflected at large polar angles when the absorption or illumination angle is increased or when the medium thickness is decreased. This is due to the relative amount of near-surface bulk scattering increasing in these cases. This affects the application of the Kubelka-Munk model as well as standards for reflectance measurements and calibration routines.

  8. Using Planning, Scheduling and Execution for Autonomous Mars Rover Operations

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel M.; Chouinard, Caroline M.; Fisher, Forest W.; Castano, Rebecca; Judd, Michele J.; Nesnas, Issa A.

    2006-01-01

    With each new rover mission to Mars, rovers are traveling significantly longer distances. This distance increase raises not only the opportunities for science data collection, but also amplifies the amount of environment and rover state uncertainty that must be handled in rover operations. This paper describes how planning, scheduling and execution techniques can be used onboard a rover to autonomously generate and execute rover activities and in particular to handle new science opportunities that have been identified dynamically. We also discuss some of the particular challenges we face in supporting autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations. Finally, we describe our experiences in testing this work using several Mars rover prototypes in a realistic environment.

  9. The use of electrochemistry for the synthesis of 17 alpha-hydroxyprogesterone by a fusion protein containing P450c17.

    PubMed

    Estabrook, R W; Shet, M S; Faulkner, K; Fisher, C W

    1996-11-01

    A method has been developed for the commercial application of the unique oxygen chemistry catalyzed by various cytochrome P450s. This is illustrated here for the synthesis of hydroxylated steroids. This method requires the preparation of large amounts of enzymatically functional P450 proteins that can serve as catalysts and a technique for providing electrons at an economically acceptable cost. To generate large amounts of enzymatically active recombinant P450s we have engineered the cDNAs for various P450s, including bovine adrenal P450c17, by linking them to a modified cDNA for rat NADPH-P450 reductase and placing them in the plasmid pCWori+. Transformation of E. coli results in the high level expression of an enzymatically active protein that can be easily purified by affinity chromatography. Incubation of the purified enzyme with steroid in a reaction vessel containing a platinum electrode and a Ag/AgCl electrode couple poised at -650 mV, together with the electromotively active redox mediator, cobalt sepulchrate, results in the 17 alpha-hydroxylation of progesterone at rates as high as 25 nmoles of progesterone hydroxylated/min/nmole of P450. Thus, high concentrations of hydroxylated steroids can be produced with incubation conditions of hours duration without the use of costly NADPH. Similar experiments have been carried out for the generation of the 6 beta-hydroxylation product of testosterone (using a fusion protein containing human P450 3A4). It is apparent that this method is applicable to many other P450 catalyzed reactions for the synthesis of large amounts of hydroxylated steroid metabolites. The electrochemical system is also applicable to drug discovery studies for the characterization of drug metabolites.

  10. Branching habit and the allocation of reproductive resources in conifers

    PubMed Central

    Leslie, Andrew B.

    2012-01-01

    Background and Aims Correlated relationships between branch thickness, branch density, and twig and leaf size have been used extensively to study the evolution of plant canopy architecture, but fewer studies have explored the impact of these relationships on the allocation of reproductive resources. This study quantifies pollen cone production in conifers, which have similar basic reproductive biology but vary dramatically in branching habit, in order to test how differences in branch diameter influence pollen cone size and the density with which they are deployed in the canopy. Methods Measurements of canopy branch density, the number of cones per branch and cone size were used to estimate the amount of pollen cone tissues produced by 16 species in three major conifer clades. The number of pollen grains produced was also estimated using direct counts from individual pollen cones. Key Results The total amount of pollen cone tissues in the conifer canopy varied little among species and clades, although vegetative traits such as branch thickness, branch density and pollen cone size varied over several orders of magnitude. However, branching habit controls the way these tissues are deployed: taxa with small branches produce small pollen cones at a high density, while taxa with large branches produce large cones relatively sparsely. Conclusions Conifers appear to invest similar amounts of energy in pollen production independent of branching habit. However, similar associations between branch thickness, branch density and pollen cone size are seen across conifers, including members of living and extinct groups not directly studied here. This suggests that reproductive features relating to pollen cone size are in large part a function of the evolution of vegetative morphology and branching habit. PMID:22782240

  11. Design of an energy conservation building

    NASA Astrophysics Data System (ADS)

    Jensen, R. N.

    1981-11-01

    The concepts in designing and predicting energy consumption in a low energy use building are summarized. The building will use less than 30,000 Btu/sq.ft./yr. of boarder energy. The building's primary energy conservation features include heavy concrete walls with external insulation, a highly insulated ceiling, and large amounts of glass for natural lighting. A solar collector air system is integrated into the south wall. Calculations for energy conservation features were performed using NASA's NECAP Energy Program.

  12. Design of an energy conservation building

    NASA Technical Reports Server (NTRS)

    Jensen, R. N.

    1981-01-01

    The concepts in designing and predicting energy consumption in a low energy use building are summarized. The building will use less than 30,000 Btu/sq.ft./yr. of boarder energy. The building's primary energy conservation features include heavy concrete walls with external insulation, a highly insulated ceiling, and large amounts of glass for natural lighting. A solar collector air system is integrated into the south wall. Calculations for energy conservation features were performed using NASA's NECAP Energy Program.

  13. NPS CubeSat Launcher Design, Process and Requirements

    DTIC Science & Technology

    2009-06-01

    Soviet era ICBM. The first Dnepr launch in July 2006 consisted of fourteen CubeSats in five P-PODs, while the second in April 2007 consisted of...Regulations (ITAR). ITAR restricts the export of defense-related products and technology on the United States Munitions List. Although one might not...think that CubeSat technology would fall under ITAR, in fact a large amount of Aerospace technology , including some that could be used on CubeSats is

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karch, Andreas; Robinson, Brandon

    Thermodynamic quantities associated with black holes in Anti-de Sitter space obey an interesting identity when the cosmological constant is included as one of the dynamical variables, the generalized Smarr relation. Here, we show that this relation can easily be understood from the point of view of the dual holographic field theory. It amounts to the simple statement that the extensive thermodynamic quantities of a large N gauge theory only depend on the number of colors, N, via an overall factor of N 2.

  15. Development and Commissioning of an External Beam Facility in the Union College Ion Beam Analysis Laboratory

    NASA Astrophysics Data System (ADS)

    Yoskowitz, Joshua; Clark, Morgan; Labrake, Scott; Vineyard, Michael

    2015-10-01

    We have developed an external beam facility for the 1.1-MV tandem Pelletron accelerator in the Union College Ion Beam Analysis Laboratory. The beam is extracted from an aluminum pipe through a 1 / 4 ' ' diameter window with a 7.5- μm thick Kapton foil. This external beam facility allows us to perform ion beam analysis on samples that cannot be put under vacuum, including wet samples and samples too large to fit into the scattering chamber. We have commissioned the new facility by performing proton induced X-ray emission (PIXE) analysis of several samples of environmental interest. These include samples of artificial turf, running tracks, and a human tooth with an amalgam filling. A 1.7-MeV external proton beam was incident on the samples positioned 2 cm from the window. The resulting X-rays were measured using a silicon drift detector and were analyzed using GUPIX software to determine the concentrations of elements in the samples. The results on the human tooth indicate that while significant concentrations of Hg, Ag, and Sn are present in the amalgam filling, only trace amounts of Hg appear to have leached into the tooth. The artificial turf and running tracks show rather large concentrations of a broad range of elements and trace amounts of Pb in the turf infill.

  16. Old age and underlying interstitial abnormalities are risk factors for development of ARDS after pleurodesis using limited amount of large particle size talc.

    PubMed

    Shinno, Yuki; Kage, Hidenori; Chino, Haruka; Inaba, Atsushi; Arakawa, Sayaka; Noguchi, Satoshi; Amano, Yosuke; Yamauchi, Yasuhiro; Tanaka, Goh; Nagase, Takahide

    2018-01-01

    Talc pleurodesis is commonly performed to manage refractory pleural effusion or pneumothorax. It is considered as a safe procedure as long as a limited amount of large particle size talc is used. However, acute respiratory distress syndrome (ARDS) is a rare but serious complication after talc pleurodesis. We sought to determine the risk factors for the development of ARDS after pleurodesis using a limited amount of large particle size talc. We retrospectively reviewed patients who underwent pleurodesis with talc or OK-432 at the University of Tokyo Hospital. Twenty-seven and 35 patients underwent chemical pleurodesis using large particle size talc (4 g or less) or OK-432, respectively. Four of 27 (15%) patients developed ARDS after talc pleurodesis. Patients who developed ARDS were significantly older than those who did not (median 80 vs 66 years, P = 0.02) and had a higher prevalence of underlying interstitial abnormalities on chest computed tomography (CT; 2/4 vs 1/23, P < 0.05). No patient developed ARDS after pleurodesis with OK-432. This is the first case series of ARDS after pleurodesis using a limited amount of large particle size talc. Older age and underlying interstitial abnormalities on chest CT seem to be risk factors for developing ARDS after talc pleurodesis. © 2017 Asian Pacific Society of Respirology.

  17. Updated US Department of Agriculture Food Patterns meet goals of the 2010 dietary guidelines.

    PubMed

    Britten, Patricia; Cleveland, Linda E; Koegel, Kristin L; Kuczynski, Kevin J; Nickols-Richardson, Sharon M

    2012-10-01

    The US Department of Agriculture Food Patterns were updated for the 2010 Dietary Guidelines for Americans to meet new nutrition goals and incorporate results of food pattern modeling requested by the Dietary Guidelines Advisory Committee. The purpose of this article is to describe the process used and changes in the updated patterns. Changes include renaming the Meat and Beans and Milk Groups to the Protein Foods and Dairy Groups, respectively, to be more encompassing of foods in each. Vegetable subgroups now provide more achievable intake recommendations. Calcium-fortified soymilk is now included in the Dairy Group because of its similarity to foods in that group. Increased amounts of seafoods are recommended in the Protein Foods Group, balanced by decreased amounts of meat and poultry. A limit on calories from solid fats and added sugars is included, replacing the previous discretionary calorie allowance and emphasizing the need to choose nutrient-dense forms of foods. Lacto-ovo vegetarian and vegan patterns that meet nutrition goals were created by making substitutions in the Protein Foods Group, and for vegan patterns, in the Dairy Group. Patterns identify food choices that meet nutritional needs within energy allowances and encourage choosing a variety of foods. They rely on foods in nutrient-dense forms, including a limited amount of calories from solid fats and added sugars. The Food Patterns provide a useful template for educating consumers about healthful food choices while highlighting a large gap between choices many Americans make and healthy eating patterns. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  18. Bag For Formulating And Dispersing Intravenous Solution

    NASA Technical Reports Server (NTRS)

    Kipp, Jim; Owens, Jim; Scharf, Mike; Finley, Mike; Dudar, Tom; Veillon, Joe; Ogle, Jim

    1993-01-01

    Large-volume parenteral (LVP) bag in which predetermined amount of sterile solution formulated by combining premeasured, prepackaged amount of sterile solute with predetermined amount of water. Bag designed to hold predetermined amount, typically 1 L, of sterile solution. Sterility of solution maintained during mixing by passing water into bag through sterilizing filter. System used in field or hospitals not having proper sterile facilities, and in field research.

  19. Low-authority control synthesis for large space structures

    NASA Technical Reports Server (NTRS)

    Aubrun, J. N.; Margulies, G.

    1982-01-01

    The control of vibrations of large space structures by distributed sensors and actuators is studied. A procedure is developed for calculating the feedback loop gains required to achieve specified amounts of damping. For moderate damping (Low Authority Control) the procedure is purely algebraic, but it can be applied iteratively when larger amounts of damping are required and is generalized for arbitrary time invariant systems.

  20. Synthesis of Monolayer MoS2 by Chemical Vapor Deposition

    NASA Astrophysics Data System (ADS)

    Withanage, Sajeevi; Lopez, Mike; Dumas, Kenneth; Jung, Yeonwoong; Khondaker, Saiful

    Finite and layer-tunable band gap of transition metal dichalcogenides (TMDs) including molybdenum disulfide (MoS2) are highlighted over the zero band gap graphene in various semiconductor applications. Weak interlayer Van der Waal bonding of bulk MoS2 allows to cleave few to single layer MoS2 using top-down methods such as mechanical and chemical exfoliation, however few micron size of these flakes limit MoS2 applications to fundamental research. Bottom-up approaches including the sulfurization of molybdenum (Mo) thin films and co-evaporation of Mo and sulfur precursors received the attention due to their potential to synthesize large area. We synthesized monolayer MoS2 on Si/SiO2 substrates by atmospheric pressure Chemical Vapor Deposition (CVD) methods using sulfur and molybdenum trioxide (MoO3) as precursors. Several growth conditions were tested including precursor amounts, growth temperature, growth time and flow rate. Raman, photoluminescence (PL) and atomic force microscopy (AFM) confirmed monolayer islands merging to create large area were observed with grain sizes up to 70 μm without using any seeds or seeding promoters. These studies provide in-depth knowledge to synthesize high quality large area MoS2 for prospective electronics applications.

  1. Expression and Purification of Rat Glucose Transporter 1 in Pichia pastoris.

    PubMed

    Venskutonytė, Raminta; Elbing, Karin; Lindkvist-Petersson, Karin

    2018-01-01

    Large amounts of pure and homogenous protein are a prerequisite for several biochemical and biophysical analyses, and in particular if aiming at resolving the three-dimensional protein structure. Here we describe the production of the rat glucose transporter 1 (GLUT1), a membrane protein facilitating the transport of glucose in cells. The protein is recombinantly expressed in the yeast Pichia pastoris. It is easily maintained and large-scale protein production in shaker flasks, as commonly performed in academic research laboratories, results in relatively high yields of membrane protein. The purification protocol describes all steps needed to obtain a pure and homogenous GLUT1 protein solution, including cell growth, membrane isolation, and chromatographic purification methods.

  2. A divide-and-conquer algorithm for large-scale de novo transcriptome assembly through combining small assemblies from existing algorithms.

    PubMed

    Sze, Sing-Hoi; Parrott, Jonathan J; Tarone, Aaron M

    2017-12-06

    While the continued development of high-throughput sequencing has facilitated studies of entire transcriptomes in non-model organisms, the incorporation of an increasing amount of RNA-Seq libraries has made de novo transcriptome assembly difficult. Although algorithms that can assemble a large amount of RNA-Seq data are available, they are generally very memory-intensive and can only be used to construct small assemblies. We develop a divide-and-conquer strategy that allows these algorithms to be utilized, by subdividing a large RNA-Seq data set into small libraries. Each individual library is assembled independently by an existing algorithm, and a merging algorithm is developed to combine these assemblies by picking a subset of high quality transcripts to form a large transcriptome. When compared to existing algorithms that return a single assembly directly, this strategy achieves comparable or increased accuracy as memory-efficient algorithms that can be used to process a large amount of RNA-Seq data, and comparable or decreased accuracy as memory-intensive algorithms that can only be used to construct small assemblies. Our divide-and-conquer strategy allows memory-intensive de novo transcriptome assembly algorithms to be utilized to construct large assemblies.

  3. Resolving the tips of the tree of life: How much mitochondrialdata doe we need?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonett, Ronald M.; Macey, J. Robert; Boore, Jeffrey L.

    2005-04-29

    Mitochondrial (mt) DNA sequences are used extensively to reconstruct evolutionary relationships among recently diverged animals,and have constituted the most widely used markers for species- and generic-level relationships for the last decade or more. However, most studies to date have employed relatively small portions of the mt-genome. In contrast, complete mt-genomes primarily have been used to investigate deep divergences, including several studies of the amount of mt sequence necessary to recover ancient relationships. We sequenced and analyzed 24 complete mt-genomes from a group of salamander species exhibiting divergences typical of those in many species-level studies. We present the first comprehensive investigationmore » of the amount of mt sequence data necessary to consistently recover the mt-genome tree at this level, using parsimony and Bayesian methods. Both methods of phylogenetic analysis revealed extremely similar results. A surprising number of well supported, yet conflicting, relationships were found in trees based on fragments less than {approx}2000 nucleotides (nt), typical of the vast majority of the thousands of mt-based studies published to date. Large amounts of data (11,500+ nt) were necessary to consistently recover the whole mt-genome tree. Some relationships consistently were recovered with fragments of all sizes, but many nodes required the majority of the mt-genome to stabilize, particularly those associated with short internal branches. Although moderate amounts of data (2000-3000 nt) were adequate to recover mt-based relationships for which most nodes were congruent with the whole mt-genome tree, many thousands of nucleotides were necessary to resolve rapid bursts of evolution. Recent advances in genomics are making collection of large amounts of sequence data highly feasible, and our results provide the basis for comparative studies of other closely related groups to optimize mt sequence sampling and phylogenetic resolution at the ''tips'' of the Tree of Life.« less

  4. The Physician Payments Sunshine Act: Data Evaluation Regarding Payments to Ophthalmologists

    PubMed Central

    Chang, Jonathan S.

    2014-01-01

    Objective/Purpose To review data for ophthalmologists published online from the Physician Payments Sunshine Act. Design Retrospective data review using a publicly available electronic database Methods: Main Outcome Measures A database was downloaded from the Centers for Medicare and Medicaid Services (CMS) Website under Identified General Payments to Physicians and a primary specialty of ophthalmology. Basic statistical analysis was performed including mean, median and range of payments for both single payments and per provider. Data were also summarized by category of payment, geographic region and compared with other surgical subspecialties. Results From August 1, 2013 to December 31, 2013, a total of 55,996 individual payments were reported to 9,855 ophthalmologists for a total of $10,926,447. The mean amount received in a single payment was $195.13 (range $0.04–$193,073). The mean amount received per physician ID was $1,108 (range $1–$397,849) and median amount $112.01. Consulting fees made up the largest percentage of fees. There was not a large difference in payments received by region. The mean payments for the subspecialties of dermatology, neurosurgery, orthopedic surgery and urology ranged from $954–$6,980, and median payments in each field by provider identifier ranged from $88–$173. Conclusions A large amount of data was released by CMS for the Physician Payment Sunshine Act. In ophthalmology, mean and median payments per physician did not vary greatly from other surgical subspecialties. Most single payments were under $100, and most physicians received less than $500 in total payments. Payments for consulting made up the largest category of spending. How this affects patient perception, patient care and medical costs warrants further study. PMID:25578254

  5. The Influence of Cloud Field Uniformity on Observed Cloud Amount

    NASA Astrophysics Data System (ADS)

    Riley, E.; Kleiss, J.; Kassianov, E.; Long, C. N.; Riihimaki, L.; Berg, L. K.

    2017-12-01

    Two ground-based measurements of cloud amount include cloud fraction (CF) obtained from time series of zenith-pointing radar-lidar observations and fractional sky cover (FSC) acquired from a Total Sky Imager (TSI). In comparison with the radars and lidars, the TSI has a considerably larger field of view (FOV 100° vs. 0.2°) and therefore is expected to have a different sensitivity to inhomogeneity in a cloud field. Radiative transfer calculations based on cloud properties retrieved from narrow-FOV overhead cloud observations may differ from shortwave and longwave flux observations due to spatial variability in local cloud cover. This bias will impede radiative closure for sampling reasons rather than the accuracy of cloud microphysics retrievals or radiative transfer calculations. Furthermore, the comparison between observed and modeled cloud amount from large eddy simulations (LES) models may be affected by cloud field inhomogeneity. The main goal of our study is to estimate the anticipated impact of cloud field inhomogeneity on the level of agreement between CF and FSC. We focus on shallow cumulus clouds observed at the U.S. Department of Energy Atmospheric Radiation Measurement Facility's Southern Great Plains (SGP) site in Oklahoma, USA. Our analysis identifies cloud field inhomogeneity using a novel metric that quantifies the spatial and temporal uniformity of FSC over 100-degree FOV TSI images. We demonstrate that (1) large differences between CF and FSC are partly attributable to increases in inhomogeneity and (2) using the uniformity metric can provide a meaningful assessment of uncertainties in observed cloud amount to aide in comparing ground-based measurements to radiative transfer or LES model outputs at SGP.

  6. A MBD-seq protocol for large-scale methylome-wide studies with (very) low amounts of DNA.

    PubMed

    Aberg, Karolina A; Chan, Robin F; Shabalin, Andrey A; Zhao, Min; Turecki, Gustavo; Staunstrup, Nicklas Heine; Starnawska, Anna; Mors, Ole; Xie, Lin Y; van den Oord, Edwin Jcg

    2017-09-01

    We recently showed that, after optimization, our methyl-CpG binding domain sequencing (MBD-seq) application approximates the methylome-wide coverage obtained with whole-genome bisulfite sequencing (WGB-seq), but at a cost that enables adequately powered large-scale association studies. A prior drawback of MBD-seq is the relatively large amount of genomic DNA (ideally >1 µg) required to obtain high-quality data. Biomaterials are typically expensive to collect, provide a finite amount of DNA, and may simply not yield sufficient starting material. The ability to use low amounts of DNA will increase the breadth and number of studies that can be conducted. Therefore, we further optimized the enrichment step. With this low starting material protocol, MBD-seq performed equally well, or better, than the protocol requiring ample starting material (>1 µg). Using only 15 ng of DNA as input, there is minimal loss in data quality, achieving 93% of the coverage of WGB-seq (with standard amounts of input DNA) at similar false/positive rates. Furthermore, across a large number of genomic features, the MBD-seq methylation profiles closely tracked those observed for WGB-seq with even slightly larger effect sizes. This suggests that MBD-seq provides similar information about the methylome and classifies methylation status somewhat more accurately. Performance decreases with <15 ng DNA as starting material but, even with as little as 5 ng, MBD-seq still achieves 90% of the coverage of WGB-seq with comparable genome-wide methylation profiles. Thus, the proposed protocol is an attractive option for adequately powered and cost-effective methylome-wide investigations using (very) low amounts of DNA.

  7. Overview of Sea-Ice Properties, Distribution and Temporal Variations, for Application to Ice-Atmosphere Chemical Processes.

    NASA Astrophysics Data System (ADS)

    Moritz, R. E.

    2005-12-01

    The properties, distribution and temporal variation of sea-ice are reviewed for application to problems of ice-atmosphere chemical processes. Typical vertical structure of sea-ice is presented for different ice types, including young ice, first-year ice and multi-year ice, emphasizing factors relevant to surface chemistry and gas exchange. Time average annual cycles of large scale variables are presented, including ice concentration, ice extent, ice thickness and ice age. Spatial and temporal variability of these large scale quantities is considered on time scales of 1-50 years, emphasizing recent and projected changes in the Arctic pack ice. The amount and time evolution of open water and thin ice are important factors that influence ocean-ice-atmosphere chemical processes. Observations and modeling of the sea-ice thickness distribution function are presented to characterize the range of variability in open water and thin ice.

  8. Accumulation of immunoglobulin G against Dermatophagoides farinae tropomyosin in dorsal root ganglia of NC/Nga mice with atopic dermatitis-like symptoms.

    PubMed

    Otsu, Ayaka; Kawasaki, Hiroaki; Tominaga, Mitsutoshi; Shigenaga, Ayako; Matsuda, Hironori; Takahashi, Nobuaki; Nakajima, Tadaaki; Naito, Hisashi; Baba, Takeshi; Ogawa, Hideoki; Tomooka, Yasuhiro; Yamakura, Fumiyuki; Takamori, Kenji

    2017-04-15

    Atopic dermatitis (AD), a chronic inflammatory skin disease, manifests as intractable itch, but its underlying mechanisms are poorly understood. This study assessed the relationship between immunoglobulin G (IgG) and dorsal root ganglia (DRG) in NC/Nga mice, a model of AD that manifests AD-like symptoms including itch. Immunohistochemical analysis showed large amounts of IgG in DRG extracts of NC/Nga mice with AD-like dermatitis, with a large fraction of the IgG distributed in satellite glial cells of the DRG. Proteomic analysis showed that this IgG was reactive against tropomyosin of Dermatophagoides farinae. These findings indicate that the accumulation of anti-tropomyosin IgG in DRG of atopic NC/Nga mice may be associated with the pathogenesis of AD-like symptoms, including itch. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. THE CONTRIBUTION OF CORONAL JETS TO THE SOLAR WIND

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lionello, R.; Török, T.; Titov, V. S.

    Transient collimated plasma eruptions in the solar corona, commonly known as coronal (or X-ray) jets, are among the most interesting manifestations of solar activity. It has been suggested that these events contribute to the mass and energy content of the corona and solar wind, but the extent of these contributions remains uncertain. We have recently modeled the formation and evolution of coronal jets using a three-dimensional (3D) magnetohydrodynamic (MHD) code with thermodynamics in a large spherical domain that includes the solar wind. Our model is coupled to 3D MHD flux-emergence simulations, i.e., we use boundary conditions provided by such simulationsmore » to drive a time-dependent coronal evolution. The model includes parametric coronal heating, radiative losses, and thermal conduction, which enables us to simulate the dynamics and plasma properties of coronal jets in a more realistic manner than done so far. Here, we employ these simulations to calculate the amount of mass and energy transported by coronal jets into the outer corona and inner heliosphere. Based on observed jet-occurrence rates, we then estimate the total contribution of coronal jets to the mass and energy content of the solar wind to (0.4–3.0)% and (0.3–1.0)%, respectively. Our results are largely consistent with the few previous rough estimates obtained from observations, supporting the conjecture that coronal jets provide only a small amount of mass and energy to the solar wind. We emphasize, however, that more advanced observations and simulations (including parametric studies) are needed to substantiate this conjecture.« less

  10. Supply of large woody debris in a stream channel

    USGS Publications Warehouse

    Diehl, Timothy H.; Bryan, Bradley A.

    1993-01-01

    The amount of large woody debris that potentially could be transported to bridge sites was assessed in the basin of the West Harpeth River in Tennessee in the fall of 1992. The assessment was based on inspections of study sites at 12 bridges and examination of channel reaches between bridges. It involved estimating the amount of woody material at least 1.5 meters long, stored in the channel, and not rooted in soil. Study of multiple sites allowed estimation of the amount, characteristics, and sources of debris stored in the channel, and identification of geomorphic features of the channel associated with debris production. Woody debris is plentiful in the channel network, and much of the debris could be transported by a large flood. Tree trunks with attached root masses are the dominant large debris type. Death of these trees is primarily the result of bank erosion. Bank instability seems to be the basin characteristic most useful in identifying basins with a high potential for abundant production of debris.

  11. Delimitation of the Thoracosphaeraceae (Dinophyceae), including the calcareous dinoflagellates, based on large amounts of ribosomal RNA sequence data.

    PubMed

    Gottschling, Marc; Soehner, Sylvia; Zinssmeister, Carmen; John, Uwe; Plötner, Jörg; Schweikert, Michael; Aligizaki, Katerina; Elbrächter, Malte

    2012-01-01

    The phylogenetic relationships of the Dinophyceae (Alveolata) are not sufficiently resolved at present. The Thoracosphaeraceae (Peridiniales) are the only group of the Alveolata that include members with calcareous coccoid stages; this trait is considered apomorphic. Although the coccoid stage apparently is not calcareous, Bysmatrum has been assigned to the Thoracosphaeraceae based on thecal morphology. We tested the monophyly of the Thoracosphaeraceae using large sets of ribosomal RNA sequence data of the Alveolata including the Dinophyceae. Phylogenetic analyses were performed using Maximum Likelihood and Bayesian approaches. The Thoracosphaeraceae were monophyletic, but included also a number of non-calcareous dinophytes (such as Pentapharsodinium and Pfiesteria) and even parasites (such as Duboscquodinium and Tintinnophagus). Bysmatrum had an isolated and uncertain phylogenetic position outside the Thoracosphaeraceae. The phylogenetic relationships among calcareous dinophytes appear complex, and the assumption of the single origin of the potential to produce calcareous structures is challenged. The application of concatenated ribosomal RNA sequence data may prove promising for phylogenetic reconstructions of the Dinophyceae in future. Copyright © 2011 Elsevier GmbH. All rights reserved.

  12. Storage and Retrieval of Large RDF Graph Using Hadoop and MapReduce

    NASA Astrophysics Data System (ADS)

    Farhan Husain, Mohammad; Doshi, Pankil; Khan, Latifur; Thuraisingham, Bhavani

    Handling huge amount of data scalably is a matter of concern for a long time. Same is true for semantic web data. Current semantic web frameworks lack this ability. In this paper, we describe a framework that we built using Hadoop to store and retrieve large number of RDF triples. We describe our schema to store RDF data in Hadoop Distribute File System. We also present our algorithms to answer a SPARQL query. We make use of Hadoop's MapReduce framework to actually answer the queries. Our results reveal that we can store huge amount of semantic web data in Hadoop clusters built mostly by cheap commodity class hardware and still can answer queries fast enough. We conclude that ours is a scalable framework, able to handle large amount of RDF data efficiently.

  13. Two cases of exenteration of the brain from Brenneke shotgun slugs.

    PubMed

    Karger, B; Banaschak, S

    1997-01-01

    A case of extended suicide resulted in two fatalities due to craniocerebral gunshots from a 12-gauge shotgun firing Brenneke shotgun slugs. In each case, the gunshot shattered the skull and the brain and in one case, large parts of the brain including a complete hemisphere were ejected similar to a "Krönlein shot". The location of the trajectory close to the base of the skull, the muzzle gases and the ballistic characteristics of the missile contributed to this rare form of head injury. The high mass and the large diameter of the lead missile do not necessitate a high muzzle velocity to crush large amounts of tissue or to produce an explosive type of head injury. The wadding material and the metal screw attached to the Brenneke slug can be of forensic significance.

  14. Survivable pulse power space radiator

    DOEpatents

    Mims, James; Buden, David; Williams, Kenneth

    1989-01-01

    A thermal radiator system is described for use on an outer space vehicle, which must survive a long period of nonuse and then radiate large amounts of heat for a limited period of time. The radiator includes groups of radiator panels that are pivotally connected in tandem, so that they can be moved to deployed configuration wherein the panels lie largely coplanar, and to a stowed configuration wherein the panels lie in a stack to resist micrometeorite damage. The panels are mounted on a boom which separates a hot power source from a payload. While the panels are stowed, warm fluid passes through their arteries to keep them warm enough to maintain the coolant in a liquid state and avoid embrittlement of material. The panels can be stored in a largely cylindrical shell, with panels progressively further from the boom being of progressively shorter length.

  15. Visual attention mitigates information loss in small- and large-scale neural codes.

    PubMed

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-04-01

    The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Assessing the Provenance of regolith components in the South Pole-Aitken Basin: Results from LRO, M3, GRAIL, and Ejecta Modeling

    NASA Astrophysics Data System (ADS)

    Petro, N. E.; Cohen, B. A.; Jolliff, B. L.; Moriarty, D. P.

    2016-12-01

    Results from recent lunar missions are reshaping our view of the lunar surface, the evolution of the Moon, and the scale of processes that have affected the Moon. From orbital remote sensing data we can investigate surface mineralogy at the 100s m scale as well as corresponding high-resolution images to evaluate the exposures of various compositions. Coupled with geophysical data from the GRAIL mission, we can now assess the effects of large impacts (>200 km in diameter). These data are essential for assessing the composition of the interior of the South Pole-Aitken Basin (SPA), a key destination for future sample return (Jolliff et al., this conference). Data from the Lunar Reconnaissance Orbiter (LRO) shows that variations in surface roughness and morphology are broad and likely reflect both the ancient age of the basin floor, as well as younger volcanic and impact-related resurfacing events. Data from the Moon Mineralogy Mapper also reveal compositional variations across the interior of the basin and reflect both ancient volcanic activity as well as surface exposures of deep-seated crustal (SPA substrate) materials. These datasets are critical for delineating variations in surface compositions, which indicate formation mechanisms (e.g., volcanic vs. impact-derived). We investigate the resurfacing history of SPA, focusing on integrating data from multiple instruments, as well as updated modeling into the origin of regolith components (in the form of ejecta from near and distant impact craters). Recent advances include determination of the inventory of large craters as well as improved estimates of the amount of ejecta from such craters. As with past estimates of basin ejecta distribution, the volume of ejecta introduced to SPA is relatively small and quickly becomes diluted within the regolith. In addition, the contribution of ejecta by smaller, local craters is shown to distribute a comparable amount of material within the basin. Much of the material distributed by these local craters is SPA substrate, with a small amount of re-melted material. In most locations within SPA, the amount of reworked SPA substrate by ballistic ejecta emplacement and mixing from impacts within the presumed transient cavity greatly exceeds the amount of material contributed by ballistic sedimentation from large craters outside of SPA.

  17. Brackish groundwater in the United States

    USGS Publications Warehouse

    Stanton, Jennifer S.; Anning, David W.; Brown, Craig J.; Moore, Richard B.; McGuire, Virginia L.; Qi, Sharon L.; Harris, Alta C.; Dennehy, Kevin F.; McMahon, Peter B.; Degnan, James R.; Böhlke, John Karl

    2017-04-05

    For some parts of the Nation, large-scale development of groundwater has caused decreases in the amount of groundwater that is present in aquifer storage and that discharges to surface-water bodies. Water supply in some areas, particularly in arid and semiarid regions, is not adequate to meet demand, and severe drought is affecting large parts of the United States. Future water demand is projected to heighten the current stress on groundwater resources. This combination of factors has led to concerns about the availability of freshwater to meet domestic, agricultural, industrial, mining, and environmental needs. To ensure the water security of the Nation, currently [2016] untapped water sources may need to be developed.Brackish groundwater is an unconventional water source that may offer a partial solution to current and future water demands. In support of the national census of water resources, the U.S. Geological Survey completed the national brackish groundwater assessment to better understand the occurrence and characteristics of brackish groundwater in the United States as a potential water resource. Analyses completed as part of this assessment relied on previously collected data from multiple sources; no new data were collected. Compiled data included readily available information about groundwater chemistry, horizontal and vertical extents and hydrogeologic characteristics of principal aquifers (regionally extensive aquifers or aquifer systems that have the potential to be used as a source of potable water), and groundwater use. Although these data were obtained from a wide variety of sources, the compiled data are biased toward shallow and fresh groundwater resources; data representing groundwater that is at great depths and is saline were not as readily available.One of the most important contributions of this assessment is the creation of a database containing chemical characteristics and aquifer information for the known areas with brackish groundwater in the United States. Previously published digital data relating to brackish groundwater resources were limited to a small number of State- and regional-level studies. Data sources for this assessment ranged from single publications to large datasets and from local studies to national assessments. Geochemical data included concentrations of dissolved solids, major ions, trace elements, nutrients, and radionuclides as well as physical properties of the water (pH, temperature, and specific conductance). Additionally, the database provides selected well information (location, yield, depth, and contributing aquifer) necessary for evaluating the water resource.The assessment was divided into national-, regional-, and aquifer-scale analyses. National-scale analyses included evaluation of the three-dimensional distribution of observed dissolved-solids concentrations in groundwater, the three-dimensional probability of brackish groundwater occurrence, and the geochemical characteristics of saline (greater than or equal to 1,000 mg/L of dissolved solids) groundwater resources. Regional-scale analyses included a summary of the percentage of observed grid cell volume in the region that was occupied by brackish groundwater within the mixture of air, water, and rock for multiple depth intervals. Aquifer-scale analyses focused primarily on four regions that contained the largest amounts of observed brackish groundwater and included a generalized description of hydrogeologic characteristics from previously published work; the distribution of dissolved-solids concentrations; considerations for developing brackish groundwater resources, including a summary of other chemical characteristics that may limit the use of brackish groundwater and the ability of sampled wells producing brackish groundwater to yield useful amounts of water; and the amount of saline groundwater being used in 2010.

  18. Competitive adsorption in model charged protein mixtures: Equilibrium isotherms and kinetics behavior

    NASA Astrophysics Data System (ADS)

    Fang, F.; Szleifer, I.

    2003-07-01

    The competitive adsorption of proteins of different sizes and charges is studied using a molecular theory. The theory enables the study of charged systems explicitly including the size, shape, and charge distributions in all the molecular species in the mixture. Thus, this approach goes beyond the commonly used Poisson-Boltzmann approximation. The adsorption isotherms of the protein mixtures are studied for mixtures of two proteins of different size and charge. The amount of proteins adsorbed and the fraction of each protein is calculated as a function of the bulk composition of the solution and the amount of salt in the system. It is found that the total amount of proteins adsorbed is a monotonically decreasing function of the fraction of large proteins on the bulk solution and for fixed protein composition of the salt concentration. However, the composition of the adsorbed layer is a complicated function of the bulk composition and solution ionic strength. The structure of the adsorb layer depends upon the bulk composition and salt concentration. In general, there are multilayers adsorbed due to the long-range character of the electrostatic interactions. When the composition of large proteins in bulk is in very large excess it is found that the structure of the adsorb multilayer is such that the layer in contact with the surface is composed by a mixture of large and small proteins. However, the second and third layers are almost exclusively composed of large proteins. The theory is also generalized to study the time-dependent adsorption. The approach is based on separation of time scales into fast modes for the ions from the salt and the solvent and slow for the proteins. The dynamic equations are written for the slow modes, while the fast ones are obtained from the condition of equilibrium constrained to the distribution of proteins given by the slow modes. Two different processes are presented: the adsorption from a homogeneous solution to a charged surface at low salt concentration, and large excess of the large proteins in bulk. The second process is the kinetics of structural and adsorption change by changing the salt concentration of the bulk solution from low to high. The first process shows a large overshoot of the large proteins on the surface due to their excess in solution, followed by a surface replacement by the smaller molecules. The second process shows a very fast desorption of the large proteins followed by adsorption at latter stages. This process is found to be driven by large electrostatic repulsions induced by the fast ions from the salt approaching the surface. The relevance of the theoretical predictions to experimental system and possible directions for improvements of the theory are discussed.

  19. A thermal and chemical degradation approach to decipher pristane and phytane precursors in sedimentary organic matter

    USGS Publications Warehouse

    Koopmans, M.P.; Rijpstra, W.I.C.; Klapwijk, M.M.; De Leeuw, J. W.; Lewan, M.D.; Sinninghe, Damste J.S.

    1999-01-01

    A thermal and chemical degradation approach was followed to determine the precursors of pristane (Pr) and phytane (Ph) in samples from the Gessoso-solfifera, Ghareb and Green River Formations. Hydrous pyrolysis of these samples yields large amounts of Pr and Ph carbon skeletons, indicating that their precursors are predominantly sequestered in high-molecular-weight fractions. However, chemical degradation of the polar fraction and the kerogen of the unheated samples generally does not release large amounts of Pr and Ph. Additional information on the precursors of Pr and Ph is obtained from flash pyrolysis analyses of kerogens and residues after hydrous pyrolysis and after chemical degradation. Multiple precursors for Pr and Ph are recognised in these three samples. The main increase of the Pr/Ph ratio with increasing maturation temperature, which is associated with strongly increasing amounts of Pr and Ph, is probably due to the higher amount of precursors of Pr compared to Ph, and not to the different timing of generation of Pr and Ph.A thermal and chemical degradation approach was followed to determine the precursors of pristane (Pr) and phytane (Ph) in samples from the Gessoso-solfifera, Ghareb and Green River Formations. Hydrous pyrolysis of these samples yields large amounts of Pr and Ph carbon skeletons, indicating that their precursors are predominantly sequestered in high-molecular-weight fractions. However, chemical degradation of the polar fraction and the kerogen of the unheated samples generally does not release large amounts of Pr and Ph. Additional information on the precursors of Pr and Ph is obtained from flash pyrolysis analyses of kerogens and residues after hydrous pyrolysis and after chemical degradation. Multiple precursors for Pr and Ph are recognised in these three samples. The main increase of the Pr/Ph ratio with increasing maturation temperature, which is associated with strongly increasing amounts of Pr and Ph, is probably due to the higher amount of precursors of Pr compared to Ph, and not to the different timing of generation of Pr and Ph.

  20. Effects of consumption of choline and lecithin on neurological and cardiovascular systems.

    PubMed

    Wood, J L; Allison, R G

    1982-12-01

    This report concerns possible adverse health effects and benefits that might result from consumption of large amounts of choline, lecithin, or phosphatidylcholine. Indications from preliminary investigations that administration of choline or lecithin might alleviate some neurological disturbances, prevent hypercholesteremia and atherosclerosis, and restore memory and cognition have resulted in much research and public interest. Symptoms of tardive dyskinesia and Alzheimer's disease have been ameliorated in some patients and varied responses have been observed in the treatment of Gilles de la Tourette's disease, Friedreich's ataxia, levodopa-induced dyskinesia, mania, Huntington's disease, and myasthenic syndrome. Further clinical trials, especially in conjunction with cholinergic drugs, are considered worthwhile but will require sufficient amounts of pure phosphatidylcholine. The public has access to large amounts of commercial lecithin. Because high intakes of lecithin or choline produce acute gastrointestinal distress, sweating, salivation, and anorexia, it is improbable that individuals will incur lasting health hazards from self-administration of either compound. Development of depression or supersensitivity of dopamine receptors and disturbance of the cholinergic-dopaminergic-serotinergic balance is a concern with prolonged, repeated intakes of large amounts of lecithin.

  1. Holographic black hole chemistry

    DOE PAGES

    Karch, Andreas; Robinson, Brandon

    2015-12-14

    Thermodynamic quantities associated with black holes in Anti-de Sitter space obey an interesting identity when the cosmological constant is included as one of the dynamical variables, the generalized Smarr relation. Here, we show that this relation can easily be understood from the point of view of the dual holographic field theory. It amounts to the simple statement that the extensive thermodynamic quantities of a large N gauge theory only depend on the number of colors, N, via an overall factor of N 2.

  2. Phoenix Missile Hypersonic Testbed (PMHT): Project Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    An over view of research into a low cost hypersonic research flight test capability to increase the amount of hypersonic flight data to help bridge the large developmental gap between ground testing/analysis and major flight demonstrator Xplanes is provided. The major objectives included: develop an air launched missile booster research testbed; accurately deliver research payloads through programmable guidance to hypersonic test conditions; low cost; a high flight rate minimum of two flights per year and utilize surplus air launched missiles and NASA aircraft.

  3. China’s Interests and Goals in the Arctic: Implications for the United States

    DTIC Science & Technology

    2017-03-01

    areas north of the Arctic Circle (lat. 66.56° N ) amounting to 6 percent of the world’s landmass, including parts of Alaska—holds the world’s larg- est...September 4, 2015, avail- able from barentsobserver.com/en/business/2015/09/russia-and-china- sign-agreement-belkomur-railroad-04-09. 143. Vitaly ...nance of China, Beijing: Foreign Language Press, p. 326; Camilla T. N . Sørensen, “The Significance of Xi Jinping’s ‘Chinese Dream’ for Chinese

  4. Technology Requirements for Information Management

    NASA Technical Reports Server (NTRS)

    Graves, Sara; Knoblock, Craig A.; Lannom, Larry

    2002-01-01

    This report provides the results of a panel study conducted into the technology requirements for information management in support of application domains of particular government interest, including digital libraries, mission operations, and scientific research. The panel concluded that it was desirable to have a coordinated program of R&D that pursues a science of information management focused on an environment typified by applications of government interest - highly distributed with very large amounts of data and a high degree of heterogeneity of sources, data, and users.

  5. Multiple collision effects on the antiproton production by high energy proton (100 GeV - 1000 GeV)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, Hiroshi; Powell, J.

    Antiproton production rates which take into account multiple collision are calculated using a simple model. Methods to reduce capture of the produced antiprotons by the target are discussed, including geometry of target and the use of a high intensity laser. Antiproton production increases substantially above 150 GeV proton incident energy. The yield increases almost linearly with incident energy, alleviating space charge problems in the high current accelerator that produces large amounts of antiprotons.

  6. BREAD LOAF ROADLESS AREA, VERMONT.

    USGS Publications Warehouse

    Slack, John F.; Bitar, Richard F.

    1984-01-01

    On the basis of mineral-resource survey the Bread Loaf Roadless Area, Vermont, is considered to have probable resource potential for the occurrence of volcanogenic massive sulfide deposits of copper, zinc, and lead, particularly in the north and northeastern section of the roadless area. Nonmetallic commodities include minor deposits of sand and gravel, and abundant rock suitable for crushing. However, large amounts of these materials in more accessible locations are available outside the roadless area. A possibility exists that oil or natural gas resources may be present at great depth.

  7. Reflections on the early development of poxvirus vectors.

    PubMed

    Moss, Bernard

    2013-09-06

    Poxvirus expression vectors were described in 1982 and quickly became widely used for vaccine development as well as research in numerous fields. Advantages of the vectors include simple construction, ability to accommodate large amounts of foreign DNA and high expression levels. Numerous poxvirus-based veterinary vaccines are currently in use and many others are in human clinical trials. The early reports of poxvirus vectors paved the way for and stimulated the development of other viral vectors and recombinant DNA vaccines. Published by Elsevier Ltd.

  8. Large dynamic range radiation detector and methods thereof

    DOEpatents

    Marrs, Roscoe E [Livermore, CA; Madden, Norman W [Sparks, NV

    2012-02-14

    According to one embodiment, a radiation detector comprises a scintillator and a photodiode optically coupled to the scintillator. The radiation detector also includes a bias voltage source electrically coupled to the photodiode, a first detector operatively electrically coupled to the photodiode for generating a signal indicative of a level of a charge at an output of the photodiode, and a second detector operatively electrically coupled to the bias voltage source for generating a signal indicative of an amount of current flowing through the photodiode.

  9. A FORTRAN program for determining aircraft stability and control derivatives from flight data

    NASA Technical Reports Server (NTRS)

    Maine, R. E.; Iliff, K. W.

    1975-01-01

    A digital computer program written in FORTRAN IV for the estimation of aircraft stability and control derivatives is presented. The program uses a maximum likelihood estimation method, and two associated programs for routine, related data handling are also included. The three programs form a package that can be used by relatively inexperienced personnel to process large amounts of data with a minimum of manpower. This package was used to successfully analyze 1500 maneuvers on 20 aircraft, and is designed to be used without modification on as many types of computers as feasible. Program listings and sample check cases are included.

  10. Unprecedented Arctic Ozone Loss in 2011

    NASA Image and Video Library

    2011-10-02

    In mid-March 2011, NASA Aura spacecraft observed ozone in Earth stratosphere -- low ozone amounts are shown in purple and grey colors, large amounts of chlorine monoxide are shown in dark blue colors.

  11. Molecular species composition of plant cardiolipin determined by liquid chromatography mass spectrometry

    PubMed Central

    Zhou, Yonghong; Peisker, Helga

    2016-01-01

    Cardiolipin (CL), an anionic phospholipid of the inner mitochondrial membrane, provides essential functions for stabilizing respiratory complexes and is involved in mitochondrial morphogenesis and programmed cell death in animals. The role of CL and its metabolism in plants are less well understood. The measurement of CL in plants, including its molecular species composition, is hampered by the fact that CL is of extremely low abundance, and that plants contain large amounts of interfering compounds including galactolipids, neutral lipids, and pigments. We used solid phase extraction by anion exchange chromatography to purify CL from crude plant lipid extracts. LC/MS was used to determine the content and molecular species composition of CL. Thus, up to 23 different molecular species of CL were detected in different plant species, including Arabidopsis, mung bean, spinach, barley, and tobacco. Similar to animals, plant CL is dominated by highly unsaturated species, mostly containing linoleic and linolenic acid. During phosphate deprivation or exposure to an extended dark period, the amount of CL decreased in Arabidopsis, accompanied with an increased degree in unsaturation. The mechanism of CL remodeling during stress, and the function of highly unsaturated CL molecular species, remains to be defined. PMID:27179363

  12. Effects on H(-) production in a multicusp ion source by mixture of H2 with H2O, NH3, CH4, N2H4, and SF6

    NASA Technical Reports Server (NTRS)

    Orient, O. J.; Chutjian, A.; Leung, K. N.

    1987-01-01

    Effects of H(-) production in a multicusp ion source are measured by separately mixing with hydrogen small amounts (0.33-10 percent) of water, ammonia, methane, and hydrazine these are molecules which produce large amounts of H(-) via dissociative attachment (DA) resonances at higher electron energies. The mixing was done in a separate reservoir, with careful measurement of individual pressures. Experimental enhancements of 1.4 and less were observed, whereas calculated enhancements, using accurate DA cross sections for ground-state H2, should have produced factors of 1.5, 3.0, 1.3, and 2.4 enhancements for water, ammonia methane, and hydrazine, respectively, at a mean electron energy of 1.0 eV in the extraction region. The difference is accounted for by including, in the enhancement calculation, vibrationally and rotationally excited H2 molecules, with v-double prime = 5-11, and J-double prime = 0-5, and the large DA cross sections for the excited H2 (v-double prime, J-double prime). The relative populations of H2 (v-double prime, J-double prime) thus obtained are found to be substantially smaller than those predicted by theoretical calculations. The effect on H(-) current was also studied by mixing small amounts of SF6 with H2. A 1.5 percent mixture was found to reduce the H(-) output by one half.

  13. Rotor compound concept for designing an industrial HTS synchronous motor

    NASA Astrophysics Data System (ADS)

    Kashani, M.; Hosseina, M.; Sarrafan, K.; Darabi, A.

    2013-06-01

    Recently, producing power with smaller amount of losses become as a goal in our daily life. Today, large amount of energy waste in power networks all around the world. The main reason is “resistive electric equipments” of power networks. Since early 1980s, simultaneous with the development of high temperature superconductive (HTS) technology, superconductors gently attracted the mankind attentions. Using superconductive equipments instead of conventional resistive ones are result in salient electric loss reduction in power systems. Especially to reduce losses in power networks superconductive industrial rotating machines can potentially perform a significant role. In early recent century, first generation of HTS rotating machines was born. But unfortunately they have long way to penetrate the commercial markets yet. In HTS rotating machines the conventional copper made windings are replaced with the HTS superconductors. In this paper an industrial HTS synchronous motor with YBCO coated conductor field windings was designed. As a new approach, model was equipped with a compound rotor that includes both magnetic and non-magnetic materials. So, large amount of heavy iron made part was replaced by light non-magnetic material such as G-10 fiberglass. Furthermore, in this structure iron loss in rotor could be reduced to its lowest value. Also less weight and more air gap energy density were the additional advantages. Regarding zero electric loss production in field windings and less iron loss in rotor construction, this model potentially is more effective than the other iron made HTS motors.

  14. Woody debris along an upland chronosequence in boreal Manitoba and its impact on long-term carbon storage

    USGS Publications Warehouse

    Manies, K.L.; Harden, J.W.; Bond-Lamberty, B. P.; O'Neill, K. P.

    2005-01-01

    This study investigated the role of fire-killed woody debris as a source of soil carbon in black spruce (Picea mariana (Mill.) BSP) stands in Manitoba, Canada. We measured the amount of standing dead and downed woody debris along an upland chronosequence, including wood partially and completely covered by moss growth. Such woody debris is rarely included in measurement protocols and composed up to 26% of the total amount of woody debris in older stands, suggesting that it is important to measure all types of woody debris in ecosystems where burial by organic matter is possible. Based on these data and existing net primary production (NPP) values, we used a mass-balance model to assess the potential impact of fire-killed wood on long-term carbon storage at this site. The amount of carbon stored in deeper soil organic layers, which persists over millennia, was used to represent this long-term carbon. We estimate that between 10% and 60% of the deep-soil carbon is derived from wood biomass. Sensitivity analyses suggest that this estimate is most affected by the fire return interval, decay rate of wood, amount of NPP, and decay rate of the char (postfire) carbon pool. Landscape variations in these terms could account for large differences in deep-soil carbon. The model was less sensitive to fire consumption rates and to rates at which standing dead becomes woody debris. All model runs, however, suggest that woody debris plays an important role in long-term carbon storage for this area. ?? 2005 NRC Canada.

  15. Serum Amylase in Bulimia Nervosa and Purging Disorder: Differentiating the Association with Binge Eating versus Purging Behavior

    PubMed Central

    Wolfe, Barbara E.; Jimerson, David C.; Smith, Adrian; Keel, Pamela K.

    2011-01-01

    Objective Elevated serum amylase levels in bulimia nervosa (BN), associated with increased salivary gland size and self-induced vomiting in some patients, provide a possible marker of symptom severity. The goal of this study was to assess whether serum hyperamylasemia in BN is more closely associated with binge eating episodes involving consumption of large amounts of food or with purging behavior. Method Participants included women with BN (n=26); women with “purging disorder” (PD), a subtype of EDNOS characterized by recurrent purging in the absence of objectively large binge eating episodes (n=14); and healthy non-eating disorder female controls (n=32). There were no significant differences in age or body mass index (BMI) across groups. The clinical groups reported similar frequency of self-induced vomiting behavior and were free of psychotropic medications. Serum samples were obtained after overnight fast and were assayed for alpha-amylase by enzymatic method. Results Serum amylase levels were significantly elevated in BN (60.7 ± 25.4 international units [IU]/liter, mean ± sd) in comparison to PD (44.7 ± 17.1 IU/L, p < 02) and to Controls (49.3 ± 15.8, p < .05). Conclusion These findings provide evidence to suggest that it is recurrent binge eating involving large amounts of food, rather than self-induced vomiting, which contributes to elevated serum amylase values in BN. PMID:21781981

  16. Detection of tiny amounts of fissile materials in large-sized containers with radioactive waste

    NASA Astrophysics Data System (ADS)

    Batyaev, V. F.; Skliarov, S. V.

    2018-01-01

    The paper is devoted to non-destructive control of tiny amounts of fissile materials in large-sized containers filled with radioactive waste (RAW). The aim of this work is to model an active neutron interrogation facility for detection of fissile ma-terials inside NZK type containers with RAW and determine the minimal detectable mass of U-235 as a function of various param-eters: matrix type, nonuniformity of container filling, neutron gen-erator parameters (flux, pulse frequency, pulse duration), meas-urement time. As a result the dependence of minimal detectable mass on fissile materials location inside container is shown. Nonu-niformity of the thermal neutron flux inside a container is the main reason of the space-heterogeneity of minimal detectable mass in-side a large-sized container. Our experiments with tiny amounts of uranium-235 (<1 g) confirm the detection of fissile materials in NZK containers by using active neutron interrogation technique.

  17. Sound Levels in East Texas Schools.

    ERIC Educational Resources Information Center

    Turner, Aaron Lynn

    A survey of sound levels was taken in several Texas schools to determine the amount of noise and sound present by size of class, type of activity, location of building, and the presence of air conditioning and large amounts of glass. The data indicate that class size and relative amounts of glass have no significant bearing on the production of…

  18. What Determines the Amount Students Borrow? Revisiting the Crisis-Convenience Debate

    ERIC Educational Resources Information Center

    Hart, Natala K.; Mustafa, Shoumi

    2008-01-01

    Recent studies have questioned the wisdom in blaming college costs for the escalation of student loans. It would appear that less affluent students borrow large amounts because inexpensive subsidized loans are available. This study attempted to verify the claim, estimating a model of the amount of loan received by students as a function of net…

  19. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  20. Accumulation of vitamin A in the hepatic stellate cell of arctic top predators.

    PubMed

    Senoo, Haruki; Imai, Katsuyuki; Mezaki, Yoshihiro; Miura, Mitsutaka; Morii, Mayako; Fujiwara, Mutsunori; Blomhoff, Rune

    2012-10-01

    We performed a systematic characterization of the hepatic vitamin A storage in mammals and birds of the Svalbard Archipelago and Greenland. The liver of top predators, including polar bear, Arctic fox, bearded seal, and glaucous gull, contained about 10-20 times more vitamin A than the liver of all other arctic animals studied, as well as their genetically related continental top predators. The values are also high compared to normal human and experimental animals like mouse and rat. This massive amount of hepatic vitamin A was located in large autofluorescent lipid droplets in hepatic stellate cells (HSCs; also called vitamin A-storing cells, lipocytes, interstitial cells, fat-storing cells, or Ito cells). The droplets made up most of the cells' cytoplasm. The development of such an efficient vitamin A-storing mechanism in HSCs may have contributed to the survival of top predators in the extreme environment of the arctic. These animals demonstrated no signs of hypervitaminosis A. We suggest that HSCs have capacity to take-up and store large amounts of vitamin A, which may play a pivotal role in maintenance of the food web, food chain, biodiversity, and eventually ecology of the arctic. Copyright © 2012 Wiley Periodicals, Inc.

  1. Procedures of determining organic trace compounds in municipal sewage sludge-a review.

    PubMed

    Lindholm-Lehto, Petra C; Ahkola, Heidi S J; Knuutinen, Juha S

    2017-02-01

    Sewage sludge is the largest by-product generated during the wastewater treatment process. Since large amounts of sludge are being produced, different ways of disposal have been introduced. One tempting option is to use it as fertilizer in agricultural fields due to its high contents of inorganic nutrients. This, however, can be limited by the amount of trace contaminants in the sewage sludge, containing a variety of microbiological pollutants and pathogens but also inorganic and organic contaminants. The bioavailability and the effects of trace contaminants on the microorganisms of soil are still largely unknown as well as their mixture effects. Therefore, there is a need to analyze the sludge to test its suitability before further use. In this article, a variety of sampling, pretreatment, extraction, and analysis methods have been reviewed. Additionally, different organic trace compounds often found in the sewage sludge and their methods of analysis have been compiled. In addition to traditional Soxhlet extraction, the most common extraction methods of organic contaminants in sludge include ultrasonic extraction (USE), supercritical fluid extraction (SFE), microwave-assisted extraction (MAE), and pressurized liquid extraction (PLE) followed by instrumental analysis based on gas or liquid chromatography and mass spectrometry.

  2. Trends in autumn rain of West China from 1961 to 2014

    NASA Astrophysics Data System (ADS)

    Zhang, Chi; Wang, Zunya; Zhou, Botao; Li, Yonghua; Tang, Hongyu; Xiang, Bo

    2018-02-01

    Autumn rain of West China is a typical climate phenomenon, which is characterized by continuous rainy days and large rainfall amounts and exerts profound impacts on the economic society. Based on daily precipitation data from 524 observation stations for the period of 1961-2014, this article comprehensively examined secular changes in autumn rain of West China, including its amount, frequency, intensity, and associated extremes. The results generally show a significant reduction of rainfall amount and rainy days and a significant enhancement of mean rainfall intensity for the average of West China during autumn (September-October) since 1961. Meanwhile, decreasing trends are consistently observed in the maximum daily rainfall, the longest consecutive rainy days, the greatest consecutive rainfall amount, and the frequencies of the extreme daily rainfall, consecutive rainfall, and consecutive rainfall process. Further analysis indicates that the decreases of autumn rainfall and related extremes in West China are associated with the decreases in both water vapor content and atmospheric unstable stratification during the past decades. On the regional scale, some differences exist in the changes of autumn rainfall between the eastern and western parts of West China. Besides, it is found that the autumn rainy season tends to start later and terminate earlier particularly in eastern West China.

  3. Nature and Properties of Lateritic Soils Derived from Different Parent Materials in Taiwan

    PubMed Central

    2014-01-01

    The objective of this study was to investigate the physical, chemical, and mineralogical composition of lateritic soils in order to use these soils as potential commercial products for industrial application in the future. Five lateritic soils derived from various parent materials in Taiwan, including andesite, diluvium, shale stone, basalt, and Pleistocene deposit, were collected from the Bt1 level of soil samples. Based on the analyses, the Tungwei soil is an alfisol, whereas other lateritic soils are ultisol. Higher pH value of Tungwei is attributed to the large amounts of Ca2+ and Mg2+. Loupi and Pingchen soils would be the older lateritic soils because of the lower active iron ratio. For the iron minerals, the magnetic iron oxides such as major amounts of magnetite and maghemite were found for Tamshui and Tungwei lateritic soils, respectively. Lepidocrocite was only found in Soka soil and intermediate amounts of goethite were detected for Loupi and Pingchen soils. After Mg-saturated and K-saturated processes, major amounts of mixed layer were observed in Loupi and Soka soils, whereas the montmorillonite was only detected in Tungwei soil. The investigation results revealed that the parent materials would play an important role during soil weathering process and physical, chemical, and mineralogy compositions strongly affect the formation of lateritic soils. PMID:24883366

  4. Nature and properties of lateritic soils derived from different parent materials in Taiwan.

    PubMed

    Ko, Tzu-Hsing

    2014-01-01

    The objective of this study was to investigate the physical, chemical, and mineralogical composition of lateritic soils in order to use these soils as potential commercial products for industrial application in the future. Five lateritic soils derived from various parent materials in Taiwan, including andesite, diluvium, shale stone, basalt, and Pleistocene deposit, were collected from the Bt1 level of soil samples. Based on the analyses, the Tungwei soil is an alfisol, whereas other lateritic soils are ultisol. Higher pH value of Tungwei is attributed to the large amounts of Ca(2+) and Mg(2+). Loupi and Pingchen soils would be the older lateritic soils because of the lower active iron ratio. For the iron minerals, the magnetic iron oxides such as major amounts of magnetite and maghemite were found for Tamshui and Tungwei lateritic soils, respectively. Lepidocrocite was only found in Soka soil and intermediate amounts of goethite were detected for Loupi and Pingchen soils. After Mg-saturated and K-saturated processes, major amounts of mixed layer were observed in Loupi and Soka soils, whereas the montmorillonite was only detected in Tungwei soil. The investigation results revealed that the parent materials would play an important role during soil weathering process and physical, chemical, and mineralogy compositions strongly affect the formation of lateritic soils.

  5. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  6. Biogas and Fuel Cells Workshop Summary Report: Proceedings from the Biogas and Fuel Cells Workshop, Golden, Colorado, June 11-13, 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2013-01-01

    The U.S. Department of Energy (DOE) National Renewable Energy Laboratory (NREL) held a Biogas and Fuel Cells Workshop June 11-13, 2012, in Golden, Colorado, to discuss biogas and waste-to-energy technologies for fuel cell applications. The overall objective was to identify opportunities for coupling renewable biomethane with highly efficient fuel cells to produce electricity; heat; combined heat and power (CHP); or combined heat, hydrogen and power (CHHP) for stationary or motive applications. The workshop focused on biogas sourced from wastewater treatment plants (WWTPs), landfills, and industrial facilities that generate or process large amounts of organic waste, including large biofuel production facilitiesmore » (biorefineries).« less

  7. Efficient tiled calculation of over-10-gigapixel holograms using ray-wavefront conversion.

    PubMed

    Igarashi, Shunsuke; Nakamura, Tomoya; Matsushima, Kyoji; Yamaguchi, Masahiro

    2018-04-16

    In the calculation of large-scale computer-generated holograms, an approach called "tiling," which divides the hologram plane into small rectangles, is often employed due to limitations on computational memory. However, the total amount of computational complexity severely increases with the number of divisions. In this paper, we propose an efficient method for calculating tiled large-scale holograms using ray-wavefront conversion. In experiments, the effectiveness of the proposed method was verified by comparing its calculation cost with that using the previous method. Additionally, a hologram of 128K × 128K pixels was calculated and fabricated by a laser-lithography system, and a high-quality 105 mm × 105 mm 3D image including complicated reflection and translucency was optically reconstructed.

  8. A successful strategy for the recovering of active P21, an insoluble recombinant protein of Trypanosoma cruzi

    NASA Astrophysics Data System (ADS)

    Santos, Marlus Alves Dos; Teixeira, Francesco Brugnera; Moreira, Heline Hellen Teixeira; Rodrigues, Adele Aud; Machado, Fabrício Castro; Clemente, Tatiana Mordente; Brigido, Paula Cristina; Silva, Rebecca Tavares E.; Purcino, Cecílio; Gomes, Rafael Gonçalves Barbosa; Bahia, Diana; Mortara, Renato Arruda; Munte, Claudia Elisabeth; Horjales, Eduardo; da Silva, Claudio Vieira

    2014-03-01

    Structural studies of proteins normally require large quantities of pure material that can only be obtained through heterologous expression systems and recombinant technique. In these procedures, large amounts of expressed protein are often found in the insoluble fraction, making protein purification from the soluble fraction inefficient, laborious, and costly. Usually, protein refolding is avoided due to a lack of experimental assays that can validate correct folding and that can compare the conformational population to that of the soluble fraction. Herein, we propose a validation method using simple and rapid 1D 1H nuclear magnetic resonance (NMR) spectra that can efficiently compare protein samples, including individual information of the environment of each proton in the structure.

  9. The Spatial Distribution of Forest Biomass in the Brazilian Amazon: A Comparison of Estimates

    NASA Technical Reports Server (NTRS)

    Houghton, R. A.; Lawrence, J. L.; Hackler, J. L.; Brown, S.

    2001-01-01

    The amount of carbon released to the atmosphere as a result of deforestation is determined, in part, by the amount of carbon held in the biomass of the forests converted to other uses. Uncertainty in forest biomass is responsible for much of the uncertainty in current estimates of the flux of carbon from land-use change. We compared several estimates of forest biomass for the Brazilian Amazon, based on spatial interpolations of direct measurements, relationships to climatic variables, and remote sensing data. We asked three questions. First, do the methods yield similar estimates? Second, do they yield similar spatial patterns of distribution of biomass? And, third, what factors need most attention if we are to predict more accurately the distribution of forest biomass over large areas? Amazonian forests (including dead and below-ground biomass) vary by more than a factor of two, from a low of 39 PgC to a high of 93 PgC. Furthermore, the estimates disagree as to the regions of high and low biomass. The lack of agreement among estimates confirms the need for reliable determination of aboveground biomass over large areas. Potential methods include direct measurement of biomass through forest inventories with improved allometric regression equations, dynamic modeling of forest recovery following observed stand-replacing disturbances (the approach used in this research), and estimation of aboveground biomass from airborne or satellite-based instruments sensitive to the vertical structure plant canopies.

  10. Low-Tech, Pilot Scale Purification of a Recombinant Spider Silk Protein Analog from Tobacco Leaves.

    PubMed

    Heppner, René; Weichert, Nicola; Schierhorn, Angelika; Conrad, Udo; Pietzsch, Markus

    2016-10-09

    Spider dragline is used by many members of the Araneae family not only as a proteinogenic safety thread but also for web construction. Spider dragline has been shown to possess high tensile strength in combination with elastic behavior. This high tensile strength can be attributed to the presence of antiparallel β-sheets within the thread; these antiparallel β-sheets are why the protein is classified as a silk. Due to the properties of spider silk and its technical and medical uses, including its use as a suture material and as a scaffold for tissue regeneration, spider dragline is a focus of the biotechnology industry. The production of sufficient amounts of spider silk is challenging, as it is difficult to produce large quantities of fibers because of the cannibalistic behavior of spiders and their large spatial requirements. In recent years, the heterologous expression of genes coding for spider silk analogs in various hosts, including plants such as Nicotiana tabacum , has been established. We developed a simple and scalable method for the purification of a recombinant spider silk protein elastin-like peptide fusion protein (Q-/K-MaSp1-100× ELP) after heterologous production in tobacco leaves involving heat and acetone precipitation. Further purification was performed using centrifugal Inverse Transition Cycling (cITC). Up to 400 mg of highly pure spider silk protein derivatives can be isolated from six kilograms of tobacco leaves, which is the highest amount of silk protein derivatives purified from plants thus far.

  11. A Parallel Multiclassification Algorithm for Big Data Using an Extreme Learning Machine.

    PubMed

    Duan, Mingxing; Li, Kenli; Liao, Xiangke; Li, Keqin

    2018-06-01

    As data sets become larger and more complicated, an extreme learning machine (ELM) that runs in a traditional serial environment cannot realize its ability to be fast and effective. Although a parallel ELM (PELM) based on MapReduce to process large-scale data shows more efficient learning speed than identical ELM algorithms in a serial environment, some operations, such as intermediate results stored on disks and multiple copies for each task, are indispensable, and these operations create a large amount of extra overhead and degrade the learning speed and efficiency of the PELMs. In this paper, an efficient ELM based on the Spark framework (SELM), which includes three parallel subalgorithms, is proposed for big data classification. By partitioning the corresponding data sets reasonably, the hidden layer output matrix calculation algorithm, matrix decomposition algorithm, and matrix decomposition algorithm perform most of the computations locally. At the same time, they retain the intermediate results in distributed memory and cache the diagonal matrix as broadcast variables instead of several copies for each task to reduce a large amount of the costs, and these actions strengthen the learning ability of the SELM. Finally, we implement our SELM algorithm to classify large data sets. Extensive experiments have been conducted to validate the effectiveness of the proposed algorithms. As shown, our SELM achieves an speedup on a cluster with ten nodes, and reaches a speedup with 15 nodes, an speedup with 20 nodes, a speedup with 25 nodes, a speedup with 30 nodes, and a speedup with 35 nodes.

  12. Seemingly unrelated intervention time series models for effectiveness evaluation of large scale environmental remediation.

    PubMed

    Ip, Ryan H L; Li, W K; Leung, Kenneth M Y

    2013-09-15

    Large scale environmental remediation projects applied to sea water always involve large amount of capital investments. Rigorous effectiveness evaluations of such projects are, therefore, necessary and essential for policy review and future planning. This study aims at investigating effectiveness of environmental remediation using three different Seemingly Unrelated Regression (SUR) time series models with intervention effects, including Model (1) assuming no correlation within and across variables, Model (2) assuming no correlation across variable but allowing correlations within variable across different sites, and Model (3) allowing all possible correlations among variables (i.e., an unrestricted model). The results suggested that the unrestricted SUR model is the most reliable one, consistently having smallest variations of the estimated model parameters. We discussed our results with reference to marine water quality management in Hong Kong while bringing managerial issues into consideration. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Humans use compression heuristics to improve the recall of social networks.

    PubMed

    Brashears, Matthew E

    2013-01-01

    The ability of primates, including humans, to maintain large social networks appears to depend on the ratio of the neocortex to the rest of the brain. However, observed human network size frequently exceeds predictions based on this ratio (e.g., "Dunbar's Number"), implying that human networks are too large to be cognitively managed. Here I show that humans adaptively use compression heuristics to allow larger amounts of social information to be stored in the same brain volume. I find that human adults can remember larger numbers of relationships in greater detail when a network exhibits triadic closure and kin labels than when it does not. These findings help to explain how humans manage large and complex social networks with finite cognitive resources and suggest that many of the unusual properties of human social networks are rooted in the strategies necessary to cope with cognitive limitations.

  14. Preliminary report on ground-water conditions in the Cloquet area, Carlton County, Minnesota

    USGS Publications Warehouse

    Akin, P.D.

    1951-01-01

    A study of the geology and ground-water conditions in the.area including Cloquet, Minn., was begun by the United States Geological Survey in 1948 in financial cooperation with the Minnesota State Department of Conservation, at the request of the city of Cloquet for assistance in locating large additional ground-water supplies for industrial and municipal use. The location of the area is show on figure 1. Although the present municipal wells provide a fairly adequate supply for current municipal needs, which averaged about three-quarters of a million gallons a day in 1946, there is great need for large supplies of good water, on the order of 10 million gallons a day, for use by the paper mills and other industries there. At present the industries are using water from the St. Louis River, but the water is unsatisfactory and expensive to use because it contains a large amount of objectionable organic material.

  15. Survivable pulse power space radiator

    DOEpatents

    Mims, J.; Buden, D.; Williams, K.

    1988-03-11

    A thermal radiator system is described for use on an outer space vehicle, which must survive a long period of nonuse and then radiate large amounts of heat for a limited period of time. The radiator includes groups of radiator panels that are pivotally connected in tandem, so that they can be moved to deployed configuration wherein the panels lie largely coplanar, and to a stowed configuration wherein the panels lie in a stack to resist micrometerorite damage. The panels are mounted on a boom which separates a hot power source from a payload. While the panels are stowed, warm fluid passes through their arteries to keep them warm enough to maintain the coolant in a liquid state and avoid embrittlement of material. The panels can be stored in a largely cylindrical shell, with panels progressively further from the boom being of progressively shorter length. 5 figs.

  16. Supraventricular tachycardia induced by chocolate: is chocolate too sweet for the heart?

    PubMed

    Parasramka, Saurabh; Dufresne, Alix

    2012-09-01

    Conflicting studies have been published concerning the association between chocolate and cardiovascular diseases. Fewer articles have described the potential arrhythmogenic risk related to chocolate intake. We present a case of paroxysmal supraventricular tachycardia in a woman after consumption of large quantity of chocolate. A 53-year-old woman with no significant medical history presented to us with complaints of palpitations and shortness of breath after consuming large amounts of chocolate. Electrocardiogram showed supraventricular tachycardia at 165 beats per minute, which was restored to sinus rhythm after adenosine bolus injection. Electrophysiology studies showed atrioventricular nodal reentry tachycardia, which was treated with radiofrequency ablation. Chocolate contains caffeine and theobromine, which are methylxanthines and are competitive antagonists of adenosine and can have arrhythmogenic potential. Our case very well describes an episode of tachycardia precipitated by large amount of chocolate consumption in a patient with underlying substrate. There are occasional case reports describing association between chocolate, caffeine, and arrhythmias. A large Danish study, however, did not find any association between amount of daily caffeine consumption and risk of arrhythmia.

  17. A Cost Benefit Analysis of Emerging LED Water Purification Systems in Expeditionary Environments

    DTIC Science & Technology

    2017-03-23

    the initial contingency response phase, ROWPUs are powered by large generators which require relatively large amounts of fossil fuels. The amount of...they attract and cling together forming a larger particle (Chem Treat, 2016). Flocculation is the addition of a polymer to water that clumps...smaller particles together to form larger particles. The idea for both methods is that larger particles will either settle out of or be removed from the

  18. Galaxy And Mass Assembly (GAMA): the connection between metals, specific SFR and H I gas in galaxies: the Z-SSFR relation

    NASA Astrophysics Data System (ADS)

    Lara-López, M. A.; Hopkins, A. M.; López-Sánchez, A. R.; Brough, S.; Colless, M.; Bland-Hawthorn, J.; Driver, S.; Foster, C.; Liske, J.; Loveday, J.; Robotham, A. S. G.; Sharp, R. G.; Steele, O.; Taylor, E. N.

    2013-06-01

    We study the interplay between gas phase metallicity (Z), specific star formation rate (SSFR) and neutral hydrogen gas (H I) for galaxies of different stellar masses. Our study uses spectroscopic data from Galaxy and Mass Assembly and Sloan Digital Sky Survey (SDSS) star-forming galaxies, as well as H I detection from the Arecibo Legacy Fast Arecibo L-band Feed Array (ALFALFA) and Galex Arecibo SDSS Survey (GASS) public catalogues. We present a model based on the Z-SSFR relation that shows that at a given stellar mass, depending on the amount of gas, galaxies will follow opposite behaviours. Low-mass galaxies with a large amount of gas will show high SSFR and low metallicities, while low-mass galaxies with small amounts of gas will show lower SSFR and high metallicities. In contrast, massive galaxies with a large amount of gas will show moderate SSFR and high metallicities, while massive galaxies with small amounts of gas will show low SSFR and low metallicities. Using ALFALFA and GASS counterparts, we find that the amount of gas is related to those drastic differences in Z and SSFR for galaxies of a similar stellar mass.

  19. Renewable energy recovery through selected industrial wastes

    NASA Astrophysics Data System (ADS)

    Zhang, Pengchong

    Typically, industrial waste treatment costs a large amount of capital, and creates environmental concerns as well. A sound alternative for treating these industrial wastes is anaerobic digestion. This technique reduces environmental pollution, and recovers renewable energy from the organic fraction of those selected industrial wastes, mostly in the form of biogas (methane). By applying anaerobic technique, selected industrial wastes could be converted from cash negative materials into economic energy feed stocks. In this study, three kinds of industrial wastes (paper mill wastes, brown grease, and corn-ethanol thin stillage) were selected, their performance in the anaerobic digestion system was studied and their applicability was investigated as well. A pilot-scale system, including anaerobic section (homogenization, pre-digestion, and anaerobic digestion) and aerobic section (activated sludge) was applied to the selected waste streams. The investigation of selected waste streams was in a gradually progressive order. For paper mill effluents, since those effluents contain a large amount of recalcitrant or toxic compounds, the anaerobic-aerobic system was used to check its treatability, including organic removal efficiency, substrate utilization rate, and methane yield. The results showed the selected effluents were anaerobically treatable. For brown grease, as it is already well known as a treatable substrate, a high rate anaerobic digester were applied to check the economic effect of this substrate, including methane yield and substrate utilization rate. These data from pilot-scale experiment have the potential to be applied to full-scale plant. For thin stillage, anaerobic digestion system has been incorporated to the traditional ethanol making process as a gate-to-gate process. The performance of anaerobic digester was applied to the gate-to-gate life-cycle analysis to estimate the energy saving and industrial cost saving in a typical ethanol plant.

  20. Antibody recognition of the glycoprotein g of viral haemorrhagic septicemia virus (VHSV) purified in large amounts from insect larvae

    PubMed Central

    2011-01-01

    Background There are currently no purification methods capable of producing the large amounts of fish rhabdoviral glycoprotein G (gpG) required for diagnosis and immunisation purposes or for studying structure and molecular mechanisms of action of this molecule (ie. pH-dependent membrane fusion). As a result of the unavailability of large amounts of the gpG from viral haemorrhagic septicaemia rhabdovirus (VHSV), one of the most dangerous viruses affecting cultured salmonid species, research interests in this field are severely hampered. Previous purification methods to obtain recombinant gpG from VHSV in E. coli, yeast and baculovirus grown in insect cells have not produced soluble conformations or acceptable yields. The development of large-scale purification methods for gpGs will also further research into other fish rhabdoviruses, such as infectious haematopoietic necrosis virus (IHNV), spring carp viremia virus (SVCV), hirame rhabdovirus (HIRRV) and snakehead rhabdovirus (SHRV). Findings Here we designed a method to produce milligram amounts of soluble VHSV gpG. Only the transmembrane and carboxy terminal-deleted (amino acid 21 to 465) gpG was efficiently expressed in insect larvae. Recognition of G21-465 by ß-mercaptoethanol-dependent neutralizing monoclonal antibodies (N-MAbs) and pH-dependent recognition by sera from VHSV-hyperimmunized or VHSV-infected rainbow trout (Oncorhynchus mykiss) was demonstrated. Conclusions Given that the purified G21-465 conserved some of its most important properties, this method might be suitable for the large-scale production of fish rhabdoviral gpGs for use in diagnosis, fusion and antigenicity studies. PMID:21693048

  1. Antibody recognition of the glycoprotein g of viral haemorrhagic septicemia virus (VHSV) purified in large amounts from insect larvae.

    PubMed

    Encinas, Paloma; Gomez-Sebastian, Silvia; Nunez, Maria Carmen; Gomez-Casado, Eduardo; Escribano, Jose M; Estepa, Amparo; Coll, Julio

    2011-06-21

    There are currently no purification methods capable of producing the large amounts of fish rhabdoviral glycoprotein G (gpG) required for diagnosis and immunisation purposes or for studying structure and molecular mechanisms of action of this molecule (ie. pH-dependent membrane fusion). As a result of the unavailability of large amounts of the gpG from viral haemorrhagic septicaemia rhabdovirus (VHSV), one of the most dangerous viruses affecting cultured salmonid species, research interests in this field are severely hampered. Previous purification methods to obtain recombinant gpG from VHSV in E. coli, yeast and baculovirus grown in insect cells have not produced soluble conformations or acceptable yields. The development of large-scale purification methods for gpGs will also further research into other fish rhabdoviruses, such as infectious haematopoietic necrosis virus (IHNV), spring carp viremia virus (SVCV), hirame rhabdovirus (HIRRV) and snakehead rhabdovirus (SHRV). Here we designed a method to produce milligram amounts of soluble VHSV gpG. Only the transmembrane and carboxy terminal-deleted (amino acid 21 to 465) gpG was efficiently expressed in insect larvae. Recognition of G21-465 by ß-mercaptoethanol-dependent neutralizing monoclonal antibodies (N-MAbs) and pH-dependent recognition by sera from VHSV-hyperimmunized or VHSV-infected rainbow trout (Oncorhynchus mykiss) was demonstrated. Given that the purified G21-465 conserved some of its most important properties, this method might be suitable for the large-scale production of fish rhabdoviral gpGs for use in diagnosis, fusion and antigenicity studies.

  2. Summary of LaRC 2-inch Erectable Joint Hardware Heritage Test Data

    NASA Technical Reports Server (NTRS)

    Dorsey, John T.; Watson, Judith J.

    2016-01-01

    As the National Space Transportation System (STS, also known as the Space Shuttle) went into service during the early 1980's, NASA envisioned many missions of exploration and discovery that could take advantage of the STS capabilities. These missions included: large orbiting space stations, large space science telescopes and large spacecraft for manned missions to the Moon and Mars. The missions required structures that were significantly larger than the payload volume available on the STS. NASA Langley Research Center (LaRC) conducted studies to design and develop the technology needed to assemble the large space structures in orbit. LaRC focused on technology for erectable truss structures, in particular, the joint that connects the truss struts at the truss nodes. When the NASA research in large erectable space structures ended in the early 1990's, a significant amount of structural testing had been performed on the LaRC 2-inch erectable joint that was never published. An extensive set of historical information and data has been reviewed and the joint structural testing results from this historical data are compiled and summarized in this report.

  3. Socioeconomic status moderates genetic and environmental effects on the amount of alcohol use.

    PubMed

    Hamdi, Nayla R; Krueger, Robert F; South, Susan C

    2015-04-01

    Much is unknown about the relationship between socioeconomic status (SES) and alcohol use, including the means by which SES may influence risk for alcohol use. Using a sample of 672 twin pairs (aged 25 to 74) derived from the MacArthur Foundation Survey of Midlife Development in the United States, this study examined whether SES, measured by household income and educational attainment, moderates genetic and environmental influences on 3 indices of alcohol use: amount used, frequency of use, and problem use. We found significant moderation for amount of alcohol used. Specifically, genetic effects were greater in low-SES conditions, shared environmental effects (i.e., environmental effects that enhance the similarity of twins from the same families) tended to increase in high-SES conditions, and nonshared environmental effects (i.e., environmental effects that distinguish twins) tended to decrease with SES. This pattern of results was found for both income and education, and it largely replicated at a second wave of assessment spaced 9 years after the first. There was virtually no evidence of moderation for either frequency of alcohol use or alcohol problems. Our findings indicate that genetic and environmental influences on drinking amount vary as a function of the broader SES context, whereas the etiologies of other drinking phenomena are less affected by this context. Efforts to find the causes underlying the amount of alcohol used are likely to be more successful if such contextual information is taken into account. Copyright © 2015 by the Research Society on Alcoholism.

  4. Socioeconomic Status Moderates Genetic and Environmental Effects on the Amount of Alcohol Use

    PubMed Central

    Hamdi, Nayla R; Krueger, Robert F.; South, Susan C.

    2015-01-01

    Background Much is unknown about the relationship between socioeconomic status (SES) and alcohol use, including the means by which SES may influence risk for alcohol use. Methods Using a sample of 672 twin pairs (aged 25–74) derived from the MacArthur Foundation Survey of Midlife Development in the United States (MIDUS), the present study examined whether SES, measured by household income and educational attainment, moderates genetic and environmental influences on three indices of alcohol use: amount used, frequency of use, and problem use. Results We found significant moderation for amount of alcohol used. Specifically, genetic effects were greater in low-SES conditions, shared environmental effects (i.e., environmental effects that enhance the similarity of twins from the same families) tended to increase in high-SES conditions, and non-shared environmental effects (i.e., environmental effects that distinguish twins) tended to decrease with SES. This pattern of results was found for both income and education, and it largely replicated at a second wave of assessment spaced nine years after the first. There was virtually no evidence of moderation for either frequency of alcohol use or alcohol problems. Conclusions Our findings indicate that genetic and environmental influences on drinking amount vary as a function of the broader SES context, whereas the etiologies of other drinking phenomena are less affected by this context. Efforts to find the causes underlying the amount of alcohol used are likely to be more successful if such contextual information is taken into account. PMID:25778493

  5. Relationship between food waste, diet quality, and environmental sustainability

    PubMed Central

    Niles, Meredith T.; Neher, Deborah A.; Roy, Eric D.; Tichenor, Nicole E.; Jahns, Lisa

    2018-01-01

    Improving diet quality while simultaneously reducing environmental impact is a critical focus globally. Metrics linking diet quality and sustainability have typically focused on a limited suite of indicators, and have not included food waste. To address this important research gap, we examine the relationship between food waste, diet quality, nutrient waste, and multiple measures of sustainability: use of cropland, irrigation water, pesticides, and fertilizers. Data on food intake, food waste, and application rates of agricultural amendments were collected from diverse US government sources. Diet quality was assessed using the Healthy Eating Index-2015. A biophysical simulation model was used to estimate the amount of cropland associated with wasted food. This analysis finds that US consumers wasted 422g of food per person daily, with 30 million acres of cropland used to produce this food every year. This accounts for 30% of daily calories available for consumption, one-quarter of daily food (by weight) available for consumption, and 7% of annual cropland acreage. Higher quality diets were associated with greater amounts of food waste and greater amounts of wasted irrigation water and pesticides, but less cropland waste. This is largely due to fruits and vegetables, which are health-promoting and require small amounts of cropland, but require substantial amounts of agricultural inputs. These results suggest that simultaneous efforts to improve diet quality and reduce food waste are necessary. Increasing consumers’ knowledge about how to prepare and store fruits and vegetables will be one of the practical solutions to reducing food waste. PMID:29668732

  6. Simplifying the negotiating process with physicians: critical elements in negotiating from private practice to employed physician.

    PubMed

    Gallucci, Armen; Deutsch, Thomas; Youngquist, Jaymie

    2013-01-01

    The authors attempt to simplify the key elements to the process of negotiating successfully with private physicians. From their experience, the business elements that have resulted in the most discussion center on the compensation including the incentive plan. Secondarily, how the issue of malpractice is handled will also consume a fair amount of time. What the authors have also learned is that the intangible issues can often be the reason for an unexpectedly large amount of discussion and therefore add time to the negotiation process. To assist with this process, they have derived a negotiation checklist, which seeks to help hospital leaders and administrators set the proper framework to ensure successful negotiation conversations. More importantly, being organized and recognizing these broad issues upfront and remaining transparent throughout the process will help to ensure a successful negotiation.

  7. ECUT: Energy Conversion and Utilization Technologies program biocatalysis research activity. Potential membrane applications to biocatalyzed processes: Assessment of concentration polarization and membrane fouling

    NASA Technical Reports Server (NTRS)

    Ingham, J. D.

    1983-01-01

    Separation and purification of the products of biocatalyzed fermentation processes, such as ethanol or butanol, consumes most of the process energy required. Since membrane systems require substantially less energy for separation than most alternatives (e.g., distillation) they have been suggested for separation or concentration of fermentation products. This report is a review of the effects of concentration polarization and membrane fouling for the principal membrane processes: microfiltration, ultrafiltration, reverse osmosis, and electrodialysis including a discussion of potential problems relevant to separation of fermentation products. It was concluded that advanced membrane systems may result in significantly decreased energy consumption. However, because of the need to separate large amounts of water from much smaller amounts of product that may be more volatile than wate, it is not clear that membrane separations will necessarily be more efficient than alternative processes.

  8. Human capabilities in space. [man machine interaction

    NASA Technical Reports Server (NTRS)

    Nicogossian, A. E.

    1984-01-01

    Man's ability to live and perform useful work in space was demonstrated throughout the history of manned space flight. Current planning envisions a multi-functional space station. Man's unique abilities to respond to the unforeseen and to operate at a level of complexity exceeding any reasonable amount of previous planning distinguish him from present day machines. His limitations, however, include his inherent inability to survive without protection, his limited strength, and his propensity to make mistakes when performing repetitive and monotonous tasks. By contrast, an automated system does routine and delicate tasks, exerts force smoothly and precisely, stores, and recalls large amounts of data, and performs deductive reasoning while maintaining a relative insensitivity to the environment. The establishment of a permanent presence of man in space demands that man and machines be appropriately combined in spaceborne systems. To achieve this optimal combination, research is needed in such diverse fields as artificial intelligence, robotics, behavioral psychology, economics, and human factors engineering.

  9. Simplified method for preparation of concentrated exoproteins produced by Staphylococcus aureus grown on surface of cellophane bag containing liquid medium.

    PubMed

    Ikigai, H; Seki, K; Nishihara, S; Masuda, S

    1988-01-01

    A simplified method for preparation of concentrated exoproteins including protein A and alpha-toxin produced by Staphylococcus aureus was successfully devised. The concentrated proteins were obtained by cultivating S. aureus organisms on the surface of a liquid medium-containing cellophane bag enclosed in a sterilized glass flask. With the same amount of medium, the total amount of proteins obtained by the method presented here was identical with that obtained by conventional liquid culture. The concentration of proteins obtained by the method, however, was high enough to observe their distinct bands stained on polyacrylamide gel electrophoresis. This method was considered quite useful not only for large-scale cultivation for the purification of staphylococcal proteins but also for small-scale study using the proteins. The precise description of the method was presented and its possible usefulness was discussed.

  10. ABS-FishCount: An Agent-Based Simulator of Underwater Sensors for Measuring the Amount of Fish.

    PubMed

    García-Magariño, Iván; Lacuesta, Raquel; Lloret, Jaime

    2017-11-13

    Underwater sensors provide one of the possibilities to explore oceans, seas, rivers, fish farms and dams, which all together cover most of our planet's area. Simulators can be helpful to test and discover some possible strategies before implementing these in real underwater sensors. This speeds up the development of research theories so that these can be implemented later. In this context, the current work presents an agent-based simulator for defining and testing strategies for measuring the amount of fish by means of underwater sensors. The current approach is illustrated with the definition and assessment of two strategies for measuring fish. One of these two corresponds to a simple control mechanism, while the other is an experimental strategy and includes an implicit coordination mechanism. The experimental strategy showed a statistically significant improvement over the control one in the reduction of errors with a large Cohen's d effect size of 2.55.

  11. Oil spill dispersants induce formation of marine snow by phytoplankton-associated bacteria.

    PubMed

    van Eenennaam, Justine S; Wei, Yuzhu; Grolle, Katja C F; Foekema, Edwin M; Murk, AlberTinka J

    2016-03-15

    Unusually large amounts of marine snow, including Extracellular Polymeric Substances (EPS), were formed during the 2010 Deepwater Horizon oil spill. The marine snow settled with oil and clay minerals as an oily sludge layer on the deep sea floor. This study tested the hypothesis that the unprecedented amount of chemical dispersants applied during high phytoplankton densities in the Gulf of Mexico induced high EPS formation. Two marine phytoplankton species (Dunaliella tertiolecta and Phaeodactylum tricornutum) produced EPS within days when exposed to the dispersant Corexit 9500. Phytoplankton-associated bacteria were shown to be responsible for the formation. The EPS consisted of proteins and to lesser extent polysaccharides. This study reveals an unexpected consequence of the presence of phytoplankton. This emphasizes the need to test the action of dispersants under realistic field conditions, which may seriously alter the fate of oil in the environment via increased marine snow formation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Plasma reactor waste management systems

    NASA Technical Reports Server (NTRS)

    Ness, Robert O., Jr.; Rindt, John R.; Ness, Sumitra R.

    1992-01-01

    The University of North Dakota is developing a plasma reactor system for use in closed-loop processing that includes biological, materials, manufacturing, and waste processing. Direct-current, high-frequency, or microwave discharges will be used to produce plasmas for the treatment of materials. The plasma reactors offer several advantages over other systems, including low operating temperatures, low operating pressures, mechanical simplicity, and relatively safe operation. Human fecal material, sunflowers, oats, soybeans, and plastic were oxidized in a batch plasma reactor. Over 98 percent of the organic material was converted to gaseous products. The solids were then analyzed and a large amount of water and acid-soluble materials were detected. These materials could possibly be used as nutrients for biological systems.

  13. Effect of noble gases on an atmospheric greenhouse /Titan/.

    NASA Technical Reports Server (NTRS)

    Cess, R.; Owen, T.

    1973-01-01

    Several models for the atmosphere of Titan have been investigated, taking into account various combinations of neon and argon. The investigation shows that the addition of large amounts of Ne and/or Ar will substantially reduce the hydrogen abundance required for a given greenhouse effect. The fact that a large amount of neon should be present if the atmosphere is a relic of the solar nebula is an especially attractive feature of the models, because it is hard to justify appropriate abundances of other enhancing agents.

  14. Impact of the BALLOTS Shared Cataloging System on the Amount of Change in the Library Technical Processing Department.

    ERIC Educational Resources Information Center

    Kershner, Lois M.

    The amount of change resulting from the implementation of the Bibliographic Automation of Large Library Operations using a Time-sharing System (BALLOTS) is analyzed, in terms of (1) physical room arrangement, (2) work procedure, and (3) organizational structure. Also considered is the factor of amount of time the new system has been in use.…

  15. An Earth-System Approach to Understanding the Deepwater Horizon Oil Spill

    ERIC Educational Resources Information Center

    Robeck, Edward

    2011-01-01

    The Deepwater Horizon explosion on April 20, 2010, and the subsequent release of oil into the Gulf of Mexico created an ecological disaster of immense proportions. The estimates of the amounts of oil, whether for the amount released per day or the total amount of oil disgorged from the well, call on numbers so large they defy the capacity of most…

  16. Contribution of Organically Grown Crops to Human Health

    PubMed Central

    Johansson, Eva; Hussain, Abrar; Kuktaite, Ramune; Andersson, Staffan C.; Olsson, Marie E.

    2014-01-01

    An increasing interest in organic agriculture for food production is seen throughout the world and one key reason for this interest is the assumption that organic food consumption is beneficial to public health. The present paper focuses on the background of organic agriculture, important public health related compounds from crop food and variations in the amount of health related compounds in crops. In addition, influence of organic farming on health related compounds, on pesticide residues and heavy metals in crops, and relations between organic food and health biomarkers as well as in vitro studies are also the focus of the present paper. Nutritionally beneficial compounds of highest relevance for public health were micronutrients, especially Fe and Zn, and bioactive compounds such as carotenoids (including pro-vitamin A compounds), tocopherols (including vitamin E) and phenolic compounds. Extremely large variations in the contents of these compounds were seen, depending on genotype, climate, environment, farming conditions, harvest time, and part of the crop. Highest amounts seen were related to the choice of genotype and were also increased by genetic modification of the crop. Organic cultivation did not influence the content of most of the nutritional beneficial compounds, except the phenolic compounds that were increased with the amounts of pathogens. However, higher amounts of pesticide residues and in many cases also of heavy metals were seen in the conventionally produced crops compared to the organic ones. Animal studies as well as in vitro studies showed a clear indication of a beneficial effect of organic food/extracts as compared to conventional ones. Thus, consumption of organic food seems to be positive from a public health point of view, although the reasons are unclear, and synergistic effects between various constituents within the food are likely. PMID:24717360

  17. A convolutional neural network-based screening tool for X-ray serial crystallography

    PubMed Central

    Ke, Tsung-Wei; Brewster, Aaron S.; Yu, Stella X.; Ushizima, Daniela; Yang, Chao; Sauter, Nicholas K.

    2018-01-01

    A new tool is introduced for screening macromolecular X-ray crystallography diffraction images produced at an X-ray free-electron laser light source. Based on a data-driven deep learning approach, the proposed tool executes a convolutional neural network to detect Bragg spots. Automatic image processing algorithms described can enable the classification of large data sets, acquired under realistic conditions consisting of noisy data with experimental artifacts. Outcomes are compared for different data regimes, including samples from multiple instruments and differing amounts of training data for neural network optimization. PMID:29714177

  18. Wind-Tunnel Development of Ailerons for the Curtiss XP-60 Airplanem Special Report

    NASA Technical Reports Server (NTRS)

    Rogallo, F. M.; Lowry, John G.

    1942-01-01

    An investigation was made in the LWAL 7- by 10-foot tunnel of internally balanced, sealed ailerons for the Curtiss XP-60 airplane. Ailerons with tabs and. with various amounts of balance were tested. Stick forces were estimated for several aileron arrangements including an arrangement recommended for the airplane. Flight tests of the recommended arrangement are discussed briefly in an appendix, The results of the wind-tunnel and flight tests indicate that the ailerons of large or fast airplanes may be satisfactorily balanced by the method developed.

  19. Supernova explosions.

    NASA Technical Reports Server (NTRS)

    Cameron, A. G. W.

    1971-01-01

    The recent history of theoretical investigations of the supernova mechanism is considered, giving attention also to a number of nuclear physical problems which have yet to be solved in connection with the thermonuclear detonation. A variety of different processes of nucleo-synthesis are expected to occur in association with the supernova explosions. Aspects of the chemical evolution of the galaxy are discussed including the cosmic ray production of lithium, beryllium, and boron in the interstellar medium. Various hypotheses to account for the very large amount of light that comes from a supernova explosion are also examined.

  20. A convolutional neural network-based screening tool for X-ray serial crystallography.

    PubMed

    Ke, Tsung Wei; Brewster, Aaron S; Yu, Stella X; Ushizima, Daniela; Yang, Chao; Sauter, Nicholas K

    2018-05-01

    A new tool is introduced for screening macromolecular X-ray crystallography diffraction images produced at an X-ray free-electron laser light source. Based on a data-driven deep learning approach, the proposed tool executes a convolutional neural network to detect Bragg spots. Automatic image processing algorithms described can enable the classification of large data sets, acquired under realistic conditions consisting of noisy data with experimental artifacts. Outcomes are compared for different data regimes, including samples from multiple instruments and differing amounts of training data for neural network optimization. open access.

  1. A convolutional neural network-based screening tool for X-ray serial crystallography

    DOE PAGES

    Ke, Tsung-Wei; Brewster, Aaron S.; Yu, Stella X.; ...

    2018-04-24

    A new tool is introduced for screening macromolecular X-ray crystallography diffraction images produced at an X-ray free-electron laser light source. Based on a data-driven deep learning approach, the proposed tool executes a convolutional neural network to detect Bragg spots. Automatic image processing algorithms described can enable the classification of large data sets, acquired under realistic conditions consisting of noisy data with experimental artifacts. Outcomes are compared for different data regimes, including samples from multiple instruments and differing amounts of training data for neural network optimization.

  2. A convolutional neural network-based screening tool for X-ray serial crystallography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ke, Tsung-Wei; Brewster, Aaron S.; Yu, Stella X.

    A new tool is introduced for screening macromolecular X-ray crystallography diffraction images produced at an X-ray free-electron laser light source. Based on a data-driven deep learning approach, the proposed tool executes a convolutional neural network to detect Bragg spots. Automatic image processing algorithms described can enable the classification of large data sets, acquired under realistic conditions consisting of noisy data with experimental artifacts. Outcomes are compared for different data regimes, including samples from multiple instruments and differing amounts of training data for neural network optimization.

  3. Data::Downloader

    NASA Technical Reports Server (NTRS)

    Duggan, Brian

    2012-01-01

    Downloading and organizing large amounts of files is challenging, and often done using ad hoc methods. This software is capable of downloading and organizing files as an OpenSearch client. It can subscribe to RSS (Really Simple Syndication) feeds and Atom feeds containing arbitrary metadata, and maintains a local content addressable data store. It uses existing standards for obtaining the files, and uses efficient techniques for storing the files. Novel features include symbolic links to maintain a sane directory structure, checksums for validating file integrity during transfer and storage, and flexible use of server-provided metadata.

  4. Preparing for in situ processing on upcoming leading-edge supercomputers

    DOE PAGES

    Kress, James; Churchill, Randy Michael; Klasky, Scott; ...

    2016-10-01

    High performance computing applications are producing increasingly large amounts of data and placing enormous stress on current capabilities for traditional post-hoc visualization techniques. Because of the growing compute and I/O imbalance, data reductions, including in situ visualization, are required. These reduced data are used for analysis and visualization in a variety of different ways. Many of he visualization and analysis requirements are known a priori, but when they are not, scientists are dependent on the reduced data to accurately represent the simulation in post hoc analysis. The contributions of this paper is a description of the directions we are pursuingmore » to assist a large scale fusion simulation code succeed on the next generation of supercomputers. Finally, these directions include the role of in situ processing for performing data reductions, as well as the tradeoffs between data size and data integrity within the context of complex operations in a typical scientific workflow.« less

  5. An Update on ToxCast™ | Science Inventory | US EPA

    EPA Pesticide Factsheets

    In its first phase, ToxCast™ is profiling over 300 well-characterized chemicals (primarily pesticides) in over 400 HTS endpoints. These endpoints include biochemical assays of protein function, cell-based transcriptional reporter assays, multi-cell interaction assays, transcriptomics on primary cell cultures, and developmental assays in zebrafish embryos. Almost all of the compounds being examined in Phase 1 of ToxCast™ have been tested in traditional toxicology tests, including developmental toxicity, multi-generation studies, and sub-chronic and chronic rodent bioassays Lessons learned to date for ToxCast: Large amounts of quality HTS data can be economically obtained. Large scale data sets will be required to understand potential for biological activity. Value in having multiple assays with overlapping coverage of biological pathways and a variety of methodologies Concentration-response will be important for ultimate interpretation Data transparency will be important for acceptance. Metabolic capabilities and coverage of developmental toxicity pathways will need additional attention. Need to define the gold standard Partnerships are needed to bring critical mass and expertise.

  6. Marine envenomations.

    PubMed

    Berling, Ingrid; Isbister, Geoffrey

    2015-01-01

    Marine stings are common but most are minor and do not require medical intervention. Severe and systemic marine envenoming is uncommon, but includes box jellyfish stings, Irukandji syndrome, major stingray trauma and blue-ringed octopus envenoming. Almost all marine injuries are caused by jellyfish stings, and penetrating injuries from spiny fish, stingrays or sea urchins. This article describes the presentation and management of marine envenomations and injuries that may occur in Australia. First aid for jellyfish includes tentacle removal, application of vinegar for box jellyfish, and hot water immersion (45°C for 20 min) for bluebottle jellyfish stings. Basic life support is essential for severe marine envenomings that result in cardiac collapse or paralysis. Irukandji syndrome causes severe generalised pain, autonomic excess and minimal local pain, which may require large amounts of analgesia, and, uncommonly, myocardial depression and pulmonary oedema occur. Penetrating marine injuries can cause significant trauma depending on location of the injury. Large and unclean wounds may have delayed healing and secondary infection if not adequately irrigated, debrided and observed.

  7. Metagenomics of rumen bacteriophage from thirteen lactating dairy cattle

    PubMed Central

    2013-01-01

    Background The bovine rumen hosts a diverse and complex community of Eukarya, Bacteria, Archea and viruses (including bacteriophage). The rumen viral population (the rumen virome) has received little attention compared to the rumen microbial population (the rumen microbiome). We used massively parallel sequencing of virus like particles to investigate the diversity of the rumen virome in thirteen lactating Australian Holstein dairy cattle all housed in the same location, 12 of which were sampled on the same day. Results Fourteen putative viral sequence fragments over 30 Kbp in length were assembled and annotated. Many of the putative genes in the assembled contigs showed no homology to previously annotated genes, highlighting the large amount of work still required to fully annotate the functions encoded in viral genomes. The abundance of the contig sequences varied widely between animals, even though the cattle were of the same age, stage of lactation and fed the same diets. Additionally the twelve animals which were co-habited shared a number of their dominant viral contigs. We compared the functional characteristics of our bovine viromes with that of other viromes, as well as rumen microbiomes. At the functional level, we found strong similarities between all of the viral samples, which were highly distinct from the rumen microbiome samples. Conclusions Our findings suggest a large amount of between animal variation in the bovine rumen virome and that co-habiting animals may have more similar viromes than non co-habited animals. We report the deepest sequencing to date of the rumen virome. This work highlights the enormous amount of novelty and variation present in the rumen virome. PMID:24180266

  8. The Coriolis Program.

    ERIC Educational Resources Information Center

    Lissaman, P. B. S.

    1979-01-01

    Detailed are the history, development, and future objectives of the Coriolis program, a project designed to place large turbine units in the Florida Current that would generate large amounts of electric power. (BT)

  9. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.

  10. Deconstructing calsequestrin. Complex buffering in the calcium store of skeletal muscle

    PubMed Central

    Royer, Leandro; Ríos, Eduardo

    2009-01-01

    Since its discovery in 1971, calsequestrin has been recognized as the main Ca2+ binding protein inside the sarcoplasmic reticulum (SR), the organelle that stores and upon demand mobilizes Ca2+ for contractile activation of muscle. This article reviews the potential roles of calsequestrin in excitation–contraction coupling of skeletal muscle. It first considers the quantitative demands for a structure that binds Ca2+ inside the SR in view of the amounts of the ion that must be mobilized to elicit muscle contraction. It briefly discusses existing evidence, largely gathered in cardiac muscle, of two roles for calsequestrin: as Ca2+ reservoir and as modulator of the activity of Ca2+ release channels, and then considers the results of an incipient body of work that manipulates the cellular endowment of calsequestrin. The observations include evidence that both the Ca2+ buffering capacity of calsequestrin in solution and that of the SR in intact cells decay as the free Ca2+ concentration is lowered. Together with puzzling observations of increase of Ca2+ inside the SR, in cells or vesicular fractions, upon activation of Ca2+ release, this is interpreted as evidence that the Ca2+ buffering in the SR is non-linear, and is optimized for support of Ca2+ release at the physiological levels of SR Ca2+ concentration. Such non-linearity of buffering is qualitatively explained by a speculation that puts together ideas first proposed by others. The speculation pictures calsequestrin polymers as ‘wires’ that both bind Ca2+ and efficiently deliver it near the release channels. In spite of the kinetic changes, the functional studies reveal that cells devoid of calsequestrin are still capable of releasing large amounts of Ca2+ into the myoplasm, consistent with the long term viability and apparent good health of mice engineered for calsequestrin ablation. The experiments therefore suggest that other molecules are capable of providing sites for reversible binding of large amounts of Ca2+ inside the sarcoplasmic reticulum. PMID:19403601

  11. Cutting Edge: Protection by Antiviral Memory CD8 T Cells Requires Rapidly Produced Antigen in Large Amounts.

    PubMed

    Remakus, Sanda; Ma, Xueying; Tang, Lingjuan; Xu, Ren-Huan; Knudson, Cory; Melo-Silva, Carolina R; Rubio, Daniel; Kuo, Yin-Ming; Andrews, Andrew; Sigal, Luis J

    2018-05-15

    Numerous attempts to produce antiviral vaccines by harnessing memory CD8 T cells have failed. A barrier to progress is that we do not know what makes an Ag a viable target of protective CD8 T cell memory. We found that in mice susceptible to lethal mousepox (the mouse homolog of human smallpox), a dendritic cell vaccine that induced memory CD8 T cells fully protected mice when the infecting virus produced Ag in large quantities and with rapid kinetics. Protection did not occur when the Ag was produced in low amounts, even with rapid kinetics, and protection was only partial when the Ag was produced in large quantities but with slow kinetics. Hence, the amount and timing of Ag expression appear to be key determinants of memory CD8 T cell antiviral protective immunity. These findings may have important implications for vaccine design. Copyright © 2018 by The American Association of Immunologists, Inc.

  12. Profiling Oman education data using data mining approach

    NASA Astrophysics Data System (ADS)

    Alawi, Sultan Juma Sultan; Shaharanee, Izwan Nizal Mohd; Jamil, Jastini Mohd

    2017-10-01

    Nowadays, with a large amount of data generated by many application services in different learning fields has led to the new challenges in education field. Education portal is an important system that leads to a better development of education field. This research paper presents an innovative data mining techniques to understand and summarizes the information of Oman's education data generated from the Ministry of Education Oman "Educational Portal". This research embarks into performing student profiling of the Oman student database. This study utilized the k-means clustering technique to determine the students' profiles. An amount of 42484-student records from Sultanate of Oman has been extracted for this study. The findings of this study show the practicality of clustering technique to investigating student's profiles. Allowing for a better understanding of student's behavior and their academic performance. Oman Education Portal contain a large amounts of user activity and interaction data. Analyses of this large data can be meaningful for educator to improve the student performance level and recognize students who needed additional attention.

  13. Challenges in Small Screening Laboratories: SaaS to the rescue

    PubMed Central

    Lemmon, Vance P.; Jia, Yuanyuan; Shi, Yan; Holbrook, S. Douglas; Bixby, John L; Buchser, William

    2012-01-01

    The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signalling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA screening of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Managing experimental workflow and library data, along with the extensive amount of experimental results is challenging. For academic laboratories generating large data sets from experiments using thousands of perturbagens, a laboratory information management system (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with a Software As A Service (SAAS) LIMS to ensure the quality of its experiments and workflows. The article discusses this application in detail, and how the system was selected and integrated into the laboratory. The advantages of SaaS are described. PMID:21631415

  14. Light baryon spectroscopy

    NASA Astrophysics Data System (ADS)

    Crede, Volker

    2013-03-01

    The spectrum of excited baryons serves as an excellent probe of quantum chromodynamics (QCD). In particular, highly-excited baryon resonances are sensitive to the details of quark confinement which is only poorly understood within QCD. Facilities worldwide such as Jefferson Lab, ELSA, and MAMI, which study the systematics of hadron spectra in photo- and electroproduction experiments, have accumulated a large amount of data in recent years including unpolarized cross section and polarization data for a large variety of meson-production reactions. These are important steps toward complete experiments that will allow us to unambiguously determine the scattering amplitude in the underlying reactions and to identify the broad and overlapping baryon resonance contributions. Several new nucleon resonances have been proposed and changes to the baryon listing in the 2012 Review of Particle Physics reflect the progress in the field.

  15. Flux Calculation Using CARIBIC DOAS Aircraft Measurements: SO2 Emission of Norilsk

    NASA Technical Reports Server (NTRS)

    Walter, D.; Heue, K.-P.; Rauthe-Schoech, A.; Brenninkmeijer, C. A. M.; Lamsal, L. N.; Krotkov, N. A.; Platt, U.

    2012-01-01

    Based on a case-study of the nickel smelter in Norilsk (Siberia), the retrieval of trace gas fluxes using airborne remote sensing is discussed. A DOAS system onboard an Airbus 340 detected large amounts of SO2 and NO2 near Norilsk during a regular passenger flight within the CARIBIC project. The remote sensing data were combined with ECMWF wind data to estimate the SO2 output of the Norilsk industrial complex to be around 1 Mt per year, which is in agreement with independent estimates. This value is compared to results using data from satellite remote sensing (GOME, OMI). The validity of the assumptions underlying our estimate is discussed, including the adaptation of this method to other gases and sources like the NO2 emissions of large industries or cities.

  16. Mining Critical Metals and Elements from Seawater: Opportunities and Challenges.

    PubMed

    Diallo, Mamadou S; Kotte, Madhusudhana Rao; Cho, Manki

    2015-08-18

    The availability and sustainable supply of technology metals and valuable elements is critical to the global economy. There is a growing realization that the development and deployment of the clean energy technologies and sustainable products and manufacturing industries of the 21st century will require large amounts of critical metals and valuable elements including rare-earth elements (REEs), platinum group metals (PGMs), lithium, copper, cobalt, silver, and gold. Advances in industrial ecology, water purification, and resource recovery have established that seawater is an important and largely untapped source of technology metals and valuable elements. This feature article discusses the opportunities and challenges of mining critical metals and elements from seawater. We highlight recent advances and provide an outlook of the future of metal mining and resource recovery from seawater.

  17. Learning from Massive Distributed Data Sets (Invited)

    NASA Astrophysics Data System (ADS)

    Kang, E. L.; Braverman, A. J.

    2013-12-01

    Technologies for remote sensing and ever-expanding computer experiments in climate science are generating massive data sets. Meanwhile, it has been common in all areas of large-scale science to have these 'big data' distributed over multiple different physical locations, and moving large amounts of data can be impractical. In this talk, we will discuss efficient ways for us to summarize and learn from distributed data. We formulate a graphical model to mimic the main characteristics of a distributed-data network, including the size of the data sets and speed of moving data. With this nominal model, we investigate the trade off between prediction accurate and cost of data movement, theoretically and through simulation experiments. We will also discuss new implementations of spatial and spatio-temporal statistical methods optimized for distributed data.

  18. The Influence of Porosity on Fatigue Crack Initiation in Additively Manufactured Titanium Components.

    PubMed

    Tammas-Williams, S; Withers, P J; Todd, I; Prangnell, P B

    2017-08-04

    Without post-manufacture HIPing the fatigue life of electron beam melting (EBM) additively manufactured parts is currently dominated by the presence of porosity, exhibiting large amounts of scatter. Here we have shown that the size and location of these defects is crucial in determining the fatigue life of EBM Ti-6Al-4V samples. X-ray computed tomography has been used to characterise all the pores in fatigue samples prior to testing and to follow the initiation and growth of fatigue cracks. This shows that the initiation stage comprises a large fraction of life (>70%). In these samples the initiating defect was often some way from being the largest (merely within the top 35% of large defects). Using various ranking strategies including a range of parameters, we found that when the proximity to the surface and the pore aspect ratio were included the actual initiating defect was within the top 3% of defects ranked most harmful. This lays the basis for considering how the deposition parameters can be optimised to ensure that the distribution of pores is tailored to the distribution of applied stresses in additively manufactured parts to maximise the fatigue life for a given loading cycle.

  19. Aerodynamic Design of a Dual-Flow Mach 7 Hypersonic Inlet System for a Turbine-Based Combined-Cycle Hypersonic Propulsion System

    NASA Technical Reports Server (NTRS)

    Sanders, Bobby W.; Weir, Lois J.

    2008-01-01

    A new hypersonic inlet for a turbine-based combined-cycle (TBCC) engine has been designed. This split-flow inlet is designed to provide flow to an over-under propulsion system with turbofan and dual-mode scramjet engines for flight from takeoff to Mach 7. It utilizes a variable-geometry ramp, high-speed cowl lip rotation, and a rotating low-speed cowl that serves as a splitter to divide the flow between the low-speed turbofan and the high-speed scramjet and to isolate the turbofan at high Mach numbers. The low-speed inlet was designed for Mach 4, the maximum mode transition Mach number. Integration of the Mach 4 inlet into the Mach 7 inlet imposed significant constraints on the low-speed inlet design, including a large amount of internal compression. The inlet design was used to develop mechanical designs for two inlet mode transition test models: small-scale (IMX) and large-scale (LIMX) research models. The large-scale model is designed to facilitate multi-phase testing including inlet mode transition and inlet performance assessment, controls development, and integrated systems testing with turbofan and scramjet engines.

  20. Data Compression Algorithm Architecture for Large Depth-of-Field Particle Image Velocimeters

    NASA Technical Reports Server (NTRS)

    Bos, Brent; Memarsadeghi, Nargess; Kizhner, Semion; Antonille, Scott

    2013-01-01

    A large depth-of-field particle image velocimeter (PIV) is designed to characterize dynamic dust environments on planetary surfaces. This instrument detects lofted dust particles, and senses the number of particles per unit volume, measuring their sizes, velocities (both speed and direction), and shape factors when the particles are large. To measure these particle characteristics in-flight, the instrument gathers two-dimensional image data at a high frame rate, typically >4,000 Hz, generating large amounts of data for every second of operation, approximately 6 GB/s. To characterize a planetary dust environment that is dynamic, the instrument would have to operate for at least several minutes during an observation period, easily producing more than a terabyte of data per observation. Given current technology, this amount of data would be very difficult to store onboard a spacecraft, and downlink to Earth. Since 2007, innovators have been developing an autonomous image analysis algorithm architecture for the PIV instrument to greatly reduce the amount of data that it has to store and downlink. The algorithm analyzes PIV images and automatically reduces the image information down to only the particle measurement data that is of interest, reducing the amount of data that is handled by more than 10(exp 3). The state of development for this innovation is now fairly mature, with a functional algorithm architecture, along with several key pieces of algorithm logic, that has been proven through field test data acquired with a proof-of-concept PIV instrument.

  1. Cloud radiative effects and changes simulated by the Coupled Model Intercomparison Project Phase 5 models

    NASA Astrophysics Data System (ADS)

    Shin, Sun-Hee; Kim, Ok-Yeon; Kim, Dongmin; Lee, Myong-In

    2017-07-01

    Using 32 CMIP5 (Coupled Model Intercomparison Project Phase 5) models, this study examines the veracity in the simulation of cloud amount and their radiative effects (CREs) in the historical run driven by observed external radiative forcing for 1850-2005, and their future changes in the RCP (Representative Concentration Pathway) 4.5 scenario runs for 2006-2100. Validation metrics for the historical run are designed to examine the accuracy in the representation of spatial patterns for climatological mean, and annual and interannual variations of clouds and CREs. The models show large spread in the simulation of cloud amounts, specifically in the low cloud amount. The observed relationship between cloud amount and the controlling large-scale environment are also reproduced diversely by various models. Based on the validation metrics, four models—ACCESS1.0, ACCESS1.3, HadGEM2-CC, and HadGEM2-ES—are selected as best models, and the average of the four models performs more skillfully than the multimodel ensemble average. All models project global-mean SST warming at the increase of the greenhouse gases, but the magnitude varies across the simulations between 1 and 2 K, which is largely attributable to the difference in the change of cloud amount and distribution. The models that simulate more SST warming show a greater increase in the net CRE due to reduced low cloud and increased incoming shortwave radiation, particularly over the regions of marine boundary layer in the subtropics. Selected best-performing models project a significant reduction in global-mean cloud amount of about -0.99% K-1 and net radiative warming of 0.46 W m-2 K-1, suggesting a role of positive feedback to global warming.

  2. Considering chance in quality and safety performance measures: an analysis of performance reports by boards in English NHS trusts.

    PubMed

    Schmidtke, Kelly Ann; Poots, Alan J; Carpio, Juan; Vlaev, Ivo; Kandala, Ngianga-Bakwin; Lilford, Richard J

    2017-01-01

    Hospital board members are asked to consider large amounts of quality and safety data with a duty to act on signals of poor performance. However, in order to do so it is necessary to distinguish signals from noise (chance). This article investigates whether data in English National Health Service (NHS) acute care hospital board papers are presented in a way that helps board members consider the role of chance in their decisions. Thirty English NHS trusts were selected at random and their board papers retrieved. Charts depicting quality and safety were identified. Categorical discriminations were then performed to document the methods used to present quality and safety data in board papers, with particular attention given to whether and how the charts depicted the role of chance, that is, by including control lines or error bars. Thirty board papers, containing a total of 1488 charts, were sampled. Only 88 (6%) of these charts depicted the role of chance, and only 17 of the 30 board papers included any charts depicting the role of chance. Of the 88 charts that attempted to represent the role of chance, 16 included error bars and 72 included control lines. Only 6 (8%) of the 72 control charts indicated where the control lines had been set (eg, 2 vs 3 SDs). Hospital board members are expected to consider large amounts of information. Control charts can help board members distinguish signals from noise, but often boards are not using them. We discuss demand-side and supply-side barriers that could be overcome to increase use of control charts in healthcare. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  3. POM Pulses: Characterizing the Physical and Chemical Properties of Particulate Organic Matter (POM) Mobilized by Large Storm Events and its Influence on Receiving Fluvial Systems

    NASA Astrophysics Data System (ADS)

    Johnson, E. R.; Rowland, R. D.; Protokowicz, J.; Inamdar, S. P.; Kan, J.; Vargas, R.

    2016-12-01

    Extreme storm events have tremendous erosive energy which is capable of mobilizing vast amounts of material from watershed sources into fluvial systems. This complex mixture of sediment and particulate organic matter (POM) is a nutrient source, and has the potential to impact downstream water quality. The impact of POM on receiving aquatic systems can vary not only by the total amount exported but also by the various sources involved and the particle sizes of POM. This study examines the composition of POM in potential sources and within-event POM by: (1) determining the amount and quality of dissolved organic matter (DOM) that can be leached from coarse, medium and fine particle classes; (2) assessing the C and N content and isotopic character of within-event POM; and (3) coupling physical and chemical properties to evaluate storm event POM influence on stream water. Storm event POM samples and source sediments were collected from a forested headwater catchment (second order stream) in the Piedmont region of Maryland. Samples were sieved into three particle classes - coarse (2mm-1mm), medium (1mm-250µm) and fine (<250µm). Extractions were performed for three particle class sizes and the resulting fluorescent organic matter was analyzed. Carbon (C) and Nitrogen (N) amount, C:N ratio, and isotopic analysis of 13C and 15N were performed on solid state event and source material. Future work will include examination of microbial communities associated with POM particle size classes. Physical size class separation of within-event POM exhibited differences in C:N ratios, δ15N composition, and extracted DOM lability. Smaller size classes exhibited lower C:N ratios, more enriched δ15N and more recalcitrant properties in leached DOM. Source material had varying C:N ratios and contributions to leached DOM. These results indicate that both source and size class strongly influence the POM contribution to fluvial systems during large storm events.

  4. Coupling a basin erosion and river sediment transport model into a large scale hydrological model: an application in the Amazon basin

    NASA Astrophysics Data System (ADS)

    Buarque, D. C.; Collischonn, W.; Paiva, R. C. D.

    2012-04-01

    This study presents the first application and preliminary results of the large scale hydrodynamic/hydrological model MGB-IPH with a new module to predict the spatial distribution of the basin erosion and river sediment transport in a daily time step. The MGB-IPH is a large-scale, distributed and process based hydrological model that uses a catchment based discretization and the Hydrological Response Units (HRU) approach. It uses physical based equations to simulate the hydrological processes, such as the Penman Monteith model for evapotranspiration, and uses the Muskingum Cunge approach and a full 1D hydrodynamic model for river routing; including backwater effects and seasonal flooding. The sediment module of the MGB-IPH model is divided into two components: 1) prediction of erosion over the basin and sediment yield to river network; 2) sediment transport along the river channels. Both MGB-IPH and the sediment module use GIS tools to display relevant maps and to extract parameters from SRTM DEM (a 15" resolution was adopted). Using the catchment discretization the sediment module applies the Modified Universal Soil Loss Equation to predict soil loss from each HRU considering three sediment classes defined according to the soil texture: sand, silt and clay. The effects of topography on soil erosion are estimated by a two-dimensional slope length (LS) factor which using the contributing area approach and a local slope steepness (S), both estimated for each DEM pixel using GIS algorithms. The amount of sediment releasing to the catchment river reach in each day is calculated using a linear reservoir. Once the sediment reaches the river they are transported into the river channel using an advection equation for silt and clay and a sediment continuity equation for sand. A sediment balance based on the Yang sediment transport capacity, allowing to compute the amount of erosion and deposition along the rivers, is performed for sand particles as bed load, whilst no erosion or deposition is allowed for silt and clay. The model was first applied on the Madeira River basin, one of the major tributaries of the Amazon River (~1.4*106 km2) accounting for 35% of the suspended sediment amount annually transported for the Amazon river to the ocean. Model results agree with observed data, mainly for monthly and annual time scales. The spatial distribution of soil erosion within the basin showed a large amount of sediment being delivered from the Andean regions of Bolivia and Peru. Spatial distribution of mean annual sediment along the river showed that Madre de Dios, Mamoré and Beni rivers transport the major amount of sediment. Simulated daily suspended solid discharge agree with observed data. The model is able to provide temporaly and spatialy distributed estimates of soil loss source over the basin, locations with tendency for erosion or deposition along the rivers, and to reproduce long term sediment yield at several locations. Despite model results are encouraging, further effort is needed to validate the model considering the scarcity of data at large scale.

  5. Partial androgen insensitivity syndrome caused by a deep intronic mutation creating an alternative splice acceptor site of the AR gene.

    PubMed

    Ono, Hiroyuki; Saitsu, Hirotomo; Horikawa, Reiko; Nakashima, Shinichi; Ohkubo, Yumiko; Yanagi, Kumiko; Nakabayashi, Kazuhiko; Fukami, Maki; Fujisawa, Yasuko; Ogata, Tsutomu

    2018-02-02

    Although partial androgen insensitivity syndrome (PAIS) is caused by attenuated responsiveness to androgens, androgen receptor gene (AR) mutations on the coding regions and their splice sites have been identified only in <25% of patients with a diagnosis of PAIS. We performed extensive molecular studies including whole exome sequencing in a Japanese family with PAIS, identifying a deep intronic variant beyond the branch site at intron 6 of AR (NM_000044.4:c.2450-42 G > A). This variant created the splice acceptor motif that was accompanied by pyrimidine-rich sequence and two candidate branch sites. Consistent with this, reverse transcriptase (RT)-PCR experiments for cycloheximide-treated lymphoblastoid cell lines revealed a relatively large amount of aberrant mRNA produced by the newly created splice acceptor site and a relatively small amount of wildtype mRNA produced by the normal splice acceptor site. Furthermore, most of the aberrant mRNA was shown to undergo nonsense mediated decay (NMD) and, if a small amount of aberrant mRNA may have escaped NMD, such mRNA was predicted to generate a truncated AR protein missing some functional domains. These findings imply that the deep intronic mutation creating an alternative splice acceptor site resulted in the production of a relatively small amount of wildtype AR mRNA, leading to PAIS.

  6. Interannual kinetics (2010-2013) of large wood in a river corridor exposed to a 50-year flood event and fluvial ice dynamics

    NASA Astrophysics Data System (ADS)

    Boivin, Maxime; Buffin-Bélanger, Thomas; Piégay, Hervé

    2017-02-01

    Semi-alluvial rivers of the Gaspé Peninsula, Québec, are prone to produce and transport vast quantities of large wood (LW). The high rate of lateral erosion owing to high energy flows and noncohesive banks is the main process leading to the recruitment of large wood, which in turn initiates complex patterns of wood accumulation and reentrainment within the active channel. The delta of the Saint-Jean River (SJR) has accumulated large annual wood fluxes since 1960 that culminated in a wood raft of > 3-km in length in 2014. To document the kinetics of large wood on the main channel of SJR, four annual surveys were carried out from 2010 to 2013 to locate and describe > 1000 large wood jams (LWJ) and 2000 large wood individuals (LWI) along a 60-km river section. Airborne and ground photo/video images were used to estimate the wood volume introduced by lateral erosion and to identify local geomorphic conditions that control wood mobility and deposits. Video camera analysis allowed the examination of transport rates from three hydrometeorological events for specific river sections. Results indicate that the volume of LW recruited between 2010 and 2013 represents 57% of the total LW production over the 2004-2013 period. Volumes of wood deposited along the 60-km section were four times higher in 2013 than in 2010. Increases in wood amount occurred mainly in upper alluvial sections of the river, whereas decreases were observed in the semi-alluvial middle sections. Observations suggest that the 50-year flood event of 2010 produced large amounts of LW that were only partly exported out of the basin so that a significant amount was still available for subsequent floods. Large wood storage continued after this flood until a similar flood or an ice-breakup event could remobilise these LW accumulations into the river corridor. Ice-jam floods transport large amounts of wood during events with fairly low flow but do not contribute significantly to recruitment rates (ca. 10 to 30% early). It is fairly probable that the wood export peak observed in 2012 at the river mouth, where no flood occurred and which is similar to the 1-in 10-year flood of 2010, is mainly linked to such ice-break events that occurred in March 2012.

  7. Source-Receptor Relationship Analysis of the Atmospheric Deposition of PAHs Subject to Long-Range Transport in Northeast Asia.

    PubMed

    Inomata, Yayoi; Kajino, Mizuo; Sato, Keiichi; Kurokawa, Junichi; Tang, Ning; Ohara, Toshimasa; Hayakawa, Kazuichi; Ueda, Hiromasa

    2017-07-18

    The source-receptor relationship analysis of PAH deposition in Northeast Asia was investigated using an Eulerian regional-scale aerosol chemical transport model. Dry deposition (DD) of PAH was controlled by wind flow patterns, whereas wet deposition (WD) depended on precipitation in addition to wind flow patterns. The contribution of WD was approximately 50-90% of the total deposition, except during winter in Northern China (NCHN) and Eastern Russia (ERUS) because of the low amount of precipitation. The amount of PAH deposition showed clear seasonal variation and was high in winter and low in summer in downwind (South Korea, Japan) and oceanic-receptor regions. In the downwind region, the contributions from NCHN (WD 28-52%; DD 54-55%) and Central China (CCHN) (WD 43-65%; DD 33-38%) were large in winter, whereas self-contributions (WD 20-51%; DD 79-81%) were relatively high in summer. In the oceanic-receptor region, the deposition amount decreased with distance from the Asian continent. The amount of DD was strongly influenced by emissions from neighboring domains. The contributions of WD from NCHN (16-20%) and CCHN (28-35%) were large. The large contributions from China in summer to the downwind region were linked to vertical transport of PAHs over the Asian continent associated with convection.

  8. Dynamic travel information personalized and delivered to your cell phone : addendum.

    DOT National Transportation Integrated Search

    2011-03-01

    Real-time travel information must reach a significant amount of travelers to create a large amount of travel behavior change. For this project, since the TRAC-IT mobile phone application is used to monitor user context in terms of location, the mobil...

  9. Toward high-throughput genotyping: dynamic and automatic software for manipulating large-scale genotype data using fluorescently labeled dinucleotide markers.

    PubMed

    Li, J L; Deng, H; Lai, D B; Xu, F; Chen, J; Gao, G; Recker, R R; Deng, H W

    2001-07-01

    To efficiently manipulate large amounts of genotype data generated with fluorescently labeled dinucleotide markers, we developed a Microsoft database management system, named. offers several advantages. First, it accommodates the dynamic nature of the accumulations of genotype data during the genotyping process; some data need to be confirmed or replaced by repeat lab procedures. By using, the raw genotype data can be imported easily and continuously and incorporated into the database during the genotyping process that may continue over an extended period of time in large projects. Second, almost all of the procedures are automatic, including autocomparison of the raw data read by different technicians from the same gel, autoadjustment among the allele fragment-size data from cross-runs or cross-platforms, autobinning of alleles, and autocompilation of genotype data for suitable programs to perform inheritance check in pedigrees. Third, provides functions to track electrophoresis gel files to locate gel or sample sources for any resultant genotype data, which is extremely helpful for double-checking consistency of raw and final data and for directing repeat experiments. In addition, the user-friendly graphic interface of renders processing of large amounts of data much less labor-intensive. Furthermore, has built-in mechanisms to detect some genotyping errors and to assess the quality of genotype data that then are summarized in the statistic reports automatically generated by. The can easily handle >500,000 genotype data entries, a number more than sufficient for typical whole-genome linkage studies. The modules and programs we developed for the can be extended to other database platforms, such as Microsoft SQL server, if the capability to handle still greater quantities of genotype data simultaneously is desired.

  10. New insights into liquid chromatography for more eco-friendly analysis of pharmaceuticals.

    PubMed

    Shaaban, Heba

    2016-10-01

    Greening the analytical methods used for analysis of pharmaceuticals has been receiving great interest aimed at eliminating or minimizing the amount of organic solvents consumed daily worldwide without loss in chromatographic performance. Traditional analytical LC techniques employed in pharmaceutical analysis consume tremendous amounts of hazardous solvents and consequently generate large amounts of waste. The monetary and ecological impact of using large amounts of solvents and waste disposal motivated the analytical community to search for alternatives to replace polluting analytical methodologies with clean ones. In this context, implementing the principles of green analytical chemistry (GAC) in analytical laboratories is highly desired. This review gives a comprehensive overview on different green LC pathways for implementing GAC principles in analytical laboratories and focuses on evaluating the greenness of LC analytical procedures. This review presents green LC approaches for eco-friendly analysis of pharmaceuticals in industrial, biological, and environmental matrices. Graphical Abstract Green pathways of liquid chromatography for more eco-friendly analysis of pharmaceuticals.

  11. Does tinnitus distress depend on age of onset?

    PubMed

    Schlee, Winfried; Kleinjung, Tobias; Hiller, Wolfgang; Goebel, Gerhard; Kolassa, Iris-Tatjana; Langguth, Berthold

    2011-01-01

    Tinnitus is the perception of a sound in the absence of any physical source of it. About 5-15% of the population report hearing such a tinnitus and about 1-2% suffer from their tinnitus leading to anxiety, sleep disorders or depression. It is currently not completely understood why some people feel distressed by their tinnitus, while others don't. Several studies indicate that the amount of tinnitus distress is associated with many factors including comorbid anxiety, comorbid depression, personality, the psychosocial situation, the amount of the related hearing loss and the loudness of the tinnitus. Furthermore, theoretical considerations suggest an impact of the age at tinnitus onset influencing tinnitus distress. Based on a sample of 755 normal hearing tinnitus patients we tested this assumption. All participants answered a questionnaire on the amount of tinnitus distress together with a large variety of clinical and demographic data. Patients with an earlier onset of tinnitus suffer significantly less than patients with an onset later in life. Furthermore, patients with a later onset of tinnitus describe their course of tinnitus distress as more abrupt and distressing right from the beginning. We argue that a decline of compensatory brain plasticity in older age accounts for this age-dependent tinnitus decompensation.

  12. Inclusion bodies of aggregated hemosiderins in liver macrophages.

    PubMed

    Hayashi, Hisao; Tatsumi, Yasuaki; Wakusawa, Shinya; Shigemasa, Ryota; Koide, Ryoji; Tsuchida, Ken-Ichi; Morotomi, Natsuko; Yamashita, Tetsuji; Kumagai, Kotaro; Ono, Yukiya; Hayashi, Kazuhiko; Ishigami, Masatoshi; Goto, Hidemi; Kato, Ayako; Kato, Koichi

    2017-12-01

    Hemosiderin formation is a structural indication of iron overload. We investigated further adaptations of the liver to excess iron. Five patients with livers showing iron-rich inclusions larger than 2 µm were selected from our database. The clinical features of patients and structures of the inclusions were compared with those of 2 controls with mild iron overload. All patients had severe iron overload with more than 5000 ng/mL of serum ferritin. Etiologies were variable, from hemochromatosis to iatrogenic iron overload. Their histological stages were either portal fibrosis or cirrhosis. Inclusion bodies were ultra-structurally visualized as aggregated hemosiderins in the periportal macrophages. X-ray analysis always identified, in addition to a large amount of iron complexes including oxygen and phosphorus, a small amount of copper and sulfur in the mosaic matrixes of inclusions. There were no inclusions in the control livers. Inclusion bodies, when the liver is loaded with excess iron, may appear in the macrophages as isolated organella of aggregated hemosiderins. Trace amounts of copper-sulfur complexes were always identified in the mosaic matrices of the inclusions, suggesting cuproprotein induction against excess iron. In conclusion, inclusion formation in macrophages may be an adaptation of the liver loaded with excess iron.

  13. [Food consumption and anthropometry related to the frailty syndrome in low-income community-living elderly in a large city].

    PubMed

    Mello, Amanda de Carvalho; Carvalho, Marilia Sá; Alves, Luciana Correia; Gomes, Viviane Pereira; Engstrom, Elyne Montenegro

    2017-08-21

    The aim of this study was to describe anthropometric and food intake data related to the frailty syndrome in the elderly. This was a cross-sectional study in individuals ≥ 60 years of age in a household survey in the Manguinhos neighborhood of Rio de Janeiro, Brazil (n = 137). Frailty syndrome was diagnosed according to Fried et al., anthropometric measures were taken, and a food frequency questionnaire was applied and the results compared to Brazilian Ministry of Health guidelines. In the pre-frail and frail groups, body mass index and measures of central adiposity showed higher levels, while lean muscle parameters showed lower values, proportional to the syndrome's gradation. Frail elderly consumed higher amounts of grains and lower amounts of beans and fruit; pre-frail elderly consumed more vegetables, dairy products, and high-sugar and high-fat foods; the two groups consumed similar amounts of meat. Thus, diagnosis of the syndrome, anthropometric evaluation, and dietary assessment should be included in health policies for the elderly, since they assist in early identification of risk and favor interventions for disease prevention and health and nutritional promotion.

  14. Kazakhstan In situ BioTransformation of Mercury ...

    EPA Pesticide Factsheets

    Our final international work on the biological decontamination of the mercury contamination of soils in the Northern outskirts of Pavlodar as a result of activity at the former PO “Khimprom” chemical plant is reported here. The plant produced chlorine and alkali from the 1970s into the 1990s using the electrolytic amalgam method entailing the use of massive amounts of mercury. Ground water became contaminated with Hg resulting in a plume 470 m wide, 1.9 km long, estimated to contain 2 million cubic meters of water. This plume could reach the River Irtysh, a source of drinking water for large cities in Kazakhstan and Russia. Significant amounts of mercuric compounds are deposited in the sediments of Lake Balkyldak, 1.5 km north of the factory. This lake occasionally received wastewater from the factory. Phase I of the PO “Kimprom” clean-up that isolated the major sources of mercury at the site was completed in 2004. However, significant amounts of mercury remain underground including groundwater contaminated with Hg in the form of HgCl2 with little to no elemental or methyl mercury (MeHg). Develop biotechnology strategies to mitigate mercury contamination in groundwater

  15. Test Data Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosas, Joseph

    The National Security Campus (NSC) collects a large amount of test data used The National Security Campus (NSC) collects a large amount of test data used to accept high value and high rigor product. The data has been used historically to support root cause analysis when anomalies are detected in down-stream processes. The opportunity to use the data for predictive failure analysis however, had never been exploited. The primary goal of the Test Data Monitor (TDM) software is to provide automated capabilities to analyze data in near-real-time and report trends that foreshadow actual product failures. To date, the aerospace industrymore » as a whole is challenged at utilizing collected data to the degree that modern technology allows. As a result of the innovation behind TDM, Honeywell is able to monitor millions of data points through a multitude of SPC algorithms continuously and autonomously so that our personnel resources can more efficiently and accurately direct their attention to suspect processes or features. TDM’s capabilities have been recognized by our U.S. Department of Energy National Nuclear Security Administration (NNSA) sponsor for potential use at other sites within the NNSA. This activity supports multiple initiatives including expectations of the NNSA and broader corporate goals that center around data-based quality controls on production.« less

  16. High Reynolds Number Investigation of a Flush Mounted, S-Duct Inlet With Large Amounts of Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Berrier, Bobby L.; Carter, Melissa B.; Allan, Brian G.

    2005-01-01

    An experimental investigation of a flush-mounted, S-duct inlet with large amounts of boundary layer ingestion has been conducted at Reynolds numbers up to full scale. The study was conducted in the NASA Langley Research Center 0.3-Meter Transonic Cryogenic Tunnel. In addition, a supplemental computational study on one of the inlet configurations was conducted using the Navier-Stokes flow solver, OVERFLOW. Tests were conducted at Mach numbers from 0.25 to 0.83, Reynolds numbers (based on aerodynamic interface plane diameter) from 5.1 million to 13.9 million (full-scale value), and inlet mass-flow ratios from 0.29 to 1.22, depending on Mach number. Results of the study indicated that increasing Mach number, increasing boundary layer thickness (relative to inlet height) or ingesting a boundary layer with a distorted profile decreased inlet performance. At Mach numbers above 0.4, increasing inlet airflow increased inlet pressure recovery but also increased distortion. Finally, inlet distortion was found to be relatively insensitive to Reynolds number, but pressure recovery increased slightly with increasing Reynolds number.This CD-ROM supplement contains inlet data including: Boundary layer data, Duct static pressure data, performance-AIP (fan face) data, Photos, Tunnel wall P-PTO data and definitions.

  17. Electroweak corrections to hadronic production of W bosons at large transverse momenta

    NASA Astrophysics Data System (ADS)

    Kühn, Johann H.; Kulesza, A.; Pozzorini, S.; Schulze, M.

    2008-07-01

    To match the precision of present and future measurements of W-boson production at hadron colliders electroweak radiative corrections must be included in the theory predictions. In this paper we consider their effect on the transverse momentum ( p) distribution of W bosons, with emphasis on large p. We evaluate the full electroweak O(α) corrections to the processes pp→W+jet and pp¯→W+jet including virtual and real photonic contributions. We present the explicit expressions in analytical form for the virtual corrections and provide results for the real corrections, discussing in detail the treatment of soft and collinear singularities. We also provide compact approximate expressions which are valid in the high-energy region, where the electroweak corrections are strongly enhanced by logarithms of sˆ/MW2. These expressions describe the complete asymptotic behaviour at one loop as well as the leading and next-to-leading logarithms at two loops. Numerical results are presented for proton-proton collisions at 14 TeV and proton-antiproton collisions at 2 TeV. The corrections are negative and their size increases with p. At the LHC, where transverse momenta of 2 TeV or more can be reached, the one- and two-loop corrections amount up to -40% and +10%, respectively, and will be important for a precise analysis of W production. At the Tevatron, transverse momenta up to 300 GeV are within reach. In this case the electroweak corrections amount up to -10% and are thus larger than the expected statistical error.

  18. Seawater strontium isotopes, acid rain, and the Cretaceous-Tertiary boundary

    NASA Technical Reports Server (NTRS)

    Macdougall, J. D.

    1988-01-01

    A large bolide impact at the end of the Cretaceous would have produced significant amounts of nitrogen oxides by shock heating of the atmosphere. The resulting acid precipitation would have increased continental weathering greatly and could be an explanation for the observed high ratio of strontium-87 to strontium-86 in seawater at about this time, due to the dissolution of large amounts of strontium from the continental crust. Spikes to high values in the seawater strontium isotope record at other times may reflect similar episodes.

  19. Inter-comparison of precipitable water among reanalyses and its effect on downscaling in the tropics

    NASA Astrophysics Data System (ADS)

    Takahashi, H. G.; Fujita, M.; Hara, M.

    2012-12-01

    This paper compared precipitable water (PW) among four major reanalyses. In addition, we also investigated the effect of the boundary conditions on downscaling in the tropics, using a regional climate model. The spatial pattern of PW in the reanalyses agreed closely with observations. However, the absolute amounts of PW in some reanalyses were very small compared to observations. The discrepancies of the 12-year mean PW in July over the Southeast Asian monsoon region exceeded the inter-annual standard deviation of the PW. There was also a discrepancy in tropical PWs throughout the year, an indication that the problem is not regional, but global. The downscaling experiments were conducted, which were forced by the different four reanalyses. The atmospheric circulation, including monsoon westerlies and various disturbances, was very small among the reanalyses. However, simulated precipitation was only 60 % of observed precipitation, although the dry bias in the boundary conditions was only 6 %. This result indicates that dry bias has large effects on precipitation in downscaling over the tropics. This suggests that a simulated regional climate downscaled from ensemble-mean boundary conditions is quite different from an ensemble-mean regional climate averaged over the several regional ones downscaled from boundary conditions of the ensemble members in the tropics. Downscaled models can provide realistic simulations of regional tropical climates only if the boundary conditions include realistic absolute amounts of PW. Use of boundary conditions that include realistic absolute amounts of PW in downscaling in the tropics is imperative at the present time. This work was partly supported by the Global Environment Research Fund (RFa-1101) of the Ministry of the Environment, Japan.

  20. Hybrid energy storage systems utilizing redox active organic compounds

    DOEpatents

    Wang, Wei; Xu, Wu; Li, Liyu; Yang, Zhenguo

    2015-09-08

    Redox flow batteries (RFB) have attracted considerable interest due to their ability to store large amounts of power and energy. Non-aqueous energy storage systems that utilize at least some aspects of RFB systems are attractive because they can offer an expansion of the operating potential window, which can improve on the system energy and power densities. One example of such systems has a separator separating first and second electrodes. The first electrode includes a first current collector and volume containing a first active material. The second electrode includes a second current collector and volume containing a second active material. During operation, the first source provides a flow of first active material to the first volume. The first active material includes a redox active organic compound dissolved in a non-aqueous, liquid electrolyte and the second active material includes a redox active metal.

  1. The Development and Microstructure Analysis of High Strength Steel Plate NVE36 for Large Heat Input Welding

    NASA Astrophysics Data System (ADS)

    Peng, Zhang; Liangfa, Xie; Ming, Wei; Jianli, Li

    In the shipbuilding industry, the welding efficiency of the ship plate not only has a great effect on the construction cost of the ship, but also affects the construction speed and determines the delivery cycle. The steel plate used for large heat input welding was developed sufficiently. In this paper, the composition of the steel with a small amount of Nb, Ti and large amount of Mn had been designed in micro-alloyed route. The content of C and the carbon equivalent were also designed to a low level. The technology of oxide metallurgy was used during the smelting process of the steel. The rolling technology of TMCP was controlled at a low rolling temperature and ultra-fast cooling technology was used, for the purpose of controlling the transformation of the microstructure. The microstructure of the steel plate was controlled to be the mixed microstructure of low carbon bainite and ferrite. Large amount of oxide particles dispersed in the microstructure of steel, which had a positive effects on the mechanical property and welding performance of the steel. The mechanical property of the steel plate was excellent and the value of longitudinal Akv at -60 °C is more than 200 J. The toughness of WM and HAZ were excellent after the steel plate was welded with a large heat input of 100-250 kJ/cm. The steel plate processed by mentioned above can meet the requirement of large heat input welding.

  2. CONSTRUCTED WETLANDS VS. RETENTION POND BMPS: MESOCOSM STUDIES FOR IMPROVED POLLUTANT MANAGEMENT IN URBAN STORMWATER TREATMENT

    EPA Science Inventory

    Increased urbanization has increased the amount of directly connected impervious area that results in large quantities of stormwater runoff. This runoff can contribute significant amounts of debris and pollutants to receiving waters. Urban watershed managers often incorporate b...

  3. Monitoring diffuse volcanic degassing during volcanic unrests: the case of Campi Flegrei (Italy)

    NASA Astrophysics Data System (ADS)

    Cardellini, Carlo; Chiodini, Giovanni; Avino, Rosario; Bagnato, Emanuela; Caliro, Stefano; Frondini, Francesco; Lelli, Matteo; Rosiello, Angelo

    2017-04-01

    Hydrothermal activity at Solfatara of Pozzuoli (Campi Flegrei caldera, Italy) results on a large area of hot soils, diffuse CO2 degassing and numerous fumaroles, releasing at the surface large amounts of gasses and thermal energy. Solfatara is one of the first sites of the world where the techniques for measuring and interpreting soil CO2 diffuse degassing were developed during 1990's and, more recently, it has become a sort of natural laboratory for testing new types of measurements of the CO2 fluxes from hydrothermal sites. The results of 30 diffuse CO2 flux surveys performed at Solfatara from 1998 to 2016 are presented and discussed. CO2 soil fluxes were measured over an area of about 1.2  1.2 km including the Solfatara crater and the hydrothermal site of Pisciarelli using the accumulation chamber technique. Each survey consisted in a number of CO2 flux measurements varying from 372 to 583 resulting in a total of 13158 measurements. This data set is one of the largest dataset ever made in the world on a single degassing volcanic-hydrothermal system. It is particularly relevant in the frame of volcanological sciences because it was acquired during a long period of unrest at Campi Flegrei caldera and because Solfatara release an amount of CO2 comparable to that released by medium-large volcanic plumes. Statistical and geostatistical elaborations of CO2 flux data allowed to characterise the sources of soil diffuse degassing, to define the extent of the area interested by the release of hydrothermal CO2 (Solfatara DDS) and to quantify the total amount of released CO2. During the last eighteen years relevant variations affected Solfatara degassing, and in particular the "background" CO2 emission , the extent of DDS and the total CO2 output, that may reflect variations in the subterraneous gas plume feeding the Solfatara and Pisciarelli emissions. In fact, the most relevant variations in Solfatara diffuse degassing well correlates with steam condensation and temperature increase affecting the Solfatara system resulting from repeated inputs of magmatic fluids into the hydrothermal systems as suggested by Chiodini et al., (2015; 2016; 2017) and show a long-term increase on the amount of released CO2 that accompanies the ongoing unrest of Campi Flegrei caldera.

  4. Inactive Hepatitis B Carrier and Pregnancy Outcomes: A Systematic Review and Meta-analysis.

    PubMed

    Keramat, Afsaneh; Younesian, Masud; Gholami Fesharaki, Mohammad; Hasani, Maryam; Mirzaei, Samaneh; Ebrahimi, Elham; Alavian, Seyed Moaed; Mohammadi, Fatemeh

    2017-04-01

    We aimed to explore whether maternal asymptomatic hepatitis B (HB) infection effects on pre-term rupture of membranous (PROM), stillbirth, preeclampsia, eclampsia, gestational hypertension, or antepartum hemorrhage. We searched the PubMed, Scopus, and ISI web of science from 1990 to Feb 2015. In addition, electronic literature searches supplemented by searching the gray literature (e.g., conference abstracts thesis and the result of technical reports) and scanning the reference lists of included studies and relevant systematic reviews. We explored statistical heterogeneity using the, I2 and tau-squared (Tau2) statistical tests. Eighteen studies were included. Preterm rupture of membranous (PROM), stillbirth, preeclampsia, eclampsia, gestational hypertension and antepartum hemorrhage were considerable outcomes in this survey. The results showed no significant association between inactive HB and these complications in pregnancy. The small amounts of P -value and chi-square and large amount of I2 suggested the probable heterogeneity in this part, which we tried to modify with statistical methods such as subgroup analysis. Inactive HB infection did not increase the risk of adversely mentioned outcomes in this study. Further, well-designed studies should be performed to confirm the results.

  5. Analysis of sea ice dynamics

    NASA Technical Reports Server (NTRS)

    Zwally, J.

    1988-01-01

    The ongoing work has established the basis for using multiyear sea ice concentrations from SMMR passive microwave for studies of largescale advection and convergence/divergence of the Arctic sea ice pack. Comparisons were made with numerical model simulations and buoy data showing qualitative agreement on daily to interannual time scales. Analysis of the 7-year SMMR data set shows significant interannual variations in the total area of multiyear ice. The scientific objective is to investigate the dynamics, mass balance, and interannual variability of the Arctic sea ice pack. The research emphasizes the direct application of sea ice parameters derived from passive microwave data (SMMR and SSMI) and collaborative studies using a sea ice dynamics model. The possible causes of observed interannual variations in the multiyear ice area are being examined. The relative effects of variations in the large scale advection and convergence/divergence within the ice pack on a regional and seasonal basis are investigated. The effects of anomolous atmospheric forcings are being examined, including the long-lived effects of synoptic events and monthly variations in the mean geostrophic winds. Estimates to be made will include the amount of new ice production within the ice pack during winter and the amount of ice exported from the pack.

  6. Genome Data Exploration Using Correspondence Analysis

    PubMed Central

    Tekaia, Fredj

    2016-01-01

    Recent developments of sequencing technologies that allow the production of massive amounts of genomic and genotyping data have highlighted the need for synthetic data representation and pattern recognition methods that can mine and help discovering biologically meaningful knowledge included in such large data sets. Correspondence analysis (CA) is an exploratory descriptive method designed to analyze two-way data tables, including some measure of association between rows and columns. It constructs linear combinations of variables, known as factors. CA has been used for decades to study high-dimensional data, and remarkable inferences from large data tables were obtained by reducing the dimensionality to a few orthogonal factors that correspond to the largest amount of variability in the data. Herein, I review CA and highlight its use by considering examples in handling high-dimensional data that can be constructed from genomic and genetic studies. Examples in amino acid compositions of large sets of species (viruses, phages, yeast, and fungi) as well as an example related to pairwise shared orthologs in a set of yeast and fungal species, as obtained from their proteome comparisons, are considered. For the first time, results show striking segregations between yeasts and fungi as well as between viruses and phages. Distributions obtained from shared orthologs show clusters of yeast and fungal species corresponding to their phylogenetic relationships. A direct comparison with the principal component analysis method is discussed using a recently published example of genotyping data related to newly discovered traces of an ancient hominid that was compared to modern human populations in the search for ancestral similarities. CA offers more detailed results highlighting links between modern humans and the ancient hominid and their characterizations. Compared to the popular principal component analysis method, CA allows easier and more effective interpretation of results, particularly by the ability of relating individual patterns with their corresponding characteristic variables. PMID:27279736

  7. Work extraction and thermodynamics for individual quantum systems

    NASA Astrophysics Data System (ADS)

    Skrzypczyk, Paul; Short, Anthony J.; Popescu, Sandu

    2014-06-01

    Thermodynamics is traditionally concerned with systems comprised of a large number of particles. Here we present a framework for extending thermodynamics to individual quantum systems, including explicitly a thermal bath and work-storage device (essentially a ‘weight’ that can be raised or lowered). We prove that the second law of thermodynamics holds in our framework, and gives a simple protocol to extract the optimal amount of work from the system, equal to its change in free energy. Our results apply to any quantum system in an arbitrary initial state, in particular including non-equilibrium situations. The optimal protocol is essentially reversible, similar to classical Carnot cycles, and indeed, we show that it can be used to construct a quantum Carnot engine.

  8. Work extraction and thermodynamics for individual quantum systems.

    PubMed

    Skrzypczyk, Paul; Short, Anthony J; Popescu, Sandu

    2014-06-27

    Thermodynamics is traditionally concerned with systems comprised of a large number of particles. Here we present a framework for extending thermodynamics to individual quantum systems, including explicitly a thermal bath and work-storage device (essentially a 'weight' that can be raised or lowered). We prove that the second law of thermodynamics holds in our framework, and gives a simple protocol to extract the optimal amount of work from the system, equal to its change in free energy. Our results apply to any quantum system in an arbitrary initial state, in particular including non-equilibrium situations. The optimal protocol is essentially reversible, similar to classical Carnot cycles, and indeed, we show that it can be used to construct a quantum Carnot engine.

  9. Kombucha: a dubious "cure".

    PubMed

    Majchrowicz, M

    1995-05-01

    The kombucha (or Manchurian) mushroom has numerous claims of "significant" health improvements, yet there is no research or any basic evidence to back up the claims. According to folklore, the kombucha is a super immune booster that can fight many ailments, including AIDS, cancer, arthritis, constipation, and more. However, there is concern about the safety of kombucha, which is not really a mushroom but a yeast culture. Since the culture must grow at room temperature for seven to ten days, contamination and growth of other organisms can take place. The tea's original ingredients include caffeine and large amounts of sugar. These may account for the increased energy some individuals have claimed. Some stories state miraculous results. Other accounts mention no improvement in general well-being.

  10. Interactive Performance and Focus Groups with Adolescents: The Power of Play

    PubMed Central

    Norris, Anne E.; Aroian, Karen J.; Warren, Stefanie

    2012-01-01

    Conducting focus groups with adolescents can be challenging given their developmental needs, particularly with sensitive topics. These challenges include intense need for peer approval, declining social trust, short attention span, and reliance on concrete operations thinking. In this article we describe an adaptation of interactive performance as an alternative to traditional focus group method. We used this method in a study of discrimination experienced by Muslims (ages 13-17) and of peer pressure to engage in sexual behavior experienced by Hispanic girls (ages 10-14). Recommendations for use of this method include using an interdisciplinary team, planning for large amounts of disclosure towards the end of the focus group, and considering the fit of this method to the study topic. PMID:22949032

  11. Studies and analyses of the space shuttle main engine. Failure information propagation model data base and software

    NASA Technical Reports Server (NTRS)

    Tischer, A. E.

    1987-01-01

    The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.

  12. Confocal laser scanning microscopy detection of chlorophylls and carotenoids in chloroplasts and chromoplasts of tomato fruit.

    PubMed

    D'Andrea, Lucio; Amenós, Montse; Rodríguez-Concepción, Manuel

    2014-01-01

    Plant cells are unique among eukaryotic cells because of the presence of plastids, including chloroplasts and chromoplasts. Chloroplasts are found in green tissues and harbor the photosynthetic machinery (including chlorophyll molecules), while chromoplasts are present in non-photosynthetic tissues and accumulate large amounts of carotenoids. During tomato fruit development, chloroplasts are converted into chromoplasts that accumulate high levels of lycopene, a linear carotenoid responsible for the characteristic red color of ripe fruit. Here, we describe a simple and fast method to detect both types of fully differentiated plastids (chloroplasts and chromoplasts), as well as intermediate stages, in fresh tomato fruits. The method is based on the differential autofluorescence of chlorophylls and carotenoids (lycopene) detected by Confocal Laser Scanning Microscopy.

  13. 38 CFR 17.254 - Applications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., assurances, and supporting documents: (a) To specify amount. Each application shall show the amount of the..., and amounts or expenses which will be borne by the applicant, and (e) To include assurance records will be kept. Each application shall include sufficient assurances that the applicant shall keep...

  14. Persistent Bovine Viral Diarrhea Virus Infection in Domestic and Wild Small Ruminants and Camelids Including the Mountain Goat (Oreamnos americanus)

    PubMed Central

    Nelson, Danielle D.; Duprau, Jennifer L.; Wolff, Peregrine L.; Evermann, James F.

    2016-01-01

    Bovine viral diarrhea virus (BVDV) is a pestivirus best known for causing a variety of disease syndromes in cattle, including gastrointestinal disease, reproductive insufficiency, immunosuppression, mucosal disease, and hemorrhagic syndrome. The virus can be spread by transiently infected individuals and by persistently infected animals that may be asymptomatic while shedding large amounts of virus throughout their lifetime. BVDV has been reported in over 40 domestic and free-ranging species, and persistent infection has been described in eight of those species: white-tailed deer, mule deer, eland, mousedeer, mountain goats, alpacas, sheep, and domestic swine. This paper reviews the various aspects of BVDV transmission, disease syndromes, diagnosis, control, and prevention, as well as examines BVDV infection in domestic and wild small ruminants and camelids including mountain goats (Oreamnos americanus). PMID:26779126

  15. Yolk embolism associated with trauma in vitellogenic sea turtles in Florida (USA): a review of 11 cases.

    PubMed

    Stacy, Brian A; Foley, Allen; Garner, Michael M; Mettee, Nancy

    2013-12-01

    Case information and postmortem examination findings are presented for 11 adult female sea turtles in reproductive form that died in Florida, USA. All had abundant, large vitellogenic follicles, and most were either gravid or had recently nested. Species included six loggerheads (Caretta caretta) and five green turtles (Chelonia mydas). Identified proximate causes of death included falls or entrapment by obstructions on nesting beaches, burial under collapsed dunes, and other traumatic injuries of different causes. Evidence of yolk embolization was found in 10 cases and suspected in an 11th turtle. Ten turtles also had various amounts of free intracoelomic yolk. Although the effects of yolk embolization are uncertain at this time, precedence of pathologic importance in other species suggests that embolism may complicate traumatic injuries, including seemingly minor events.

  16. An experimental test of the habitat-amount hypothesis for saproxylic beetles in a forested region.

    PubMed

    Seibold, Sebastian; Bässler, Claus; Brandl, Roland; Fahrig, Lenore; Förster, Bernhard; Heurich, Marco; Hothorn, Torsten; Scheipl, Fabian; Thorn, Simon; Müller, Jörg

    2017-06-01

    The habitat-amount hypothesis challenges traditional concepts that explain species richness within habitats, such as the habitat-patch hypothesis, where species number is a function of patch size and patch isolation. It posits that effects of patch size and patch isolation are driven by effects of sample area, and thus that the number of species at a site is basically a function of the total habitat amount surrounding this site. We tested the habitat-amount hypothesis for saproxylic beetles and their habitat of dead wood by using an experiment comprising 190 plots with manipulated patch sizes situated in a forested region with a high variation in habitat amount (i.e., density of dead trees in the surrounding landscape). Although dead wood is a spatio-temporally dynamic habitat, saproxylic insects have life cycles shorter than the time needed for habitat turnover and they closely track their resource. Patch size was manipulated by adding various amounts of downed dead wood to the plots (~800 m³ in total); dead trees in the surrounding landscape (~240 km 2 ) were identified using airborne laser scanning (light detection and ranging). Over 3 yr, 477 saproxylic species (101,416 individuals) were recorded. Considering 20-1,000 m radii around the patches, local landscapes were identified as having a radius of 40-120 m. Both patch size and habitat amount in the local landscapes independently affected species numbers without a significant interaction effect, hence refuting the island effect. Species accumulation curves relative to cumulative patch size were not consistent with either the habitat-patch hypothesis or the habitat-amount hypothesis: several small dead-wood patches held more species than a single large patch with an amount of dead wood equal to the sum of that of the small patches. Our results indicate that conservation of saproxylic beetles in forested regions should primarily focus on increasing the overall amount of dead wood without considering its spatial arrangement. This means dead wood should be added wherever possible including in local landscapes with low or high dead-wood amounts. For species that have disappeared from most forests owing to anthropogenic habitat degradation, this should, however, be complemented by specific conservation measures pursued within their extant distributional ranges. © 2017 by the Ecological Society of America.

  17. Different Amounts of DNA in Newborn Cells of Escherichia coli Preclude a Role for the Chromosome in Size Control According to the "Adder" Model.

    PubMed

    Huls, Peter G; Vischer, Norbert O E; Woldringh, Conrad L

    2018-01-01

    According to the recently-revived adder model for cell size control, newborn cells of Escherichia coli will grow and divide after having added a constant size or length, ΔL , irrespective of their size at birth. Assuming exponential elongation, this implies that large newborns will divide earlier than small ones. The molecular basis for the constant size increment is still unknown. As DNA replication and cell growth are coordinated, the constant ΔL could be based on duplication of an equal amount of DNA, ΔG , present in newborn cells. To test this idea, we measured amounts of DNA and lengths of nucleoids in DAPI-stained cells growing in batch culture at slow and fast rates. Deeply-constricted cells were divided in two subpopulations of longer and shorter lengths than average; these were considered to represent large and small prospective daughter cells, respectively. While at slow growth, large and small prospective daughter cells contained similar amounts of DNA, fast growing cells with multiforked replicating chromosomes, showed a significantly higher amount of DNA (20%) in the larger cells. This observation precludes the hypothesis that Δ L is based on the synthesis of a constant ΔG . Growth curves were constructed for siblings generated by asymmetric division and growing according to the adder model. Under the assumption that all cells at the same growth rate exhibit the same time between initiation of DNA replication and cell division (i.e., constant C+D -period), the constructions predict that initiation occurs at different sizes ( Li ) and that, at fast growth, large newborn cells transiently contain more DNA than small newborns, in accordance with the observations. Because the state of segregation, measured as the distance between separated nucleoids, was found to be more advanced in larger deeply-constricted cells, we propose that in larger newborns nucleoid separation occurs faster and at a shorter length, allowing them to divide earlier. We propose a composite model in which both differential initiation and segregation leads to an adder-like behavior of large and small newborn cells.

  18. Volcanic Aerosol Radiative Properties

    NASA Technical Reports Server (NTRS)

    Lacis, Andrew

    2015-01-01

    Large sporadic volcanic eruptions inject large amounts of sulfur bearing gases into the stratosphere which then get photochemically converted to sulfuric acid aerosol droplets that exert a radiative cooling effect on the global climate system lasting for several years.

  19. The Origin of the Terra Meridiani Sediments: Volatile Transport and the Formation of Sulfate Bearing Layered Deposits on Mars

    NASA Technical Reports Server (NTRS)

    Niles, P.B.

    2008-01-01

    The chemistry, sedimentology, and geology of the Meridiani sedimentary deposits are best explained by eolian reworking of the sublimation residue of a large scale ice/dust deposit. This large ice deposit was located in close proximity to Terra Meridiani and incorporated large amounts of dust, sand, and SO2 aerosols generated by impacts and volcanism during early martian history. Sulfate formation and chemical weathering of the initial igneous material is hypothesized to have occurred inside of the ice when the darker mineral grains were heated by solar radiant energy. This created conditions in which small films of liquid water were created in and around the mineral grains. This water dissolved the SO2 and reacted with the mineral grains forming an acidic environment under low water/rock conditions. Subsequent sublimation of this ice deposit left behind large amounts of weathered sublimation residue which became the source material for the eolian process that deposited the Terra Meridiani deposit. The following features of the Meridiani sediments are best explained by this model: The large scale of the deposit, its mineralogic similarity across large distances, the cation-conservative nature of the weathering processes, the presence of acidic groundwaters on a basaltic planet, the accumulation of a thick sedimentary sequence outside of a topographic basin, and the low water/rock ratio needed to explain the presence of very soluble minerals and elements in the deposit. Remote sensing studies have linked the Meridiani deposits to a number of other martian surface features through mineralogic similarities, geomorphic similarities, and regional associations. These include layered deposits in Arabia Terra, interior layered deposits in the Valles Marineris system, southern Elysium/Aeolis, Amazonis Planitia, and the Hellas basin, Aram Chaos, Aureum Chaos, and Ioni Chaos. The common properties shared by these deposits suggest that all of these deposits share a common formation process which must have acted over a large area of Mars. The results of this study suggest a mechanism for volatile transport on Mars without invoking an early greenhouse. They also imply a common formation mechanism for most of the sulfate minerals and layered deposits on Mars, which explains their common occurrence.

  20. Solute-Filled Syringe For Formulating Intravenous Solution

    NASA Technical Reports Server (NTRS)

    Owens, Jim; Bindokas, AL; Dudar, Tom; Finley, Mike; Scharf, Mike

    1993-01-01

    Prefilled syringe contains premeasured amount of solute in powder or concentrate form used to deliver solute to sterile interior of large-volume parenteral (LVP) bag. Predetermined amount of sterile water also added to LVP bag through sterilizing filter, and mixed with contents of syringe, yielding sterile intravenous solution of specified concentration.

  1. An Alkalophilic Bacillus sp. Produces 2-Phenylethylamine

    PubMed Central

    Hamasaki, Nobuko; Shirai, Shinji; Niitsu, Masaru; Kakinuma, Katsumi; Oshima, Tairo

    1993-01-01

    A large amount of 2-phenylethylamine was produced in cells of alkalophilic Bacillus sp. strain YN-2000. This amine is secreted in the medium during the cell growth. The amounts of 2-phenylethylamine in both cells and medium change upon changing the pH of the medium. PMID:16349025

  2. Low Reynolds number numerical solutions of chaotic flow

    NASA Technical Reports Server (NTRS)

    Pulliam, Thomas H.

    1989-01-01

    Numerical computations of two-dimensional flow past an airfoil at low Mach number, large angle of attack, and low Reynolds number are reported which show a sequence of flow states leading from single-period vortex shedding to chaos via the period-doubling mechanism. Analysis of the flow in terms of phase diagrams, Poincare sections, and flowfield variables are used to substantiate these results. The critical Reynolds number for the period-doubling bifurcations is shown to be sensitive to mesh refinement and the influence of large amounts of numerical dissipation. In extreme cases, large amounts of added dissipation can delay or completely eliminate the chaotic response. The effect of artificial dissipation at these low Reynolds numbers is to produce a new effective Reynolds number for the computations.

  3. Social modeling effects on young women's breakfast intake.

    PubMed

    Hermans, Roel C J; Herman, C Peter; Larsen, Junilla K; Engels, Rutger C M E

    2010-12-01

    Numerous studies have shown that the presence of others influences young women's food intake. They eat more when the other eats more, and eat less when the other eats less. However, most of these studies have focused on snack situations. The present study assesses the degree to which young women model the breakfast intake of a same-sex peer in a semi-naturalistic setting. The study took place in a laboratory setting at the Radboud University Nijmegen, the Netherlands, during the period January to April 2009. After completing three cover tasks, normal-weight participants (n=57) spent a 20-minute break with a peer who ate a large amount or a small amount of breakfast or no breakfast at all. The participants' total amount of energy consumed (in kilocalories) during the break was measured. An analysis of variance was used to examine whether young women modeled the breakfast intake of same-sex peers. Results indicate a main effect of breakfast condition, F(2,54)=8.44; P<0.01. Participants exposed to a peer eating nothing ate less than did participants exposed to a peer eating a small amount (d=0.85) or large amount of breakfast (d=1.23). Intake in the Small-Breakfast condition did not differ substantially from intake in the Large-Breakfast condition. The findings from the present study provide evidence that modeling effects of food intake are weaker in eating contexts in which scripts or routines guide an individual's eating behavior. Copyright © 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  4. Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines

    NASA Astrophysics Data System (ADS)

    Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.

    2016-12-01

    Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.

  5. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  6. The Strasbourg Large Refractor and Dome: Significant Improvements and Failed Attempts

    NASA Astrophysics Data System (ADS)

    Heck, Andre

    2009-01-01

    Founded by the German Empire in the late 19th century, Strasbourg Astronomical Observatory featured several novelties from the start. According to Mueller (1978), the separation of observing buildings from the study area and from the astronomers' residence was a revolution in observatory construction. The instruments were, as much as possible, isolated from the vibrations of the buildings themselves. "Gas flames" and water were used to reduce temperature effects. Thus the Large Dome (ca 11m diameter), housing the Large Refractor (ca 49cm, then the largest in Germany) and covered by zinc over wood, could be cooled down by water running from the top. Reports (including by the French who took over the observatory after World War I) are however somehow nonexistent on the effective usage and actual efficiency of such a system (which must have generated locally a significant amount of humidity). The paper will detail these technical attempts as well as the specificities of the instruments installed in that new observatory intended as a showcase of German astronomy.

  7. The Flint Animal Cancer Center (FACC) Canine Tumour Cell Line Panel: a resource for veterinary drug discovery, comparative oncology and translational medicine.

    PubMed

    Fowles, J S; Dailey, D D; Gustafson, D L; Thamm, D H; Duval, D L

    2017-06-01

    Mammalian cell tissue culture has been a critical tool leading to our current understanding of cancer including many aspects of cellular transformation, growth and response to therapies. The current use of large panels of cell lines with associated phenotypic and genotypic information now allows for informatics approaches and in silico screens to rapidly test hypotheses based on simple as well as complex relationships. Current cell line panels with large amounts of associated drug sensitivity and genomics data are comprised of human cancer cell lines (i.e. NCI60 and GDSC). There is increased recognition of the contribution of canine cancer to comparative cancer research as a spontaneous large animal model with application in basic and translational studies. We have assembled a panel of canine cancer cell lines to facilitate studies in canine cancer and report here phenotypic and genotypic data associated with these cells. © 2016 John Wiley & Sons Ltd.

  8. Development and Application of the Collaborative Optimization Architecture in a Multidisciplinary Design Environment

    NASA Technical Reports Server (NTRS)

    Braun, R. D.; Kroo, I. M.

    1995-01-01

    Collaborative optimization is a design architecture applicable in any multidisciplinary analysis environment but specifically intended for large-scale distributed analysis applications. In this approach, a complex problem is hierarchically de- composed along disciplinary boundaries into a number of subproblems which are brought into multidisciplinary agreement by a system-level coordination process. When applied to problems in a multidisciplinary design environment, this scheme has several advantages over traditional solution strategies. These advantageous features include reducing the amount of information transferred between disciplines, the removal of large iteration-loops, allowing the use of different subspace optimizers among the various analysis groups, an analysis framework which is easily parallelized and can operate on heterogenous equipment, and a structural framework that is well-suited for conventional disciplinary organizations. In this article, the collaborative architecture is developed and its mathematical foundation is presented. An example application is also presented which highlights the potential of this method for use in large-scale design applications.

  9. Amounts and activity concentrations of radioactive wastes from the cleanup of large areas contaminated in nuclear accidents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehto, J.; Ikaeheimonen, T.K.; Salbu, B.

    The fallout from a major nuclear accident at a nuclear plant may result in a wide-scale contamination of the environment. Cleanup of contaminated areas is of special importance if these areas are populated or cultivated. All cleanup measures generate high amounts of radioactive waste, which have to be treated and disposed of in a safe manner. Scenarios assessing the amounts and activity concentrations of radioactive wastes for various cleanup measures after severe nuclear accidents have been worked out for urban, forest and agricultural areas. These scenarios are based on contamination levels and ares of contaminated lands from a model accident,more » which simulates a worst case accident at a nuclear power plant. Amounts and activity concentrations of cleanup wastes are not only dependent on the contamination levels and areas of affected lands, but also on the type of deposition, wet or dry, on the time between the deposition and the cleanup work, on the season, at which the deposition took place, and finally on the level of cleanup work. In this study practically all types of cleanup wastes were considered, whether or not the corresponding cleanup measures are cost-effective or justified. All cleanup measures are shown to create large amounts of radioactive wastes, but the amounts, as well as the activity concentrations vary widely from case to case.« less

  10. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  11. Source parameters controlling the generation and propagation of potential local tsunamis along the cascadia margin

    USGS Publications Warehouse

    Geist, E.; Yoshioka, S.

    1996-01-01

    The largest uncertainty in assessing hazards from local tsunamis along the Cascadia margin is estimating the possible earthquake source parameters. We investigate which source parameters exert the largest influence on tsunami generation and determine how each parameter affects the amplitude of the local tsunami. The following source parameters were analyzed: (1) type of faulting characteristic of the Cascadia subduction zone, (2) amount of slip during rupture, (3) slip orientation, (4) duration of rupture, (5) physical properties of the accretionary wedge, and (6) influence of secondary faulting. The effect of each of these source parameters on the quasi-static displacement of the ocean floor is determined by using elastic three-dimensional, finite-element models. The propagation of the resulting tsunami is modeled both near the coastline using the two-dimensional (x-t) Peregrine equations that includes the effects of dispersion and near the source using the three-dimensional (x-y-t) linear long-wave equations. The source parameters that have the largest influence on local tsunami excitation are the shallowness of rupture and the amount of slip. In addition, the orientation of slip has a large effect on the directivity of the tsunami, especially for shallow dipping faults, which consequently has a direct influence on the length of coastline inundated by the tsunami. Duration of rupture, physical properties of the accretionary wedge, and secondary faulting all affect the excitation of tsunamis but to a lesser extent than the shallowness of rupture and the amount and orientation of slip. Assessment of the severity of the local tsunami hazard should take into account that relatively large tsunamis can be generated from anomalous 'tsunami earthquakes' that rupture within the accretionary wedge in comparison to interplate thrust earthquakes of similar magnitude. ?? 1996 Kluwer Academic Publishers.

  12. Fire history reconstruction in grassland ecosystems: amount of charcoal reflects local area burned

    NASA Astrophysics Data System (ADS)

    Leys, Bérangère; Brewer, Simon C.; McConaghy, Scott; Mueller, Joshua; McLauchlan, Kendra K.

    2015-11-01

    Fire is one of the most prevalent disturbances in the Earth system, and its past characteristics can be reconstructed using charcoal particles preserved in depositional environments. Although researchers know that fires produce charcoal particles, interpretation of the quantity or composition of charcoal particles in terms of fire source remains poorly understood. In this study, we used a unique four-year dataset of charcoal deposited in traps from a native tallgrass prairie in mid-North America to test which environmental factors were linked to charcoal measurements on three spatial scales. We investigated small and large charcoal particles commonly used as a proxy of fire activity at different spatial scales, and charcoal morphotypes representing different types of fuel. We found that small (125-250 μm) and large (250 μm-1 mm) particles of charcoal are well-correlated (Spearman correlation = 0.88) and likely reflect the same spatial scale of fire activity in a system with both herbaceous and woody fuels. There was no significant relationship between charcoal pieces and fire parameters <500 m from the traps. Moreover, local area burned (<5 km distance radius from traps) explained the total charcoal amount, and regional burning (200 km radius distance from traps) explained the ratio of non arboreal to total charcoal (NA/T ratio). Charcoal variables, including total charcoal count and NA/T ratio, did not correlate with other fire parameters, vegetation cover, landscape, or climate variables. Thus, in long-term studies that involve fire history reconstructions, total charcoal particles, even of a small size (125-250 μm), could be an indicator of local area burned. Further studies may determine relationships among amount of charcoal recorded, fire intensity, vegetation cover, and climatic parameters.

  13. The effects of moderately high temperature on zeaxanthin accumulation and decay.

    PubMed

    Zhang, Ru; Kramer, David M; Cruz, Jeffrey A; Struck, Kimberly R; Sharkey, Thomas D

    2011-09-01

    Moderately high temperature reduces photosynthetic capacities of leaves with large effects on thylakoid reactions of photosynthesis, including xanthophyll conversion in the lipid phase of the thylakoid membrane. In previous studies, we have found that leaf temperature of 40°C increased zeaxanthin accumulation in dark-adapted, intact tobacco leaves following a brief illumination, but did not change the amount of zeaxanthin in light-adatped leaves. To investigate heat effects on zeaxanthin accumulation and decay, zeaxanthin level was monitored optically in dark-adapted, intact tobacco and Arabidopsis thaliana leaves at either 23 or 40°C under 45-min illumination. Heated leaves had more zeaxanthin following 3-min light but had less or comparable amounts of zeaxanthin by the end of 45 min of illumination. Zeaxanthin accumulated faster at light initiation and decayed faster upon darkening in leaves at 40°C than leaves at 23°C, indicating that heat increased the activities of both violaxanthin de-epoxidase (VDE) and zeaxanthin epoxidase (ZE). In addition, our optical measurement demonstrated in vivo that weak light enhances zeaxanthin decay relative to darkness in intact leaves of tobacco and Arabidopsis, confirming previous observations in isolated spinach chloroplasts. However, the maximum rate of decay is similar for weak light and darkness, and we used the maximum rate of decay following darkness as a measure of the rate of ZE during steady-state light. A simulation indicated that high temperature should cause a large shift in the pH dependence of the amount of zeaxanthin in leaves because of differential effects on VDE and ZE. This allows for the reduction in ΔpH caused by heat to be offset by increased VDE activity relative to ZE.

  14. Validation of a simplified food frequency questionnaire for the assessment of dietary habits in Iranian adults: Isfahan Healthy Heart Program, Iran.

    PubMed

    Mohammadifard, Noushin; Sajjadi, Firouzeh; Maghroun, Maryam; Alikhasi, Hassan; Nilforoushzadeh, Farzaneh; Sarrafzadegan, Nizal

    2015-03-01

    Dietary assessment is the first step of dietary modification in community-based interventional programs. This study was performed to validate a simple food frequency questionnaire (SFFQ) for assessment of selected food items in epidemiological studies with a large sample size as well as community trails. This validation study was carried out on 264 healthy adults aged ≥ 41 years old living in 3 district central of Iran, including Isfahan, Najafabad, and Arak. Selected food intakes were assessed using a 48-item food frequency questionnaire (FFQ). The FFQ was interviewer-administered, which was completed twice; at the beginning of the study and 2 weeks thereafter. The validity of this SFFQ was examined compared to estimated amount by single 24 h dietary recall and 2 days dietary record. Validation of the FFQ was determined using Spearman correlation coefficients between daily frequency consumption of food groups as assessed by the FFQ and the qualitative amount of daily food groups intake accessed by dietary reference method was applied to evaluate validity. Intraclass correlation coefficients (ICC) were used to determine the reproducibility. Spearman correlation coefficient between the estimated amount of food groups intake by examined and reference methods ranged from 0.105 (P = 0.378) in pickles to 0.48 (P < 0.001) in plant protein. ICC for reproducibility of FFQ were between 0.47-0.69 in different food groups (P < 0.001). The designed SFFQ has a good relative validity and reproducibility for assessment of selected food groups intake. Thus, it can serve as a valid tool in epidemiological studies and clinical trial with large participants.

  15. Eleuthera Island, Bahamas seen from STS-66

    NASA Image and Video Library

    1994-11-14

    The striking views provided by the Bahama Islands lend insights into the important problems of limestone (CaCO3) production and transport. This photograph includes the southern part of Eleuthera Island in the northern Bahamas. The hook-shaped island encloses a relatively shallow platform (light blue) which is surrounded by deep water (dark blue). The feathery pattern along the western edge of Eleuthera's platform are sand bars and sand channels created by tidal currents sweeping on and off the platform. The channels serve to funnel large amounts of CaCO3 off the platform and into the deeper water.

  16. MIT solar wind plasma data from Explorer 33 and Explorer 35: July 1966 to September 1970

    NASA Technical Reports Server (NTRS)

    Howe, H.; Binsack, J.; Wang, C.; Clapp, E.

    1971-01-01

    The plasma experiments on Explorer 33 and Explorer 35 have yielded large amounts of solar wind data. This report gives a brief review of the method used to obtain the data, provides a description of the plasma parameters, and describes in detail the format of the plots and tapes which are available from the Data Center. Hourly average plots of the data are included at the end of the report. From these plots, the availability and interest of the solar wind data for any period of time may be determined.

  17. Genetic Engineering of Alfalfa (Medicago sativa L.).

    PubMed

    Wang, Dan; Khurshid, Muhammad; Sun, Zhan Min; Tang, Yi Xiong; Zhou, Mei Liang; Wu, Yan Min

    2016-01-01

    Alfalfa is excellent perennial legume forage for its extensive ecological adaptability, high nutrition value, palatability and biological nitrogen fixation. It plays a very important role in the agriculture, animal husbandry and ecological construction. It is cultivated in all continents. With the development of modern plant breeding and genetic engineering techniques, a large amount of work has been carried out on alfalfa. Here we summarize the recent research advances in genetic engineering of alfalfa breeding, including transformation, quality improvement, stress resistance and as a bioreactor. The review article can enables us to understand the research method, direction and achievements of genetic engineering technology of Alfalfa.

  18. Conceptualizing recovery capital: expansion of a theoretical construct.

    PubMed

    Cloud, William; Granfield, Robert

    2008-01-01

    In order to capture key personal and social resources individuals are able to access in their efforts to overcome substance misuse, we introduced the construct of recovery capital into the literature. The purpose of this paper is to further explore the construct and include discussions of implications unexplored in our previous writings. In this paper we reveal the relationship between access to large amounts of recovery capital and substance misuse maintenance and introduce the concept of negative recovery capital. In doing so, we examine the relationships between negative recovery capital and gender, age, health, mental health, and incarceration.

  19. Nanomaterials-Based Optical Techniques for the Detection of Acetylcholinesterase and Pesticides

    PubMed Central

    Xia, Ning; Wang, Qinglong; Liu, Lin

    2015-01-01

    The large amount of pesticide residues in the environment is a threat to global health by inhibition of acetylcholinesterase (AChE). Biosensors for inhibition of AChE have been thus developed for the detection of pesticides. In line with the rapid development of nanotechnology, nanomaterials have attracted great attention and have been intensively studied in biological analysis due to their unique chemical, physical and size properties. The aim of this review is to provide insight into nanomaterial-based optical techniques for the determination of AChE and pesticides, including colorimetric and fluorescent assays and surface plasmon resonance. PMID:25558991

  20. Acculturation stress among Maya in the United States.

    PubMed

    Millender, Eugenia

    2012-01-01

    Abstract: As health care disparities become more evident in our multicultural nation, culture sensitive health research needs to be a priority in order for good health care to take place. This article will explore the literature related to acculturation stress and mental health disparities among the Mayan population. Literatures of similar but distinct groups are included due to the limited amount of research of the Mayan population. Using Leiniger's Transcultural nursing theory, these findings suggest that nurses have a large gap to fill to address the mental health disparities of specific cultural groups like the indigenous Maya, thereby satisfying their nursing obligations.

  1. The role of black holes in galaxy formation and evolution.

    PubMed

    Cattaneo, A; Faber, S M; Binney, J; Dekel, A; Kormendy, J; Mushotzky, R; Babul, A; Best, P N; Brüggen, M; Fabian, A C; Frenk, C S; Khalatyan, A; Netzer, H; Mahdavi, A; Silk, J; Steinmetz, M; Wisotzki, L

    2009-07-09

    Virtually all massive galaxies, including our own, host central black holes ranging in mass from millions to billions of solar masses. The growth of these black holes releases vast amounts of energy that powers quasars and other weaker active galactic nuclei. A tiny fraction of this energy, if absorbed by the host galaxy, could halt star formation by heating and ejecting ambient gas. A central question in galaxy evolution is the degree to which this process has caused the decline of star formation in large elliptical galaxies, which typically have little cold gas and few young stars, unlike spiral galaxies.

  2. Hevea Linamarase—A Nonspecific β-Glycosidase 1

    PubMed Central

    Selmar, Dirk; Lieberei, Reinhard; Biehl, Böle; Voigt, Jürgen

    1987-01-01

    In the leaf tissue of the cyanogenic plant Hevea brasiliensis, which contains large amounts of linamarin, there is no specific linamarase. In Hevea leaves only one β-glucosidase is detectable. It is responsible for the cleavage of all β-glucosides and β-galactosides occurring in Hevea leaf tissue, including the cyanogenic glucoside linamarin. Therefore, the enzyme is referred to as a β-glycosidase instead of the term β-glucosidase. This β-glycosidase has a broad substrate spectrum and occurs in multiple forms. These homo-oligomeric forms are interconvertible by dissociation-association processes. The monomer is a single protein of 64 kilodaltons. PMID:16665288

  3. Use of optimization to predict the effect of selected parameters on commuter aircraft performance

    NASA Technical Reports Server (NTRS)

    Wells, V. L.; Shevell, R. S.

    1982-01-01

    An optimizing computer program determined the turboprop aircraft with lowest direct operating cost for various sets of cruise speed and field length constraints. External variables included wing area, wing aspect ratio and engine sea level static horsepower; tail sizes, climb speed and cruise altitude were varied within the function evaluation program. Direct operating cost was minimized for a 150 n.mi typical mission. Generally, DOC increased with increasing speed and decreasing field length but not by a large amount. Ride roughness, however, increased considerably as speed became higher and field length became shorter.

  4. High-temperature ductility of electro-deposited nickel

    NASA Technical Reports Server (NTRS)

    Dini, J. W.; Johnson, H. R.

    1977-01-01

    Work done during the past several months on high temperature ductility of electrodeposited nickel is summarized. Data are presented which show that earlier measurements made at NASA-Langley erred on the low side, that strain rate has a marked influence on high temperature ductility, and that codeposition of a small amount of manganese helps to improve high temperature ductility. Influences of a number of other factors on nickel properties were also investigated. They included plating solution temperature, current density, agitation, and elimination of the wetting agent from the plating solution. Repair of a large nozzle section by nickel plating is described.

  5. Realizing the financial benefits of capitation arbitrage.

    PubMed

    Sussman, A J; Fairchild, D G; Colling, M C; Brennan, T A

    1999-11-01

    By anticipating the arbitrage potential of cash flow under budgeted capitation, healthcare organizations can make the best use of cash flow as a revenue-generating resource. Factors that determine the magnitude of the benefits for providers and insurers include settlement interval, withhold amount, which party controls the withhold, and incurred-but-not-reported expenses. In choosing how to structure these factors in their contract negotiations, providers and insurers should carefully assess whether capitation surpluses or deficits can be expected from the provider. In both instances, the recipient and magnitude of capitation arbitrage benefits are dictated largely by the performance of the provider.

  6. Review of capital investment in economic growth cycle

    NASA Astrophysics Data System (ADS)

    Shaffie, Siti Salihah; Jaaman, Saiful Hafizah; Mohamad, Daud

    2016-11-01

    The study of linkages of macroeconomics factors is prominent in order to understand how the economic cycle affects one another. These factors include interest rate, growth rate, saving and capital investment which are mutually correlated to stabilize the GDP. Part of this study, it will look upon the impact of investment which emphasize the efficiency of capital investment to the economic growth. Capital investment is one investment appraisal that gives impact to the economic growth. It is a long term investment and involve with large amount of capital to incorporate the development of private and public capital investment.

  7. [A Successful Treatment of Locally Advanced Breast Cancer with Using Mohs' Paste and Chemotherapy - A Case Report].

    PubMed

    Tsubota, Yu; Yamamoto, Daigo; Ishizuka, Mariko; Yoshikawa, Katsuhiro; Sueoka, Noriko; Kon, Masanori

    2018-04-01

    Foul smell and large amounts ofexudate, bleeding are the most common and serious symptoms with locally advanced breast cancer(LABC). Mohs' paste is made ofa mixture ofzinc chloride and used for treatment ofmalignant skin tumors. Recently some reports show that Mohs' paste is useful for treatment of malignant tumor including unresectable breast cancer and skin metastasis ofcancer. Mohs' paste is useful for reducing symptoms such as foul smell and exudate, Bleeding. We report a successful case of treatment for LABC with using Mohs' paste and chemotherapy and surgery.

  8. Research on grid connection control technology of double fed wind generator

    NASA Astrophysics Data System (ADS)

    Ling, Li

    2017-01-01

    The composition and working principle of variable speed constant frequency doubly fed wind power generation system is discussed in this thesis. On the basis of theoretical analysis and control on the modeling, the doubly fed wind power generation simulation control system is designed based on a TMS320F2407 digital signal processor (DSP), and has done a large amount of experimental research, which mainly include, variable speed constant frequency, constant pressure, Grid connected control experiment. The running results show that the design of simulation control system is reasonable and can meet the need of experimental research.

  9. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  10. Viral Hemorrhagic Fever Diagnostics

    PubMed Central

    Racsa, Lori D.; Kraft, Colleen S.; Olinger, Gene G.; Hensley, Lisa E.

    2016-01-01

    There are 4 families of viruses that cause viral hemorrhagic fever (VHF), including Filoviridae. Ebola virus is one virus within the family Filoviridae and the cause of the current outbreak of VHF in West Africa. VHF-endemic areas are found throughout the world, yet traditional diagnosis of VHF has been performed in large reference laboratories centered in Europe and the United States. The large amount of capital needed, as well as highly trained and skilled personnel, has limited the availability of diagnostics in endemic areas except in conjunction with governmental and nongovernmental entities. However, rapid diagnosis of VHF is essential to efforts that will limit outbreaks. In addition, increased global travel suggests VHF diagnoses may be made outside of the endemic areas. Thus, understanding how to diagnose VHF is imperative for laboratories worldwide. This article reviews traditional and current diagnostic modalities for VHF. PMID:26354968

  11. Sample presentation, sources of error and future perspectives on the application of vibrational spectroscopy in the wine industry.

    PubMed

    Cozzolino, Daniel

    2015-03-30

    Vibrational spectroscopy encompasses a number of techniques and methods including ultra-violet, visible, Fourier transform infrared or mid infrared, near infrared and Raman spectroscopy. The use and application of spectroscopy generates spectra containing hundreds of variables (absorbances at each wavenumbers or wavelengths), resulting in the production of large data sets representing the chemical and biochemical wine fingerprint. Multivariate data analysis techniques are then required to handle the large amount of data generated in order to interpret the spectra in a meaningful way in order to develop a specific application. This paper focuses on the developments of sample presentation and main sources of error when vibrational spectroscopy methods are applied in wine analysis. Recent and novel applications will be discussed as examples of these developments. © 2014 Society of Chemical Industry.

  12. Biotin: From Nutrition to Therapeutics.

    PubMed

    Mock, Donald M

    2017-08-01

    Although frank symptomatic biotin deficiency is rare, some evidence suggests that marginal biotin deficiency occurs spontaneously in a substantial proportion of women during normal human pregnancy and might confer an increased risk of birth defects. Herein I review 1 ) advances in assessing biotin status, including the relation between acylcarnitine excretion and biotin status; 2 ) recent studies of biotin status in pregnancy; 3 ) advances in understanding the role of biotin in gene expression and the potential roles of biotinylated proteins that are neither histones nor carboxylases; and 4 ) novel large-dose biotin supplementation as therapy for multiple sclerosis. The review concludes with a summary of recent studies that have reported potentially dangerous erroneous results in individuals consuming large amounts of biotin for measurements of various plasma hormones for common clinical assays that use streptavidin-biotin technology. © 2017 American Society for Nutrition.

  13. Facilitating access to information in large documents with an intelligent hypertext system

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie

    1993-01-01

    Retrieving specific information from large amounts of documentation is not an easy task. It could be facilitated if information relevant in the current problem solving context could be automatically supplied to the user. As a first step towards this goal, we have developed an intelligent hypertext system called CID (Computer Integrated Documentation) and tested it on the Space Station Freedom requirement documents. The CID system enables integration of various technical documents in a hypertext framework and includes an intelligent context-sensitive indexing and retrieval mechanism. This mechanism utilizes on-line user information requirements and relevance feedback either to reinforce current indexing in case of success or to generate new knowledge in case of failure. This allows the CID system to provide helpful responses, based on previous usage of the documentation, and to improve its performance over time.

  14. Light baryon spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crede, Volker

    The spectrum of excited baryons serves as an excellent probe of quantum chromodynamics (QCD). In particular, highly-excited baryon resonances are sensitive to the details of quark confinement which is only poorly understood within QCD. Facilities worldwide such as Jefferson Lab, ELSA, and MAMI, which study the systematics of hadron spectra in photo- and electroproduction experiments, have accumulated a large amount of data in recent years including unpolarized cross section and polarization data for a large variety of meson-production reactions. These are important steps toward complete experiments that will allow us to unambiguously determine the scattering amplitude in the underlying reactionsmore » and to identify the broad and overlapping baryon resonance contributions. Several new nucleon resonances have been proposed and changes to the baryon listing in the 2012 Review of Particle Physics reflect the progress in the field.« less

  15. Biofuels done right: land efficient animal feeds enable large environmental and energy benefits.

    PubMed

    Dale, Bruce E; Bals, Bryan D; Kim, Seungdo; Eranki, Pragnya

    2010-11-15

    There is an intense ongoing debate regarding the potential scale of biofuel production without creating adverse effects on food supply. We explore the possibility of three land-efficient technologies for producing food (actually animal feed), including leaf protein concentrates, pretreated forages, and double crops to increase the total amount of plant biomass available for biofuels. Using less than 30% of total U.S. cropland, pasture, and range, 400 billion liters of ethanol can be produced annually without decreasing domestic food production or agricultural exports. This approach also reduces U.S. greenhouse gas emissions by 670 Tg CO₂-equivalent per year, or over 10% of total U.S. annual emissions, while increasing soil fertility and promoting biodiversity. Thus we can replace a large fraction of U.S. petroleum consumption without indirect land use change.

  16. TIMESERIESSTREAMING.VI: LabVIEW program for reliable data streaming of large analog time series

    NASA Astrophysics Data System (ADS)

    Czerwinski, Fabian; Oddershede, Lene B.

    2011-02-01

    With modern data acquisition devices that work fast and very precise, scientists often face the task of dealing with huge amounts of data. These need to be rapidly processed and stored onto a hard disk. We present a LabVIEW program which reliably streams analog time series of MHz sampling. Its run time has virtually no limitation. We explicitly show how to use the program to extract time series from two experiments: For a photodiode detection system that tracks the position of an optically trapped particle and for a measurement of ionic current through a glass capillary. The program is easy to use and versatile as the input can be any type of analog signal. Also, the data streaming software is simple, highly reliable, and can be easily customized to include, e.g., real-time power spectral analysis and Allan variance noise quantification. Program summaryProgram title: TimeSeriesStreaming.VI Catalogue identifier: AEHT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 250 No. of bytes in distributed program, including test data, etc.: 63 259 Distribution format: tar.gz Programming language: LabVIEW ( http://www.ni.com/labview/) Computer: Any machine running LabVIEW 8.6 or higher Operating system: Windows XP and Windows 7 RAM: 60-360 Mbyte Classification: 3 Nature of problem: For numerous scientific and engineering applications, it is highly desirable to have an efficient, reliable, and flexible program to perform data streaming of time series sampled with high frequencies and possibly for long time intervals. This type of data acquisition often produces very large amounts of data not easily streamed onto a computer hard disk using standard methods. Solution method: This LabVIEW program is developed to directly stream any kind of time series onto a hard disk. Due to optimized timing and usage of computational resources, such as multicores and protocols for memory usage, this program provides extremely reliable data acquisition. In particular, the program is optimized to deal with large amounts of data, e.g., taken with high sampling frequencies and over long time intervals. The program can be easily customized for time series analyses. Restrictions: Only tested in Windows-operating LabVIEW environments, must use TDMS format, acquisition cards must be LabVIEW compatible, driver DAQmx installed. Running time: As desirable: microseconds to hours

  17. Faster growth in warmer winters for large trees in a Mediterranean-climate ecosystem

    Treesearch

    Seth W. Bigelow; Michael J. Papaik; Caroline Caum; Malcolm P. North

    2014-01-01

    Large trees (>76 cm breast-height diameter) are vital components of Sierra Nevada/Cascades mixed-conifer ecosystems because of their fire resistance, ability to sequester large amounts of carbon, and role as preferred habitat for sensitive species such as the California spotted owl. To investigate the likely performance of large trees in a rapidly changing...

  18. Support Vector Machines Trained with Evolutionary Algorithms Employing Kernel Adatron for Large Scale Classification of Protein Structures.

    PubMed

    Arana-Daniel, Nancy; Gallegos, Alberto A; López-Franco, Carlos; Alanís, Alma Y; Morales, Jacob; López-Franco, Adriana

    2016-01-01

    With the increasing power of computers, the amount of data that can be processed in small periods of time has grown exponentially, as has the importance of classifying large-scale data efficiently. Support vector machines have shown good results classifying large amounts of high-dimensional data, such as data generated by protein structure prediction, spam recognition, medical diagnosis, optical character recognition and text classification, etc. Most state of the art approaches for large-scale learning use traditional optimization methods, such as quadratic programming or gradient descent, which makes the use of evolutionary algorithms for training support vector machines an area to be explored. The present paper proposes an approach that is simple to implement based on evolutionary algorithms and Kernel-Adatron for solving large-scale classification problems, focusing on protein structure prediction. The functional properties of proteins depend upon their three-dimensional structures. Knowing the structures of proteins is crucial for biology and can lead to improvements in areas such as medicine, agriculture and biofuels.

  19. Implementing Parquet equations using HPX

    NASA Astrophysics Data System (ADS)

    Kellar, Samuel; Wagle, Bibek; Yang, Shuxiang; Tam, Ka-Ming; Kaiser, Hartmut; Moreno, Juana; Jarrell, Mark

    A new C++ runtime system (HPX) enables simulations of complex systems to run more efficiently on parallel and heterogeneous systems. This increased efficiency allows for solutions to larger simulations of the parquet approximation for a system with impurities. The relevancy of the parquet equations depends upon the ability to solve systems which require long runs and large amounts of memory. These limitations, in addition to numerical complications arising from stability of the solutions, necessitate running on large distributed systems. As the computational resources trend towards the exascale and the limitations arising from computational resources vanish efficiency of large scale simulations becomes a focus. HPX facilitates efficient simulations through intelligent overlapping of computation and communication. Simulations such as the parquet equations which require the transfer of large amounts of data should benefit from HPX implementations. Supported by the the NSF EPSCoR Cooperative Agreement No. EPS-1003897 with additional support from the Louisiana Board of Regents.

  20. A Cerebellar-model Associative Memory as a Generalized Random-access Memory

    NASA Technical Reports Server (NTRS)

    Kanerva, Pentti

    1989-01-01

    A versatile neural-net model is explained in terms familiar to computer scientists and engineers. It is called the sparse distributed memory, and it is a random-access memory for very long words (for patterns with thousands of bits). Its potential utility is the result of several factors: (1) a large pattern representing an object or a scene or a moment can encode a large amount of information about what it represents; (2) this information can serve as an address to the memory, and it can also serve as data; (3) the memory is noise tolerant--the information need not be exact; (4) the memory can be made arbitrarily large and hence an arbitrary amount of information can be stored in it; and (5) the architecture is inherently parallel, allowing large memories to be fast. Such memories can become important components of future computers.

  1. Ion Heating During Local Helicity Injection Plasma Startup in the Pegasus ST

    NASA Astrophysics Data System (ADS)

    Burke, M. G.; Barr, J. L.; Bongard, M. W.; Fonck, R. J.; Hinson, E. T.; Perry, J. M.; Reusch, J. A.

    2015-11-01

    Plasmas in the Pegasus ST are initiated either through standard, MHD stable, inductive current drive or non-solenoidal local helicity injection (LHI) current drive with strong reconnection activity, providing a rich environment to study ion dynamics. During LHI discharges, a large amount of impurity ion heating has been observed, with the passively measured impurity Ti as high as 800 eV compared to Ti ~ 60 eV and Te ~ 175 eV during standard inductive current drive discharges. In addition, non-thermal ion velocity distributions are observed and appear to be strongest near the helicity injectors. The ion heating is hypothesized to be a result of large-scale magnetic reconnection activity, as the amount of heating scales with increasing fluctuation amplitude of the dominant, edge localized, n =1 MHD mode. An approximate temporal scaling of the heating with the amplitude of higher frequency magnetic fluctuations has also been observed, with large amounts of power spectral density present at several impurity ion cyclotron frequencies. Recent experiments have focused on investigating the impurity ion heating scaling with the ion charge to mass ratio as well as the reconnecting field strength. The ion charge to mass ratio was modified by observing different impurity charge states in similar LHI plasmas while the reconnecting field strength was modified by changing the amount of injected edge current. Work supported by US DOE grant DE-FG02-96ER54375.

  2. Large-Eddy Simulation of Shallow Cumulus over Land: A Composite Case Based on ARM Long-Term Observations at Its Southern Great Plains Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yunyan; Klein, Stephen A.; Fan, Jiwen

    Based on long-term observations by the Atmospheric Radiation Measurement program at its Southern Great Plains site, a new composite case of continental shallow cumulus (ShCu) convection is constructed for large-eddy simulations (LES) and single-column models. The case represents a typical daytime nonprecipitating ShCu whose formation and dissipation are driven by the local atmospheric conditions and land surface forcing and are not influenced by synoptic weather events. The case includes early morning initial profiles of temperature and moisture with a residual layer; diurnally varying sensible and latent heat fluxes, which represent a domain average over different land surface types; simplified large-scalemore » horizontal advective tendencies and subsidence; and horizontal winds with prevailing direction and average speed. Observed composite cloud statistics are provided for model evaluation. The observed diurnal cycle is well reproduced by LES; however, the cloud amount, liquid water path, and shortwave radiative effect are generally underestimated. LES are compared between simulations with an all-or-nothing bulk microphysics and a spectral bin microphysics. The latter shows improved agreement with observations in the total cloud cover and the amount of clouds with depths greater than 300 m. When compared with radar retrievals of in-cloud air motion, LES produce comparable downdraft vertical velocities, but a larger updraft area, velocity, and updraft mass flux. Both observations and LES show a significantly larger in-cloud downdraft fraction and downdraft mass flux than marine ShCu.« less

  3. Large-Eddy Simulation of Shallow Cumulus over Land: A Composite Case Based on ARM Long-Term Observations at Its Southern Great Plains Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yunyan; Klein, Stephen A.; Fan, Jiwen

    Based on long-term observations by the Atmospheric Radiation Measurement program at its Southern Great Plains site, a new composite case of continental shallow cumulus (ShCu) convection is constructed for large-eddy simulations (LES) and single-column models. The case represents a typical daytime non-precipitating ShCu whose formation and dissipation are driven by the local atmospheric conditions and land-surface forcing, and are not influenced by synoptic weather events. The case includes: early-morning initial profiles of temperature and moisture with a residual layer; diurnally-varying sensible and latent heat fluxes which represent a domain average over different land-surface types; simplified large-scale horizontal advective tendencies andmore » subsidence; and horizontal winds with prevailing direction and average speed. Observed composite cloud statistics are provided for model evaluation. The observed diurnal cycle is well-reproduced by LES, however the cloud amount, liquid water path, and shortwave radiative effect are generally underestimated. LES are compared between simulations with an all-or-nothing bulk microphysics and a spectral bin microphysics. The latter shows improved agreement with observations in the total cloud cover and the amount of clouds with depths greater than 300 meters. When compared with radar retrievals of in-cloud air motion, LES produce comparable downdraft vertical velocities, but a larger updraft area, velocity and updraft mass flux. Finally, both observation and LES show a significantly larger in-cloud downdraft fraction and downdraft mass flux than marine ShCu.« less

  4. A New Framework and Prototype Solution for Clinical Decision Support and Research in Genomics and Other Data-intensive Fields of Medicine.

    PubMed

    Evans, James P; Wilhelmsen, Kirk C; Berg, Jonathan; Schmitt, Charles P; Krishnamurthy, Ashok; Fecho, Karamarie; Ahalt, Stanley C

    2016-01-01

    In genomics and other fields, it is now possible to capture and store large amounts of data in electronic medical records (EMRs). However, it is not clear if the routine accumulation of massive amounts of (largely uninterpretable) data will yield any health benefits to patients. Nevertheless, the use of large-scale medical data is likely to grow. To meet emerging challenges and facilitate optimal use of genomic data, our institution initiated a comprehensive planning process that addresses the needs of all stakeholders (e.g., patients, families, healthcare providers, researchers, technical staff, administrators). Our experience with this process and a key genomics research project contributed to the proposed framework. We propose a two-pronged Genomic Clinical Decision Support System (CDSS) that encompasses the concept of the "Clinical Mendeliome" as a patient-centric list of genomic variants that are clinically actionable and introduces the concept of the "Archival Value Criterion" as a decision-making formalism that approximates the cost-effectiveness of capturing, storing, and curating genome-scale sequencing data. We describe a prototype Genomic CDSS that we developed as a first step toward implementation of the framework. The proposed framework and prototype solution are designed to address the perspectives of stakeholders, stimulate effective clinical use of genomic data, drive genomic research, and meet current and future needs. The framework also can be broadly applied to additional fields, including other '-omics' fields. We advocate for the creation of a Task Force on the Clinical Mendeliome, charged with defining Clinical Mendeliomes and drafting clinical guidelines for their use.

  5. Large-Eddy Simulation of Shallow Cumulus over Land: A Composite Case Based on ARM Long-Term Observations at Its Southern Great Plains Site

    DOE PAGES

    Zhang, Yunyan; Klein, Stephen A.; Fan, Jiwen; ...

    2017-09-19

    Based on long-term observations by the Atmospheric Radiation Measurement program at its Southern Great Plains site, a new composite case of continental shallow cumulus (ShCu) convection is constructed for large-eddy simulations (LES) and single-column models. The case represents a typical daytime non-precipitating ShCu whose formation and dissipation are driven by the local atmospheric conditions and land-surface forcing, and are not influenced by synoptic weather events. The case includes: early-morning initial profiles of temperature and moisture with a residual layer; diurnally-varying sensible and latent heat fluxes which represent a domain average over different land-surface types; simplified large-scale horizontal advective tendencies andmore » subsidence; and horizontal winds with prevailing direction and average speed. Observed composite cloud statistics are provided for model evaluation. The observed diurnal cycle is well-reproduced by LES, however the cloud amount, liquid water path, and shortwave radiative effect are generally underestimated. LES are compared between simulations with an all-or-nothing bulk microphysics and a spectral bin microphysics. The latter shows improved agreement with observations in the total cloud cover and the amount of clouds with depths greater than 300 meters. When compared with radar retrievals of in-cloud air motion, LES produce comparable downdraft vertical velocities, but a larger updraft area, velocity and updraft mass flux. Finally, both observation and LES show a significantly larger in-cloud downdraft fraction and downdraft mass flux than marine ShCu.« less

  6. Evaluation of interpolation techniques for the creation of gridded daily precipitation (1 × 1 km2); Cyprus, 1980-2010

    NASA Astrophysics Data System (ADS)

    Camera, Corrado; Bruggeman, Adriana; Hadjinicolaou, Panos; Pashiardis, Stelios; Lange, Manfred A.

    2014-01-01

    High-resolution gridded daily data sets are essential for natural resource management and the analyses of climate changes and their effects. This study aims to evaluate the performance of 15 simple or complex interpolation techniques in reproducing daily precipitation at a resolution of 1 km2 over topographically complex areas. Methods are tested considering two different sets of observation densities and different rainfall amounts. We used rainfall data that were recorded at 74 and 145 observational stations, respectively, spread over the 5760 km2 of the Republic of Cyprus, in the Eastern Mediterranean. Regression analyses utilizing geographical copredictors and neighboring interpolation techniques were evaluated both in isolation and combined. Linear multiple regression (LMR) and geographically weighted regression methods (GWR) were tested. These included a step-wise selection of covariables, as well as inverse distance weighting (IDW), kriging, and 3D-thin plate splines (TPS). The relative rank of the different techniques changes with different station density and rainfall amounts. Our results indicate that TPS performs well for low station density and large-scale events and also when coupled with regression models. It performs poorly for high station density. The opposite is observed when using IDW. Simple IDW performs best for local events, while a combination of step-wise GWR and IDW proves to be the best method for large-scale events and high station density. This study indicates that the use of step-wise regression with a variable set of geographic parameters can improve the interpolation of large-scale events because it facilitates the representation of local climate dynamics.

  7. 27 CFR 40.133 - Amount of individual bond.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... amount, the manufacturer shall immediately file a strengthening or superseding bond as required by this subpart. The amount of any such bond (or the total amount including strengthening bonds, if any) need not...

  8. 27 CFR 40.133 - Amount of individual bond.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... amount, the manufacturer shall immediately file a strengthening or superseding bond as required by this subpart. The amount of any such bond (or the total amount including strengthening bonds, if any) need not...

  9. Plasma issues associated with the use of electrodynamic tethers

    NASA Technical Reports Server (NTRS)

    Hastings, D. E.

    1986-01-01

    The use of an electrodynamic tether to generate power or thrust on the space station raises important plasma issues associted with the current flow. In addition to the issue of current closure through the space station, high power tethers (equal to or greater than tens of kilowatts) require the use of plasma contactors to enhance the current flow. They will generate large amounts of electrostatic turbulence in the vicinity of the space station. This is because the contactors work best when a large amount of current driven turbulence is excited. Current work is reviewed and future directions suggested.

  10. Models of resource planning during formation of calendar construction plans for erection of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Pocebneva, Irina; Belousov, Vadim; Fateeva, Irina

    2018-03-01

    This article provides a methodical description of resource-time analysis for a wide range of requirements imposed for resource consumption processes in scheduling tasks during the construction of high-rise buildings and facilities. The core of the proposed approach and is the resource models being determined. The generalized network models are the elements of those models, the amount of which can be too large to carry out the analysis of each element. Therefore, the problem is to approximate the original resource model by simpler time models, when their amount is not very large.

  11. Numerical experiments on short-term meteorological effects on solar variability

    NASA Technical Reports Server (NTRS)

    Somerville, R. C. J.; Hansen, J. E.; Stone, P. H.; Quirk, W. J.; Lacis, A. A.

    1975-01-01

    A set of numerical experiments was conducted to test the short-range sensitivity of a large atmospheric general circulation model to changes in solar constant and ozone amount. On the basis of the results of 12-day sets of integrations with very large variations in these parameters, it is concluded that realistic variations would produce insignificant meteorological effects. Any causal relationships between solar variability and weather, for time scales of two weeks or less, rely upon changes in parameters other than solar constant or ozone amounts, or upon mechanisms not yet incorporated in the model.

  12. Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey

    ERIC Educational Resources Information Center

    Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin

    2015-01-01

    "Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…

  13. Large Groups in the Boundary Waters Canoe Area - Their Numbers, Characteristics, and Impact

    Treesearch

    David W. Lime

    1972-01-01

    The impact of "large" parties in the BWCA is discussed in terms of their effect on the resource and on the experience of other visitors. The amount of use by large groups and the visitors most likely to be affected by a reduction in party size limit are described.

  14. The Ethics of Paid Plasma Donation: A Plea for Patient Centeredness.

    PubMed

    Farrugia, Albert; Penrod, Joshua; Bult, Jan M

    2015-12-01

    Plasma protein therapies (PPTs) are a group of essential medicines extracted from human plasma through processes of industrial scale fractionation. They are used primarily to treat a number of rare, chronic disorders ensuing from inherited or acquired deficiencies of a number of physiologically essential proteins. These disorders include hemophilia A and B, different immunodeficiencies and alpha 1-antitrypsin deficiency. In addition, acute blood loss, burns and sepsis are treated by PPTs. Hence, a population of vulnerable and very sick individuals is dependent on these products. In addition, the continued well-being of large sections of the community, including pregnant women and their children, travelers and workers exposed to infectious risk is also subject to the availability of these therapies. Their manufacture to adequate amounts requires large volumes of human plasma as the starting material of a complex purification process. Mainstream blood transfusion services run primarily by the not-for-profit sector have attempted to provide this plasma through the separation of blood donations, but have failed to provide sufficient amounts to meet the clinical demand. The collection of plasma from donors willing to commit to the process of plasmapheresis, which is not only time consuming but requires a long term, continuing commitment, generates much higher amounts of plasma and has been an activity historically separate from the blood transfusion sector and run by commercial companies. These companies now supply two-thirds of the growing global need for these therapies, while the mainstream government-run blood sector continues to supply a shrinking proportion. The private sector plasmapheresis activity which provides the bulk of treatment products has been compensating the donors in order to recognize the time and effort required. Recent activities have reignited the debate regarding the ethical and medical aspects of such compensation. In this work, we review the landscape; assess the contributions made by the compensated and non-compensated sectors and synthesize the outcomes on the relevant patient communities of perturbing the current paradigm of compensated plasma donation. We conclude that the current era of "Patient Centeredness" in health care demands the continuation and extension of paid plasma donation.

  15. Helioviewer.org: An Open-source Tool for Visualizing Solar Data

    NASA Astrophysics Data System (ADS)

    Hughitt, V. Keith; Ireland, J.; Schmiedel, P.; Dimitoglou, G.; Mueller, D.; Fleck, B.

    2009-05-01

    As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. Currently, Helioviewer enables users to browse the entire SOHO data archive, updated hourly, as well as data feature/event catalog data from eight different catalogs including active region, flare, coronal mass ejection, type II radio burst data. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Future functionality will include: support for additional data-sources including TRACE, SDO and STEREO, dynamic movie generation, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.

  16. Energy Storage Requirements for Achieving 50% Penetration of Solar Photovoltaic Energy in California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denholm, Paul; Margolis, Robert

    2016-09-01

    We estimate the storage required to enable PV penetration up to 50% in California (with renewable penetration over 66%), and we quantify the complex relationships among storage, PV penetration, grid flexibility, and PV costs due to increased curtailment. We find that the storage needed depends strongly on the amount of other flexibility resources deployed. With very low-cost PV (three cents per kilowatt-hour) and a highly flexible electric power system, about 19 gigawatts of energy storage could enable 50% PV penetration with a marginal net PV levelized cost of energy (LCOE) comparable to the variable costs of future combined-cycle gas generatorsmore » under carbon constraints. This system requires extensive use of flexible generation, transmission, demand response, and electrifying one quarter of the vehicle fleet in California with largely optimized charging. A less flexible system, or more expensive PV would require significantly greater amounts of storage. The amount of storage needed to support very large amounts of PV might fit within a least-cost framework driven by declining storage costs and reduced storage-duration needs due to high PV penetration.« less

  17. Energy Storage Requirements for Achieving 50% Solar Photovoltaic Energy Penetration in California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denholm, Paul; Margolis, Robert

    2016-08-01

    We estimate the storage required to enable PV penetration up to 50% in California (with renewable penetration over 66%), and we quantify the complex relationships among storage, PV penetration, grid flexibility, and PV costs due to increased curtailment. We find that the storage needed depends strongly on the amount of other flexibility resources deployed. With very low-cost PV (three cents per kilowatt-hour) and a highly flexible electric power system, about 19 gigawatts of energy storage could enable 50% PV penetration with a marginal net PV levelized cost of energy (LCOE) comparable to the variable costs of future combined-cycle gas generatorsmore » under carbon constraints. This system requires extensive use of flexible generation, transmission, demand response, and electrifying one quarter of the vehicle fleet in California with largely optimized charging. A less flexible system, or more expensive PV would require significantly greater amounts of storage. The amount of storage needed to support very large amounts of PV might fit within a least-cost framework driven by declining storage costs and reduced storage-duration needs due to high PV penetration.« less

  18. Growth process optimization of ZnO thin film using atomic layer deposition

    NASA Astrophysics Data System (ADS)

    Weng, Binbin; Wang, Jingyu; Larson, Preston; Liu, Yingtao

    2016-12-01

    The work reports experimental studies of ZnO thin films grown on Si(100) wafers using a customized thermal atomic layer deposition. The impact of growth parameters including H2O/DiethylZinc (DEZn) dose ratio, background pressure, and temperature are investigated. The imaging results of scanning electron microscopy and atomic force microscopy reveal that the dose ratio is critical to the surface morphology. To achieve high uniformity, the H2O dose amount needs to be at least twice that of DEZn per each cycle. If the background pressure drops below 400 mTorr, a large amount of nanoflower-like ZnO grains would emerge and increase surface roughness significantly. In addition, the growth temperature range between 200 °C and 250 °C is found to be the optimal growth window. And the crystal structures and orientations are also strongly correlated to the temperature as proved by electron back-scattering diffraction and x-ray diffraction results.

  19. The biochemical consequences of hypoxia.

    PubMed Central

    Alberti, K G

    1977-01-01

    The various phases of energy production have been described. These include glycolysis which is unique in its ability to produce ATP anaerobically, the tricarboxylic acid cycle with its major contribution to ATP production coming through the generation of NADH, and the cytochrome system at which reducing equivalents are converted to water, the released energy being incorporated into high-energy phosphates. The regulation of these pathways has been briefly described and the importance of the small amount of ATP generated anaerobically emphasized. The adaptation of muscle to periods of hypoxia through the presence of myoglobin, creatine phosphate and large amounts of glycogen is then discussed. The role of pH in limiting anaerobic glycolysis in muscle and the importance of the circulation in providing oxygen for exercising muscle are outlined. The effects of hypoxia on certain other tissues such as liver and brain have been detailed and finally methods for assessment of tissue hypoxia in man such as the measurement of the lactate:pyruvate ratio in blood are presented. PMID:198434

  20. Investigation of plastic debris ingestion by four species of sea turtles collected as bycatch in pelagic Pacific longline fisheries

    USGS Publications Warehouse

    Clukey, Katherine E.; Lepczyk, Christopher A.; Balazs, George H.; Work, Thierry M.; Lynch, Jennifer M.

    2017-01-01

    Ingestion of marine debris is an established threat to sea turtles. The amount, type, color and location of ingested plastics in the gastrointestinal tracts of 55 sea turtles from Pacific longline fisheries from 2012 to 2016 were quantified, and compared across species, turtle length, body condition, sex, capture location, season and year. Six approaches for quantifying amounts of ingested plastic strongly correlated with one another and included: number of pieces, mass, volume and surface area of plastics, ratio of plastic mass to body mass, and percentage of the mass of gut contents consisting of plastic. All olive ridley (n = 37), 90% of green (n = 10), 80% of loggerhead (n = 5) and 0% of leatherback (n = 3) turtles had ingested plastic; green turtles ingested significantly more than olive ridleys. Most debris was in the large intestines. No adverse health impacts (intestinal lesions, blockage, or poor body condition) due directly to plastic ingestion were noted.

  1. Dynamic microwave assisted extraction coupled with dispersive micro-solid-phase extraction of herbicides in soybeans.

    PubMed

    Li, Na; Wu, Lijie; Nian, Li; Song, Ying; Lei, Lei; Yang, Xiao; Wang, Kun; Wang, Zhibing; Zhang, Liyuan; Zhang, Hanqi; Yu, Aimin; Zhang, Ziwei

    2015-09-01

    Non-polar solvent dynamic microwave assisted extraction was firstly applied to the treatment of high-fat soybean samples. In the dispersive micro-solid-phase extraction (D-µ-SPE), the herbicides in the high-fat extract were directly adsorbed on metal-organic frameworks MIL-101(Cr). The effects of several experimental parameters, including extraction solvent, microwave absorption medium, microwave power, volume and flow rate of extraction solvent, amount of MIL-101(Cr), and D-µ-SPE time, were investigated. At the optimal conditions, the limits of detection for the herbicides ranged from 1.56 to 2.00 μg kg(-1). The relative recoveries of the herbicides were in the range of 91.1-106.7%, and relative standard deviations were equal to or lower than 6.7%. The present method was simple, rapid and effective. A large amount of fat was also removed. This method was demonstrated to be suitable for treatment of high-fat samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Debris ingestion by juvenile marine turtles: an underestimated problem.

    PubMed

    Santos, Robson Guimarães; Andrades, Ryan; Boldrini, Marcillo Altoé; Martins, Agnaldo Silva

    2015-04-15

    Marine turtles are an iconic group of endangered animals threatened by debris ingestion. However, key aspects related to debris ingestion are still poorly known, including its effects on mortality and the original use of the ingested debris. Therefore, we analysed the impact of debris ingestion in 265 green turtles (Chelonia mydas) over a large geographical area and different habitats along the Brazilian coast. We determined the death rate due to debris ingestion and quantified the amount of debris that is sufficient to cause the death of juvenile green turtles. Additionally, we investigated the original use of the ingested debris. We found that a surprisingly small amount of debris was sufficient to block the digestive tract and cause death. We suggested that debris ingestion has a high death potential that may be masked by other causes of death. An expressive part of the ingested debris come from disposable and short-lived products. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Automatic Feature Extraction from Planetary Images

    NASA Technical Reports Server (NTRS)

    Troglio, Giulia; Le Moigne, Jacqueline; Benediktsson, Jon A.; Moser, Gabriele; Serpico, Sebastiano B.

    2010-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images has already been acquired and much more will be available for analysis in the coming years. The image data need to be analyzed, preferably by automatic processing techniques because of the huge amount of data. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to planetary data that often present low contrast and uneven illumination characteristics. Different methods have already been presented for crater extraction from planetary images, but the detection of other types of planetary features has not been addressed yet. Here, we propose a new unsupervised method for the extraction of different features from the surface of the analyzed planet, based on the combination of several image processing techniques, including a watershed segmentation and the generalized Hough Transform. The method has many applications, among which image registration and can be applied to arbitrary planetary images.

  4. Obesity in dogs and cats: a metabolic and endocrine disorder.

    PubMed

    Zoran, Debra L

    2010-03-01

    Obesity is defined as an accumulation of excessive amounts of adipose tissue in the body, and has been called the most common nutritional disease of dogs in Western countries. Most investigators agree that at least 33% of the dogs presented to veterinary clinics are obese, and that the incidence is increasing as human obesity increases in the overall population. Obesity is not just the accumulation of large amounts of adipose tissue, but is associated with important metabolic and hormonal changes in the body, which are the focus of this review. Obesity is associated with a variety of conditions, including osteoarthritis, respiratory distress, glucose intolerance and diabetes mellitus, hypertension, dystocia, decreased heat tolerance, some forms of cancer, and increased risk of anesthetic and surgical complications. Prevention and early recognition of obesity, as well as correcting obesity when it is present, are essential to appropriate health care, and increases both the quality and quantity of life for pets. Copyright 2010 Elsevier Inc. All rights reserved.

  5. Sensor Alerting Capability

    NASA Astrophysics Data System (ADS)

    Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam

    2013-04-01

    There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.

  6. ECoG sleep-waking rhythms and bodily activity in the cerveau isolé rat.

    PubMed

    Nakata, K; Kawamura, H

    1986-01-01

    In rats with a high mesencephalic transection, isolating both the locus coeruleus and raphe nuclei from the forebrain, Electrocorticogram (ECoG) and Electromyogram (EMG) of the neck muscles were continuously recorded. Normal sleep-waking ECoG changes with a significant circadian rhythm reappeared in 4 to 9 days after transection. Neck muscle EMG and bodily movements were independent of the ECoG changes and did not show any significant circadian rhythm. In these high mesencephalic rats with sleep-waking ECoG changes, large bilateral hypothalamic lesions were made by passing DC current either in the preoptic area or in the posterior hypothalamus. After the preoptic area lesions the amount of low voltage fast ECoG per day markedly increased, whereas after the posterior hypothalamic lesions, the total amount of low voltate fast wave per day decreased showing long-lasting slow wave sleep pattern. These results support an idea that the forebrain, especially in the hypothalamus including the preoptic area, a mechanism inducing sleep-waking ECoG changes is localized.

  7. BANNER: an executable survey of advances in biomedical named entity recognition.

    PubMed

    Leaman, Robert; Gonzalez, Graciela

    2008-01-01

    There has been an increasing amount of research on biomedical named entity recognition, the most basic text extraction problem, resulting in significant progress by different research teams around the world. This has created a need for a freely-available, open source system implementing the advances described in the literature. In this paper we present BANNER, an open-source, executable survey of advances in biomedical named entity recognition, intended to serve as a benchmark for the field. BANNER is implemented in Java as a machine-learning system based on conditional random fields and includes a wide survey of the best techniques recently described in the literature. It is designed to maximize domain independence by not employing brittle semantic features or rule-based processing steps, and achieves significantly better performance than existing baseline systems. It is therefore useful to developers as an extensible NER implementation, to researchers as a standard for comparing innovative techniques, and to biologists requiring the ability to find novel entities in large amounts of text.

  8. Effects of exposure to ultraviolet light on the development of Rana pipiens, the northern leopard frog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, J.J.; Wofford, H.W.

    1996-10-01

    The increase in ultraviolet light intensity levels due to ozone depletion recently has been linked to the decline in amphibian population. In this experiment, eggs and larvae of Rana pipiens were subjected to differing amounts of ultraviolet radiation to determine the effects of ultraviolet light on the development of amphibian tadpoles. The total length, length of body without tail, and maximum width of each specimen was recorded for a month of the tadpoles` development, including several measurements after the ultraviolet exposures were concluded. It was found that ultraviolet exposure significantly reduced the size of the organisms in comparison with themore » control group in all three measured areas. Ultraviolet radiation altered the health and appearance of the exposed organisms and was lethal at large amounts. This experiment showed that ultraviolet radiation could cause many problems in developing amphibians. By slowing their development and physically weakening predation, thus contributing to a decline in overall population levels.« less

  9. Two types of geomagnetic storms and relationship between Dst and AE indexes

    NASA Astrophysics Data System (ADS)

    Shadrina, Lyudmila P.

    2017-10-01

    The study of the relationship between Dst and AE indices of the geomagnetic field and its manifestation in geomagnetic storms in the XXIII solar cycle was carried out. It is shown that geomagnetic storms are divided into two groups according to the ratio of the amplitude of Ds index decrease to the sum of the AE index during the main phase of the storm. For the first group it is characteristic that for small depressions of the Dst index, significant amounts of the AE index are observed. Most often these are storms with a gradual beginning and a long main phase associated with recurrent solar wind streams. Storms of the second group differ in large amplitudes of Dst index decrease, shorter duration of main phase and small amounts of AE-index. Usually these are sporadic geomagnetic storms with a sudden commencement caused by interplanetary disturbances of the CME type. The storms of these two types differ also in their geoeffects, including the effect on human health.

  10. Technical report: mercury in the environment: implications for pediatricians.

    PubMed

    Goldman, L R; Shannon, M W

    2001-07-01

    Mercury is a ubiquitous environmental toxin that causes a wide range of adverse health effects in humans. Three forms of mercury (elemental, inorganic, and organic) exist, and each has its own profile of toxicity. Exposure to mercury typically occurs by inhalation or ingestion. Readily absorbed after its inhalation, mercury can be an indoor air pollutant, for example, after spills of elemental mercury in the home; however, industry emissions with resulting ambient air pollution remain the most important source of inhaled mercury. Because fresh-water and ocean fish may contain large amounts of mercury, children and pregnant women can have significant exposure if they consume excessive amounts of fish. The developing fetus and young children are thought to be disproportionately affected by mercury exposure, because many aspects of development, particularly brain maturation, can be disturbed by the presence of mercury. Minimizing mercury exposure is, therefore, essential to optimal child health. This review provides pediatricians with current information on mercury, including environmental sources, toxicity, and treatment and prevention of mercury exposure.

  11. ABS-FishCount: An Agent-Based Simulator of Underwater Sensors for Measuring the Amount of Fish

    PubMed Central

    2017-01-01

    Underwater sensors provide one of the possibilities to explore oceans, seas, rivers, fish farms and dams, which all together cover most of our planet’s area. Simulators can be helpful to test and discover some possible strategies before implementing these in real underwater sensors. This speeds up the development of research theories so that these can be implemented later. In this context, the current work presents an agent-based simulator for defining and testing strategies for measuring the amount of fish by means of underwater sensors. The current approach is illustrated with the definition and assessment of two strategies for measuring fish. One of these two corresponds to a simple control mechanism, while the other is an experimental strategy and includes an implicit coordination mechanism. The experimental strategy showed a statistically significant improvement over the control one in the reduction of errors with a large Cohen’s d effect size of 2.55. PMID:29137165

  12. Lime kiln dust as a potential raw material in portland cement manufacturing

    USGS Publications Warehouse

    Miller, M. Michael; Callaghan, Robert M.

    2004-01-01

    In the United States, the manufacture of portland cement involves burning in a rotary kiln a finely ground proportional mix of raw materials. The raw material mix provides the required chemical combination of calcium, silicon, aluminum, iron, and small amounts of other ingredients. The majority of calcium is supplied in the form of calcium carbonate usually from limestone. Other sources including waste materials or byproducts from other industries can be used to supply calcium (or lime, CaO), provided they have sufficiently high CaO content, have low magnesia content (less than 5 percent), and are competitive with limestone in terms of cost and adequacy of supply. In the United States, the lime industry produces large amounts of lime kiln dust (LKD), which is collected by dust control systems. This LKD may be a supplemental source of calcium for cement plants, if the lime and cement plants are located near enough to each other to make the arrangement economical.

  13. Methods of Controlling the Loop Heat Pipe Operating Temperature

    NASA Technical Reports Server (NTRS)

    Ku, Jentung

    2008-01-01

    The operating temperature of a loop heat pipe (LHP) is governed by the saturation temperature of its compensation chamber (CC); the latter is in turn determined by the balance among the heat leak from the evaporator to the CC, the amount of subcooling carried by the liquid returning to the CC, and the amount of heat exchanged between the CC and ambient. The LHP operating temperature can be controlled at a desired set point by actively controlling the CC temperature. The most common method is to cold bias the CC and use electric heater power to maintain the CC set point temperature. The required electric heater power can be large when the condenser sink is very cold. Several methods have been developed to reduce the control heater power, including coupling block, heat exchanger and separate subcooler, variable conductance heat pipe, by-pass valve with pressure regulator, secondary evaporator, and thermoelectric converter. The paper discusses the operating principles, advantages and disadvantages of each method.

  14. Investigation of plastic debris ingestion by four species of sea turtles collected as bycatch in pelagic Pacific longline fisheries.

    PubMed

    Clukey, Katharine E; Lepczyk, Christopher A; Balazs, George H; Work, Thierry M; Lynch, Jennifer M

    2017-07-15

    Ingestion of marine debris is an established threat to sea turtles. The amount, type, color and location of ingested plastics in the gastrointestinal tracts of 55 sea turtles from Pacific longline fisheries from 2012 to 2016 were quantified, and compared across species, turtle length, body condition, sex, capture location, season and year. Six approaches for quantifying amounts of ingested plastic strongly correlated with one another and included: number of pieces, mass, volume and surface area of plastics, ratio of plastic mass to body mass, and percentage of the mass of gut contents consisting of plastic. All olive ridley (n=37), 90% of green (n=10), 80% of loggerhead (n=5) and 0% of leatherback (n=3) turtles had ingested plastic; green turtles ingested significantly more than olive ridleys. Most debris was in the large intestines. No adverse health impacts (intestinal lesions, blockage, or poor body condition) due directly to plastic ingestion were noted. Copyright © 2017. Published by Elsevier Ltd.

  15. Meat consumption and cancer risk: a critical review of published meta-analyses.

    PubMed

    Lippi, Giuseppe; Mattiuzzi, Camilla; Cervellin, Gianfranco

    2016-01-01

    Dietary habits play a substantial role for increasing or reducing cancer risk. We performed a critical review of scientific literature, to describe the findings of meta-analyses that explored the association between meat consumption and cancer risk. Overall, 42 eligible meta-analyses were included in this review, in which meat consumption was assumed from sheer statistics. Convincing association was found between larger intake of red meat and cancer, especially with colorectal, lung, esophageal and gastric malignancies. Increased consumption of processed meat was also found to be associated with colorectal, esophageal, gastric and bladder cancers. Enhanced intake of white meat or poultry was found to be negatively associated with some types of cancers. Larger beef consumption was significantly associated with cancer, whereas the risk was not increased consuming high amounts of pork. Our analysis suggest increased risk of cancer in subjects consuming large amounts of red and processed meat, but not in those with high intake of white meat or poultry. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. EFFECT OF X RADIATION ON THE AMOUNT OF PROPERDINE SERUM IN THE RAT (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verain, A.; Despaux, E.; Verain, A.

    1958-01-01

    The effect of x radiation on the amount of properdine in rat serum was studied in vivo and in vitro. White rats were submitted to 1000 r, and the amount of properdine before and after irradiation was determined. Serum in vitro was irradiatcd with 1 to 2 Mr. The results showed a rapid, almost constant, deerease of the properdine in the irradiated rat. This effect was found in vitro only when large radiation doses were used. (J.S.R.)

  17. Reliability-based optimization design of geosynthetic reinforced road embankment.

    DOT National Transportation Integrated Search

    2014-07-01

    Road embankments are typically large earth structures, the construction of which requires for large amounts of competent fill soil. In order to limit costs, the utilization of geosynthetics in road embankments allows for construction of steep slopes ...

  18. The Era of the Large Databases: Outcomes After Gastroesophageal Surgery According to NSQIP, NIS, and NCDB Databases. Systematic Literature Review.

    PubMed

    Batista Rodríguez, Gabriela; Balla, Andrea; Fernández-Ananín, Sonia; Balagué, Carmen; Targarona, Eduard M

    2018-05-01

    The term big data refers to databases that include large amounts of information used in various areas of knowledge. Currently, there are large databases that allow the evaluation of postoperative evolution, such as the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP), the Healthcare Cost and Utilization Project (HCUP) National Inpatient Sample (NIS), and the National Cancer Database (NCDB). The aim of this review was to evaluate the clinical impact of information obtained from these registries regarding gastroesophageal surgery. A systematic review using the Meta-analysis of Observational Studies in Epidemiology guidelines was performed. The research was carried out using the PubMed database identifying 251 articles. All outcomes related to gastroesophageal surgery were analyzed. A total of 34 articles published between January 2007 and July 2017 were included, for a total of 345 697 patients. Studies were analyzed and divided according to the type of surgery and main theme in (1) esophageal surgery and (2) gastric surgery. The information provided by these databases is an effective way to obtain levels of evidence not obtainable by conventional methods. Furthermore, this information is useful for the external validation of previous studies, to establish benchmarks that allow comparisons between centers and have a positive impact on the quality of care.

  19. [Histomorphometric evaluation of ridge preservation after molar tooth extraction].

    PubMed

    Zhan, Y L; Hu, W J; Xu, T; Zhen, M; Lu, R F

    2017-02-18

    To evaluate bone formation in human extraction sockets with absorbed surrounding walls augmented with Bio-Oss and Bio-Gide after a 6-month healing period by histologic and histomorphometric analyses. Six fresh molar tooth extraction sockets in 6 patients who required periodontally compromised moral tooth extraction were included in this study. The six fresh extraction sockets were grafted with Bio-Oss particle covered with Bio-Gide. The 2.8 mm×6.0 mm cylindric bone specimens were taken from the graft sites with aid of stent 6 months after the surgery. Histologic and histomorphometric analyses were performed. The histological results showed Bio-Oss particles were easily distinguished from the newly formed bone, small amounts of new bone were formed among the Bio-Oss particles, large amounts of connective tissue were found. Intimate contact between the newly formed bone and the small part of Bio-Oss particles was present. All the biopsy cylinders measurement demonstrated a high inter-individual variability in the percentage of the bone, connective tissues and Bio-Oss particles. The new bone occupied 11.54% (0-28.40%) of the total area; the connective tissues were 53.42% (34.08%-74.59%) and the Bio-Oss particles were 35.04% (13.92%-50.87%). The percentage of the particles, which were in contact with bone tissues, amounted to 20.13% (0-48.50%). Sites grafted with Bio-Oss particles covered with Bio-Gide were comprised of connective tissues and small amounts of newly formed bone surrounding the graft particles.

  20. A population-based, case–control study of green tea consumption and leukemia risk in southwestern Taiwan

    PubMed Central

    Yu, Chu-Ling; Liu, Chen-Yu; Wang, Su-Fen; Pan, Pi-Chen; Wu, Ming-Tsang; Ho, Chi-Kung; Lo, Yu-Shing; Li, Yi; Christiani, David C.

    2011-01-01

    Objective This study investigated the association between green tea consumption and leukemia. Methods A total of 252 cases (90.3% response) and 637 controls (53.4% response) were enrolled. Controls were matched for cases on age and gender. Information was collected on participants’ living habits, including tea consumption. Green tea was used as a standard to estimate the total amount of individual catechin consumption. We stratified individual consumption of catechins into four levels. Conditional logistic regression models were fit to subjects aged 0–15 and 16–29 years to evaluate separate associations between leukemia and catechin consumption. Results A significant inverse association between green tea consumption and leukemia risk was found in individuals aged 16–29 years, whereas no significant association was found in the younger age groups. For the older group with higher amounts of tea consumption (>550 units of catechins), the adjusted odds ratio (OR) compared with the group without tea consumption was 0.47 [95% confidence interval (CI) = 0.23–0.97]. After we adjusted for smoking status and medical irradiation exposure, the overall OR for all participants was 0.49 (95% CI = 0.27–0.91), indicating an inverse relation between large amounts of catechins and leukemia. Conclusion Drinking sufficient amounts of tea, especially green tea, which contains more catechins than oolong tea and black tea, may reduce the risk of leukemia. PMID:18752033

  1. Videometric Applications in Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Radeztsky, R. H.; Liu, Tian-Shu

    1997-01-01

    Videometric measurements in wind tunnels can be very challenging due to the limited optical access, model dynamics, optical path variability during testing, large range of temperature and pressure, hostile environment, and the requirements for high productivity and large amounts of data on a daily basis. Other complications for wind tunnel testing include the model support mechanism and stringent surface finish requirements for the models in order to maintain aerodynamic fidelity. For these reasons nontraditional photogrammetric techniques and procedures sometimes must be employed. In this paper several such applications are discussed for wind tunnels which include test conditions with Mach number from low speed to hypersonic, pressures from less than an atmosphere to nearly seven atmospheres, and temperatures from cryogenic to above room temperature. Several of the wind tunnel facilities are continuous flow while one is a short duration blowdown facility. Videometric techniques and calibration procedures developed to measure angle of attack, the change in wing twist and bending induced by aerodynamic load, and the effects of varying model injection rates are described. Some advantages and disadvantages of these techniques are given and comparisons are made with non-optical and more traditional video photogrammetric techniques.

  2. Acute thyrotoxicosis secondary to destructive thyroiditis associated with cardiac catheterization contrast dye.

    PubMed

    Calvi, Laura; Daniels, Gilbert H

    2011-04-01

    Thyrotoxicosis caused by destructive thyroiditis is self-limited and results from the subacute release of preformed thyroid hormone. Common etiologies include painful subacute thyroiditis and silent (painless) subacute thyroiditis (including postpartum thyroiditis, amiodarone-associated destructive thyroiditis, and lithium-associated thyroiditis). Thyrotoxicosis commonly evolves slowly over a matter of weeks. We report a unique case of severe thyrotoxicosis caused by acute- onset painful destructive thyroiditis in a patient who received large amounts of nonionic contrast dye Hexabrix® for cardiac catheterization. The results of thyroid function and physical examination were normal before the catheterization. The acute onset of severe thyroid pain, rapid increase in serum Free Thyroxine Index, and thyroglobulin concentrations with a triiodothyronine to free thyroxine index ratio of < 20 to 1 were compatible with an acute onset destructive thyroiditis, likely related to direct toxicity from the iodinated contrast material. In light of the large number of patients who receive these contrast agents during cardiac catheterization, clinicians should be advised of this potentially serious complication, particularly in the setting of unstable cardiac disease.

  3. Early Teen Marriage and Future Poverty

    PubMed Central

    DAHL, GORDON B.

    2010-01-01

    Both early teen marriage and dropping out of high school have historically been associated with a variety of negative outcomes, including higher poverty rates throughout life. Are these negative outcomes due to preexisting differences, or do they represent the causal effect of marriage and schooling choices? To better understand the true personal and societal consequences, in this article, I use an instrumental variables (IV) approach that takes advantage of variation in state laws regulating the age at which individuals are allowed to marry, drop out of school, and begin work. The baseline IV estimate indicates that a woman who marries young is 31 percentage points more likely to live in poverty when she is older. Similarly, a woman who drops out of school is 11 percentage points more likely to be poor. The results are robust to a variety of alternative specifications and estimation methods, including limited information maximum likelihood (LIML) estimation and a control function approach. While grouped ordinary least squares (OLS) estimates for the early teen marriage variable are also large, OLS estimates based on individual-level data are small, consistent with a large amount of measurement error. PMID:20879684

  4. Early teen marriage and future poverty.

    PubMed

    Dahl, Gordon B

    2010-08-01

    Both early teen marriage and dropping out of high school have historically been associated with a variety of negative outcomes, including higher poverty rates throughout life. Are these negative outcomes due to preexisting differences, or do they represent the causal effect of marriage and schooling choices? To better understand the true personal and societal consequences, in this article, I use an instrumental variables (IV) approach that takes advantage of variation in state laws regulating the age at which individuals are allowed to marry, drop out of school, and begin work. The baseline IV estimate indicates that a woman who marries young is 31 percentage points more likely to live in poverty when she is older. Similarly, a woman who drops out of school is 11 percentage points more likely to be poor. The results are robust to a variety of alternative specifications and estimation methods, including limited information maximum likelihood (LIML) estimation and a control function approach. While grouped ordinary least squares (OLS) estimates for the early teen marriage variable are also large, OLS estimates based on individual-level data are small, consistent with a large amount of measurement error

  5. Carotenoids in floral parts of a narcissus, a daffodil and a tulip

    PubMed Central

    Valadon, L. R. G.; Mummery, Rosemary S.

    1968-01-01

    1. The qualitative and quantitative distribution of carotenoids of the floral parts of three monocotyledons, the narcissus Scarlet Elegance, the daffodil King Alfred and the tulip Golden Harvest, were studied. β-Carotene, lutein or epoxy-β-carotenes were usually the main pigments, depending on the floral part and on the flower. When β-carotene was the major pigment there were only small amounts of, or sometimes no, epoxycarotenes. 2. Anthers, stigmas and styles of the three flowers did not possess any specific carotenoids but in some cases contained appreciable amounts of epoxycarotenoids. The possibility that these take part in reproduction is discussed. 3. The generalization that yellow flowers contained large amounts of xanthophylls and only traces of carotenes, whereas deep-orange flowers seemed to be characterized by the presence of large amounts of one carotene, was not always the correct one. It is suggested that the floral parts are yellow or orange depending on what carotenoids are present, which is the major one and the amount of total carotenoids, and also on the presence of other non-carotenoid pigments. 4. Two new probable isomers of 5,6:5′,6′-diepoxy-β-carotene were isolated and found together in various floral parts of the tulip Golden Harvest. PMID:5637355

  6. Determining national greenhouse gas emissions from waste-to-energy using the Balance Method.

    PubMed

    Schwarzböck, Therese; Rechberger, Helmut; Cencic, Oliver; Fellner, Johann

    2016-03-01

    Different directives of the European Union require operators of waste-to-energy (WTE) plants to report the amount of electricity that is produced from biomass in the waste feed, as well as the amount of fossil CO2 emissions generated by the combustion of fossil waste materials. This paper describes the application of the Balance Method for determining the overall amount of fossil and thus climate relevant CO2 emissions from waste incineration in Austria. The results of 10 Austrian WTE plants (annual waste throughput of around 2,300 kt) demonstrate large seasonal variations in the specific fossil CO2 emissions of the plants as well as large differences between the facilities (annual means range from 32±2 to 51±3 kg CO(2,foss)/GJ heating value). An overall amount of around 924 kt/yr of fossil CO2 for all 10 WTE plants is determined. In comparison biogenic (climate neutral) CO2 emissions amount to 1,187 kt/yr, which corresponds to 56% of the total CO2 emissions from waste incineration. The total energy input via waste feed to the 10 facilities is about 22,500 TJ/yr, of which around 48% can be assigned to biogenic and thus renewable sources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. The Dynamics of Pheromone Gland Synthesis and Release: a Paradigm Shift for Understanding Sex Pheromone Quantity in Female Moths.

    PubMed

    Foster, Stephen P; Anderson, Karin G; Casas, Jérôme

    2018-05-10

    Moths are exemplars of chemical communication, especially with regard to specificity and the minute amounts they use. Yet, little is known about how females manage synthesis and storage of pheromone to maintain release rates attractive to conspecific males and why such small amounts are used. We developed, for the first time, a quantitative model, based on an extensive empirical data set, describing the dynamical relationship among synthesis, storage (titer) and release of pheromone over time in a moth (Heliothis virescens). The model is compartmental, with one major state variable (titer), one time-varying (synthesis), and two constant (catabolism and release) rates. The model was a good fit, suggesting it accounted for the major processes. Overall, we found the relatively small amounts of pheromone stored and released were largely a function of high catabolism rather than a low rate of synthesis. A paradigm shift may be necessary to understand the low amounts released by female moths, away from the small quantities synthesized to the (relatively) large amounts catabolized. Future research on pheromone quantity should focus on structural and physicochemical processes that limit storage and release rate quantities. To our knowledge, this is the first time that pheromone gland function has been modeled for any animal.

  8. Effect of copper chloride on the emissions of PCDD/Fs and PAHs from PVC combustion.

    PubMed

    Wang, Dongli; Xu, Xiaobai; Zheng, Minghui; Chiu, Chung H

    2002-09-01

    The influences of temperature, air flow and the amount of copper chloride upon the types and amount of the toxic emissions such as polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs) and polycyclic aromatic hydrocarbons (PAHs) during combustion of polyvinyl chloride (PVC) were investigated. The mechanism concerning the effect of temperature and copper chloride on the PCDD/Fs and PAHs formation was discussed. The results shown that without copper chloride, trace amounts of PCDD/Fs and large amounts of PAHs were found in the emissions from the pure PVC combustion under various combustion conditions. The addition of copper chloride enhanced PCDD/Fs formation, but it seems that the formation of PAHs decreased with increasing amount of copper chloride, and greater total amount of PAHs were produced at the higher temperature under our experimental conditions.

  9. The evolution of the quasar continuum

    NASA Technical Reports Server (NTRS)

    Elvis, M.

    1992-01-01

    We now have in hand a large data base of Roentgen Satellite (ROSAT), optical, and IR complementary data. We are in the process of obtaining a large amount of the International Ultraviolet Explorer (IUE) data for the same quasar sample. For our complementary sample at high redshifts, where the UV was redshifted into the optical, we have just had approved large amounts of observing time to cover the quasar continuum in the near-IR using the new Near-Infrared Camera and Multi-Object Spectrometer (NICMOS) array spectrographs. Ten micron, optical, and VLA radio, data also have approved time. An ISO US key program was approved to extend this work into the far-IR, and the launch of ASTRO-D (early in 1993) promises to extend it to higher energy X-rays.

  10. Using Geomorphic Change Detection to Understand Restoration Project Success Relative to Stream Size

    NASA Astrophysics Data System (ADS)

    Yeager, A.; Segura, C.

    2017-12-01

    Large wood (LW) jams have long been utilized as a stream restoration strategy to create fish habitat, with a strong focus on Coho salmon in the Pacific Northwest. These projects continue to be implemented despite limited understanding of their success in streams of different size. In this study, we assessed the changes triggered by LW introductions in 10 alluvial plane bed reaches with varying drainage areas (3.9-22 km²) and bankfull widths (6.4-14.7 m) in one Oregon Coast Range basin. In this basin, LW was added in an effort to improve winter rearing habitat for Coho salmon. We used detailed topographic mapping (0.5 m² resolution) to describe the local stream and floodplain geometry. Pebble counts were used to monitor changes in average substrate size after the LW addition. Field surveys were conducted immediately after the LW were installed, in the summer of 2016, and one year after installation, in the summer of 2017. We used geomorphic change detection analysis to quantify the amount of scour and deposition at each site along with changes in average bankfull width. Then we determined the relative amount of change among all sites to identify which size stream changed the most. We also modeled fluctuations in water surface elevation at each site, correlating frequency and inundation of the LW with geomorphic changes detected from the topographic surveys. Preliminary results show an increase in channel width and floodplain connectivity at all sites, indicating an increase in off-channel habitat for juvenile Coho salmon. Bankfull widths increased up to 75% in small sites and up to 25% in large sites. Median grain size became coarser in large streams (increased up to 20%), while we saw a similar amount of fining at smaller sites. The overall increase in channel width is compensated by an overall decrease in bed elevation at both large and small sites, suggesting the maintenance of overall geomorphic equilibrium. Further work will include quantifying these geomorphic changes in the context of critical salmon habitat factors. By identifying which size stream changes the most after LW introduction, and linking this change to salmon habitat metrics, we will provide information to aid in optimizing future LW stream restoration efforts that focus on stream reaches likely to experience the greatest increase in fish habitat.

  11. Age differences in the effect of framing on risky choice: A meta-analysis

    PubMed Central

    Best, Ryan; Charness, Neil

    2015-01-01

    The framing of decision scenarios in terms of potential gains versus losses has been shown to influence choice preferences between sure and risky options. Normative cognitive changes associated with aging have been known to affect decision-making, which has led to a number of studies investigating the influence of aging on the effect of framing. Mata, Josef, Samanez-Larkin, and Hertwig (2011) systematically reviewed the available literature using a meta-analytic approach, but did not include tests of homogeneity nor subsequent moderator variable analyses. The current review serves to extend the previous analysis to include such tests as well as update the pool of studies available for analysis. Results for both positively and negatively framed conditions were reviewed using two meta-analyses encompassing data collected from 3,232 subjects across 18 studies. Deviating from the previous results, the current analysis finds a tendency for younger adults to choose the risky option more often than older adults for positively framed items. Moderator variable analyses find this effect to likely be driven by the specific decision scenario, showing a significant effect with younger adults choosing the risky option more often in small-amount financial and large-amount mortality-based scenarios. For negatively framed items, the current review found no overall age difference in risky decision making, confirming the results from the prior meta-analysis. Moderator variable analyses conducted to address heterogeneity found younger adults to be more likely than older adults to choose the risky option for negatively framed high-amount mortality-based decision scenarios. Practical implications for older adults are discussed. PMID:26098168

  12. Age differences in the effect of framing on risky choice: A meta-analysis.

    PubMed

    Best, Ryan; Charness, Neil

    2015-09-01

    The framing of decision scenarios in terms of potential gains versus losses has been shown to influence choice preferences between sure and risky options. Normative cognitive changes associated with aging have been known to affect decision making, which has led to a number of studies investigating the influence of aging on the effect of framing. Mata, Josef, Samanez-Larkin, and Hertwig (2011) systematically reviewed the available literature using a meta-analytic approach, but did not include tests of homogeneity or subsequent moderator variable analyses. The current review serves to extend the previous analysis to include such tests as well as update the pool of studies available for analysis. Results for both positively and negatively framed conditions were reviewed using 2 meta-analyses encompassing data collected from 3,232 subjects across 18 studies. Deviating from the previous results, the current analysis found a tendency for younger adults to choose the risky option more often than older adults for positively framed items. Moderator variable analyses found this effect likely to be driven by the specific decision scenario, showing a significant effect, with younger adults choosing the risky option more often in small-amount financial and large-amount mortality-based scenarios. For negatively framed items, the current review found no overall age difference in risky decision making, confirming the results from the prior meta-analysis. Moderator variable analyses conducted to address heterogeneity found younger adults to be more likely than older adults to choose the risky option for negatively framed high-amount mortality-based decision scenarios. Practical implications for older adults are discussed. (c) 2015 APA, all rights reserved).

  13. A systematic review to assess comparative effectiveness studies in epidural steroid injections for lumbar spinal stenosis and to estimate reimbursement amounts.

    PubMed

    Bresnahan, Brian W; Rundell, Sean D; Dagadakis, Marissa C; Sullivan, Sean D; Jarvik, Jeffrey G; Nguyen, Hiep; Friedly, Janna L

    2013-08-01

    To systematically appraise published comparative effectiveness evidence (clinical and economic) of epidural steroid injections (ESI) for lumbar spinal stenosis and to estimate Medicare reimbursement amounts for ESI procedures. TYPE: Systematic review. PubMed, Embase, and CINAHL were searched through August 2012 for key words that pertain to low back pain, spinal stenosis or sciatica, and epidural steroid injection. We used institutional and Medicare reimbursement amounts for our cost estimation. Articles published in English that assessed ESIs for adults with lumbar spinal stenosis versus a comparison intervention were included. Our search identified 146 unique articles, and 138 were excluded due to noncomparative study design, not having a study population with lumbar spinal stenosis, not having an appropriate outcome, or not being in English. We fully summarized 6 randomized controlled trials and 2 large observational studies. Randomized controlled trial articles were reviewed, and the study population, sample size, treatment groups, ESI dosage, ESI approaches, concomitant interventions, outcomes, and follow-up time were reported. Descriptive resource use estimates for ESIs were calculated with use of data from our institution during 2010 and Medicare-based reimbursement amounts. ESIs or anesthetic injections alone resulted in better short-term improvement in walking distance compared with control injections. However, there were no longer-term differences. No differences between ESIs versus anesthetic in self-reported improvement in pain were reported. Transforaminal approaches had better improvement in pain scores (≤4 months) compared with interlaminar injections. Two observational studies indicated increased rates of lumbar ESI in Medicare beneficiaries. Our sample included 279 patients who received at least 1 ESI during 2010, with an estimated mean total outpatient reimbursement for one ESI procedure "event" to be $637, based on 2010 Medicare reimbursement amounts ($505 technical and $132 professional payments). This systematic review of ESI for treating lumbar spinal stenosis found a limited amount of data that suggest that ESI is effective in some patients for improving select short-term outcomes, but results differed depending on study design, outcome measures used, and comparison groups evaluated. Overall, there are relatively few comparative clinical or economic studies for ESI procedures for lumbar spinal stenosis in adults, which indicated a need for additional evidence. Copyright © 2013. Published by Elsevier Inc.

  14. Medical Malpractice Damage Caps and Provider Reimbursement.

    PubMed

    Friedson, Andrew I

    2017-01-01

    A common state legislative maneuver to combat rising healthcare costs is to reform the tort system by implementing caps on noneconomic damages awardable in medical malpractice cases. Using the implementation of caps in several states and large database of private insurance claims, I estimate the effect of damage caps on the amount providers charge to insurance companies as well as the amount that insurance companies reimburse providers for medical services. The amount providers charge insurers is unresponsive to tort reform, but the amount that insurers reimburse providers decreases for some procedures. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Consuming the daily recommended amounts of dairy products would reduce the prevalence of inadequate micronutrient intakes in the United States: diet modeling study based on NHANES 2007-2010.

    PubMed

    Quann, Erin E; Fulgoni, Victor L; Auestad, Nancy

    2015-09-04

    A large portion of Americans are not meeting the Dietary Reference Intakes (DRI) for several essential vitamins and minerals due to poor dietary choices. Dairy products are a key source of many of the nutrients that are under consumed, but children and adults do not consume the recommended amounts from this food group. This study modeled the impact of meeting daily recommended amounts of dairy products on population-based nutrient intakes. Two-day 24-h dietary recalls collected from participants ≥ 2 years (n = 8944) from the 2007-2010 What We Eat in America, National Health and Nutrition Examination Survey (NHANES) were analyzed. Databases available from the WWEIA/NHANES and the United States Department of Agriculture (USDA) were used to determine nutrient, food group, and dietary supplement intakes. Modeling was performed by adding the necessary number of dairy servings, using the dairy composite designed by USDA, to each participant's diet to meet the dairy recommendations outlined by the 2010 Dietary Guidelines for Americans. All analyses included sample weights to account for the NHANES survey design. The majority of children 4 years and older (67.4-88.8%) and nearly all adults (99.0-99.6%) fall below the recommended 2.5-3 daily servings of dairy products. Increasing dairy consumption to recommended amounts would result in a significant reduction in the percent of adults with calcium, magnesium, and vitamin A intakes below the Estimated Average Requirement (EAR) when considering food intake alone (0-2.0 vs. 9.9-91.1%; 17.3-75.0 vs. 44.7-88.5%; 0.1-15.1 vs. 15.3-48.0%, respectively), as well as food and dietary supplement intake. Minimal, but significant, improvements were observed for the percent of people below the EAR for vitamin D (91.7-99.9 vs. 91.8-99.9%), and little change was achieved for the large percentage of people below the Adequate Intake for potassium. Increasing dairy food consumption to recommended amounts is one practical dietary change that could significantly improve the population's adequacy for certain vitamins and minerals that are currently under-consumed, as well as have a positive impact on health.

  16. Preparation of biochar from sewage sludge

    NASA Astrophysics Data System (ADS)

    Nieto, Aurora; María Méndez, Ana; Gascó, Gabriel

    2013-04-01

    Biomass waste materials appropriate for biochar production include crop residues (both field residues and processing residues such as nut shells, fruit pits, bagasse, etc), as well as yard, food and forestry wastes, and animal manures. Biochar can and should be made from biomass waste materials and must not contain unacceptable levels of toxins such as heavy metals which can be found in sewage sludge and industrial or landfill waste. Making biochar from biomass waste materials should create no competition for land with any other land use option—such as food production or leaving the land in its pristine state. Large amounts of agricultural, municipal and forestry biomass are currently burned or left to decompose and release CO2 and methane back into the atmosphere. They also can pollute local ground and surface waters—a large issue for livestock wastes. Using these materials to make biochar not only removes them from a pollution cycle, but biochar can be obtained as a by-product of producing energy from this biomass. Sewage sludge is a by-product from wastewater treatment plants, and contains significant amounts of heavy metals, organic toxins and pathogenic microorganisms, which are considered to be harmful to the environment and all living organisms. Agricultural use, land filling and incineration are commonly used as disposal methods. It was, however, reported that sewage sludge applications in agriculture gives rise to an accumulation of harmful components (heavy metals and organic compounds) in soil. For this reason, pyrolysis can be considered as a promising technique to treat the sewage sludge including the production of fuels. The objective of this work is to study the advantages of the biochar prepared from sewage sludge.

  17. Chemical Contaminants as Stratigraphic Markers for the Anthropocene

    NASA Astrophysics Data System (ADS)

    Kruge, M. A.

    2012-12-01

    Thousands and even millions of years from now, widespread anthropogenic contaminants in sediments would likely persist, incorporated into the geological record. They would inadvertently preserve evidence of our present era (informally designated as the Anthropocene Epoch) characterized by large human populations engaged in intensive industrial and agricultural activities. Hypothetical geologists in the distant future would likely find unusually high concentrations of a wide variety of contaminants at stratigraphic levels corresponding to our present time, analogous to the iridium anomaly marking the bolide impact event at the close of the Cretaceous Period. These would include both organic and inorganic substances, such as industrially-derived heavy metals (e.g., Hg, Pb, Cr, Zn) and hydrocarbons, both petrogenic (derived directly from petroleum) and pyrogenic (combustion products). While there are natural sources for these materials, such as volcanic eruptions, wildfires, and oil seeps, their co-occurrence would provide a signature characteristic of human activity. Diagnostic assemblages of organic compounds would carry an anthropogenic imprint. The distribution of polycyclic aromatic hydrocarbons (PAHs) in a sediment sample could distinguish between natural and human sources. Stable isotopic signatures would provide additional evidence. Concentrations of contaminants in the sedimentary record would increase exponentially with increasing proximity to urban source areas, where at present billions of people are collectively consuming vast quantities of fossil fuels and generating large amounts of waste. Aolian and marine transport prior to deposition has been seen at present to globally redistribute detectable amounts of contaminants including Hg and PAHs, even at great distances from principal source areas. For organic contaminants, deposition in an anoxic sedimentary environment could insure their preservation, increasing the likelihood of their inclusion in the long-term stratigraphic record, establishing markers of the Anthropocene Epoch for millions of years to come.

  18. Northern Forest DroughtNet: A New Framework to Understand Impacts of Precipitation Change on the Northern Forest Ecosystem

    NASA Astrophysics Data System (ADS)

    Asbjornsen, H.; Rustad, L.; Templer, P. H.; Jennings, K.; Phillips, R.; Smith, M.

    2014-12-01

    Recent trends and projections for future change for the U.S. northern forests suggest that the region's climate is becoming warmer, wetter, and, ironically, drier, with more precipitation occurring as large events, separated by longer periods with no precipitation. However, to date, precipitation manipulation experiments conducted in forest ecosystems represent only ~5% of all such experiments worldwide, and our understanding of how the mesic-adapted northern forest will respond to greater frequency and intensity of drought in the future is especially poor. Several important challenges have hampered previous research efforts to conduct forest drought experiments and draw robust conclusions, including difficulties in reducing water uptake by deep and lateral tree roots, logistical and financial constraints to establishing and maintaining large-scale field experiments, and the lack of standardized approaches for determining the appropriate precipitation manipulation treatment (e.g., amount and timing of throughfall displacement), designing and constructing the throughfall displacement infrastructure, identifying key response variables, and collecting and analyzing the field data. The overarching goal of this project is to establish a regional research coordination network - Northern Forest DroughtNet - to investigate the impacts of changes in the amount and distribution of precipitation on the hydrology, biogeochemistry, and carbon (C) cycling dynamics of northern temperate forests. Specific objectives include the development of a standard prototype for conducting precipitation manipulation studies in forest ecosystems (in collaboration with the international DroughtNet-RCN) and the implementation of this prototype drought experiment at the Hubbard Brook Experimental Forest. Here, we present the advances made thus far towards achieving the objectives of Northern Forest DroughtNet, plans for future work, and an invitation to the larger scientific community interested in precipitation manipulation experiments in forest ecosystems to participate in the network.

  19. Mapping patient safety: a large-scale literature review using bibliometric visualisation techniques.

    PubMed

    Rodrigues, S P; van Eck, N J; Waltman, L; Jansen, F W

    2014-03-13

    The amount of scientific literature available is often overwhelming, making it difficult for researchers to have a good overview of the literature and to see relations between different developments. Visualisation techniques based on bibliometric data are helpful in obtaining an overview of the literature on complex research topics, and have been applied here to the topic of patient safety (PS). On the basis of title words and citation relations, publications in the period 2000-2010 related to PS were identified in the Scopus bibliographic database. A visualisation of the most frequently cited PS publications was produced based on direct and indirect citation relations between publications. Terms were extracted from titles and abstracts of the publications, and a visualisation of the most important terms was created. The main PS-related topics studied in the literature were identified using a technique for clustering publications and terms. A total of 8480 publications were identified, of which the 1462 most frequently cited ones were included in the visualisation. The publications were clustered into 19 clusters, which were grouped into three categories: (1) magnitude of PS problems (42% of all included publications); (2) PS risk factors (31%) and (3) implementation of solutions (19%). In the visualisation of PS-related terms, five clusters were identified: (1) medication; (2) measuring harm; (3) PS culture; (4) physician; (5) training, education and communication. Both analysis at publication and term level indicate an increasing focus on risk factors. A bibliometric visualisation approach makes it possible to analyse large amounts of literature. This approach is very useful for improving one's understanding of a complex research topic such as PS and for suggesting new research directions or alternative research priorities. For PS research, the approach suggests that more research on implementing PS improvement initiatives might be needed.

  20. A method for analysing small samples of floral pollen for free and protein-bound amino acids.

    PubMed

    Stabler, Daniel; Power, Eileen F; Borland, Anne M; Barnes, Jeremy D; Wright, Geraldine A

    2018-02-01

    Pollen provides floral visitors with essential nutrients including proteins, lipids, vitamins and minerals. As an important nutrient resource for pollinators, including honeybees and bumblebees, pollen quality is of growing interest in assessing available nutrition to foraging bees. To date, quantifying the protein-bound amino acids in pollen has been difficult and methods rely on large amounts of pollen, typically more than 1 g. More usual is to estimate a crude protein value based on the nitrogen content of pollen, however, such methods provide no information on the distribution of essential and non-essential amino acids constituting the proteins.Here, we describe a method of microwave-assisted acid hydrolysis using low amounts of pollen that allows exploration of amino acid composition, quantified using ultra high performance liquid chromatography (UHPLC), and a back calculation to estimate the crude protein content of pollen.Reliable analysis of protein-bound and free amino acids as well as an estimation of crude protein concentration was obtained from pollen samples as low as 1 mg. Greater variation in both protein-bound and free amino acids was found in pollen sample sizes <1 mg. Due to the variability in recovery of amino acids in smaller sample sizes, we suggest a correction factor to apply to specific sample sizes of pollen in order to estimate total crude protein content.The method described in this paper will allow researchers to explore the composition of amino acids in pollen and will aid research assessing the available nutrition to pollinating animals. This method will be particularly useful in assaying the pollen of wild plants, from which it is difficult to obtain large sample weights.

  1. Comparison of oligosaccharides in milk specimens from humans and twelve other species.

    PubMed

    Warren, C D; Chaturvedi, P; Newburg, A R; Oftedal, O T; Tilden, C D; Newburg, D S

    2001-01-01

    Human milk contains large amounts of many oligosaccharides, most of which are fucosylated; several inhibit pathogenic bacteria, viruses, and toxins that cause disease in humans. Although bovine milk is known to have much less and many fewer types of oligosaccharides, no studies heretofore have indicated whether the amount or complexity of human milk oligosaccharides is unique to our species. Toward this end, a comparison was made of the major individual oligosaccharides in milk specimens from a variety of species, including the great apes. The neutral compounds, which represent the bulk of oligosaccharides in human milk, were isolated, perbenzoylated, resolved by high performance liquid chromatography (HPLC), and detected at 229nm. Ambiguous structures were determined by mass spectrometry. All milk specimens contained lactose, although levels were quite low in bear and kangaroo milk. The types of oligosaccharides in milk specimens from the primates resembled those of human milk, but the amounts, especially of the larger molecules, were markedly lower. The relative amounts of oligosaccharides in the bonobo changed over the course of lactation, as they do in humans. Marine mammals generally had few oligosaccharides in their milk other than 2'-fucosyllactose. Grizzly and black bear milk specimens contained a wide range of oligosaccharides, many of which had novel, fucosylated structures. Milk specimens from humans, bears, and marsupials had the greatest quantity of, and the most complex, neutral oligosaccharides. Although human milk contained more oligosaccharide than did milk specimens from the other species studied, the presence of appreciable amounts of complex oligosaccharides was not unique to humans. This finding suggests that in animal milk specimens, as in human milk, neutral fucosylated oligosaccharides potentially offer protection from pathogens to offspring with immature immune systems.

  2. General method and thermodynamic tables for computation of equilibrium composition and temperature of chemical reactions

    NASA Technical Reports Server (NTRS)

    Huff, Vearl N; Gordon, Sanford; Morrell, Virginia E

    1951-01-01

    A rapidly convergent successive approximation process is described that simultaneously determines both composition and temperature resulting from a chemical reaction. This method is suitable for use with any set of reactants over the complete range of mixture ratios as long as the products of reaction are ideal gases. An approximate treatment of limited amounts of liquids and solids is also included. This method is particularly suited to problems having a large number of products of reaction and to problems that require determination of such properties as specific heat or velocity of sound of a dissociating mixture. The method presented is applicable to a wide variety of problems that include (1) combustion at constant pressure or volume; and (2) isentropic expansion to an assigned pressure, temperature, or Mach number. Tables of thermodynamic functions needed with this method are included for 42 substances for convenience in numerical computations.

  3. Collisional evolution of rotating, non-identical particles. [in Saturn rings

    NASA Technical Reports Server (NTRS)

    Salo, H.

    1987-01-01

    Hameen-Anttila's (1984) theory of self-gravitating collisional particle disks is extended to include the effects of particle spin. Equations are derived for the coupled evolution of random velocities and spins, showing that friction and surface irregularity both reduce the local velocity dispersion and transfer significant amounts of random kinetic energy to rotational energy. Results for the equilibrium ratio of rotational energy to random kinetic energy are exact not only for identical nongravitating mass points, but also if finite size, self-gravitating forces, or size distribution are included. The model is applied to the dynamics of Saturn's rings, showing that the inclusion of rotation reduces the geometrical thickness of the layer of cm-sized particles to, at most, about one-half, with large particles being less affected.

  4. Field test report of the Department of Energy's 100-kW vertical axis wind turbine

    NASA Astrophysics Data System (ADS)

    Nellums, R. O.

    1985-02-01

    Three second generation Darrieus type vertical axis wind turbines of approximately 120 kW capacity per unit were installed in 1980-1981. Through March 1984, over 9000 hours of operation had been accumulated, including 6600 hours of operation on the unit installed in Bushland, Texas. The turbines were heavily instrumented and have yielded a large amount of test data. Test results of this program, including aerodynamic, structural, drive train, and economic data are presented. Among the most favorable results were an aerodynamic peak performance coefficient of 0.41; fundamental structural integrity requiring few repairs and no major component replacements as of March 1984; and an average prototype fabrication cost of approximately $970 per peak kilowatt of output. A review of potential design improvements is presented.

  5. Interactive performance and focus groups with adolescents: the power of play.

    PubMed

    Norris, Anne E; Aroian, Karen J; Warren, Stefanie; Wirth, Jeff

    2012-12-01

    Conducting focus groups with adolescents can be challenging given their developmental needs, particularly with sensitive topics. These challenges include intense need for peer approval, declining social trust, short attention span, and reliance on concrete operations thinking. In this article, we describe an adaptation of interactive performance as an alternative to traditional focus group method. We used this method in a study of discrimination experienced by Muslims (ages 13-17) and of peer pressure to engage in sexual behavior experienced by Hispanic girls (ages 10-14). Recommendations for use of this method include using an interdisciplinary team, planning for large amounts of disclosure towards the end of the focus group, and considering the fit of this method to the study topic. Copyright © 2012 Wiley Periodicals, Inc.

  6. Using Deep Learning to Analyze the Voices of Stars.

    NASA Astrophysics Data System (ADS)

    Boudreaux, Thomas Macaulay

    2018-01-01

    With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and compare the performance of different deep learning algorithms, including Artifical Neural Netoworks, and Convolutional Neural Networks, in classifing these synthetic data sets as either pulsators, or not observed to vary stars.

  7. Acute toxicity in five dogs after ingestion of a commercial snail and slug bait containing iron EDTA.

    PubMed

    Haldane, S L; Davis, R M

    2009-07-01

    This case series of five dogs describes the effects of ingesting large amounts of an iron EDTA snail-bait product. In all cases signs of toxicity occurred between 6 and 24 h after ingestion and included abdominal pain and haemorrhagic gastroenteritis. Two of the dogs had pretreatment serum iron levels measured and in both cases the levels were above normal limits. All of the dogs were treated with iron chelation therapy and supportive care including intravenous fluids, analgesics, gastric protectants and antibiotics. Chelation therapy with desferrioxamine mesylate did not cause adverse effects in any of the dogs and all survived to discharge. The effects of iron EDTA snail bait in dogs requires further study and minimum toxic doses need to be established.

  8. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  9. "Women in Astronomy: an Essay Review"

    NASA Astrophysics Data System (ADS)

    Cox, M.

    2006-12-01

    Interest in the history of women in astronomy has increased dramatically in the last 30 years. This interest has come from the growing number of professional scientists, historians and feminists researching the lives and work of earlier generations, as well as from amateur astronomers. It is reflected in the vast amount of literature on the subject, both in books and journals, and on the internet. This Essay Review will focus on monographs published in the last 10 years (1996-2006), and will be restricted mainly to pre-20th century women. The scope includes researchers, translators, computers and astronomical assistants as well as observers. Where appropriate, it includes books that discuss the role of women scientists, as well as pure astronomy books. Part 2, to be published later, will consider encyclopaedias and large works of reference .

  10. Does Tinnitus Distress Depend on Age of Onset?

    PubMed Central

    Schlee, Winfried; Kleinjung, Tobias; Hiller, Wolfgang; Goebel, Gerhard; Kolassa, Iris-Tatjana; Langguth, Berthold

    2011-01-01

    Objectives Tinnitus is the perception of a sound in the absence of any physical source of it. About 5–15% of the population report hearing such a tinnitus and about 1–2% suffer from their tinnitus leading to anxiety, sleep disorders or depression. It is currently not completely understood why some people feel distressed by their tinnitus, while others don't. Several studies indicate that the amount of tinnitus distress is associated with many factors including comorbid anxiety, comorbid depression, personality, the psychosocial situation, the amount of the related hearing loss and the loudness of the tinnitus. Furthermore, theoretical considerations suggest an impact of the age at tinnitus onset influencing tinnitus distress. Methods Based on a sample of 755 normal hearing tinnitus patients we tested this assumption. All participants answered a questionnaire on the amount of tinnitus distress together with a large variety of clinical and demographic data. Results Patients with an earlier onset of tinnitus suffer significantly less than patients with an onset later in life. Furthermore, patients with a later onset of tinnitus describe their course of tinnitus distress as more abrupt and distressing right from the beginning. Conclusion We argue that a decline of compensatory brain plasticity in older age accounts for this age-dependent tinnitus decompensation. PMID:22125612

  11. Impact of water quality on chlorine demand of corroding copper

    EPA Pesticide Factsheets

    Copper is widely used in drinking water premise plumbing system materials. In buildings such ashospitals, large and complicated plumbing networks make it difficult to maintain good water quality.Sustaining safe disinfectant residuals throughout a building to protect against waterborne pathogenssuch as Legionella is particularly challenging since copper and other reactive distribution system materialscan exert considerable demands. The objective of this work was to evaluate the impact of pH andorthophosphate on the consumption of free chlorine associated with corroding copper pipes over time. Acopper test-loop pilot system was used to control test conditions and systematically meet the studyobjectives. Chlorine consumption trends attributed to abiotic reactions with copper over time weredifferent for each pH condition tested, and the total amount of chlorine consumed over the test runsincreased with increasing pH. Orthophosphate eliminated chlorine consumption trends with elapsedtime (i.e., chlorine demand was consistent across entire test runs). Orthophosphate also greatly reducedthe total amount of chlorine consumed over the test runs. Interestingly, the total amount of chlorineconsumed and the consumption rate were not pH dependent when orthophosphate was present. Thefindings reflect the complex and competing reactions at the copper pipe wall including corrosion,oxidation of Cu(I) minerals and ions, and possible oxidation of Cu(II) minerals, and the change in

  12. Small- bowel mucosal changes and antibody responses after low- and moderate-dose gluten challenge in celiac disease

    PubMed Central

    2011-01-01

    Background Due to the restrictive nature of a gluten-free diet, celiac patients are looking for alternative therapies. While drug-development programs include gluten challenges, knowledge regarding the duration of gluten challenge and gluten dosage is insufficient. We challenged adult celiac patients with gluten with a view to assessing the amount needed to cause some small-bowel mucosal deterioration. Methods Twenty-five celiac disease adults were challenged with low (1-3 g) or moderate (3-5g) doses of gluten daily for 12 weeks. Symptoms, small-bowel morphology, densities of CD3+ intraepithelial lymphocytes (IELs) and celiac serology were determined. Results Both moderate and low amounts of gluten induced small-bowel morphological damage in 67% of celiac patients. Moderate gluten doses also triggered mucosal inflammation and more gastrointestinal symptoms leading to premature withdrawals in seven cases. In 22% of those who developed significant small- intestinal damage, symptoms remained absent. Celiac antibodies seroconverted in 43% of the patients. Conclusions Low amounts of gluten can also cause significant mucosal deterioration in the majority of the patients. As there are always some celiac disease patients who will not respond within these conditions, sample sizes must be sufficiently large to attain to statistical power in analysis. PMID:22115041

  13. Measurement of tritium penetration through concrete material covered by various paints coating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edao, Y.; Kawamura, Y.; Kurata, R.

    The present study aims at obtaining fundamental data on tritium migration in porous materials, which include soaking effect, interaction between tritium and cement paste coated with paints and transient tritium sorption in porous cement. The amounts of tritium penetrated into or released from cement paste with epoxy and urethane paint coatings were measured. The tritium penetration amounts were increased with the HTO (tritiated water) exposure time. Time to achieve a saturated value of tritium sorption was more than 60 days for cement paste coated with epoxy paint and with urethane paint, while that for cement paste without any paint coatingmore » took 2 days to achieve it. The effect of tritium permeation reduction by the epoxy paint was higher than that of the urethane. Although their paint coatings were effective for reduction of tritium penetration through the cement paste which was exposed to HTO for a short period, it was found that the amount of tritium trapped in the paints became large for a long period. Tritium penetration rates were estimated by an analysis of one-dimensional diffusion in the axial direction of a thickness of a sample. Obtained data were helpful for evaluation of tritium contamination and decontamination. (authors)« less

  14. Futures of global urban expansion: uncertainties and implications for biodiversity conservation

    NASA Astrophysics Data System (ADS)

    Güneralp, B.; Seto, K. C.

    2013-03-01

    Urbanization will place significant pressures on biodiversity across the world. However, there are large uncertainties in the amount and location of future urbanization, particularly urban land expansion. Here, we present a global analysis of urban extent circa 2000 and probabilistic forecasts of urban expansion for 2030 near protected areas and in biodiversity hotspots. We estimate that the amount of urban land within 50 km of all protected area boundaries will increase from 450 000 km2 circa 2000 to 1440 000 ± 65 000 km2 in 2030. Our analysis shows that protected areas around the world will experience significant increases in urban land within 50 km of their boundaries. China will experience the largest increase in urban land near protected areas with 304 000 ± 33 000 km2 of new urban land to be developed within 50 km of protected area boundaries. The largest urban expansion in biodiversity hotspots, over 100 000 ± 25 000 km2, is forecasted to occur in South America. Uncertainties in the forecasts of the amount and location of urban land expansion reflect uncertainties in their underlying drivers including urban population and economic growth. The forecasts point to the need to reconcile urban development and biodiversity conservation strategies.

  15. The use of waste materials for concrete production in construction applications

    NASA Astrophysics Data System (ADS)

    Teara, Ashraf; Shu Ing, Doh; Tam, Vivian WY

    2018-04-01

    To sustain the environment, it is crucial to find solutions to deal with waste, pollution, depletion and degradation resources. In construction, large amounts of concrete from buildings’ demolitions made up 30-40 % of total wastes. Expensive dumping cost, landfill taxes and limited disposal sites give chance to develop recycled concrete. Recycled aggregates were used for reconstructing damaged infrastructures and roads after World War II. However, recycled concrete consists fly ash, slag and recycled aggregate, is not widely used because of its poor quality compared with ordinary concrete. This research investigates the possibility of using recycled concrete in construction applications as normal concrete. Methods include varying proportion of replacing natural aggregate by recycled aggregate, and the substitute of cement by associated slag cement with fly ash. The study reveals that slag and fly ash are effective supplementary elements in improving the properties of the concrete with cement. But, without cement, these two elements do not play an important role in improving the properties. Also, slag is more useful than fly ash if its amount does not go higher than 50%. Moreover, recycled aggregate contributes positively to the concrete mixture, in terms of compression strength. Finally, concrete strength increases when the amount of the RA augments, related to either the high quality of RA or the method of mixing, or both.

  16. Mining moving object trajectories in location-based services for spatio-temporal database update

    NASA Astrophysics Data System (ADS)

    Guo, Danhuai; Cui, Weihong

    2008-10-01

    Advances in wireless transmission and mobile technology applied to LBS (Location-based Services) flood us with amounts of moving objects data. Vast amounts of gathered data from position sensors of mobile phones, PDAs, or vehicles hide interesting and valuable knowledge and describe the behavior of moving objects. The correlation between temporal moving patterns of moving objects and geo-feature spatio-temporal attribute was ignored, and the value of spatio-temporal trajectory data was not fully exploited too. Urban expanding or frequent town plan change bring about a large amount of outdated or imprecise data in spatial database of LBS, and they cannot be updated timely and efficiently by manual processing. In this paper we introduce a data mining approach to movement pattern extraction of moving objects, build a model to describe the relationship between movement patterns of LBS mobile objects and their environment, and put up with a spatio-temporal database update strategy in LBS database based on trajectories spatiotemporal mining. Experimental evaluation reveals excellent performance of the proposed model and strategy. Our original contribution include formulation of model of interaction between trajectory and its environment, design of spatio-temporal database update strategy based on moving objects data mining, and the experimental application of spatio-temporal database update by mining moving objects trajectories.

  17. Corrosion of Pipeline and Wellbore Steel by Liquid CO2 Containing Trace Amounts of Water and SO2

    NASA Astrophysics Data System (ADS)

    McGrail, P.; Schaef, H. T.; Owen, A. T.

    2009-12-01

    Carbon dioxide capture and storage in deep saline formations is currently considered the most attractive option to reduce greenhouse gas emissions with continued use of fossil fuels for energy production. Transporting captured CO2 and injection into suitable formations for storage will necessarily involve pipeline systems and wellbores constructed of carbon steels. Industry standards currently require nearly complete dehydration of liquid CO2 to reduce corrosion in the pipeline transport system. However, it may be possible to establish a corrosion threshold based on H2O content in the CO2 that could allow for minor amounts of H2O to remain in the liquid CO2 and thereby eliminate a costly dehydration step. Similarly, trace amounts of sulfur and nitrogen compounds common in flue gas streams are currently removed through expensive desulfurization and catalytic reduction processes. Provided these contaminants could be safely and permanently transported and stored in the geologic reservoir, retrofits of existing fossil-fuel plants could address comprehensive emissions reductions, including CO2 at perhaps nearly the same capital and operating cost. Because CO2-SO2 mixtures have never been commercially transported or injected, both experimental and theoretical work is needed to understand corrosion mechanisms of various steels in these gas mixtures containing varying amounts of water. Experiments were conducted with common tool steel (AISI-01) and pipeline steel (X65) immersed in liquid CO2 at room temperature containing ~1% SO2 and varying amounts of H2O (0 to 2500 ppmw). A threshold concentration of H2O in the liquid CO2-SO2 mixture was established based on the absence of visible surface corrosion. For example, experiments exposing steel to liquid CO2-SO2 containing ~300 ppmw H2O showed a delay in onset of visible corrosion products and minimal surface corrosion was visible after five days of testing. However increasing the water content to 760 ppmw produced extensive surface corrosion after 48 hours at room temperature. Surface characterization by SEM showed one type of morphology that included large circular features radiating outward from a central structure. Chemical analyses obtained by SEM-EDX indicate the phases contained mostly Fe and S with minor amounts of Mn. Corrosion products completely covering the metal coupon surface were identified by XRD as iron sulfite hydrate (FeSO3●3H2O), with lesser amounts of gravegliaite (MnSO3●3H2O), and rozenite (Fe(SO4)●(H2O)4).

  18. Sidewall-box airlift pump provides large flows for aeration, CO2 stripping, and water rotation in large dual-drain circular tanks

    USDA-ARS?s Scientific Manuscript database

    Conventional gas transfer technologies for aquaculture systems occupy a large amount of space, require a considerable capital investment, and can contribute to high electricity demand. In addition, diffused aeration in a circular culture tank can interfere with the hydrodynamics of water rotation a...

  19. Hapl-o-Mat: open-source software for HLA haplotype frequency estimation from ambiguous and heterogeneous data.

    PubMed

    Schäfer, Christian; Schmidt, Alexander H; Sauter, Jürgen

    2017-05-30

    Knowledge of HLA haplotypes is helpful in many settings as disease association studies, population genetics, or hematopoietic stem cell transplantation. Regarding the recruitment of unrelated hematopoietic stem cell donors, HLA haplotype frequencies of specific populations are used to optimize both donor searches for individual patients and strategic donor registry planning. However, the estimation of haplotype frequencies from HLA genotyping data is challenged by the large amount of genotype data, the complex HLA nomenclature, and the heterogeneous and ambiguous nature of typing records. To meet these challenges, we have developed the open-source software Hapl-o-Mat. It estimates haplotype frequencies from population data including an arbitrary number of loci using an expectation-maximization algorithm. Its key features are the processing of different HLA typing resolutions within a given population sample and the handling of ambiguities recorded via multiple allele codes or genotype list strings. Implemented in C++, Hapl-o-Mat facilitates efficient haplotype frequency estimation from large amounts of genotype data. We demonstrate its accuracy and performance on the basis of artificial and real genotype data. Hapl-o-Mat is a versatile and efficient software for HLA haplotype frequency estimation. Its capability of processing various forms of HLA genotype data allows for a straightforward haplotype frequency estimation from typing records usually found in stem cell donor registries.

  20. Influence of the Wenchuan earthquake on self-reported irregular menstrual cycles in surviving women.

    PubMed

    Li, Xiao-Hong; Qin, Lang; Hu, Han; Luo, Shan; Li, Lei; Fan, Wei; Xiao, Zhun; Li, Ying-Xing; Li, Shang-Wei

    2011-09-01

    To explore the influence of stress induced by the Wenchuan earthquake on the menstrual cycles of surviving women. Self-reports of the menstrual cycles of 473 women that survived the Wenchuan earthquake were analyzed. Menstrual regularity was defined as menses between 21 and 35 days long. The death of a child or the loss of property and social resources was verified for all surviving women. The severity of these losses was assessed and graded as high, little, and none. About 21% of the study participants reported that their menstrual cycles became irregular after the Wenchuan earthquake, and this percentage was significantly higher than before the earthquake (6%, p < 0.05). About 30% of the surviving women with a high degree of loss in the earthquake reported menstrual irregularity after the earthquake. Association analyses showed that some stressors of the Wenchuan earthquake were strongly associated with self-reports of menstrual irregularity, including the loss of children (RR: 1.58; 95% CI: 1.09, 2.28), large amounts of property (RR: 1.49; 95% CI: 1.03, 2.15), social resources (RR: 1.34; 95% CI: 1.00, 1.80) and the hormonal contraception use (RR: 1.62; 95% CI: 1.21, 1.83). Self-reported menstrual irregularity is common in women that survived the Wenchuan earthquake, especially in those who lost children, large amounts of property and social resources.

  1. CMS Connect

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Gardner, R., Jr.; Hurtado Anampa, K.; Jayatilaka, B.; Aftab Khan, F.; Lannon, K.; Larson, K.; Letts, J.; Marra Da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.

    2017-10-01

    The CMS experiment collects and analyzes large amounts of data coming from high energy particle collisions produced by the Large Hadron Collider (LHC) at CERN. This involves a huge amount of real and simulated data processing that needs to be handled in batch-oriented platforms. The CMS Global Pool of computing resources provide +100K dedicated CPU cores and another 50K to 100K CPU cores from opportunistic resources for these kind of tasks and even though production and event processing analysis workflows are already managed by existing tools, there is still a lack of support to submit final stage condor-like analysis jobs familiar to Tier-3 or local Computing Facilities users into these distributed resources in an integrated (with other CMS services) and friendly way. CMS Connect is a set of computing tools and services designed to augment existing services in the CMS Physics community focusing on these kind of condor analysis jobs. It is based on the CI-Connect platform developed by the Open Science Grid and uses the CMS GlideInWMS infrastructure to transparently plug CMS global grid resources into a virtual pool accessed via a single submission machine. This paper describes the specific developments and deployment of CMS Connect beyond the CI-Connect platform in order to integrate the service with CMS specific needs, including specific Site submission, accounting of jobs and automated reporting to standard CMS monitoring resources in an effortless way to their users.

  2. Healthcare and the Roles of the Medical Profession in the Big Data Era*1

    PubMed Central

    YAMAMOTO, Yuji

    2016-01-01

    The accumulation of large amounts of healthcare information is in progress, and society is about to enter the Health Big Data era by linking such data. Medical professionals’ daily tasks in clinical practice have become more complicated due to information overload, accelerated technological development, and the expansion of conceptual frameworks for medical care. Further, their responsibilities are more challenging and their workload is consistently increasing. As medical professionals enter the Health Big Data era, we need to reevaluate the fundamental significance and role of medicine and investigate ways to utilize this available information and technology. For example, a data analysis on diabetes patients has already shed light on the status of accessibility to physicians and the treatment response rate. In time, large amounts of health data will help find solutions including new effective treatment that could not be discovered by conventional means. Despite the vastness of accumulated data and analyses, their interpretation is necessarily conducted by attending physicians who communicate these findings to patients face to face; this task cannot be replaced by technology. As medical professionals, we must take the initiative to evaluate the framework of medicine in the Health Big Data era, study the ideal approach for clinical practitioners within this framework, and spread awareness to the public about our framework and approach while implementing them. PMID:28299246

  3. Distribution of a pelagic tunicate, Salpa fusiformis in warm surface current of the eastern Korean waters and its impingement on cooling water intakes of Uljin nuclear power plant.

    PubMed

    Chae, Jinho; Choi, Hyun Woo; Lee, Woo Jin; Kim, Dongsung; Lee, Jae Hac

    2008-07-01

    Impingement of a large amount of gelatinous plankton, Salpa fusiformis on the seawater intake system-screens in a nuclear power plant at Uljin was firstly recorded on 18th June 2003. Whole amount of the clogged animals was estimated were presumptively at 295 tons and the shortage of cooling seawater supply by the animal clogging caused 38% of decrease in generation capability of the power plant. Zooplankton collection with a multiple towing net during the day and at night from 5 to 6 June 2003 included various gelatinous zooplanktons known to be warm water species such as salps and siphonophores. Comparatively larger species, Salpa fusiformis occupied 25.4% in individual density among the gelatinous plankton and showed surface distribution in the depth shallower than thermocline, performing little diel vertical migration. Temperature, salinity and satellite data also showed warm surface current predominated over the southern coastal region near the power plant in June. The results suggested that warm surface current occasionally extended into the neritic region may transfer S. fusiformis, to the waters off the power plant. The environmental factors and their relation to ecobiology of the large quantity of salpa population that are being sucked into the intake channel of the power plant are discussed.

  4. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges

    PubMed Central

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  5. Recent advances quantifying the large wood dynamics in river basins: New methods and remaining challenges

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, Virginia; Piégay, Hervé; Gurnell, Angela A.; Marston, Richard A.; Stoffel, Markus

    2016-09-01

    Large wood is an important physical component of woodland rivers and significantly influences river morphology. It is also a key component of stream ecosystems. However, large wood is also a source of risk for human activities as it may damage infrastructure, block river channels, and induce flooding. Therefore, the analysis and quantification of large wood and its mobility are crucial for understanding and managing wood in rivers. As the amount of large-wood-related studies by researchers, river managers, and stakeholders increases, documentation of commonly used and newly available techniques and their effectiveness has also become increasingly relevant as well. Important data and knowledge have been obtained from the application of very different approaches and have generated a significant body of valuable information representative of different environments. This review brings a comprehensive qualitative and quantitative summary of recent advances regarding the different processes involved in large wood dynamics in fluvial systems including wood budgeting and wood mechanics. First, some key definitions and concepts are introduced. Second, advances in quantifying large wood dynamics are reviewed; in particular, how measurements and modeling can be combined to integrate our understanding of how large wood moves through and is retained within river systems. Throughout, we present a quantitative and integrated meta-analysis compiled from different studies and geographical regions. Finally, we conclude by highlighting areas of particular research importance and their likely future trajectories, and we consider a particularly underresearched area so as to stress the future challenges for large wood research.

  6. Factors associated with the amount of public home care received by elderly and intellectually disabled individuals in a large Norwegian municipality.

    PubMed

    Døhl, Øystein; Garåsen, Helge; Kalseth, Jorid; Magnussen, Jon

    2016-05-01

    This study reports an analysis of factors associated with home care use in a setting in which long-term care services are provided within a publicly financed welfare system. We considered two groups of home care recipients: elderly individuals and intellectually disabled individuals. Routinely collected data on users of public home care in the municipality of Trondheim in October 2012, including 2493 people aged 67 years or older and 270 intellectually disabled people, were used. Multivariate regression analysis was used to analyse the relationship between the time spent in direct contact with recipients by public healthcare personnel and perceived individual determinants of home care use (i.e. physical disability, cognitive impairment, diagnoses, age and gender, as well as socioeconomic characteristics). Physical disability and cognitive impairment are routinely registered for long-term care users through a standardised instrument that is used in all Norwegian municipalities. Factor analysis was used to aggregate the individual items into composite variables that were included as need variables. Both physical disability and cognitive impairment were strong predictors of the amount of received care for both elderly and intellectually disabled individuals. Furthermore, we found a negative interaction effect between physical disability and cognitive impairment for elderly home care users. For elderly individuals, we also found significant positive associations between weekly hours of home care and having comorbidity, living alone, living in a service flat and having a safety alarm. The reduction in the amount of care for elderly individuals living with a cohabitant was substantially greater for males than for females. For intellectually disabled individuals, receiving services involuntarily due to severe behavioural problems was a strong predictor of the amount of care received. Our analysis showed that routinely collected data capture important predictors of home care use and thus facilitate both short-term budgeting and long-term planning of home care services. © 2015 John Wiley & Sons Ltd.

  7. Computerized breast cancer analysis system using three stage semi-supervised learning method.

    PubMed

    Sun, Wenqing; Tseng, Tzu-Liang Bill; Zhang, Jianying; Qian, Wei

    2016-10-01

    A large number of labeled medical image data is usually a requirement to train a well-performed computer-aided detection (CAD) system. But the process of data labeling is time consuming, and potential ethical and logistical problems may also present complications. As a result, incorporating unlabeled data into CAD system can be a feasible way to combat these obstacles. In this study we developed a three stage semi-supervised learning (SSL) scheme that combines a small amount of labeled data and larger amount of unlabeled data. The scheme was modified on our existing CAD system using the following three stages: data weighing, feature selection, and newly proposed dividing co-training data labeling algorithm. Global density asymmetry features were incorporated to the feature pool to reduce the false positive rate. Area under the curve (AUC) and accuracy were computed using 10 fold cross validation method to evaluate the performance of our CAD system. The image dataset includes mammograms from 400 women who underwent routine screening examinations, and each pair contains either two cranio-caudal (CC) or two mediolateral-oblique (MLO) view mammograms from the right and the left breasts. From these mammograms 512 regions were extracted and used in this study, and among them 90 regions were treated as labeled while the rest were treated as unlabeled. Using our proposed scheme, the highest AUC observed in our research was 0.841, which included the 90 labeled data and all the unlabeled data. It was 7.4% higher than using labeled data only. With the increasing amount of labeled data, AUC difference between using mixed data and using labeled data only reached its peak when the amount of labeled data was around 60. This study demonstrated that our proposed three stage semi-supervised learning can improve the CAD performance by incorporating unlabeled data. Using unlabeled data is promising in computerized cancer research and may have a significant impact for future CAD system applications. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Beverage consumption among European adolescents in the HELENA study.

    PubMed

    Duffey, K J; Huybrechts, I; Mouratidou, T; Libuda, L; Kersting, M; De Vriendt, T; Gottrand, F; Widhalm, K; Dallongeville, J; Hallström, L; González-Gross, M; De Henauw, S; Moreno, L A; Popkin, B M

    2012-02-01

    Our objective was to describe the fluid and energy consumption of beverages in a large sample of European adolescents. We used data from 2741 European adolescents residing in 8 countries participating in the Healthy Lifestyle in Europe by Nutrition in Adolescence Cross-Sectional Study (HELENA-CSS). We averaged two 24-h recalls, collected using the HELENA-dietary assessment tool. By gender and age subgroup (12.5-14.9 years and 15-17.5 years), we examined per capita and per consumer fluid (milliliters (ml)) and energy (kilojoules (kJ)) intake from beverages and percentage consuming 10 different beverage groups. Mean beverage consumption was 1611 ml/day in boys and 1316 ml/day in girls. Energy intake from beverages was about 1966 kJ/day and 1289 kJ/day in European boys and girls, respectively, with sugar-sweetened beverages (SSBs) (carbonated and non-carbonated beverages, including soft drinks, fruit drinks and powders/concentrates) contributing to daily energy intake more than other groups of beverages. Boys and older adolescents consumed the most amount of per capita total energy from beverages. Among all age and gender subgroups, SSBs, sweetened milk (including chocolate milk and flavored yogurt drinks all with added sugar), low-fat milk and fruit juice provided the highest amount of per capita energy. Water was consumed by the largest percentage of adolescents followed by SSBs, fruit juice and sweetened milk. Among consumers, water provided the greatest fluid intake and sweetened milk accounted for the largest amount of energy intake followed by SSBs. Patterns of energy intake from each beverage varied between countries. European adolescents consume an average of 1455 ml/day of beverages, with the largest proportion of consumers and the largest fluid amount coming from water. Beverages provide 1609 kJ/day, of which 30.4%, 20.7% and 18.1% comes from SSBs, sweetened milk and fruit juice, respectively.

  9. Beverage consumption among European adolescents in the HELENA Study

    PubMed Central

    Duffey, K.J.; Huybrechts, I.; Mouratidou, T.; Libuda, L.; Kersting, M.; DeVriendt, T.; Gottrand, F.; Widhalm, K.; Dallongeville, J.; Hallström, L.; González-Gross, M.; DeHenauw, S.; Moreno, L.A.; Popkin, B.M.

    2012-01-01

    Background and Objective Our objective was to describe the fluid and energy consumption of beverages in a large sample of European adolescents Methods We used data from 2,741 European adolescents residing in 8 countries participating in the Healthy Lifestyle in Europe by Nutrition in Adolescence Cross Sectional Study (HELENA-CSS). We averaged two 24-hour recalls, collected using the HELENA-dietary assessment tool. By gender and age subgroup (12.5–14.9 y and 15–17.5 y), we examined per capita and per consumer fluid (milliliters [mL]) and energy (kilojoules [kJ]) intake from beverages and percent consuming ten different beverage groups. Results Mean beverage consumption was 1611 ml/d in boys and 1316 ml/d in girls. Energy intake from beverages was about 1966 kJ/d and 1289 kJ/d in European boys and girls respectively, with sugar-sweetened beverages (carbonated and non-carbonated beverages, including soft drinks, fruit drinks and powders/concentrates) contributing to daily energy intake more than other groups of beverages. Boys and older adolescents consumed the most amount of per capita total energy from beverages. Among all age and gender subgroups sugar-sweetened beverages, sweetened milk (including chocolate milk and flavored yogurt drinks all with added sugar), low-fat milk, and fruit juice provided the highest amount of per capita energy. Water was consumed by the largest percent of adolescents followed by sugar-sweetened beverages, fruit juice, and sweetened milk. Among consumers, water provided the greatest fluid intake and sweetened milk accounted for the largest amount of energy intake followed by sugar-sweetened beverages. Patterns of energy intake from each beverage varied between countries. Conclusions European adolescents consume an average of 1455 ml/d of beverages, with the largest proportion of consumers and the largest fluid amount coming from water. Beverages provide 1609 kJ/d, of which 30.4%, 20.7%, and 18.1% comes from sugar-sweetened beverages, sweetened milk, and fruit juice respectively. PMID:21952695

  10. Default values for assessment of potential dermal exposure of the hands to industrial chemicals in the scope of regulatory risk assessments.

    PubMed

    Marquart, Hans; Warren, Nicholas D; Laitinen, Juha; van Hemmen, Joop J

    2006-07-01

    Dermal exposure needs to be addressed in regulatory risk assessment of chemicals. The models used so far are based on very limited data. The EU project RISKOFDERM has gathered a large number of new measurements on dermal exposure to industrial chemicals in various work situations, together with information on possible determinants of exposure. These data and information, together with some non-RISKOFDERM data were used to derive default values for potential dermal exposure of the hands for so-called 'TGD exposure scenarios'. TGD exposure scenarios have similar values for some very important determinant(s) of dermal exposure, such as amount of substance used. They form narrower bands within the so-called 'RISKOFDERM scenarios', which cluster exposure situations according to the same purpose of use of the products. The RISKOFDERM scenarios in turn are narrower bands within the so-called Dermal Exposure Operation units (DEO units) that were defined in the RISKOFDERM project to cluster situations with similar exposure processes and exposure routes. Default values for both reasonable worst case situations and typical situations were derived, both for single datasets and, where possible, for combined datasets that fit the same TGD exposure scenario. The following reasonable worst case potential hand exposures were derived from combined datasets: (i) loading and filling of large containers (or mixers) with large amounts (many litres) of liquids: 11,500 mg per scenario (14 mg cm(-2) per scenario with surface of the hands assumed to be 820 cm(2)); (ii) careful mixing of small quantities (tens of grams in <1l): 4.1 mg per scenario (0.005 mg cm(-2) per scenario); (iii) spreading of (viscous) liquids with a comb on a large surface area: 130 mg per scenario (0.16 mg cm(-2) per scenario); (iv) brushing and rolling of (relatively viscous) liquid products on surfaces: 6500 mg per scenario (8 mg cm(-2) per scenario) and (v) spraying large amounts of liquids (paints, cleaning products) on large areas: 12,000 mg per scenario (14 mg cm(-2) per scenario). These default values are considered useful for estimating exposure for similar substances in similar situations with low uncertainty. Several other default values based on single datasets can also be used, but lead to estimates with a higher uncertainty, due to their more limited basis. Sufficient analogy in all described parameters of the scenario, including duration, is needed to enable proper use of the default values. The default values lead to similar estimates as the RISKOFDERM dermal exposure model that was based on the same datasets, but uses very different parameters. Both approaches are preferred over older general models, such as EASE, that are not based on data from actual dermal exposure situations.

  11. Solubilization of benomyl for xylem injection in vascular wilt disease control

    Treesearch

    Percy McWain; Garold F. Gregory; Garold F. Gregory

    1971-01-01

    Benomyl, in varying amounts, was solubilized in several solvents, thus allowing injection into trees for fungus disease prevention and therapy. A large amount of benomyl can be solubilized in diluted lactic acid. The resulting solution can be infinitely diluted with water without pre-cipitation. These characteristics make it the current solution of choice for our tree...

  12. Pseudo-CFI for industrial forest industries

    Treesearch

    Francis A. Roesch

    2000-01-01

    Corporate inventory systems have historically had a greater spatial and temporal intensity than is common in the public sector. For many corporations, these inventory systems might be described as dynamic in that current estimates rely on a small amount of recent data and a large amount of information resulting from the imputation of older data that have been...

  13. Pseudo-CFI for industrial forest inventories

    Treesearch

    Francis A. Roesch

    2000-01-01

    Corporate inventory systems have historically had a greater spatial and temporal intensity than is common in the public sector. For many corporations, these inventory systems might be described as dynamic in that current estimates rely on a small amount of recent data and a large amount of information resulting from the imputation of older data that have been subjected...

  14. Sleep and Delinquency: Does the Amount of Sleep Matter?

    ERIC Educational Resources Information Center

    Clinkinbeard, Samantha S.; Simi, Pete; Evans, Mary K.; Anderson, Amy L.

    2011-01-01

    Sleep, a key indicator of health, has been linked to a variety of indicators of well-being such that people who get an adequate amount generally experience greater well-being. Further, a lack of sleep has been linked to a wide range of negative developmental outcomes, yet sleep has been largely overlooked among researchers interested in adolescent…

  15. The effectiveness of flocculants on inorganic and metallic species removal during aerobic digestion of wastewater from poultry processing plant

    USDA-ARS?s Scientific Manuscript database

    : Large amount of water is used for processing of our food supplies, especially in meat processing plants. The resulting amount of wastewater cannot be discarded freely back into natural settings due to regulatory mandates, whether the sinks would be rivers, ponds, or other natural systems. These wa...

  16. The effectiveness of flocculants on inorganic and metallic species removal during aerobic digestion of wastewater from poultry processing plant

    USDA-ARS?s Scientific Manuscript database

    Large amount of water is used for processing of our food supplies, especially in meat processing plants. The resulting amount of wastewater cannot be discarded freely back into natural settings due to regulatory mandates, whether the sinks would be rivers, ponds, or other natural systems. These wast...

  17. Transfer of Learning from Management Development Programmes: Testing the Holton Model

    ERIC Educational Resources Information Center

    Kirwan, Cyril; Birchall, David

    2006-01-01

    Transfer of learning from management development programmes has been described as the effective and continuing application back at work of the knowledge and skills gained on those programmes. It is a very important issue for organizations today, given the large amounts of investment in these programmes and the small amounts of that investment that…

  18. Cultivation of marine sponges

    NASA Astrophysics Data System (ADS)

    Qu, Yi; Zhang, Wei; Li, Hua; Yu, Xingju; Jin, Meifang

    2005-06-01

    Sponges are the most primitive of multicellular animals, and are major pharmaceutical sources of marine secondary metabolites. A wide variety of new compounds have been isolated from sponges. In order to produce sufficient amounts of the compounds of the needed, it is necessary to obtain large amount of sponges. The production of sponge biomass has become a focus of marine biotechnology.

  19. DBMap: a TreeMap-based framework for data navigation and visualization of brain research registry

    NASA Astrophysics Data System (ADS)

    Zhang, Ming; Zhang, Hong; Tjandra, Donny; Wong, Stephen T. C.

    2003-05-01

    The purpose of this study is to investigate and apply a new, intuitive and space-conscious visualization framework to facilitate efficient data presentation and exploration of large-scale data warehouses. We have implemented the DBMap framework for the UCSF Brain Research Registry. Such a novel utility would facilitate medical specialists and clinical researchers in better exploring and evaluating a number of attributes organized in the brain research registry. The current UCSF Brain Research Registry consists of a federation of disease-oriented database modules, including Epilepsy, Brain Tumor, Intracerebral Hemorrphage, and CJD (Creuzfeld-Jacob disease). These database modules organize large volumes of imaging and non-imaging data to support Web-based clinical research. While the data warehouse supports general information retrieval and analysis, there lacks an effective way to visualize and present the voluminous and complex data stored. This study investigates whether the TreeMap algorithm can be adapted to display and navigate categorical biomedical data warehouse or registry. TreeMap is a space constrained graphical representation of large hierarchical data sets, mapped to a matrix of rectangles, whose size and color represent interested database fields. It allows the display of a large amount of numerical and categorical information in limited real estate of computer screen with an intuitive user interface. The paper will describe, DBMap, the proposed new data visualization framework for large biomedical databases. Built upon XML, Java and JDBC technologies, the prototype system includes a set of software modules that reside in the application server tier and provide interface to backend database tier and front-end Web tier of the brain registry.

  20. Evaluating the Large-Scale Environment of Extreme Events Using Reanalyses

    NASA Astrophysics Data System (ADS)

    Bosilovich, M. G.; Schubert, S. D.; Koster, R. D.; da Silva, A. M., Jr.; Eichmann, A.

    2014-12-01

    Extreme conditions and events have always been a long standing concern in weather forecasting and national security. While some evidence indicates extreme weather will increase in global change scenarios, extremes are often related to the large scale atmospheric circulation, but also occurring infrequently. Reanalyses assimilate substantial amounts of weather data and a primary strength of reanalysis data is the representation of the large-scale atmospheric environment. In this effort, we link the occurrences of extreme events or climate indicators to the underlying regional and global weather patterns. Now, with greater than 3o years of data, reanalyses can include multiple cases of extreme events, and thereby identify commonality among the weather to better characterize the large-scale to global environment linked to the indicator or extreme event. Since these features are certainly regionally dependent, and also, the indicators of climate are continually being developed, we outline various methods to analyze the reanalysis data and the development of tools to support regional evaluation of the data. Here, we provide some examples of both individual case studies and composite studies of similar events. For example, we will compare the large scale environment for Northeastern US extreme precipitation with that of highest mean precipitation seasons. Likewise, southerly winds can shown to be a major contributor to very warm days in the Northeast winter. While most of our development has involved NASA's MERRA reanalysis, we are also looking forward to MERRA-2 which includes several new features that greatly improve the representation of weather and climate, especially for the regions and sectors involved in the National Climate Assessment.

Top