Sample records for analyze large amounts

  1. Impact of the BALLOTS Shared Cataloging System on the Amount of Change in the Library Technical Processing Department.

    ERIC Educational Resources Information Center

    Kershner, Lois M.

    The amount of change resulting from the implementation of the Bibliographic Automation of Large Library Operations using a Time-sharing System (BALLOTS) is analyzed, in terms of (1) physical room arrangement, (2) work procedure, and (3) organizational structure. Also considered is the factor of amount of time the new system has been in use.…

  2. Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey

    ERIC Educational Resources Information Center

    Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin

    2015-01-01

    "Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…

  3. Analysis of volatile organic compounds. [trace amounts of organic volatiles in gas samples

    NASA Technical Reports Server (NTRS)

    Zlatkis, A. (Inventor)

    1977-01-01

    An apparatus and method are described for reproducibly analyzing trace amounts of a large number of organic volatiles existing in a gas sample. Direct injection of the trapped volatiles into a cryogenic percolum provides a sharply defined plug. Applications of the method include: (1) analyzing the headspace gas of body fluids and comparing a profile of the organic volatiles with standard profiles for the detection and monitoring of disease; (2) analyzing the headspace gas of foods and beverages and comparing the profile with standard profiles to monitor and control flavor and aroma; and (3) analyses for determining the organic pollutants in air or water samples.

  4. Recent progress and market analysis of anticoagulant drugs

    PubMed Central

    Fan, Ping; Gao, Yangyang; Zheng, Minglin; Xu, Ting; Schoenhagen, Paul

    2018-01-01

    This review describes epidemiology of thromboembolic disease in China and abroad, evaluates trends in the development of anticoagulant drugs, and analyzes the market situation based on large amounts of accumulated data. Specifically, we describe advances in clinical application of anticoagulants and analyze the most commonly used anticoagulants in the market systematically.

  5. Earbuds: A Method for Analyzing Nasality in the Field

    ERIC Educational Resources Information Center

    Stewart, Jesse; Kohlberger, Martin

    2017-01-01

    Existing methods for collecting and analyzing nasality data are problematic for linguistic fieldworkers: aerodynamic equipment can be expensive and difficult to transport, and acoustic analyses require large amounts of optimally-recorded data. In this paper, a highly mobile and low-cost method is proposed. By connecting low impedance earbuds into…

  6. Really big data: Processing and analysis of large datasets

    USDA-ARS?s Scientific Manuscript database

    Modern animal breeding datasets are large and getting larger, due in part to the recent availability of DNA data for many animals. Computational methods for efficiently storing and analyzing those data are under development. The amount of storage space required for such datasets is increasing rapidl...

  7. Analyzing large scale genomic data on the cloud with Sparkhit

    PubMed Central

    Huang, Liren; Krüger, Jan

    2018-01-01

    Abstract Motivation The increasing amount of next-generation sequencing data poses a fundamental challenge on large scale genomic analytics. Existing tools use different distributed computational platforms to scale-out bioinformatics workloads. However, the scalability of these tools is not efficient. Moreover, they have heavy run time overheads when pre-processing large amounts of data. To address these limitations, we have developed Sparkhit: a distributed bioinformatics framework built on top of the Apache Spark platform. Results Sparkhit integrates a variety of analytical methods. It is implemented in the Spark extended MapReduce model. It runs 92–157 times faster than MetaSpark on metagenomic fragment recruitment and 18–32 times faster than Crossbow on data pre-processing. We analyzed 100 terabytes of data across four genomic projects in the cloud in 21 h, which includes the run times of cluster deployment and data downloading. Furthermore, our application on the entire Human Microbiome Project shotgun sequencing data was completed in 2 h, presenting an approach to easily associate large amounts of public datasets with reference data. Availability and implementation Sparkhit is freely available at: https://rhinempi.github.io/sparkhit/. Contact asczyrba@cebitec.uni-bielefeld.de Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253074

  8. Applications of Data Mining Methods in the Integrative Medical Studies of Coronary Heart Disease: Progress and Prospect

    PubMed Central

    Wang, Yixin; Guo, Fang

    2014-01-01

    A large amount of studies show that real-world study has strong external validity than the traditional randomized controlled trials and can evaluate the effect of interventions in a real clinical setting, which open up a new path for researches of integrative medicine in coronary heart disease. However, clinical data of integrative medicine in coronary heart disease are large in amount and complex in data types, making exploring the appropriate methodology a hot topic. Data mining techniques are to analyze and dig out useful information and knowledge from the mass data to guide people's practices. The present review provides insights for the main features of data mining and their applications of integrative medical studies in coronary heart disease, aiming to analyze the progress and prospect in this field. PMID:25544853

  9. Pattern Recognition for a Flight Dynamics Monte Carlo Simulation

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; Hurtado, John E.

    2011-01-01

    The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.

  10. Analyzing large-scale spiking neural data with HRLAnalysis™

    PubMed Central

    Thibeault, Corey M.; O'Brien, Michael J.; Srinivasa, Narayan

    2014-01-01

    The additional capabilities provided by high-performance neural simulation environments and modern computing hardware has allowed for the modeling of increasingly larger spiking neural networks. This is important for exploring more anatomically detailed networks but the corresponding accumulation in data can make analyzing the results of these simulations difficult. This is further compounded by the fact that many existing analysis packages were not developed with large spiking data sets in mind. Presented here is a software suite developed to not only process the increased amount of spike-train data in a reasonable amount of time, but also provide a user friendly Python interface. We describe the design considerations, implementation and features of the HRLAnalysis™ suite. In addition, performance benchmarks demonstrating the speedup of this design compared to a published Python implementation are also presented. The result is a high-performance analysis toolkit that is not only usable and readily extensible, but also straightforward to interface with existing Python modules. PMID:24634655

  11. Amount of newspaper coverage of high school athletics for boys and girls on sports page and newspaper circulation.

    PubMed

    Pedersen, Paul M; Whisenant, Warren A

    2002-02-01

    This study analyzed the amount of coverage for high school athletics in 43 newspapers with small circulation by devoting 40% of their interscholastic athletics coverage to girls in athletics, printed significantly more articles about girls' athletics than did the newspapers with medium (33%) or large (32%) circulation. Therefore, the smaller the newspaper circulation, the more equitable the coverage of athletics for girls and boys. This finding was consistent with some prior work but not all.

  12. Investigation of estimators of probability density functions

    NASA Technical Reports Server (NTRS)

    Speed, F. M.

    1972-01-01

    Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.

  13. Feasibility of bridge structural health monitoring using short term data acquisition system.

    DOT National Transportation Integrated Search

    2015-01-01

    Long-term testing of bridges can expensive and result in a large amount of data that is dicult to manage and : analyze. The purpose of this study was to investigate the feasibility of a short-term data acquisition system that : used a minimal numb...

  14. Time Series ARIMA Models of Undergraduate Grade Point Average.

    ERIC Educational Resources Information Center

    Rogers, Bruce G.

    The Auto-Regressive Integrated Moving Average (ARIMA) Models, often referred to as Box-Jenkins models, are regression methods for analyzing sequential dependent observations with large amounts of data. The Box-Jenkins approach, a three-stage procedure consisting of identification, estimation and diagnosis, was used to select the most appropriate…

  15. Gigwa-Genotype investigator for genome-wide analyses.

    PubMed

    Sempéré, Guilhem; Philippe, Florian; Dereeper, Alexis; Ruiz, Manuel; Sarah, Gautier; Larmande, Pierre

    2016-06-06

    Exploring the structure of genomes and analyzing their evolution is essential to understanding the ecological adaptation of organisms. However, with the large amounts of data being produced by next-generation sequencing, computational challenges arise in terms of storage, search, sharing, analysis and visualization. This is particularly true with regards to studies of genomic variation, which are currently lacking scalable and user-friendly data exploration solutions. Here we present Gigwa, a web-based tool that provides an easy and intuitive way to explore large amounts of genotyping data by filtering it not only on the basis of variant features, including functional annotations, but also on genotype patterns. The data storage relies on MongoDB, which offers good scalability properties. Gigwa can handle multiple databases and may be deployed in either single- or multi-user mode. In addition, it provides a wide range of popular export formats. The Gigwa application is suitable for managing large amounts of genomic variation data. Its user-friendly web interface makes such processing widely accessible. It can either be simply deployed on a workstation or be used to provide a shared data portal for a given community of researchers.

  16. An interactive web-based system using cloud for large-scale visual analytics

    NASA Astrophysics Data System (ADS)

    Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.

  17. Automated information-analytical system for thunderstorm monitoring and early warning alarms using modern physical sensors and information technologies with elements of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Boldyreff, Anton S.; Bespalov, Dmitry A.; Adzhiev, Anatoly Kh.

    2017-05-01

    Methods of artificial intelligence are a good solution for weather phenomena forecasting. They allow to process a large amount of diverse data. Recirculation Neural Networks is implemented in the paper for the system of thunderstorm events prediction. Large amounts of experimental data from lightning sensors and electric field mills networks are received and analyzed. The average recognition accuracy of sensor signals is calculated. It is shown that Recirculation Neural Networks is a promising solution in the forecasting of thunderstorms and weather phenomena, characterized by the high efficiency of the recognition elements of the sensor signals, allows to compress images and highlight their characteristic features for subsequent recognition.

  18. Fourier transform infrared microspectroscopy for the analysis of the biochemical composition of C. elegans worms.

    PubMed

    Sheng, Ming; Gorzsás, András; Tuck, Simon

    2016-01-01

    Changes in intermediary metabolism have profound effects on many aspects of C. elegans biology including growth, development and behavior. However, many traditional biochemical techniques for analyzing chemical composition require relatively large amounts of starting material precluding the analysis of mutants that cannot be grown in large amounts as homozygotes. Here we describe a technique for detecting changes in the chemical compositions of C. elegans worms by Fourier transform infrared microspectroscopy. We demonstrate that the technique can be used to detect changes in the relative levels of carbohydrates, proteins and lipids in one and the same worm. We suggest that Fourier transform infrared microspectroscopy represents a useful addition to the arsenal of techniques for metabolic studies of C. elegans worms.

  19. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. An indoor positioning technology in the BLE mobile payment system

    NASA Astrophysics Data System (ADS)

    Han, Tiantian; Ding, Lei

    2017-05-01

    Mobile payment system for large supermarkets, the core function is through the BLE low-power Bluetooth technology to achieve the amount of payment in the mobile payment system, can through an indoor positioning technology to achieve value-added services. The technology by collecting Bluetooth RSSI, the fingerprint database of sampling points corresponding is established. To get Bluetooth module RSSI by the AP. Then, to use k-Nearest Neighbor match the value of the fingerprint database. Thereby, to help businesses find customers through the mall location, combined settlement amount of the customer's purchase of goods, to analyze customer's behavior. When the system collect signal strength, the distribution of the sampling points of RSSI is analyzed and the value is filtered. The system, used in the laboratory is designed to demonstrate the feasibility.

  1. Chemical Characterization of an Envelope B/D Sample from Hanford Tank 241-AZ-102

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hay, M.S.

    2000-08-23

    A sample from Hanford waste tank 241-AZ-102 was received at the Savannah River Technology Center (SRTC) and chemically characterized. The sample containing supernate and a small amount of sludge solids was analyzed as-received. The filtered supernatant liquid, the total dried solids of the sample, and the washed insoluble solids obtained from filtration of the sample were analyzed. A mass balance calculation of the three fractions of the sample analyzed indicate the analytical results appear relatively self-consistent for major components of the sample. However, some inconsistency was observed between results were more than one method of determination was employed and formore » species present in low concentrations. The actinides isotopes, plutonium, americium, and curium, present analytical challenges due to the low concentration of these species and the potential for introduction of small amounts of contamination during sampling handling resulting in large uncertainties. A direct comparison to previous analyses of material from tank 241-AZ-102 showed good agreement with the filtered supernatant liquid. However, the comparison of solids data showed poor agreement. The poor agreement shown between the current results for the solids samples and previous analyses most likely results from the uncertainties associated with obtaining small solids samples from a large non-homogenized waste tank.« less

  2. Morphological and genetic analysis of four color morphs of bean leaf beetle, Cerotoma trifurcata (Coleoptera: Chrysomelidae)

    USDA-ARS?s Scientific Manuscript database

    Bean leaf beetle (BLB) exhibits a relatively large amount of morphological variation in terms of color but little is known about the underlying genetic structure and gene flow. Genetic variation among four color phenotypes of the BLB was analyzed using amplified fragment length polymorphisms (AFLP) ...

  3. Helping Young Children Understand Graphs: A Demonstration Study.

    ERIC Educational Resources Information Center

    Freeland, Kent; Madden, Wendy

    1990-01-01

    Outlines a demonstration lesson showing third graders how to make and interpret graphs. Includes descriptions of purpose, vocabulary, and learning activities in which students graph numbers of students with dogs at home and analyze the contents of M&M candy packages by color. Argues process helps students understand large amounts of abstract…

  4. Applications of KHZ-CW Lidar in Ecological Entomology

    NASA Astrophysics Data System (ADS)

    Malmqvist, Elin; Brydegaard, Mikkel

    2016-06-01

    The benefits of kHz lidar in ecological entomology are explained. Results from kHz-measurements on insects, carried out with a CW-lidar system, employing the Scheimpflug principle to obtain range resolution, are presented. A method to extract insect events and analyze the large amount of lidar data is also described.

  5. Data Compression Algorithm Architecture for Large Depth-of-Field Particle Image Velocimeters

    NASA Technical Reports Server (NTRS)

    Bos, Brent; Memarsadeghi, Nargess; Kizhner, Semion; Antonille, Scott

    2013-01-01

    A large depth-of-field particle image velocimeter (PIV) is designed to characterize dynamic dust environments on planetary surfaces. This instrument detects lofted dust particles, and senses the number of particles per unit volume, measuring their sizes, velocities (both speed and direction), and shape factors when the particles are large. To measure these particle characteristics in-flight, the instrument gathers two-dimensional image data at a high frame rate, typically >4,000 Hz, generating large amounts of data for every second of operation, approximately 6 GB/s. To characterize a planetary dust environment that is dynamic, the instrument would have to operate for at least several minutes during an observation period, easily producing more than a terabyte of data per observation. Given current technology, this amount of data would be very difficult to store onboard a spacecraft, and downlink to Earth. Since 2007, innovators have been developing an autonomous image analysis algorithm architecture for the PIV instrument to greatly reduce the amount of data that it has to store and downlink. The algorithm analyzes PIV images and automatically reduces the image information down to only the particle measurement data that is of interest, reducing the amount of data that is handled by more than 10(exp 3). The state of development for this innovation is now fairly mature, with a functional algorithm architecture, along with several key pieces of algorithm logic, that has been proven through field test data acquired with a proof-of-concept PIV instrument.

  6. The origin, type and hydrocarbon generation potential of organic matter in a marine-continental transitional facies shale succession (Qaidam Basin, China).

    PubMed

    Wang, Guo-Cang; Sun, Min-Zhuo; Gao, Shu-Fang; Tang, Li

    2018-04-26

    This organic-rich shale was analyzed to determine the type, origin, maturity and depositional environment of the organic matter and to evaluate the hydrocarbon generation potential of the shale. This study is based on geochemical (total carbon content, Rock-Eval pyrolysis and the molecular composition of hydrocarbons) and whole-rock petrographic (maceral composition) analyses. The petrographic analyses show that the shale penetrated by the Chaiye 2 well contains large amounts of vitrinite and sapropelinite and that the organic matter within these rocks is type III and highly mature. The geochemical analyses show that these rocks are characterized by high total organic carbon contents and that the organic matter is derived from a mix of terrestrial and marine sources and highly mature. These geochemical characteristics are consistent with the results of the petrographic analyses. The large amounts of organic matter in the Carboniferous shale succession penetrated by the Chaiye 2 well may be due to good preservation under hypersaline lacustrine and anoxic marine conditions. Consequently, the studied shale possesses very good hydrocarbon generation potential because of the presence of large amounts of highly mature type III organic matter.

  7. Analysis of effluent after anaerobic digestion of liquid phase separated from liquidized garbage.

    PubMed

    Inoue, Seiichi; Tsukahara, Kenichiro; Sawayama, Shigeki

    2002-01-01

    The organic compositions of the liquid phase separated from liquidized garbage as the influent and its effluent after anaerobic digestion at an overloading rate were analyzed. A large amount of organic acids was found in the effluent. The accumulation of organic acids suggests that the rate of methanogenesis is lower than that of acidogenesis.

  8. Computer Literacy for Life Sciences: Helping the Digital-Era Biology Undergraduates Face Today's Research

    ERIC Educational Resources Information Center

    Smolinski, Tomasz G.

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of…

  9. Retrospective Mining of Toxicology Data to Discover Multispecies and Chemical Class Effects: Anemia as a Case Study

    EPA Science Inventory

    Predictive toxicity models (in vitro to in vivo, QSAR, read-across) rely on large amounts of accurate in vivo data. Here, we analyze the quality of in vivo data from the Toxicity Reference Database (ToxRefDB), using chemical-induced anemia as an example. Considerations include v...

  10. A Comparison Study of Multivariate Fixed Models and Gene Association with Multiple Traits (GAMuT) for Next-Generation Sequencing

    PubMed Central

    Chiu, Chi-yang; Jung, Jeesun; Wang, Yifan; Weeks, Daniel E.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Amos, Christopher I.; Mills, James L.; Boehnke, Michael; Xiong, Momiao; Fan, Ruzong

    2016-01-01

    In this paper, extensive simulations are performed to compare two statistical methods to analyze multiple correlated quantitative phenotypes: (1) approximate F-distributed tests of multivariate functional linear models (MFLM) and additive models of multivariate analysis of variance (MANOVA), and (2) Gene Association with Multiple Traits (GAMuT) for association testing of high-dimensional genotype data. It is shown that approximate F-distributed tests of MFLM and MANOVA have higher power and are more appropriate for major gene association analysis (i.e., scenarios in which some genetic variants have relatively large effects on the phenotypes); GAMuT has higher power and is more appropriate for analyzing polygenic effects (i.e., effects from a large number of genetic variants each of which contributes a small amount to the phenotypes). MFLM and MANOVA are very flexible and can be used to perform association analysis for: (i) rare variants, (ii) common variants, and (iii) a combination of rare and common variants. Although GAMuT was designed to analyze rare variants, it can be applied to analyze a combination of rare and common variants and it performs well when (1) the number of genetic variants is large and (2) each variant contributes a small amount to the phenotypes (i.e., polygenes). MFLM and MANOVA are fixed effect models which perform well for major gene association analysis. GAMuT can be viewed as an extension of sequence kernel association tests (SKAT). Both GAMuT and SKAT are more appropriate for analyzing polygenic effects and they perform well not only in the rare variant case, but also in the case of a combination of rare and common variants. Data analyses of European cohorts and the Trinity Students Study are presented to compare the performance of the two methods. PMID:27917525

  11. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency.

    PubMed

    Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio

    2015-01-01

    Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB.

  12. Estimation and change tendency of rape straw resource in Leshan

    NASA Astrophysics Data System (ADS)

    Guan, Qinlan; Gong, Mingfu

    2018-04-01

    Rape straw in Leshan area are rape stalks, including stems, leaves and pods after removing rapeseed. Leshan area is one of the main rape planting areas in Sichuan Province and rape planting area is large. Each year will produce a lot of rape straw. Based on the analysis of the trend of rapeseed planting area and rapeseed yield from 2008 to 2014, the change trend of rape straw resources in Leshan from 2008 to 2014 was analyzed and the decision-making reference was provided for resource utilization of rape straw. The results showed that the amount of rape straw resources in Leshan was very large, which was more than 100,000 tons per year, which was increasing year by year. By 2014, the amount of rape straw resources in Leshan was close to 200,000 tons.

  13. Environmental status of livestock and poultry sectors in China under current transformation stage.

    PubMed

    Qian, Yi; Song, Kaihui; Hu, Tao; Ying, Tianyu

    2018-05-01

    Intensive animal husbandry had aroused great environmental concerns in many developed countries. However, some developing countries are still undergoing the environmental pollution from livestock and poultry sectors. Driven by the large demand, China has experienced a remarkable increase in dairy and meat production, especially in the transformation stage from conventional household breeding to large-scale industrial breeding. At the same time, a large amount of manure from the livestock and poultry sector is released into waterbodies and soil, causing eutrophication and soil degradation. This condition will be reinforced in the large-scale cultivation where the amount of manure exceeds the soil nutrient capacity, if not treated or utilized properly. Our research aims to analyze whether the transformation of raising scale would be beneficial to the environment as well as present the latest status of livestock and poultry sectors in China. The estimation of the pollutants generated and discharged from livestock and poultry sector in China will facilitate the legislation of manure management. This paper analyzes the pollutants generated from the manure of the five principal commercial animals in different farming practices. The results show that the fattening pigs contribute almost half of the pollutants released from manure. Moreover, the beef cattle exert the largest environmental impact for unitary production, about 2-3 times of pork and 5-20 times of chicken. The animals raised with large-scale feedlots practice generate fewer pollutants than those raised in households. The shift towards industrial production of livestock and poultry is easier to manage from the environmental perspective, but adequate large-scale cultivation is encouraged. Regulation control, manure treatment and financial subsidies for the manure treatment and utilization are recommended to achieve the ecological agriculture in China. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Using information theory to identify redundancy in common laboratory tests in the intensive care unit.

    PubMed

    Lee, Joon; Maslove, David M

    2015-07-31

    Clinical workflow is infused with large quantities of data, particularly in areas with enhanced monitoring such as the Intensive Care Unit (ICU). Information theory can quantify the expected amounts of total and redundant information contained in a given clinical data type, and as such has the potential to inform clinicians on how to manage the vast volumes of data they are required to analyze in their daily practice. The objective of this proof-of-concept study was to quantify the amounts of redundant information associated with common ICU lab tests. We analyzed the information content of 11 laboratory test results from 29,149 adult ICU admissions in the MIMIC II database. Information theory was applied to quantify the expected amount of redundant information both between lab values from the same ICU day, and between consecutive ICU days. Most lab values showed a decreasing trend over time in the expected amount of novel information they contained. Platelet, blood urea nitrogen (BUN), and creatinine measurements exhibited the most amount of redundant information on days 2 and 3 compared to the previous day. The creatinine-BUN and sodium-chloride pairs had the most redundancy. Information theory can help identify and discourage unnecessary testing and bloodwork, and can in general be a useful data analytic technique for many medical specialties that deal with information overload.

  15. Tracing the trajectory of skill learning with a very large sample of online game players.

    PubMed

    Stafford, Tom; Dewar, Michael

    2014-02-01

    In the present study, we analyzed data from a very large sample (N = 854,064) of players of an online game involving rapid perception, decision making, and motor responding. Use of game data allowed us to connect, for the first time, rich details of training history with measures of performance from participants engaged for a sustained amount of time in effortful practice. We showed that lawful relations exist between practice amount and subsequent performance, and between practice spacing and subsequent performance. Our methodology allowed an in situ confirmation of results long established in the experimental literature on skill acquisition. Additionally, we showed that greater initial variation in performance is linked to higher subsequent performance, a result we link to the exploration/exploitation trade-off from the computational framework of reinforcement learning. We discuss the benefits and opportunities of behavioral data sets with very large sample sizes and suggest that this approach could be particularly fecund for studies of skill acquisition.

  16. Water demand-supply analysis in a large spatial area based on the processes of evapotranspiration and runoff

    PubMed Central

    Maruyama, Toshisuke

    2007-01-01

    To estimate the amount of evapotranspiration in a river basin, the “short period water balance method” was formulated. Then, by introducing the “complementary relationship method,” the amount of evapotranspiration was estimated seasonally, and with reasonable accuracy, for both small and large areas. Moreover, to accurately estimate river discharge in the low water season, the “weighted statistical unit hydrograph method” was proposed and a procedure for the calculation of the unit hydrograph was developed. Also, a new model, based on the “equivalent roughness method,” was successfully developed for the estimation of flood runoff from newly reclaimed farmlands. Based on the results of this research, a “composite reservoir model” was formulated to analyze the repeated use of irrigation water in large spatial areas. The application of this model to a number of watershed areas provided useful information with regard to the realities of water demand-supply systems in watersheds predominately dedicated to paddy fields, in Japan. PMID:24367144

  17. Institute for Brain and Neural Systems

    DTIC Science & Technology

    2009-10-06

    to deal with computational complexity when analyzing large amounts of information in visual scenes. It seems natural that in addition to exploring...algorithms using methods from statistical pattern recognition and machine learning. Over the last fifteen years, significant advances had been made in...recognition, robustness to noise and ability to cope with significant variations in lighting conditions. Identifying an occluded target adds another layer of

  18. Comparison of biodegradation of low-weight hydroentangled raw cotton nonwoven fabric and that of commonly used disposable nonwoven fabrics in the aerobic Captina silt loam soil

    USDA-ARS?s Scientific Manuscript database

    The increasing use of disposable nonwovens made of petroleum-based materials generates a large amount of non-biodegradable, solid waste in the environment. As an effort to enhance the usage of biodegradable cotton in nonwovens, this study analyzed the biodegradability of mechanically pre-cleaned gr...

  19. An Integrated Management Support and Production Control System for Hardwood Forest Products

    Treesearch

    Guillermo A. Mendoza; Roger J. Meimban; William Sprouse; William G. Luppold; Philip A. Araman

    1991-01-01

    Spreadsheet and simulation models are tools which enable users to analyze a large number of variables affecting hardwood material utilization and profit in a systematic fashion. This paper describes two spreadsheet models; SEASaw and SEAIn, and a hardwood sawmill simulator. SEASaw is designed to estimate the amount of conversion from timber to lumber, while SEAIn is a...

  20. Foundations of a query and simulation system for the modeling of biochemical and biological processes.

    PubMed

    Antoniotti, M; Park, F; Policriti, A; Ugel, N; Mishra, B

    2003-01-01

    The analysis of large amounts of data, produced as (numerical) traces of in vivo, in vitro and in silico experiments, has become a central activity for many biologists and biochemists. Recent advances in the mathematical modeling and computation of biochemical systems have moreover increased the prominence of in silico experiments; such experiments typically involve the simulation of sets of Differential Algebraic Equations (DAE), e.g., Generalized Mass Action systems (GMA) and S-systems. In this paper we reason about the necessary theoretical and pragmatic foundations for a query and simulation system capable of analyzing large amounts of such trace data. To this end, we propose to combine in a novel way several well-known tools from numerical analysis (approximation theory), temporal logic and verification, and visualization. The result is a preliminary prototype system: simpathica/xssys. When dealing with simulation data simpathica/xssys exploits the special structure of the underlying DAE, and reduces the search space in an efficient way so as to facilitate any queries about the traces. The proposed system is designed to give the user possibility to systematically analyze and simultaneously query different possible timed evolutions of the modeled system.

  1. Distributed memory parallel Markov random fields using graph partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinemann, C.; Perciano, T.; Ushizima, D.

    Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less

  2. Genomics Portals: integrative web-platform for mining genomics data.

    PubMed

    Shinde, Kaustubh; Phatak, Mukta; Johannes, Freudenberg M; Chen, Jing; Li, Qian; Vineet, Joshi K; Hu, Zhen; Ghosh, Krishnendu; Meller, Jaroslaw; Medvedovic, Mario

    2010-01-13

    A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org.

  3. Genomics Portals: integrative web-platform for mining genomics data

    PubMed Central

    2010-01-01

    Background A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Results Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. Conclusion The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org. PMID:20070909

  4. Advancement of Analysis Method for Electromagnetic Screening Effect of Mountain Tunnel

    NASA Astrophysics Data System (ADS)

    Okutani, Tamio; Nakamura, Nobuyuki; Terada, Natsuki; Fukuda, Mitsuyoshi; Tate, Yutaka; Inada, Satoshi; Itoh, Hidenori; Wakao, Shinji

    In this paper we report advancement of an analysis method for electromagnetic screening effect of mountain tunnel with a multiple conductor circuit model. On A.C. electrified railways it is a great issue to manage the influence of electromagnetic induction caused by feeding circuits. Tunnels are said to have a screening effect to reduce the electromagnetic induction because a large amount of steel is used in the tunnels. But recently the screening effect is less expected because New Austrian Tunneling Method (NATM), in which the amount of steel used is less than in conventional methods, is adopted as the standard tunneling method for constructing mountain tunnels. So we measured and analyzed the actual screening effect of mountain tunnels constructed with NATM. In the process of the analysis we have advanced a method to analyze the screening effect more precisely. In this method we can adequately model tunnel structure as a part of multiple conductor circuit.

  5. Reverse isotope dilution method for determining benzene and metabolites in tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bechtold, W.E.; Sabourin, P.J.; Henderson, R.F.

    1988-07-01

    A method utilizing reverse isotope dilution for the analysis of benzene and its organic soluble metabolites in tissues of rats and mice is presented. Tissues from rats and mice that had been exposed to radiolabeled benzene were extracted with ethyl acetate containing known, excess quantities of unlabeled benzene and metabolites. Butylated hydroxytoluene was added as an antioxidant. The ethyl acetate extracts were analyzed with semipreparative reversed-phase HPLC. Isolated peaks were collected and analyzed for radioactivity (by liquid scintillation spectrometry) and for mass (by UV absorption). The total amount of each compound present was calculated from the mass dilution of themore » radiolabeled isotope. This method has the advantages of high sensitivity, because of the high specific activity of benzene, and relative stability of the analyses, because of the addition of large amounts of unlabeled carrier analogue.« less

  6. Effect of oil on an electrowetting lenticular lens and related optical characteristics.

    PubMed

    Shin, Dooseub; Kim, Junoh; Kim, Cheoljoong; Koo, Gyo Hyun; Sim, Jee Hoon; Lee, Junsik; Won, Yong Hyub

    2017-03-01

    While there are many ways to realize autostereoscopic 2D/3D switchable displays, the electrowetting lenticular lens is superior due to the high optical efficiency and short response time. In this paper, we propose a more stable electrowetting lenticular lens by controlling the quantity of oil. With a large amount of oil, the oil layer was broken and the lenticular lens was damaged at relatively low voltage. Therefore, controlling the amount of oil is crucial to obtain the required dioptric power with stability. We proposed a new structure to evenly adjust the volume of oil and the dioptric power was measured by varying the volume of oil. Furthermore, the optical characteristics were finally analyzed in the electrowetting lenticular lens array with a proper amount of oil.

  7. Automating the Generation of the Cassini Tour Atlas Database

    NASA Technical Reports Server (NTRS)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2010-01-01

    The Tour Atlas is a large database of geometrical tables, plots, and graphics used by Cassini science planning engineers and scientists primarily for science observation planning. Over time, as the contents of the Tour Atlas grew, the amount of time it took to recreate the Tour Atlas similarly grew--to the point that it took one person a week of effort. When Cassini tour designers estimated that they were going to create approximately 30 candidate Extended Mission trajectories--which needed to be analyzed for science return in a short amount of time--it became a necessity to automate. We report on the automation methodology that reduced the amount of time it took one person to (re)generate a Tour Atlas from a week to, literally, one UNIX command.

  8. The application of waste fly ash and construction-waste in cement filling material in goaf

    NASA Astrophysics Data System (ADS)

    Chen, W. X.; Xiao, F. K.; Guan, X. H.; Cheng, Y.; Shi, X. P.; Liu, S. M.; Wang, W. W.

    2018-01-01

    As the process of urbanization accelerated, resulting in a large number of abandoned fly ash and construction waste, which have occupied the farmland and polluted the environment. In this paper, a large number of construction waste and abandoned fly ash are mixed into the filling material in goaf, the best formula of the filling material which containing a large amount of abandoned fly ash and construction waste is obtained, and the performance of the filling material is analyzed. The experimental results show that the cost of filling material is very low while the performance is very good, which have a good prospect in goaf.

  9. The impact of air pollutants on rainwater chemistry during "urban-induced heavy rainfall" in downtown Tokyo, Japan

    NASA Astrophysics Data System (ADS)

    Uchiyama, Ryunosuke; Okochi, Hiroshi; Katsumi, Naoya; Ogata, Hiroko

    2017-06-01

    In order to clarify the impact of air pollution on the formation of sudden and locally distributed heavy rain in urban areas (hereafter UHR = urban-induced heavy rain), we analyzed inorganic ions in rainwater samples collected on an event basis over 5 years from October 2012 to December 2016 in Shinjuku, Tokyo. Hourly rainfall amounts and wet deposition fluxes of acidic components (the sum of H+, NH4+, NO3-, and nonsea-salt SO42-) in UHR were 13.1 and 17.8 times larger than those in normal rainfall, respectively, indicating that large amount of air pollutants were scavenged and deposited by UHR with large amounts of rainfall. The level of air pollutants, such as NO2, SO2, and potential ozone, in the ambient air increased just before the formation of UHR and decreased sharply at the end of the UHR event. These results indicate that NO2, which was formed secondarily by oxidants, was further oxidized by HO radicals and formed HNO3 just before the formation of UHR, which was subsequently scavenged by UHR.

  10. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency

    PubMed Central

    Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio

    2015-01-01

    Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB. PMID:26558254

  11. Evaluation and auger analysis of a zinc-dialkyl-dithiophosphate antiwear additive in several diester lubricants

    NASA Technical Reports Server (NTRS)

    Brainard, W. A.; Ferrante, J.

    1979-01-01

    The wear of pure iron in sliding contact with hardened M-2 tool steel was measured for a series of synthetic diester fluids, both with and without a zinc dialkyl dithiophosphate (ZDP) antiwear additive, as test lubricants. Selected wear scars were analyzed by an Auger emission spectroscopy (AES) depth profiling technique in order to assess the surface film elemental composition. The ZDP was an effective antiwear additive for all the diesters except dibutyl oxalate and dibutyl sebacate. The high wear measured for the additive-containing oxalate was related to corrosion; the higher wear measured for the additive-containing sebacate was due to an oxygen interaction. The AES of dibutyl sebacate surfaces run in dry air and in dry nitrogen showed large differences only in the amount of oxygen present. The AES of worn surfaces where the additive was effective showed no zinc, only a little phosphorus, and large amounts of sulfur.

  12. Data Prospecting Framework - a new approach to explore "big data" in Earth Science

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Rushing, J.; Lin, A.; Kuo, K.

    2012-12-01

    Due to advances in sensors, computation and storage, cost and effort required to produce large datasets have been significantly reduced. As a result, we are seeing a proliferation of large-scale data sets being assembled in almost every science field, especially in geosciences. Opportunities to exploit the "big data" are enormous as new hypotheses can be generated by combining and analyzing large amounts of data. However, such a data-driven approach to science discovery assumes that scientists can find and isolate relevant subsets from vast amounts of available data. Current Earth Science data systems only provide data discovery through simple metadata and keyword-based searches and are not designed to support data exploration capabilities based on the actual content. Consequently, scientists often find themselves downloading large volumes of data, struggling with large amounts of storage and learning new analysis technologies that will help them separate the wheat from the chaff. New mechanisms of data exploration are needed to help scientists discover the relevant subsets We present data prospecting, a new content-based data analysis paradigm to support data-intensive science. Data prospecting allows the researchers to explore big data in determining and isolating data subsets for further analysis. This is akin to geo-prospecting in which mineral sites of interest are determined over the landscape through screening methods. The resulting "data prospects" only provide an interaction with and feel for the data through first-look analytics; the researchers would still have to download the relevant datasets and analyze them deeply using their favorite analytical tools to determine if the datasets will yield new hypotheses. Data prospecting combines two traditional categories of data analysis, data exploration and data mining within the discovery step. Data exploration utilizes manual/interactive methods for data analysis such as standard statistical analysis and visualization, usually on small datasets. On the other hand, data mining utilizes automated algorithms to extract useful information. Humans guide these automated algorithms and specify algorithm parameters (training samples, clustering size, etc.). Data Prospecting combines these two approaches using high performance computing and the new techniques for efficient distributed file access.

  13. Both topography and climate affected forest and woodland burn severity in two regions of the western US, 1984 to 2006

    Treesearch

    Gregory K. Dillon; Zachery A. Holden; Penelope Morgan; Michael A. Crimmins; Emily K. Heyerdahl; Charles H. Luce

    2011-01-01

    Fire is a keystone process in many ecosystems of western North America. Severe fires kill and consume large amounts of above- and belowground biomass and affect soils, resulting in long-lasting consequences for vegetation, aquatic ecosystem productivity and diversity, and other ecosystem properties. We analyzed the occurrence of, and trends in, satellite-derived burn...

  14. The Arabic Hyperbolic Pattern "Fa??al" in Two Recent Translations of the Qur'an

    ERIC Educational Resources Information Center

    El-Zawawy, Amr M.

    2014-01-01

    The present study addresses the problem of rendering the ?? ?? 'fa??al' hyperbolic pattern into English in two recent translations of the Qur'an. Due to the variety of Qur'an translations and the large amount of hyperbolic forms of Arabic verbs recorded in the Qur'an, only two translations of the Qur'an are consulted and analyzed: these two…

  15. An analysis of antioxidants, organoleptics and hedonics with variations of boiling time in Jasmine tea and Jasmine root tea a study on Kaliprau, Pemalang

    NASA Astrophysics Data System (ADS)

    Arifan, Fahmi; Winarni, Sri; Handoyo, Gentur; Nurdiana, Asri; Nabila Rahma H, Afkar; Risdiyanti, Sri

    2018-05-01

    There are so many jasmine plantations without any preservation and post production in Kaliprau, Pemalang. The aims of this research are analyzing the amount of antioxidant and organoleptic-hedonic test. The measurement of antioxidant used in this research is using DPPH. The organoleptic and hedonic test on 25 respondents. Jasmines that been used on this research are the flower and the root part. Through the test, some results have been found from the jasmine tea’s sampling with the boiling time of 15 minutes and it contain antioxidant for about 55.0% and 74.84% for the jasmine root tea. Whereas for the boiling time of 30 minutes, it contained 54.00% of antioxidant for the jasmine tea and 84.00% of antioxidant in jasmine root tea. Jasmine tea and jasmine root tea contains flavonoids. Despite the large amount of antioxidant were found in jasmine tea and jasmine root tea (50-100%). There’s a decreasing of antioxidant amount found in the samples, along with the prolonged boiling time. 84% of tresponden like the scent, flavor, color and the texture of jasmine tea and jasmine root tea. These products are finally accepted by the people and have its large amount of antioxidant contain for the jasmine tea.

  16. Nearly ideal binary communication in squeezed channels

    NASA Astrophysics Data System (ADS)

    Paris, Matteo G.

    2001-07-01

    We analyze the effect of squeezing the channel in binary communication based on Gaussian states. We show that for coding on pure states, squeezing increases the detection probability at fixed size of the strategy, actually saturating the optimal bound already for moderate signal energy. Using Neyman-Pearson lemma for fuzzy hypothesis testing we are able to analyze also the case of mixed states, and to find the optimal amount of squeezing that can be effectively employed. It results that optimally squeezed channels are robust against signal mixing, and largely improve the strategy power by comparison with coherent ones.

  17. A parallel implementation of the network identification by multiple regression (NIR) algorithm to reverse-engineer regulatory gene networks.

    PubMed

    Gregoretti, Francesco; Belcastro, Vincenzo; di Bernardo, Diego; Oliva, Gennaro

    2010-04-21

    The reverse engineering of gene regulatory networks using gene expression profile data has become crucial to gain novel biological knowledge. Large amounts of data that need to be analyzed are currently being produced due to advances in microarray technologies. Using current reverse engineering algorithms to analyze large data sets can be very computational-intensive. These emerging computational requirements can be met using parallel computing techniques. It has been shown that the Network Identification by multiple Regression (NIR) algorithm performs better than the other ready-to-use reverse engineering software. However it cannot be used with large networks with thousands of nodes--as is the case in biological networks--due to the high time and space complexity. In this work we overcome this limitation by designing and developing a parallel version of the NIR algorithm. The new implementation of the algorithm reaches a very good accuracy even for large gene networks, improving our understanding of the gene regulatory networks that is crucial for a wide range of biomedical applications.

  18. Mathematical Models to Determine Stable Behavior of Complex Systems

    NASA Astrophysics Data System (ADS)

    Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.

    2018-05-01

    The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.

  19. Reanalysis of 24 Nearby Open Clusters using Gaia data

    NASA Astrophysics Data System (ADS)

    Yen, Steffi X.; Reffert, Sabine; Röser, Siegfried; Schilbach, Elena; Kharchenko, Nina V.; Piskunov, Anatoly E.

    2018-04-01

    We have developed a fully automated cluster characterization pipeline, which simultaneously determines cluster membership and fits the fundamental cluster parameters: distance, reddening, and age. We present results for 24 established clusters and compare them to literature values. Given the large amount of stellar data for clusters available from Gaia DR2 in 2018, this pipeline will be beneficial to analyzing the parameters of open clusters in our Galaxy.

  20. An Exploratory Study of the United States Naval Academy Engineering Curriculum

    DTIC Science & Technology

    2007-06-01

    research was conducted in an entirely quantitative fashion. There is a large amount of qualitative data that was not analyzed. These data are from the... qualitative assessment of the Naval Academy’s engineering program, and would be an excellent opportunity for future research . Overall, the results of...California: Sage Publications. Patton, M. Q. (1987). How to Use Qualitative Methods in Evaluation. California: Sage Publications. 89

  1. Characterization of Moving Dust Particles

    NASA Technical Reports Server (NTRS)

    Bos, Brent J.; Antonille, Scott R.; Memarsadeghi, Nargess

    2010-01-01

    A large depth-of-field Particle Image Velocimeter (PIV) has been developed at NASA GSFC to characterize dynamic dust environments on planetary surfaces. This instrument detects and senses lofted dust particles. We have been developing an autonomous image analysis algorithm architecture for the PIV instrument to greatly reduce the amount of data that it has to store and downlink. The algorithm analyzes PIV images and reduces the image information down to only the particle measurement data we are interested in receiving on the ground - typically reducing the amount of data to be handled by more than two orders of magnitude. We give a general description of PIV algorithms and describe only the algorithm for estimating the velocity of the traveling particles.

  2. Automatic Feature Extraction from Planetary Images

    NASA Technical Reports Server (NTRS)

    Troglio, Giulia; Le Moigne, Jacqueline; Benediktsson, Jon A.; Moser, Gabriele; Serpico, Sebastiano B.

    2010-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images has already been acquired and much more will be available for analysis in the coming years. The image data need to be analyzed, preferably by automatic processing techniques because of the huge amount of data. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to planetary data that often present low contrast and uneven illumination characteristics. Different methods have already been presented for crater extraction from planetary images, but the detection of other types of planetary features has not been addressed yet. Here, we propose a new unsupervised method for the extraction of different features from the surface of the analyzed planet, based on the combination of several image processing techniques, including a watershed segmentation and the generalized Hough Transform. The method has many applications, among which image registration and can be applied to arbitrary planetary images.

  3. Shape-Memory Effect and Pseudoelasticity in Fe-Mn-Based Alloys

    NASA Astrophysics Data System (ADS)

    La Roca, P.; Baruj, A.; Sade, M.

    2017-03-01

    Several Fe-based alloys are being considered as potential candidates for applications which require shape-memory behavior or superelastic properties. The possibility of using fabrication methods which are well known in the steel industry is very attractive and encourages a large amount of research in the field. In the present article, Fe-Mn-based alloys are mainly addressed. On the one hand, attention is paid to the shape-memory effect where the alloys contain (a) a maximum amount of Mn up to around 30 wt%, (b) several possible substitutional elements like Si, Cr, Ni, Co, and Nb and (c) some possible interstitial elements like C. On the other hand, superelastic alloys are analyzed, mainly the Fe-Mn-Al-Ni system discovered a few years ago. The most noticeable properties resulting from the martensitic transformations which are responsible for the mentioned properties, i.e., the fcc-hcp in the first case and the bcc-fcc in the latter are discussed. Selected potential applications are also analyzed.

  4. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.

  5. Design Enhancements to Facilitate a Sustainable and Energy Efficient Dining Facility (DFAC) in a Contingency Environment

    DTIC Science & Technology

    2014-09-01

    resources, and generate large amounts of food and solid waste daily. Almost all Contingency Basecamp (CB) DFACs provide individual paper and plastic ware...which is costly in terms of purchase, transportation, and disposal. This work analyzed the effects of replacing paper and plastic ware with...reusable materials, and of adding industrial dishwashers to re- duce the logistical burden of using paper and plastic ware. Additional en- hancements

  6. LinkWinds: An Approach to Visual Data Analysis

    NASA Technical Reports Server (NTRS)

    Jacobson, Allan S.

    1992-01-01

    The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration and analysis system resulting from a NASA/JPL program of research into graphical methods for rapidly accessing, displaying and analyzing large multivariate multidisciplinary datasets. It is an integrated multi-application execution environment allowing the dynamic interconnection of multiple windows containing visual displays and/or controls through a data-linking paradigm. This paradigm, which results in a system much like a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but provides a highly intuitive, easy to learn user interface on top of the traditional graphical user interface.

  7. Nonlinear effects in the laser-assisted scattering of a positron by a muon

    NASA Astrophysics Data System (ADS)

    Du, Wen-Yuan; Wang, Bing-Hong; Li, Shu-Min

    2018-02-01

    The scattering of a positron by a muon in the presence of a linearly polarized laser field is investigated in the first Born approximation. The theoretical results reveal: (1) At large scattering angle, an amount of multiphoton processes take place in the course of scattering. The photon emission processes predominate the photon absorption ones. (2) Some nonlinear phenomena about oscillations, dark angular windows, and asymmetry can be observed in angular distributions. We analyze the cause giving rise to dark windows and geometric asymmetry initially noted in the potential scattering. (3) We also analyze the total differential cross-section, the result shows that the larger the incident energy is, the smaller the total differential cross-section is. The reasons of these new results are analyzed.

  8. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  9. Heavy Analysis and Light Virtualization of Water Use Data with Python

    NASA Astrophysics Data System (ADS)

    Kim, H.; Bijoor, N.; Famiglietti, J. S.

    2014-12-01

    Water utilities possess a large amount of water data that could be used to inform urban ecohydrology, management decisions, and conservation policies, but such data are rarely analyzed owing to difficulty in analyzation, visualization, and interpretion. We have developed a high performance computing resource for this purpose. We partnered with 6 water agencies in Orange County who provided 10 years of parcel-level monthly water use billing data for a pilot study. The first challenge that we overcame was to refine all human errors and unify the many different formats of data over all agencies. Second, we tested and applied experimental approaches to the data, including complex calculations, with high efficiency. Third, we developed a method to refine the data so it can be browsed along a time series index and/or geo-spatial queries with high efficiency, no matter how large the data. Python scientific libraries were the best match to handle arbitrary data sets in our environment. Further milestones include agency entry, sets of formulae, and maintaining 15M rows X 70 columns of data with high performance of cpu-bound processes. To deal with billions of rows, we performed an analysis virtualization stack by leveraging iPython parallel computing. With this architecture, one agency could be considered one computing node or virtual machine that maintains its own data sets respectively. For example, a big agency could use a large node, and a small agency could use a micro node. Under the minimum required raw data specs, more agencies could be analyzed. The program developed in this study simplifies data analysis, visualization, and interpretation of large water datasets, and can be used to analyze large data volumes from water agencies nationally or worldwide.

  10. webpic: A flexible web application for collecting distance and count measurements from images

    PubMed Central

    2018-01-01

    Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592

  11. Examining the social ecology of a bar-crawl: An exploratory pilot study.

    PubMed

    Clapp, John D; Madden, Danielle R; Mooney, Douglas D; Dahlquist, Kristin E

    2017-01-01

    Many of the problems associated with alcohol occur after a single drinking event (e.g. drink driving, assault). These acute alcohol problems have a huge global impact and account for a large percentage of unintentional and intentional injuries in the world. Nonetheless, alcohol research and preventive interventions rarely focus on drinking at the event-level since drinking events are complex, dynamic, and methodologically challenging to observe. This exploratory study provides an example of how event-level data may be collected, analyzed, and interpreted. The drinking behavior of twenty undergraduate students enrolled at a large Midwestern public university was observed during a single bar crawl event that is organized by students annually. Alcohol use was monitored with transdermal alcohol devices coupled with ecological momentary assessments and geospatial data. "Small N, Big Data" studies have the potential to advance health behavior theory and to guide real-time interventions. However, such studies generate large amounts of within subject data that can be challenging to analyze and present. This study examined how to visually display event-level data and also explored the relationship between some basic indicators and alcohol consumption.

  12. Security and Correctness Analysis on Privacy-Preserving k-Means Clustering Schemes

    NASA Astrophysics Data System (ADS)

    Su, Chunhua; Bao, Feng; Zhou, Jianying; Takagi, Tsuyoshi; Sakurai, Kouichi

    Due to the fast development of Internet and the related IT technologies, it becomes more and more easier to access a large amount of data. k-means clustering is a powerful and frequently used technique in data mining. Many research papers about privacy-preserving k-means clustering were published. In this paper, we analyze the existing privacy-preserving k-means clustering schemes based on the cryptographic techniques. We show those schemes will cause the privacy breach and cannot output the correct results due to the faults in the protocol construction. Furthermore, we analyze our proposal as an option to improve such problems but with intermediate information breach during the computation.

  13. Morphology, composition, and mixing state of primary particles from combustion sources - crop residue, wood, and solid waste.

    PubMed

    Liu, Lei; Kong, Shaofei; Zhang, Yinxiao; Wang, Yuanyuan; Xu, Liang; Yan, Qin; Lingaswamy, A P; Shi, Zongbo; Lv, Senlin; Niu, Hongya; Shao, Longyi; Hu, Min; Zhang, Daizhou; Chen, Jianmin; Zhang, Xiaoye; Li, Weijun

    2017-07-11

    Morphology, composition, and mixing state of individual particles emitted from crop residue, wood, and solid waste combustion in a residential stove were analyzed using transmission electron microscopy (TEM). Our study showed that particles from crop residue and apple wood combustion were mainly organic matter (OM) in smoldering phase, whereas soot-OM internally mixed with K in flaming phase. Wild grass combustion in flaming phase released some Cl-rich-OM/soot particles and cardboard combustion released OM and S-rich particles. Interestingly, particles from hardwood (pear wood and bamboo) and softwood (cypress and pine wood) combustion were mainly soot and OM in the flaming phase, respectively. The combustion of foam boxes, rubber tires, and plastic bottles/bags in the flaming phase released large amounts of soot internally mixed with a small amount of OM, whereas the combustion of printed circuit boards and copper-core cables emitted large amounts of OM with Br-rich inclusions. In addition, the printed circuit board combustion released toxic metals containing Pb, Zn, Sn, and Sb. The results are important to document properties of primary particles from combustion sources, which can be used to trace the sources of ambient particles and to know their potential impacts in human health and radiative forcing in the air.

  14. Multiple Goals and Homework Involvement in Elementary School Students.

    PubMed

    Valle, Antonio; Pan, Irene; Núñez, José C; Rodríguez, Susana; Rosário, Pedro; Regueiro, Bibiana

    2015-10-27

    This work arises from the need to investigate the role of motivational variables in homework involvement and academic achievement of elementary school students. The aims of this study are twofold: identifying the different combinations of student academic goals and analyzing the differences in homework involvement and academic achievement. The sample was composed of 535 fourth-, fifth- and sixth-grade elementary school students, between the ages of 9 and 13 years old. Findings showed three groups with different motivational profiles: a group of students with high multiple goals, another group with a learning goal orientation and a third group defined by a low multiple goals profile. Focusing on the differences between groups, it was observed that the amount of time doing homework was not associated with any motivational profile. Nevertheless, the differences were statistically significant between the motivational groups in the amount of homework (F(2, 530) = 42.59; p < .001; ηp 2 = .138), in the management of time spent on homework (F(2, 530) = 33.08; p < .001; ηp 2 = .111), and in academic achievement (F(2, 530) = 33.99; p < .001; ηp 2 = .114). The effect size was large for the amount of homework performed and was also relatively large in the case of management of time and academic achievement.

  15. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    PubMed

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  16. A systems biology approach to predict and characterize human gut microbial metabolites in colorectal cancer.

    PubMed

    Wang, QuanQiu; Li, Li; Xu, Rong

    2018-04-18

    Colorectal cancer (CRC) is the second leading cause of cancer-related deaths. It is estimated that about half the cases of CRC occurring today are preventable. Recent studies showed that human gut microbiota and their collective metabolic outputs play important roles in CRC. However, the mechanisms by which human gut microbial metabolites interact with host genetics in contributing CRC remain largely unknown. We hypothesize that computational approaches that integrate and analyze vast amounts of publicly available biomedical data have great potential in better understanding how human gut microbial metabolites are mechanistically involved in CRC. Leveraging vast amount of publicly available data, we developed a computational algorithm to predict human gut microbial metabolites for CRC. We validated the prediction algorithm by showing that previously known CRC-associated gut microbial metabolites ranked highly (mean ranking: top 10.52%; median ranking: 6.29%; p-value: 3.85E-16). Moreover, we identified new gut microbial metabolites likely associated with CRC. Through computational analysis, we propose potential roles for tartaric acid, the top one ranked metabolite, in CRC etiology. In summary, our data-driven computation-based study generated a large amount of associations that could serve as a starting point for further experiments to refute or validate these microbial metabolite associations in CRC cancer.

  17. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis

    PubMed Central

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S.

    2016-01-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical ‘large p, small n’ problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package “adaptiveHM”, which is freely available from https://github.com/benliemory/adaptiveHM. PMID:28919931

  18. Research on wastewater reuse planning in Beijing central region.

    PubMed

    Jia, H; Guo, R; Xin, K; Wang, J

    2005-01-01

    The need to implement wastewater reuse in Beijing is discussed. Based on the investigation of the built wastewater reuse projects in Beijing, the differences between small wastewater reuse system and large systems were analyzed according to the technical, economical and social issues. The advantages and disadvantages of the small system and the large system were then given. In wastewater reuse planning in Beijing urban region, the large system was adopted. The rations of reclaimed water for difference land use type, including industrial reuse, municipal reuse, grass irrigation, and scenes water reuse were determined. Then according to the land use information in every block in central Beijing, using GIS techniques, the amounts of the reclaimed water needed in every block were calculated, and the main pipe system of reclaimed water was planned.

  19. Comparison of fMRI data analysis by SPM99 on different operating systems.

    PubMed

    Shinagawa, Hideo; Honda, Ei-ichi; Ono, Takashi; Kurabayashi, Tohru; Ohyama, Kimie

    2004-09-01

    The hardware chosen for fMRI data analysis may depend on the platform already present in the laboratory or the supporting software. In this study, we ran SPM99 software on multiple platforms to examine whether we could analyze fMRI data by SPM99, and to compare their differences and limitations in processing fMRI data, which can be attributed to hardware capabilities. Six normal right-handed volunteers participated in a study of hand-grasping to obtain fMRI data. Each subject performed a run that consisted of 98 images. The run was measured using a gradient echo-type echo planar imaging sequence on a 1.5T apparatus with a head coil. We used several personal computer (PC), Unix and Linux machines to analyze the fMRI data. There were no differences in the results obtained on several PC, Unix and Linux machines. The only limitations in processing large amounts of the fMRI data were found using PC machines. This suggests that the results obtained with different machines were not affected by differences in hardware components, such as the CPU, memory and hard drive. Rather, it is likely that the limitations in analyzing a huge amount of the fMRI data were due to differences in the operating system (OS).

  20. 1H NMR quantitative determination of photosynthetic pigments from green beans (Phaseolus vulgaris L.).

    PubMed

    Valverde, Juan; This, Hervé

    2008-01-23

    Using 1H nuclear magnetic resonance spectroscopy (1D and 2D), the two types of photosynthetic pigments (chlorophylls, their derivatives, and carotenoids) of "green beans" (immature pods of Phaseolus vulgaris L.) were analyzed. Compared to other analytical methods (light spectroscopy or chromatography), 1H NMR spectroscopy is a fast analytical way that provides more information on chlorophyll derivatives (allomers and epimers) than ultraviolet-visible spectroscopy. Moreover, it gives a large amount of data without prior chromatographic separation.

  1. Techniques for increasing the efficiency of Earth gravity calculations for precision orbit determination

    NASA Technical Reports Server (NTRS)

    Smith, R. L.; Lyubomirsky, A. S.

    1981-01-01

    Two techniques were analyzed. The first is a representation using Chebyshev expansions in three-dimensional cells. The second technique employs a temporary file for storing the components of the nonspherical gravity force. Computer storage requirements and relative CPU time requirements are presented. The Chebyshev gravity representation can provide a significant reduction in CPU time in precision orbit calculations, but at the cost of a large amount of direct-access storage space, which is required for a global model.

  2. Quantitative mapping of rainfall rates over the oceans utilizing Nimbus-5 ESMR data

    NASA Technical Reports Server (NTRS)

    Rao, M. S. V.; Abbott, W. V.

    1976-01-01

    The electrically scanning microwave radiometer (ESMR) data from the Nimbus 5 satellite was used to deduce estimates of precipitation amount over the oceans. An atlas of the global oceanic rainfall was prepared and the global rainfall maps analyzed and related to available ground truth information as well as to large scale processes in the atmosphere. It was concluded that the ESMR system provides the most reliable and direct approach yet known for the estimation of rainfall over sparsely documented, wide oceanic regions.

  3. Understanding the economics of succeeding in disease management.

    PubMed

    Shulkin, D J

    1999-04-01

    If implemented with the proper resource commitment, disease management can have a significant effect on the health of an organization's patient population. However, it is unlikely that even the noblest of strategic initiatives will survive long without a compelling business imperative. After analyzing the business case, many organizations have committed large amounts of resources to building disease management programs. Yet these issues are still being formulated. The author discusses five issues that are key to understanding the economics of disease management.

  4. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Koo, Michelle; Cao, Yu

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less

  5. A Cost Effective Block Framing Scheme for Underwater Communication

    PubMed Central

    Shin, Soo-Young; Park, Soo-Hyun

    2011-01-01

    In this paper, the Selective Multiple Acknowledgement (SMA) method, based on Multiple Acknowledgement (MA), is proposed to efficiently reduce the amount of data transmission by redesigning the transmission frame structure and taking into consideration underwater transmission characteristics. The method is suited to integrated underwater system models, as the proposed method can handle the same amount of data in a much more compact frame structure without any appreciable loss of reliability. Herein, the performance of the proposed SMA method was analyzed and compared to those of the conventional Automatic Repeat-reQuest (ARQ), Block Acknowledgement (BA), block response, and MA methods. The efficiency of the underwater sensor network, which forms a large cluster and mostly contains uplink data, is expected to be improved by the proposed method. PMID:22247689

  6. Exploratory tests of two strut fuel injectors for supersonic combustion

    NASA Technical Reports Server (NTRS)

    Anderson, G. Y.; Gooderum, P. B.

    1974-01-01

    Results of supersonic mixing and combustion tests performed with two simple strut injector configurations, one with parallel injectors and one with perpendicular injectors, are presented and analyzed. Good agreement is obtained between static pressure measured on the duct wall downstream of the strut injectors and distributions obtained from one-dimensional calculations. Measured duct heat load agrees with results of the one-dimensional calculations for moderate amounts of reaction, but is underestimated when large separated regions occur near the injection location. For the parallel injection strut, good agreement is obtained between the shape of the injected fuel distribution inferred from gas sample measurements at the duct exit and the distribution calculated with a multiple-jet mixing theory. The overall fraction of injected fuel reacted in the multiple-jet calculation closely matches the amount of fuel reaction necessary to match static pressure with the one-dimensional calculation. Gas sample measurements with the perpendicular injection strut also give results consistent with the amount of fuel reaction in the one-dimensional calculation.

  7. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pidatala, Venkataramana R.; Mahboubi, Amir; Mortimer, Jenny C.

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharidemore » fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.« less

  8. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE.

    PubMed

    Pidatala, Venkataramana R; Mahboubi, Amir; Mortimer, Jenny C

    2017-10-16

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharide fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.

  9. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE

    DOE PAGES

    Pidatala, Venkataramana R.; Mahboubi, Amir; Mortimer, Jenny C.

    2017-10-16

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharidemore » fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.« less

  10. Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messer, Bronson; Sewell, Christopher; Heitmann, Katrin

    2015-01-01

    Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial inmore » situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.« less

  11. ESTEEM: A Novel Framework for Qualitatively Evaluating and Visualizing Spatiotemporal Embeddings in Social Media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arendt, Dustin L.; Volkova, Svitlana

    Analyzing and visualizing large amounts of social media communications and contrasting short-term conversation changes over time and geo-locations is extremely important for commercial and government applications. Earlier approaches for large-scale text stream summarization used dynamic topic models and trending words. Instead, we rely on text embeddings – low-dimensional word representations in a continuous vector space where similar words are embedded nearby each other. This paper presents ESTEEM,1 a novel tool for visualizing and evaluating spatiotemporal embeddings learned from streaming social media texts. Our tool allows users to monitor and analyze query words and their closest neighbors with an interactive interface.more » We used state-of- the-art techniques to learn embeddings and developed a visualization to represent dynamically changing relations between words in social media over time and other dimensions. This is the first interactive visualization of streaming text representations learned from social media texts that also allows users to contrast differences across multiple dimensions of the data.« less

  12. Automated Text Analysis Based on Skip-Gram Model for Food Evaluation in Predicting Consumer Acceptance

    PubMed Central

    Kim, Augustine Yongwhi; Choi, Hoduk

    2018-01-01

    The purpose of this paper is to evaluate food taste, smell, and characteristics from consumers' online reviews. Several studies in food sensory evaluation have been presented for consumer acceptance. However, these studies need taste descriptive word lexicon, and they are not suitable for analyzing large number of evaluators to predict consumer acceptance. In this paper, an automated text analysis method for food evaluation is presented to analyze and compare recently introduced two jjampong ramen types (mixed seafood noodles). To avoid building a sensory word lexicon, consumers' reviews are collected from SNS. Then, by training word embedding model with acquired reviews, words in the large amount of review text are converted into vectors. Based on these words represented as vectors, inference is performed to evaluate taste and smell of two jjampong ramen types. Finally, the reliability and merits of the proposed food evaluation method are confirmed by a comparison with the results from an actual consumer preference taste evaluation. PMID:29606960

  13. Automated Text Analysis Based on Skip-Gram Model for Food Evaluation in Predicting Consumer Acceptance.

    PubMed

    Kim, Augustine Yongwhi; Ha, Jin Gwan; Choi, Hoduk; Moon, Hyeonjoon

    2018-01-01

    The purpose of this paper is to evaluate food taste, smell, and characteristics from consumers' online reviews. Several studies in food sensory evaluation have been presented for consumer acceptance. However, these studies need taste descriptive word lexicon, and they are not suitable for analyzing large number of evaluators to predict consumer acceptance. In this paper, an automated text analysis method for food evaluation is presented to analyze and compare recently introduced two jjampong ramen types (mixed seafood noodles). To avoid building a sensory word lexicon, consumers' reviews are collected from SNS. Then, by training word embedding model with acquired reviews, words in the large amount of review text are converted into vectors. Based on these words represented as vectors, inference is performed to evaluate taste and smell of two jjampong ramen types. Finally, the reliability and merits of the proposed food evaluation method are confirmed by a comparison with the results from an actual consumer preference taste evaluation.

  14. Effects of Childhood and Middle-Adulthood Family Conditions on Later-Life Mortality: Evidence from the Utah Population Database, 1850-2002

    PubMed Central

    Mineau, Mineau P; Gilda, Garibotti; Kerber, Richard

    2014-01-01

    We examine how key early family circumstances affect mortality risks decades later. Early life conditions are measured by parental mortality, parental fertility (e.g., offspring sibship size, parental age at offspring birth), religious upbringing, and parental socioeconomic status. Prior to these early life conditions are familial and genetic factors that affect life-span. Accordingly, we consider the role of parental and familial longevity on adult mortality risks. We analyze the large Utah Population Database which contains a vast amount of genealogical and other vital/health data that contain full life histories of individuals and hundreds of their relatives. To control for unobserved heterogeneity, we analyze sib-pair data for 12,000 sib-pairs using frailty models. We found modest effects of key childhood conditions (birth order, sibship size, parental religiosity, parental SES, and parental death in childhood). Our measures of familial aggregation of longevity were large and suggest an alternative view of early life conditions. PMID:19278766

  15. Nitric oxide: a physiologic messenger.

    PubMed

    Lowenstein, C J; Dinerman, J L; Snyder, S H

    1994-02-01

    To review the physiologic role of nitric oxide, an unusual messenger molecule that mediates blood vessel relaxation, neurotransmission, and pathogen suppression. A MEDLINE search of articles published from 1987 to 1993 that addressed nitric oxide and the enzyme that synthesizes it, nitric oxide synthase. Animal and human studies were selected from 3044 articles to analyze the clinical importance of nitric oxide. Descriptions of the structure and function of nitric oxide synthase were selected to show how nitric oxide acts as a biological messenger molecule. Biochemical and physiologic studies were analyzed if the same results were found by three or more independent observers. Two major classes of nitric oxide synthase enzymes produce nitric oxide. The constitutive isoforms found in endothelial cells and neurons release small amounts of nitric oxide for brief periods to signal adjacent cells, whereas the inducible isoform found in macrophages releases large amounts of nitric oxide continuously to eliminate bacteria and parasites. By diffusing into adjacent cells and binding to enzymes that contain iron, nitric oxide plays many important physiologic roles. It regulates blood pressure, transmits signals between neurons, and suppresses pathogens. Excess amounts, however, can damage host cells, causing neurotoxicity during strokes and causing the hypotension associated with sepsis. Nitric oxide is a simple molecule with many physiologic roles in the cardiovascular, neurologic, and immune systems. Although the general principles of nitric oxide synthesis are known, further research is necessary to determine what role it plays in causing disease.

  16. Over 4,100 protein identifications from a Xenopus laevis fertilized egg digest using reversed-phase chromatographic prefractionation followed by capillary zone electrophoresis - electrospray ionization - tandem mass spectrometry analysis

    PubMed Central

    Yan, Xiaojing; Sun, Liangliang; Zhu, Guijie; Cox, Olivia F.; Dovichi, Norman J.

    2016-01-01

    A tryptic digest generated from Xenopus laevis fertilized embryos was fractionated by reversed phase liquid chromatography. One set of 30 fractions was analyzed by 100-min CZE-ESI-MS/MS separations (50 hr total instrument time), and a second set of 15 fractions was analyzed by 3-hr UPLC-ESI-MS/MS separations (45 hr total instrument time). CZE-MS/MS produced 70% as many protein IDs (4,134 vs. 5,787) and 60% as many peptide IDs (22,535 vs. 36,848) as UPLC-MS/MS with similar instrument time (50 h vs. 45 h) but with 50 times smaller total consumed sample amount (1.5 μg vs. 75 μg). Surprisingly, CZE generated peaks that were 25% more intense than UPLC for peptides that were identified by both techniques, despite the 50-fold lower loading amount; this high sensitivity reflects the efficient ionization produced by the electrokinetically-pumped nanospray interface used in CZE. This report is the first comparison of CZE-MS/MS and UPLC-MS/MS for large-scale eukaryotic proteomic analysis. The numbers of protein and peptide identifications produced by CZE-ESI-MS/MS approach those produced by UPLC-MS/MS, but with nearly two orders of magnitude lower sample amounts. PMID:27723263

  17. PyNeb: a new tool for analyzing emission lines. I. Code description and validation of results

    NASA Astrophysics Data System (ADS)

    Luridiana, V.; Morisset, C.; Shaw, R. A.

    2015-01-01

    Analysis of emission lines in gaseous nebulae yields direct measures of physical conditions and chemical abundances and is the cornerstone of nebular astrophysics. Although the physical problem is conceptually simple, its practical complexity can be overwhelming since the amount of data to be analyzed steadily increases; furthermore, results depend crucially on the input atomic data, whose determination also improves each year. To address these challenges we created PyNeb, an innovative code for analyzing emission lines. PyNeb computes physical conditions and ionic and elemental abundances and produces both theoretical and observational diagnostic plots. It is designed to be portable, modular, and largely customizable in aspects such as the atomic data used, the format of the observational data to be analyzed, and the graphical output. It gives full access to the intermediate quantities of the calculation, making it possible to write scripts tailored to the specific type of analysis one wants to carry out. In the case of collisionally excited lines, PyNeb works by solving the equilibrium equations for an n-level atom; in the case of recombination lines, it works by interpolation in emissivity tables. The code offers a choice of extinction laws and ionization correction factors, which can be complemented by user-provided recipes. It is entirely written in the python programming language and uses standard python libraries. It is fully vectorized, making it apt for analyzing huge amounts of data. The code is stable and has been benchmarked against IRAF/NEBULAR. It is public, fully documented, and has already been satisfactorily used in a number of published papers.

  18. Semantic orchestration of image processing services for environmental analysis

    NASA Astrophysics Data System (ADS)

    Ranisavljević, Élisabeth; Devin, Florent; Laffly, Dominique; Le Nir, Yannick

    2013-09-01

    In order to analyze environmental dynamics, a major process is the classification of the different phenomena of the site (e.g. ice and snow for a glacier). When using in situ pictures, this classification requires data pre-processing. Not all the pictures need the same sequence of processes depending on the disturbances. Until now, these sequences have been done manually, which restricts the processing of large amount of data. In this paper, we present how to realize a semantic orchestration to automate the sequencing for the analysis. It combines two advantages: solving the problem of the amount of processing, and diversifying the possibilities in the data processing. We define a BPEL description to express the sequences. This BPEL uses some web services to run the data processing. Each web service is semantically annotated using an ontology of image processing. The dynamic modification of the BPEL is done using SPARQL queries on these annotated web services. The results obtained by a prototype implementing this method validate the construction of the different workflows that can be applied to a large number of pictures.

  19. Program Analyzes Radar Altimeter Data

    NASA Technical Reports Server (NTRS)

    Vandemark, Doug; Hancock, David; Tran, Ngan

    2004-01-01

    A computer program has been written to perform several analyses of radar altimeter data. The program was designed to improve on previous methods of analysis of altimeter engineering data by (1) facilitating and accelerating the analysis of large amounts of data in a more direct manner and (2) improving the ability to estimate performance of radar-altimeter instrumentation and provide data corrections. The data in question are openly available to the international scientific community and can be downloaded from anonymous file-transfer- protocol (FTP) locations that are accessible via links from altimetry Web sites. The software estimates noise in range measurements, estimates corrections for electromagnetic bias, and performs statistical analyses on various parameters for comparison of different altimeters. Whereas prior techniques used to perform similar analyses of altimeter range noise require comparison of data from repetitions of satellite ground tracks, the present software uses a high-pass filtering technique to obtain similar results from single satellite passes. Elimination of the requirement for repeat-track analysis facilitates the analysis of large amounts of satellite data to assess subtle variations in range noise.

  20. Effects of Cooling Conditions on Tensile and Charpy Impact Properties of API X80 Linepipe Steels

    NASA Astrophysics Data System (ADS)

    Han, Seung Youb; Shin, Sang Yong; Lee, Sunghak; Kim, Nack J.; Bae, Jin-Ho; Kim, Kisoo

    2010-02-01

    In this study, four API X80 linepipe steel specimens were fabricated by varying the cooling rate and finish cooling temperature, and their microstructures and crystallographic orientations were analyzed to investigate the effects of the cooling conditions on the tensile and Charpy impact properties. All the specimens consisted of acicular ferrite (AF), granular bainite (GB), and martensite-austenite (MA) constituents. The volume fraction of MA increased with an increasing cooling rate, and the volume fraction and size of MA tended to decrease with an increasing finish cooling temperature. According to the crystallographic orientation analysis data, the effective grain size and unit crack path decreased as fine ACs having a large amount of high-angle grain boundaries were homogeneously formed, thereby leading to the improvement in the Charpy impact properties. The specimen fabricated with the higher cooling rate and lower finish cooling temperature had the highest upper-shelf energy (USE) and the lowest energy transition temperature (ETT), because it contained a large amount of MA homogeneously distributed inside fine AFs, while its tensile properties remained excellent.

  1. Advantages of Parallel Processing and the Effects of Communications Time

    NASA Technical Reports Server (NTRS)

    Eddy, Wesley M.; Allman, Mark

    2000-01-01

    Many computing tasks involve heavy mathematical calculations, or analyzing large amounts of data. These operations can take a long time to complete using only one computer. Networks such as the Internet provide many computers with the ability to communicate with each other. Parallel or distributed computing takes advantage of these networked computers by arranging them to work together on a problem, thereby reducing the time needed to obtain the solution. The drawback to using a network of computers to solve a problem is the time wasted in communicating between the various hosts. The application of distributed computing techniques to a space environment or to use over a satellite network would therefore be limited by the amount of time needed to send data across the network, which would typically take much longer than on a terrestrial network. This experiment shows how much faster a large job can be performed by adding more computers to the task, what role communications time plays in the total execution time, and the impact a long-delay network has on a distributed computing system.

  2. Examining the Environmental Effects of Athletic Training: Perceptions of Waste and the Use of Green Techniques.

    PubMed

    Potteiger, Kelly; Pitney, William A; Cappaert, Thomas A; Wolfe, Angela

    2017-12-01

      Environmental sustainability is a critical concern in health care. Similar to other professions, the practice of athletic training necessitates the use of a large quantity of natural and manufactured resources.   To examine the perceptions of the waste produced by the practice of athletic training and the green practices currently used by athletic trainers (ATs) to combat this waste.   Mixed-methods study.   Field setting.   A total of 442 ATs completed the study. Sixteen individuals participated in the qualitative portion.   Data from sections 2 and 3 of the Athletic Training Environmental Impact Survey were analyzed. Focus groups and individual interviews were used to determine participants' views of waste and the efforts used to combat waste. Descriptive statistics were used to examine types of waste. Independent t tests, χ 2 tests, and 1-way analyses of variance were calculated to identify any differences between the knowledge and use of green techniques. Interviews and focus groups were transcribed verbatim and analyzed inductively.   Participants reported moderate knowledge of green techniques (3.18 ± 0.53 on a 5-point Likert scale). Fifty-eight percent (n = 260) of survey participants perceived that a substantial amount of waste was produced by the practice of athletic training. Ninety-two percent (n = 408) admitted they thought about the waste produced in their daily practice. The types of waste reported most frequently were plastics (n = 111, 29%), water (n = 88, 23%), and paper for administrative use (n = 81, 21%). Fifty-two percent (n = 234) agreed this waste directly affected the environment. The qualitative aspect of the study reinforced recognition of the large amount of waste produced by the practice of athletic training. Types of conservation practices used by ATs were also explored.   Participants reported concern regarding the waste produced by athletic training. The amount of waste varies depending on practice size and setting. Future researchers should use direct measures to determine the amount of waste created by the practice of athletic training.

  3. Shift work-related problems in 16-h night shift nurses (1): Development of an automated data processing system for questionnaires, heart rate, physical activity and posture.

    PubMed

    Fukuda, H; Takahashi, M; Miki, K; Haratani, T; Kurabayashi, L; Hisanaga, N; Arito, H; Takahashi, H; Egoshi, M; Sakurai, M

    1999-04-01

    To assess the shift work-related problems associated with a 16-h night shift in a two-shift system, we took the following important factors into consideration; the interaction between circadian rhythms and the longer night shift, the type of morningness and eveningness experienced, the subjective sleep feeling, the subjects' daily behavior, the effectiveness of taking a nap during the long night shift, and finally the effectiveness of using several different kinds of measuring devices. Included among the measuring devices used were a standard questionnaire, repetitive self-assessment of subjective symptoms and daily behavior at short intervals, and a continuous recording of such objective indices as physical activity and heart rate. A potential problem lies in the fact that field studies that use such measures tend to produce a mass of data, and are thus faced with the accompanying technical problem of analyzing such a large amount of data (time, effort and cost). To solve the data analysis problem, we developed an automated data processing system. Through the use of an image scanner with a paper feeder, standard paper, an optical character recognition function and common application software, we were able to analyze a mass of data continuously and automatically within a short time. Our system should prove useful for field studies that produce a large amount of data collected with several different kinds of measuring devices.

  4. NASA/MSFC FY92 Earth Science and Applications Program Research Review

    NASA Technical Reports Server (NTRS)

    Arnold, James E. (Editor); Leslie, Fred W. (Editor)

    1993-01-01

    A large amount of attention has recently been given to global issues such as the ozone hole, tropospheric temperature variability, etc. A scientific challenge is to better understand atmospheric processes on a variety of spatial and temporal scales in order to predict environmental changes. Measurement of geophysical parameters such as wind, temperature, and moisture are needed to validate theories, provide analyzed data sets, and initialize or constrain numerical models. One of NASA's initiatives is the Mission to Planet Earth Program comprised of an Earth Observation System (EOS) and the scientific strategy to analyze these data. This work describes these efforts in the context of satellite data analysis and fundamental studies of atmospheric dynamics which examine selected processes important to the global circulation.

  5. A Brief Review of RNA–Protein Interaction Database Resources

    PubMed Central

    Yi, Ying; Zhao, Yue; Huang, Yan; Wang, Dong

    2017-01-01

    RNA–Protein interactions play critical roles in various biological processes. By collecting and analyzing the RNA–Protein interactions and binding sites from experiments and predictions, RNA–Protein interaction databases have become an essential resource for the exploration of the transcriptional and post-transcriptional regulatory network. Here, we briefly review several widely used RNA–Protein interaction database resources developed in recent years to provide a guide of these databases. The content and major functions in databases are presented. The brief description of database helps users to quickly choose the database containing information they interested. In short, these RNA–Protein interaction database resources are continually updated, but the current state shows the efforts to identify and analyze the large amount of RNA–Protein interactions. PMID:29657278

  6. Effect of negative bias on TiAlSiN coating deposited on nitrided Zircaloy-4

    NASA Astrophysics Data System (ADS)

    Jun, Zhou; Zhendong, Feng; Xiangfang, Fan; Yanhong, Liu; Huanlin, Li

    2018-01-01

    TiAlSiN coatings were deposited on the nitrided Zircaloy-4 by multi-arc ion plating at -100 V, -200 V and -300 V. In this study, the high temperature oxidation behavior of coatings was tested by a box-type resistance furnace in air for 3 h at 800 °C; the macro-morphology of coatings was observed and analyzed by a zoom-stereo microscope; the micro-morphology of coatings was analyzed by a scanning electron microscopy (SEM), and the chemical elements of samples were analyzed by an energy dispersive spectroscopy(EDS); the adhesion strength of the coating to the substrate was measured by an automatic scratch tester; and the phases of coatings were analyzed by an X-ray diffractometer(XRD). Results show that the coating deposited at -100 V shows better high temperature oxidation resistance behavior, at the same time, Al elements contained in the coating is of the highest amount, meanwhile, the adhesion strength of the coating to the substrate is the highest, which is 33N. As the bias increases, high temperature oxidation resistance behavior of the coating weakens first and then increases, the amount of large particles on the surface of the coating increases first and then decreases whereas the density of the coating decreases first and then increases, and adhesion strength of the coating to the substrate increases first and then weakens. The coating's quality is relatively poor when the bias is -200 V.

  7. Analyses of infrequent (quasi-decadal) large groundwater recharge events in the northern Great Basin: Their importance for groundwater availability, use, and management

    USGS Publications Warehouse

    Masbruch, Melissa D.; Rumsey, Christine; Gangopadhyay, Subhrendu; Susong, David D.; Pruitt, Tom

    2016-01-01

    There has been a considerable amount of research linking climatic variability to hydrologic responses in the western United States. Although much effort has been spent to assess and predict changes in surface water resources, little has been done to understand how climatic events and changes affect groundwater resources. This study focuses on characterizing and quantifying the effects of large, multiyear, quasi-decadal groundwater recharge events in the northern Utah portion of the Great Basin for the period 1960–2013. Annual groundwater level data were analyzed with climatic data to characterize climatic conditions and frequency of these large recharge events. Using observed water-level changes and multivariate analysis, five large groundwater recharge events were identified with a frequency of about 11–13 years. These events were generally characterized as having above-average annual precipitation and snow water equivalent and below-average seasonal temperatures, especially during the spring (April through June). Existing groundwater flow models for several basins within the study area were used to quantify changes in groundwater storage from these events. Simulated groundwater storage increases per basin from a single recharge event ranged from about 115 to 205 Mm3. Extrapolating these amounts over the entire northern Great Basin indicates that a single large quasi-decadal recharge event could result in billions of cubic meters of groundwater storage. Understanding the role of these large quasi-decadal recharge events in replenishing aquifers and sustaining water supplies is crucial for long-term groundwater management.

  8. Pilot-Induced Oscillation Prediction With Three Levels of Simulation Motion Displacement

    NASA Technical Reports Server (NTRS)

    Schroeder, Jeffery A.; Chung, William W. Y.; Tran, Duc T.; Laforce, Soren; Bengford, Norman J.

    2001-01-01

    Simulator motion platform characteristics were examined to determine if the amount of motion affects pilot-induced oscillation (PIO) prediction. Five test pilots evaluated how susceptible 18 different sets of pitch dynamics were to PIOs with three different levels of simulation motion platform displacement: large, small, and none. The pitch dynamics were those of a previous in-flight experiment, some of which elicited PIOs These in-flight results served as truth data for the simulation. As such, the in-flight experiment was replicated as much as possible. Objective and subjective data were collected and analyzed With large motion, PIO and handling qualities ratings matched the flight data more closely than did small motion or no motion. Also, regardless of the aircraft dynamics, large motion increased pilot confidence in assigning handling qualifies ratings, reduced safety pilot trips, and lowered touchdown velocities. While both large and small motion provided a pitch rate cue of high fidelity, only large motion presented the pilot with a high fidelity vertical acceleration cue.

  9. PBxplore: a tool to analyze local protein structure and deformability with Protein Blocks

    PubMed Central

    Craveur, Pierrick; Joseph, Agnel Praveen; Jallu, Vincent

    2017-01-01

    This paper describes the development and application of a suite of tools, called PBxplore, to analyze the dynamics and deformability of protein structures using Protein Blocks (PBs). Proteins are highly dynamic macromolecules, and a classical way to analyze their inherent flexibility is to perform molecular dynamics simulations. The advantage of using small structural prototypes such as PBs is to give a good approximation of the local structure of the protein backbone. More importantly, by reducing the conformational complexity of protein structures, PBs allow analysis of local protein deformability which cannot be done with other methods and had been used efficiently in different applications. PBxplore is able to process large amounts of data such as those produced by molecular dynamics simulations. It produces frequencies, entropy and information logo outputs as text and graphics. PBxplore is available at https://github.com/pierrepo/PBxplore and is released under the open-source MIT license. PMID:29177113

  10. PATHA: Performance Analysis Tool for HPC Applications

    DOE PAGES

    Yoo, Wucherl; Koo, Michelle; Cao, Yi; ...

    2016-02-18

    Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data.more » Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.« less

  11. Influence of overconsolidated condition on permeability evolution in silica sand

    NASA Astrophysics Data System (ADS)

    Kimura, S.; Kaneko, H.; Ito, T.; Nishimura, O.; Minagawa, H.

    2013-12-01

    Permeability of sediments is important factors for production of natural gas from natural gas hydrate bearing layers. Methane-hydrate is regarded as one of the potential resources of natural gas. As results of coring and logging, the existence of a large amount of methane-hydrate is estimated in the Nankai Trough, offshore central Japan, where many folds and faults have been observed. In the present study, we investigate the permeability of silica sand specimen forming the artificial fault zone after large displacement shear in the ring-shear test under two different normal consolidated and overconsolidated conditions. The significant influence of overconsolidation ratio (OCR) on permeability evolution is not found. The permeability reduction is influenced a great deal by the magnitude of normal stress during large displacement shearing. The grain size distribution and structure observation in the shear zone of specimen after shearing at each normal stress level are analyzed by laser scattering type particle analyzer and scanning electron microscope, respectively. It is indicated that the grain size and porosity reduction due to the particle crushing are the factor of the permeability reduction. This study is financially supported by METI and Research Consortium for Methane Hydrate Resources in Japan (the MH21 Research Consortium).

  12. Desiderata for Healthcare Integrated Data Repositories Based on Architectural Comparison of Three Public Repositories

    PubMed Central

    Huser, Vojtech; Cimino, James J.

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network’s Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management. PMID:24551366

  13. Desiderata for healthcare integrated data repositories based on architectural comparison of three public repositories.

    PubMed

    Huser, Vojtech; Cimino, James J

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network's Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management.

  14. Chemical characterization of seven Large Area Collector particles by SXRF. [cosmic dust composition

    NASA Technical Reports Server (NTRS)

    Flynn, G. J.; Sutton, S. R.

    1991-01-01

    Optical microscopy and synchrotron X-ray fluorescence (SXRF) are used to analyze the chemical composition of seven dark-appearing cosmic-dust particles obtained in the stratosphere during NASA Johnson Large Area Collector flights. The experimental setup and procedures are outlined, and the results are presented in extensive tables. Three of the particles had abundances similar to those of chondrites (except for low Ca values in one particle); two had a metallic appearance and spectra dominated by Fe and Zn; one contained Cu and Cr plus small amounts of Fe and Zn; and one had igneous-type abundances of minor and trace elements while containing all of the elements seen in chondritic particles, suggesting it may be of extraterrestrial origin.

  15. Comparing Cumberland With Other Samples Analyzed by Curiosity

    NASA Image and Video Library

    2014-12-16

    This graphic offers comparisons between the amount of an organic chemical named chlorobenzene detected in the Cumberland rock sample and amounts of it in samples from three other Martian surface targets analyzed by NASA Curiosity Mars rover.

  16. Examination of snowmelt over Western Himalayas using remote sensing data

    NASA Astrophysics Data System (ADS)

    Tiwari, Sarita; Kar, Sarat C.; Bhatla, R.

    2016-07-01

    Snowmelt variability in the Western Himalayas has been examined using remotely sensed snow water equivalent (SWE) and snow-covered area (SCA) datasets. It is seen that climatological snowfall and snowmelt amount varies in the Himalayan region from west to east and from month to month. Maximum snowmelt occurs at the elevation zone between 4500 and 5000 m. As the spring and summer approach and snowmelt begins, a large amount of snow melts in May. Strength and weaknesses of temperature-based snowmelt models have been analyzed for this region by computing the snowmelt factor or the degree-day factor (DDF). It is seen that average DDF in the Himalayas is more in April and less in July. During spring and summer months, melting rate is higher in the areas that have height above 2500 m. The region that lies between 4500 and 5000 m elevation zones contributes toward more snowmelt with higher melting rate. Snowmelt models have been developed to estimate interannual variations of monthly snowmelt amount using the DDF, observed SWE, and surface air temperature from reanalysis datasets. In order to further improve the estimate snowmelt, regression between observed and modeled snowmelt has been carried out and revised DDF values have been computed. It is found that both the models do not capture the interannual variability of snowmelt in April. The skill of the model is moderate in May and June, but the skill is relatively better in July. In order to explain this skill, interannual variability (IAV) of surface air temperature has been examined. Compared to July, in April, the IAV of temperature is large indicating that a climatological value of DDF is not sufficient to explain the snowmelt rate in April. Snow area and snow amount depletion curves over Himalayas indicate that in a small area at high altitude, snow is still observed with large SWE whereas over most of the region, all the snow has melted.

  17. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  18. The NIH Roadmap Epigenomics Program data resource

    PubMed Central

    Chadwick, Lisa Helbling

    2012-01-01

    The NIH Roadmap Reference Epigenome Mapping Consortium is developing a community resource of genome-wide epigenetic maps in a broad range of human primary cells and tissues. There are large amounts of data already available, and a number of different options for viewing and analyzing the data. This report will describe key features of the websites where users will find data, protocols and analysis tools developed by the consortium, and provide a perspective on how this unique resource will facilitate and inform human disease research, both immediately and in the future. PMID:22690667

  19. The NIH Roadmap Epigenomics Program data resource.

    PubMed

    Chadwick, Lisa Helbling

    2012-06-01

    The NIH Roadmap Reference Epigenome Mapping Consortium is developing a community resource of genome-wide epigenetic maps in a broad range of human primary cells and tissues. There are large amounts of data already available, and a number of different options for viewing and analyzing the data. This report will describe key features of the websites where users will find data, protocols and analysis tools developed by the consortium, and provide a perspective on how this unique resource will facilitate and inform human disease research, both immediately and in the future.

  20. Analysis of a microstrip reflectarray antenna for microspacecraft applications

    NASA Technical Reports Server (NTRS)

    Huang, J.

    1995-01-01

    A microstrip reflectarray is a flat reflector antenna that can be mounted conformally onto a spacecraft's outside structure without consuming a significant amount of spacecraft volume and mass. For large apertures (2 m or larger), the antenna's reflecting surface, being flat, can be more easily and reliably deployed than a curved parabolic reflector. This article presents the study results on a microstrip reflect-array with circular polarization. Its efficiency and bandwidth characteristics are analyzed. Numerous advantages of this antenna system are discussed. Three new concepts using this microstrip reflectarray are also proposed.

  1. Cosmological cosmic strings

    NASA Technical Reports Server (NTRS)

    Gregory, Ruth

    1988-01-01

    The effect of an infinite cosmic string on a cosmological background is investigated. It is found that the metric is approximately a scaled version of the empty space string metric, i.e., conical in nature. Results are used to place bounds on the amount of cylindrical gravitational radiation currently emitted by such a string. The gravitational radiation equations are then analyzed explicitly and it is shown that even initially large disturbances are rapidly damped as the expansion proceeds. The implications of the gravitational radiation background and the limitations of the quadrupole formula are discussed.

  2. Gaining Insights on Nasopharyngeal Carcinoma Treatment Outcome Using Clinical Data Mining Techniques.

    PubMed

    Ghaibeh, A Ammar; Kasem, Asem; Ng, Xun Jin; Nair, Hema Latha Krishna; Hirose, Jun; Thiruchelvam, Vinesh

    2018-01-01

    The analysis of Electronic Health Records (EHRs) is attracting a lot of research attention in the medical informatics domain. Hospitals and medical institutes started to use data mining techniques to gain new insights from the massive amounts of data that can be made available through EHRs. Researchers in the medical field have often used descriptive statistics and classical statistical methods to prove assumed medical hypotheses. However, discovering new insights from large amounts of data solely based on experts' observations is difficult. Using data mining techniques and visualizations, practitioners can find hidden knowledge, identify interesting patterns, or formulate new hypotheses to be further investigated. This paper describes a work in progress on using data mining methods to analyze clinical data of Nasopharyngeal Carcinoma (NPC) cancer patients. NPC is the fifth most common cancer among Malaysians, and the data analyzed in this study was collected from three states in Malaysia (Kuala Lumpur, Sabah and Sarawak), and is considered to be the largest up-to-date dataset of its kind. This research is addressing the issue of cancer recurrence after the completion of radiotherapy and chemotherapy treatment. We describe the procedure, problems, and insights gained during the process.

  3. The impact of radioactive steel recycling on the public and professionals.

    PubMed

    Hrncir, Tomas; Panik, Michal; Ondra, Frantisek; Necas, Vladimir

    2013-06-15

    The decommissioning of nuclear power plants represents a complex process resulting in the generation of large amounts of waste materials, e.g. steel scrap containing various concentrations of radionuclides. Recycling some of these materials is highly desirable due to numerous reasons. Herein presented scenarios of recycling of radioactive steel within the nuclear as well as civil engineering industry are analyzed from the radiation protection point of view. An approach based on the dose constraints principle is chosen. The aim of the study is to derive conditional clearance levels (maximal specific mass activity of material allowing its recycling/clearance) for analyzed radionuclides ensuring that the detrimental impact on human health is kept on a negligible level. Determined conditional clearance levels, as the result of performed software calculations, are valid for the reuse of radioactive steel in four selected scenarios. Calculation results indicate that the increase of the amount of recyclable radioactive steel due to its reuse in specific applications may be feasible considering the radiation impact on the public and professionals. However, issues connected with public acceptance, technical difficulties and financing of potential realization are still open and they have to be examined in more detail. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. An exploratory investigation of polar organic compounds in waters from a lead–zinc mine and mill complex

    USGS Publications Warehouse

    Rostad, Colleen E.; Schmitt, Christopher J.; Schumacher, John G.; Leiker, Thomas J.

    2011-01-01

    Surface water samples were collected in 2006 from a lead mine-mill complex in Missouri to investigate possible organic compounds coming from the milling process. Water samples contained relatively high concentrations of dissolved organic carbon (DOC; greater than 20 mg/l) for surface waters but were colorless, implying a lack of naturally occurring aquatic humic or fulvic acids. Samples were extracted by three different types of solid-phase extraction and analyzed by electrospray ionization/mass spectrometry. Because large amounts of xanthate complexation reagents are used in the milling process, techniques were developed to extract and analyze for sodium isopropyl xanthate and sodium ethyl xanthate. Although these xanthate reagents were not found, trace amounts of the degradates, isopropyl xanthyl thiosulfonate and isopropyl xanthyl sulfonate, were found in most locations sampled, including the tailings pond downstream. Dioctyl sulfosuccinate, a surfactant and process filtering aid, was found at concentrations estimated at 350 μg/l at one mill outlet, but not downstream. Release of these organic compounds downstream from lead-zinc mine and milling areas has not previously been reported. A majority of the DOC remains unidentified.

  5. Impact of Medicare on the Use of Medical Services by Disabled Beneficiaries, 1972-1974

    PubMed Central

    Deacon, Ronald W.

    1979-01-01

    The extension of Medicare coverage in 1973 to disabled persons receiving cash benefits under the Social Security Act provided an opportunity to examine the impact of health insurance coverage on utilization and expenses for Part B services. Data on medical services used both before and after coverage, collected through the Current Medicare Survey, were analyzed. Results indicate that access to care (as measured by the number of persons using services) increased slightly, while the rate of use did not. The large increase in the number of persons eligible for Medicare reflected the large increase in the number of cash beneficiaries. Significant increases also were found in the amount charged for medical services. The absence of large increases in access and service use may be attributed, in part, to the already existing source of third party payment available to disabled cash beneficiaries in 1972, before Medicare coverage. PMID:10316939

  6. Runaway electron dynamics in tokamak plasmas with high impurity content

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martín-Solís, J. R., E-mail: solis@fis.uc3m.es; Loarte, A.; Lehnen, M.

    2015-09-15

    The dynamics of high energy runaway electrons is analyzed for plasmas with high impurity content. It is shown that modified collision terms are required in order to account for the collisions of the relativistic runaway electrons with partially stripped impurity ions, including the effect of the collisions with free and bound electrons, as well as the scattering by the full nuclear and the electron-shielded ion charge. The effect of the impurities on the avalanche runaway growth rate is discussed. The results are applied, for illustration, to the interpretation of the runaway electron behavior during disruptions, where large amounts of impuritiesmore » are expected, particularly during disruption mitigation by massive gas injection. The consequences for the electron synchrotron radiation losses and the resulting runaway electron dynamics are also analyzed.« less

  7. Distributed and parallel approach for handle and perform huge datasets

    NASA Astrophysics Data System (ADS)

    Konopko, Joanna

    2015-12-01

    Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.

  8. Efficiently modeling neural networks on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Farber, Robert M.

    1993-01-01

    Neural networks are a very useful tool for analyzing and modeling complex real world systems. Applying neural network simulations to real world problems generally involves large amounts of data and massive amounts of computation. To efficiently handle the computational requirements of large problems, we have implemented at Los Alamos a highly efficient neural network compiler for serial computers, vector computers, vector parallel computers, and fine grain SIMD computers such as the CM-2 connection machine. This paper describes the mapping used by the compiler to implement feed-forward backpropagation neural networks for a SIMD (Single Instruction Multiple Data) architecture parallel computer. Thinking Machines Corporation has benchmarked our code at 1.3 billion interconnects per second (approximately 3 gigaflops) on a 64,000 processor CM-2 connection machine (Singer 1990). This mapping is applicable to other SIMD computers and can be implemented on MIMD computers such as the CM-5 connection machine. Our mapping has virtually no communications overhead with the exception of the communications required for a global summation across the processors (which has a sub-linear runtime growth on the order of O(log(number of processors)). We can efficiently model very large neural networks which have many neurons and interconnects and our mapping can extend to arbitrarily large networks (within memory limitations) by merging the memory space of separate processors with fast adjacent processor interprocessor communications. This paper will consider the simulation of only feed forward neural network although this method is extendable to recurrent networks.

  9. Changes in polysaccharide and protein composition of cell walls in grape berry skin (Cv. Shiraz) during ripening and over-ripening.

    PubMed

    Vicens, Anysia; Fournand, David; Williams, Pascale; Sidhoum, Louise; Moutounet, Michel; Doco, Thierry

    2009-04-08

    Polysaccharide modification is the most fundamental factor that affects firmness of fruit during ripening. In grape, because of the lack of information on the modifications occurring in cell wall polysaccharides in skins, but also because this tissue contains large amounts of organoleptic compounds for winemaking, a study was performed on the evolution and extractability of polysaccharides from grape skins of Shiraz cultivar throughout ripening. A HEPES/phenol extraction technique was used to analyze Shiraz grape cell wall material isolated from skins of berries harvested from one to ten weeks after veraison. Total amounts in cell wall polysaccharides remained constant during ripening (4.2 mg/berry). A slight decrease in galactose content of insoluble polysaccharides was observed, as well as a significant de-esterification of methoxylated uronic acids, indicating that some modifications occur in cell wall polysaccharides. The water-soluble fraction represented a very small fraction of the whole polysaccharides, but its amounts increased more than 2-fold between the first and the last sample. Isolated cell walls were also analyzed for their protein composition. Last, hydroalcoholic extractions in model-wine solution were also performed on fresh skins. This extracted fraction was very similar to the water-soluble one, and increased during the entire period. By comparison with polysaccharide modifications described in flesh cell wall in previous works, it can be assumed that the moderate skin polysaccharide degradation highlights the protective role of that tissue.

  10. Quercetin as colorimetric reagent for determination of zirconium

    USGS Publications Warehouse

    Grimaldi, F.S.; White, C.E.

    1953-01-01

    Methods described in the literature for the determination of zirconium are generally designed for relatively large amounts of this element. A good procedure using colorimetric reagent for the determination of trace amounts is desirable. Quercetin has been found to yield a sensitive color reaction with zirconium suitable for the determination of from 0.1 to 50?? of zirconium dioxide. The procedure developed involves the separation of zirconium from interfering elements by precipitation with p-dimethylaminoazophenylarsonic acid prior to its estimation with quercetin. The quercetin reaction is carried out in 0.5N hydrochloric acid solution. Under the operating conditions it is indicated that quercetin forms a 2 to 1 complex with zirconium; however, a 2 to 1 and a 1 to 1 complex can coexist under special conditions. Approximate values for the equilibrium constants of the complexes are K1 = 0.33 ?? 10-5 and K2 = 1.3 ?? 10-9. Seven Bureau of Standards samples of glass sands and refractories were analyzed with excellent results. The method described should find considerable application in the analysis of minerals and other materials for macro as well as micro amounts of zirconium.

  11. Destruction of Navy Hazardous Wastes by Supercritical Water Oxidation

    DTIC Science & Technology

    1994-08-01

    cleaning and derusting (nitrite and citric acid solutions), electroplating ( acids and metal bearing solutions), electronics and refrigeration... acid forming chemical species or that contain a large amount of dissolved solids present a challenge to current SCWO •-chnology. Approved for public...Waste streams that contain a large amount of mineral- acid forming chemical species or that contain a large amount of dissolved solids present a challenge

  12. STAR FORMATION AND SUPERCLUSTER ENVIRONMENT OF 107 NEARBY GALAXY CLUSTERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, Seth A.; Hickox, Ryan C.; Wegner, Gary A.

    We analyze the relationship between star formation (SF), substructure, and supercluster environment in a sample of 107 nearby galaxy clusters using data from the Sloan Digital Sky Survey. Previous works have investigated the relationships between SF and cluster substructure, and cluster substructure and supercluster environment, but definitive conclusions relating all three of these variables has remained elusive. We find an inverse relationship between cluster SF fraction ( f {sub SF}) and supercluster environment density, calculated using the Galaxy luminosity density field at a smoothing length of 8 h {sup −1} Mpc (D8). The slope of f {sub SF} versus D8more » is −0.008 ± 0.002. The f {sub SF} of clusters located in low-density large-scale environments, 0.244 ± 0.011, is higher than for clusters located in high-density supercluster cores, 0.202 ± 0.014. We also divide superclusters, according to their morphology, into filament- and spider-type systems. The inverse relationship between cluster f {sub SF} and large-scale density is dominated by filament- rather than spider-type superclusters. In high-density cores of superclusters, we find a higher f {sub SF} in spider-type superclusters, 0.229 ± 0.016, than in filament-type superclusters, 0.166 ± 0.019. Using principal component analysis, we confirm these results and the direct correlation between cluster substructure and SF. These results indicate that cluster SF is affected by both the dynamical age of the cluster (younger systems exhibit higher amounts of SF); the large-scale density of the supercluster environment (high-density core regions exhibit lower amounts of SF); and supercluster morphology (spider-type superclusters exhibit higher amounts of SF at high densities).« less

  13. Analysis of Infrequent (Quasi-Decadal) Large Groundwater Recharge Events: A Case Study for Northern Utah, United States

    NASA Astrophysics Data System (ADS)

    Masbruch, M.; Rumsey, C.; Gangopadhyay, S.; Susong, D.; Pruitt, T.

    2015-12-01

    There has been a considerable amount of research linking climatic variability to hydrologic responses in arid and semi-arid regions such as the western United States. Although much effort has been spent to assess and predict changes in surface-water resources, little has been done to understand how climatic events and changes affect groundwater resources. This study focuses on quantifying the effects of large quasi-decadal groundwater recharge events on groundwater in the northern Utah portion of the Great Basin for the period 1960 to 2013. Groundwater-level monitoring data were analyzed with climatic data to characterize climatic conditions and frequency of these large recharge events. Using observed water-level changes and multivariate analysis, five large groundwater recharge events were identified within the study area and period, with a frequency of about 11 to 13 years. These events were generally characterized as having above-average annual precipitation and snow water equivalent and below-average seasonal temperatures, especially during the spring (April through June). Existing groundwater flow models for several basins within the study area were used to quantify changes in groundwater storage from these events. Simulated groundwater storage increases per basin from a single event ranged from about 115 Mm3 (93,000 acre-feet) to 205 Mm3 (166,000 acre-ft). Extrapolating these amounts over the entire northern Great Basin indicates that even a single large quasi-decadal recharge event could result in billions of cubic meters (millions of acre-feet) of groundwater recharge. Understanding the role of these large quasi-decadal recharge events in replenishing aquifers and sustaining water supplies is crucial for making informed water management decisions.

  14. Empirical relationships between tree fall and landscape-level amounts of logging and fire

    PubMed Central

    Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C.

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape. PMID:29474487

  15. Empirical relationships between tree fall and landscape-level amounts of logging and fire.

    PubMed

    Lindenmayer, David B; Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape.

  16. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  17. TOFSIMS-P: a web-based platform for analysis of large-scale TOF-SIMS data.

    PubMed

    Yun, So Jeong; Park, Ji-Won; Choi, Il Ju; Kang, Byeongsoo; Kim, Hark Kyun; Moon, Dae Won; Lee, Tae Geol; Hwang, Daehee

    2011-12-15

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) has been a useful tool to profile secondary ions from the near surface region of specimens with its high molecular specificity and submicrometer spatial resolution. However, the TOF-SIMS analysis of even a moderately large size of samples has been hampered due to the lack of tools for automatically analyzing the huge amount of TOF-SIMS data. Here, we present a computational platform to automatically identify and align peaks, find discriminatory ions, build a classifier, and construct networks describing differential metabolic pathways. To demonstrate the utility of the platform, we analyzed 43 data sets generated from seven gastric cancer and eight normal tissues using TOF-SIMS. A total of 87 138 ions were detected from the 43 data sets by TOF-SIMS. We selected and then aligned 1286 ions. Among them, we found the 66 ions discriminating gastric cancer tissues from normal ones. Using these 66 ions, we then built a partial least square-discriminant analysis (PLS-DA) model resulting in a misclassification error rate of 0.024. Finally, network analysis of the 66 ions showed disregulation of amino acid metabolism in the gastric cancer tissues. The results show that the proposed framework was effective in analyzing TOF-SIMS data from a moderately large size of samples, resulting in discrimination of gastric cancer tissues from normal tissues and identification of biomarker candidates associated with the amino acid metabolism.

  18. Older adults' memory for medical information, effect of number and mode of presentation: An experimental study.

    PubMed

    Latorre-Postigo, José Miguel; Ros-Segura, Laura; Navarro-Bravo, Beatriz; Ricarte-Trives, Jorge Javier; Serrano-Selva, Juan Pedro; López-Torres-Hidalgo, Jesús

    2017-01-01

    To analyze different ways of presenting medical information to older adults, tailoring the information and its presentation to the characteristics of memory function in old age. Experimental study. We took into account the following variables: amount of information, type of information and mode of presentation, and time delay. The greater the number of recommendations, the lower the recall; visual presentation does not enhance verbal presentation; lifestyle information is recalled better than medication information; after ten minutes the percentage of memory decreases significantly; the first and last recommendations are better remembered. As a whole, these findings show that older adults remember more medical information when very few recommendations are provided in each session. It is inadvisable to overload older adults with a large amount of information: It is better to program more consultations and provide less information. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system.

    PubMed

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  20. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.

  1. Charging and Discharging of Lichtenberg Electrets

    NASA Astrophysics Data System (ADS)

    Wood, Monika

    The research presented here describes a unique way to deposit a large amount of charge onto the surface of a thin dielectric sheet to create a Lichtenberg electret that can be discharged elsewhere to form spectacular Lichtenberg figures. This study examines how the amount of charge deposited onto the surface, the geometry of the probes, and the type of material used can all impact the formation of the Lichtenberg figures. Photographs of the Lichtenberg figures were taken and used to determine the voltage, current, and energy released during each discharge. It was found that a single discharge can release 0.49 J of energy in 1.24 micros for a Lichtenberg figure that covers approximately 500 cm. 2. Lichtenberg figures can be used to characterize high-voltage surgeson power lines, to diagnose lightning strike victims, to analyze electrical breakdown of insulating materials, for artistic purposes, and for similar applications where pulsed capacitors are commonly used.

  2. Responses to Oxidative and Heavy Metal Stresses in Cyanobacteria: Recent Advances

    PubMed Central

    Cassier-Chauvat, Corinne; Chauvat, Franck

    2014-01-01

    Cyanobacteria, the only known prokaryotes that perform oxygen-evolving photosynthesis, are receiving strong attention in basic and applied research. In using solar energy, water, CO2 and mineral salts to produce a large amount of biomass for the food chain, cyanobacteria constitute the first biological barrier against the entry of toxics into the food chain. In addition, cyanobacteria have the potential for the solar-driven carbon-neutral production of biofuels. However, cyanobacteria are often challenged by toxic reactive oxygen species generated under intense illumination, i.e., when their production of photosynthetic electrons exceeds what they need for the assimilation of inorganic nutrients. Furthermore, in requiring high amounts of various metals for growth, cyanobacteria are also frequently affected by drastic changes in metal availabilities. They are often challenged by heavy metals, which are increasingly spread out in the environment through human activities, and constitute persistent pollutants because they cannot be degraded. Consequently, it is important to analyze the protection against oxidative and metal stresses in cyanobacteria because these ancient organisms have developed most of these processes, a large number of which have been conserved during evolution. This review summarizes what is known regarding these mechanisms, emphasizing on their crosstalk. PMID:25561236

  3. Procedures of determining organic trace compounds in municipal sewage sludge-a review.

    PubMed

    Lindholm-Lehto, Petra C; Ahkola, Heidi S J; Knuutinen, Juha S

    2017-02-01

    Sewage sludge is the largest by-product generated during the wastewater treatment process. Since large amounts of sludge are being produced, different ways of disposal have been introduced. One tempting option is to use it as fertilizer in agricultural fields due to its high contents of inorganic nutrients. This, however, can be limited by the amount of trace contaminants in the sewage sludge, containing a variety of microbiological pollutants and pathogens but also inorganic and organic contaminants. The bioavailability and the effects of trace contaminants on the microorganisms of soil are still largely unknown as well as their mixture effects. Therefore, there is a need to analyze the sludge to test its suitability before further use. In this article, a variety of sampling, pretreatment, extraction, and analysis methods have been reviewed. Additionally, different organic trace compounds often found in the sewage sludge and their methods of analysis have been compiled. In addition to traditional Soxhlet extraction, the most common extraction methods of organic contaminants in sludge include ultrasonic extraction (USE), supercritical fluid extraction (SFE), microwave-assisted extraction (MAE), and pressurized liquid extraction (PLE) followed by instrumental analysis based on gas or liquid chromatography and mass spectrometry.

  4. PRESENCE OF CITRININ IN GRAINS AND ITS POSSIBLE HEALTH EFFECTS.

    PubMed

    Čulig, Borna; Bevardi, Martina; Bošnir, Jasna; Serdar, Sonja; Lasić, Dario; Racz, Aleksandar; Galić, Antonija; Kuharić, Željka

    2017-01-01

    Citrinin is a mycotoxin produced by several species of the genera Aspergillus , Penicillium and Monascus and it occurs mainly in stored grain. Citrinin is generally formed after harvest and occurs mainly in stored grains, it also occurs in other plant products. Often, the co-occurrence with other mycotoxins is observed, especially ochratoxin A, which is usually associated with endemic nephropathy. At the European Union level, systematic monitoring of Citrinin in grains began with the aim of determining its highest permissible amount in food. Thus, far the systematic monitoring of the above mentioned mycotoxin in Croatia is yet to begin. The main goal of this study was to determine the presence of Citrinin in grains sampled in the area of Međimurje, Osijek-Baranja, Vukovar-Srijem and Brod-Posavina County. For the purpose of identification and quantification of citrinin, high performance liquid chromatograph (HPLC) with fluorescence was used (Calibration curve k > 0.999; Intra assay CV = 2.1%; Inter assay CV = 4.3%; LOQ < 1 μg/kg). From the area of Međimurje County, 10 samples of corn and 10 samples of wheat were analyzed. None of the samples contained Citrinin (<1 μg/kg). From the area of Osijek-Baranja and Vukovar-Srijem County, 15 samples from each County were analyzed. The mean value for the samples of Osijek-Baranja County was 19.63 μg/kg (median=15.8 μg/kg), while for Vukovar-Srijem County the mean value of citrinin was 14,6 μg/kg (median=1.23 μg/kg). From 5 analyzed samples from Brod-Posavina County, one of the samples contained citrinin in the amount of 23.8 μg/kg, while the registered amounts in the other samples were <1 μg/kg. The results show that grains from several Counties contain certain amounts of Citrinin possibly indicating a significant intake of citrinin in humans. It must be stated that grains and grain-based products are the basis of everyday diet of all age groups, especially small children, where higher intake of citrinin can occur. Consequently, we emphasize the need for systematic analysis of larger amount of samples, from both large grains and small grains, especially in the area of Brod-Posavina County, in order to obtain more realistic notion of citrinin contamination of grains and to asses the health risk in humans.

  5. GPCR ontology: development and application of a G protein-coupled receptor pharmacology knowledge framework.

    PubMed

    Przydzial, Magdalena J; Bhhatarai, Barun; Koleti, Amar; Vempati, Uma; Schürer, Stephan C

    2013-12-15

    Novel tools need to be developed to help scientists analyze large amounts of available screening data with the goal to identify entry points for the development of novel chemical probes and drugs. As the largest class of drug targets, G protein-coupled receptors (GPCRs) remain of particular interest and are pursued by numerous academic and industrial research projects. We report the first GPCR ontology to facilitate integration and aggregation of GPCR-targeting drugs and demonstrate its application to classify and analyze a large subset of the PubChem database. The GPCR ontology, based on previously reported BioAssay Ontology, depicts available pharmacological, biochemical and physiological profiles of GPCRs and their ligands. The novelty of the GPCR ontology lies in the use of diverse experimental datasets linked by a model to formally define these concepts. Using a reasoning system, GPCR ontology offers potential for knowledge-based classification of individuals (such as small molecules) as a function of the data. The GPCR ontology is available at http://www.bioassayontology.org/bao_gpcr and the National Center for Biomedical Ontologies Web site.

  6. Mathematical model of simple spalling formation during coal cutting with extracting machine

    NASA Astrophysics Data System (ADS)

    Gabov, V. V.; Zadkov, D. A.

    2018-05-01

    A single-mass model of a rotor shearer is analyzed. It is shown that rotor mining machines has large inertia moments and load dynamics. An extraction module model with selective movement of the cutting tool is represented. The peculiar feature of such extracting machines is fluid power drive cutter mechanism. They can steadily operate at large shear thickness, and locking modes are not an emergency for them. Comparing with shearers they have less inertional mass, but slower average cutting speed, and its momentary values depend on load. Basing on the equation of hydraulic fuel consumption balance the work of fluid power drive of extracting module cutter mechanism together with hydro pneumatic accumulator is analyzed. Spalling formation model during coal cutting with fluid power drive cutter mechanism and potential energy stores are suggested. Matching cutter speed with the speed of main crack expansion and amount of potential energy consumption, cutter load is determined only by ultimate stress at crack pole and friction. Tests of an extracting module cutter in real size model proved the stated theory.

  7. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  8. Time-Shifted Boundary Conditions Used for Navier-Stokes Aeroelastic Solver

    NASA Technical Reports Server (NTRS)

    Srivastava, Rakesh

    1999-01-01

    Under the Advanced Subsonic Technology (AST) Program, an aeroelastic analysis code (TURBO-AE) based on Navier-Stokes equations is currently under development at NASA Lewis Research Center s Machine Dynamics Branch. For a blade row, aeroelastic instability can occur in any of the possible interblade phase angles (IBPA s). Analyzing small IBPA s is very computationally expensive because a large number of blade passages must be simulated. To reduce the computational cost of these analyses, we used time shifted, or phase-lagged, boundary conditions in the TURBO-AE code. These conditions can be used to reduce the computational domain to a single blade passage by requiring the boundary conditions across the passage to be lagged depending on the IBPA being analyzed. The time-shifted boundary conditions currently implemented are based on the direct-store method. This method requires large amounts of data to be stored over a period of the oscillation cycle. On CRAY computers this is not a major problem because solid-state devices can be used for fast input and output to read and write the data onto a disk instead of storing it in core memory.

  9. A simple biosynthetic pathway for large product generation from small substrate amounts

    NASA Astrophysics Data System (ADS)

    Djordjevic, Marko; Djordjevic, Magdalena

    2012-10-01

    A recently emerging discipline of synthetic biology has the aim of constructing new biosynthetic pathways with useful biological functions. A major application of these pathways is generating a large amount of the desired product. However, toxicity due to the possible presence of toxic precursors is one of the main problems for such production. We consider here the problem of generating a large amount of product from a potentially toxic substrate. To address this, we propose a simple biosynthetic pathway, which can be induced in order to produce a large number of the product molecules, by keeping the substrate amount at low levels. Surprisingly, we show that the large product generation crucially depends on fast non-specific degradation of the substrate molecules. We derive an optimal induction strategy, which allows as much as three orders of magnitude increase in the product amount through biologically realistic parameter values. We point to a recently discovered bacterial immune system (CRISPR/Cas in E. coli) as a putative example of the pathway analysed here. We also argue that the scheme proposed here can be used not only as a stand-alone pathway, but also as a strategy to produce a large amount of the desired molecules with small perturbations of endogenous biosynthetic pathways.

  10. Cellular interface morphologies in directional solidification. III - The effects of heat transfer and solid diffusivity

    NASA Technical Reports Server (NTRS)

    Ungar, Lyle H.; Bennett, Mark J.; Brown, Robert A.

    1985-01-01

    The shape and stability of two-dimensional finite-amplitude cellular interfaces arising during directional solidification are compared for several solidification models that account differently for latent heat released at the interface, unequal thermal conductivities of melt and solid, and solute diffusivity in the solid. Finite-element analysis and computer-implemented perturbation methods are used to analyze the families of steadily growing cellular forms that evolve from the planar state. In all models a secondary bifurcation between different families of finite-amplitude cells exists that halves the spatial wavelength of the stable interface. The quantitative location of this transition is very dependent on the details of the model. Large amounts of solute diffusion in the solid retard the growth of large-amplitude cells.

  11. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  12. Precision medicine for psychopharmacology: a general introduction.

    PubMed

    Shin, Cheolmin; Han, Changsu; Pae, Chi-Un; Patkar, Ashwin A

    2016-07-01

    Precision medicine is an emerging medical model that can provide accurate diagnoses and tailored therapeutic strategies for patients based on data pertaining to genes, microbiomes, environment, family history and lifestyle. Here, we provide basic information about precision medicine and newly introduced concepts, such as the precision medicine ecosystem and big data processing, and omics technologies including pharmacogenomics, pharamacometabolomics, pharmacoproteomics, pharmacoepigenomics, connectomics and exposomics. The authors review the current state of omics in psychiatry and the future direction of psychopharmacology as it moves towards precision medicine. Expert commentary: Advances in precision medicine have been facilitated by achievements in multiple fields, including large-scale biological databases, powerful methods for characterizing patients (such as genomics, proteomics, metabolomics, diverse cellular assays, and even social networks and mobile health technologies), and computer-based tools for analyzing large amounts of data.

  13. Resolving the tips of the tree of life: How much mitochondrialdata doe we need?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonett, Ronald M.; Macey, J. Robert; Boore, Jeffrey L.

    2005-04-29

    Mitochondrial (mt) DNA sequences are used extensively to reconstruct evolutionary relationships among recently diverged animals,and have constituted the most widely used markers for species- and generic-level relationships for the last decade or more. However, most studies to date have employed relatively small portions of the mt-genome. In contrast, complete mt-genomes primarily have been used to investigate deep divergences, including several studies of the amount of mt sequence necessary to recover ancient relationships. We sequenced and analyzed 24 complete mt-genomes from a group of salamander species exhibiting divergences typical of those in many species-level studies. We present the first comprehensive investigationmore » of the amount of mt sequence data necessary to consistently recover the mt-genome tree at this level, using parsimony and Bayesian methods. Both methods of phylogenetic analysis revealed extremely similar results. A surprising number of well supported, yet conflicting, relationships were found in trees based on fragments less than {approx}2000 nucleotides (nt), typical of the vast majority of the thousands of mt-based studies published to date. Large amounts of data (11,500+ nt) were necessary to consistently recover the whole mt-genome tree. Some relationships consistently were recovered with fragments of all sizes, but many nodes required the majority of the mt-genome to stabilize, particularly those associated with short internal branches. Although moderate amounts of data (2000-3000 nt) were adequate to recover mt-based relationships for which most nodes were congruent with the whole mt-genome tree, many thousands of nucleotides were necessary to resolve rapid bursts of evolution. Recent advances in genomics are making collection of large amounts of sequence data highly feasible, and our results provide the basis for comparative studies of other closely related groups to optimize mt sequence sampling and phylogenetic resolution at the ''tips'' of the Tree of Life.« less

  14. The Era of the Large Databases: Outcomes After Gastroesophageal Surgery According to NSQIP, NIS, and NCDB Databases. Systematic Literature Review.

    PubMed

    Batista Rodríguez, Gabriela; Balla, Andrea; Fernández-Ananín, Sonia; Balagué, Carmen; Targarona, Eduard M

    2018-05-01

    The term big data refers to databases that include large amounts of information used in various areas of knowledge. Currently, there are large databases that allow the evaluation of postoperative evolution, such as the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP), the Healthcare Cost and Utilization Project (HCUP) National Inpatient Sample (NIS), and the National Cancer Database (NCDB). The aim of this review was to evaluate the clinical impact of information obtained from these registries regarding gastroesophageal surgery. A systematic review using the Meta-analysis of Observational Studies in Epidemiology guidelines was performed. The research was carried out using the PubMed database identifying 251 articles. All outcomes related to gastroesophageal surgery were analyzed. A total of 34 articles published between January 2007 and July 2017 were included, for a total of 345 697 patients. Studies were analyzed and divided according to the type of surgery and main theme in (1) esophageal surgery and (2) gastric surgery. The information provided by these databases is an effective way to obtain levels of evidence not obtainable by conventional methods. Furthermore, this information is useful for the external validation of previous studies, to establish benchmarks that allow comparisons between centers and have a positive impact on the quality of care.

  15. Scalable Performance Measurement and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number ofmore » tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.« less

  16. Classification Influence of Features on Given Emotions and Its Application in Feature Selection

    NASA Astrophysics Data System (ADS)

    Xing, Yin; Chen, Chuang; Liu, Li-Long

    2018-04-01

    In order to solve the problem that there is a large amount of redundant data in high-dimensional speech emotion features, we analyze deeply the extracted speech emotion features and select better features. Firstly, a given emotion is classified by each feature. Secondly, the recognition rate is ranked in descending order. Then, the optimal threshold of features is determined by rate criterion. Finally, the better features are obtained. When applied in Berlin and Chinese emotional data set, the experimental results show that the feature selection method outperforms the other traditional methods.

  17. FLIPing heterokaryons to analyze nucleo-cytoplasmic shuttling of yeast proteins.

    PubMed

    Belaya, Katsiaryna; Tollervey, David; Kos, Martin

    2006-05-01

    Nucleo-cytoplasmic shuttling is an important feature of proteins involved in nuclear export/import of RNAs, proteins, and also large ribonucleoprotein complexes such as ribosomes. The vast amount of proteomic data available shows that many of these processes are highly dynamic. Therefore, methods are needed to reliably assess whether a protein shuttles between nucleus and cytoplasm, and the kinetics with which it exchanges. Here we describe a combination of the classical heterokaryon assay with fluorescence recovery after photobleaching (FRAP) and fluorescence loss in photobleaching (FLIP) techniques, which allows an assessment of the kinetics of protein shuttling in the yeast Saccharomyces cerevisiae.

  18. Vlsi implementation of flexible architecture for decision tree classification in data mining

    NASA Astrophysics Data System (ADS)

    Sharma, K. Venkatesh; Shewandagn, Behailu; Bhukya, Shankar Nayak

    2017-07-01

    The Data mining algorithms have become vital to researchers in science, engineering, medicine, business, search and security domains. In recent years, there has been a terrific raise in the size of the data being collected and analyzed. Classification is the main difficulty faced in data mining. In a number of the solutions developed for this problem, most accepted one is Decision Tree Classification (DTC) that gives high precision while handling very large amount of data. This paper presents VLSI implementation of flexible architecture for Decision Tree classification in data mining using c4.5 algorithm.

  19. Characterizing Marine Soundscapes.

    PubMed

    Erbe, Christine; McCauley, Robert; Gavrilov, Alexander

    2016-01-01

    The study of marine soundscapes is becoming widespread and the amount of data collected is increasing rapidly. Data owners (typically academia, industry, government, and defense) are negotiating data sharing and generating potential for data syntheses, comparative studies, analyses of trends, and large-scale and long-term acoustic ecology research. A problem is the lack of standards and commonly agreed protocols for the recording of marine soundscapes, data analysis, and reporting that make a synthesis and comparison of results difficult. We provide a brief overview of the components in a marine soundscape, the hard- and software tools for recording and analyzing marine soundscapes, and common reporting formats.

  20. Differential principal component analysis of ChIP-seq.

    PubMed

    Ji, Hongkai; Li, Xia; Wang, Qian-fei; Ning, Yang

    2013-04-23

    We propose differential principal component analysis (dPCA) for analyzing multiple ChIP-sequencing datasets to identify differential protein-DNA interactions between two biological conditions. dPCA integrates unsupervised pattern discovery, dimension reduction, and statistical inference into a single framework. It uses a small number of principal components to summarize concisely the major multiprotein synergistic differential patterns between the two conditions. For each pattern, it detects and prioritizes differential genomic loci by comparing the between-condition differences with the within-condition variation among replicate samples. dPCA provides a unique tool for efficiently analyzing large amounts of ChIP-sequencing data to study dynamic changes of gene regulation across different biological conditions. We demonstrate this approach through analyses of differential chromatin patterns at transcription factor binding sites and promoters as well as allele-specific protein-DNA interactions.

  1. One to One Recommendation System in Apparel On-Line Shopping

    NASA Astrophysics Data System (ADS)

    Sekozawa, Teruji; Mitsuhashi, Hiroyuki; Ozawa, Yukio

    We propose an apparel online shopping site that the fashion adviser exists on the internet. The fashion adviser, who has detailed knowledge about the fashion in real shop, selects and coordinates the clothes of the customer's preference. However, the customer, who didn't have detailed knowledge about the fashion, was not able to choose the clothes suitable for the customer's preference from among the candidate of a large amount of clothes on a conventional apparel shopping site. Then, we compose the system that analyzes the customer's preference by the AHP technique, makes to the cluster by the correlation of clothes, and analyzes the market basket. As a result, this system can coordinate the clothes appropriate for the favor of an individual customer. Moreover, this system can propose the recommendation of other clothes based on past sales data.

  2. Theoretical comparison of maser materials for a 32-GHz maser amplifier

    NASA Technical Reports Server (NTRS)

    Lyons, James R.

    1988-01-01

    The computational results of a comparison of maser materials for a 32 GHz maser amplifier are presented. The search for a better maser material is prompted by the relatively large amount of pump power required to sustain a population inversion in ruby at frequencies on the order of 30 GHz and above. The general requirements of a maser material and the specific problems with ruby are outlined. The spin Hamiltonian is used to calculate energy levels and transition probabilities for ruby and twelve other materials. A table is compiled of several attractive operating points for each of the materials analyzed. All the materials analyzed possess operating points that could be superior to ruby. To complete the evaluation of the materials, measurements of inversion ratio and pump power requirements must be made in the future.

  3. Effects of imputation on correlation: implications for analysis of mass spectrometry data from multiple biological matrices.

    PubMed

    Taylor, Sandra L; Ruhaak, L Renee; Kelly, Karen; Weiss, Robert H; Kim, Kyoungmi

    2017-03-01

    With expanded access to, and decreased costs of, mass spectrometry, investigators are collecting and analyzing multiple biological matrices from the same subject such as serum, plasma, tissue and urine to enhance biomarker discoveries, understanding of disease processes and identification of therapeutic targets. Commonly, each biological matrix is analyzed separately, but multivariate methods such as MANOVAs that combine information from multiple biological matrices are potentially more powerful. However, mass spectrometric data typically contain large amounts of missing values, and imputation is often used to create complete data sets for analysis. The effects of imputation on multiple biological matrix analyses have not been studied. We investigated the effects of seven imputation methods (half minimum substitution, mean substitution, k-nearest neighbors, local least squares regression, Bayesian principal components analysis, singular value decomposition and random forest), on the within-subject correlation of compounds between biological matrices and its consequences on MANOVA results. Through analysis of three real omics data sets and simulation studies, we found the amount of missing data and imputation method to substantially change the between-matrix correlation structure. The magnitude of the correlations was generally reduced in imputed data sets, and this effect increased with the amount of missing data. Significant results from MANOVA testing also were substantially affected. In particular, the number of false positives increased with the level of missing data for all imputation methods. No one imputation method was universally the best, but the simple substitution methods (Half Minimum and Mean) consistently performed poorly. © The Author 2016. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  4. Querying Large Biological Network Datasets

    ERIC Educational Resources Information Center

    Gulsoy, Gunhan

    2013-01-01

    New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…

  5. Analytical method for measuring cosmogenic 35S in natural waters

    DOE PAGES

    Uriostegui, Stephanie H.; Bibby, Richard K.; Esser, Bradley K.; ...

    2015-05-18

    Here, cosmogenic sulfur-35 in water as dissolved sulfate ( 35SO 4) has successfully been used as an intrinsic hydrologic tracer in low-SO 4, high-elevation basins. Its application in environmental waters containing high SO 4 concentrations has been limited because only small amounts of SO 4 can be analyzed using current liquid scintillation counting (LSC) techniques. We present a new analytical method for analyzing large amounts of BaSO 4 for 35S. We quantify efficiency gains when suspending BaSO 4 precipitate in Inta-Gel Plus cocktail, purify BaSO 4 precipitate to remove dissolved organic matter, mitigate interference of radium-226 and its daughter productsmore » by selection of high purity barium chloride, and optimize LSC counting parameters for 35S determination in larger masses of BaSO 4. Using this improved procedure, we achieved counting efficiencies that are comparable to published LSC techniques despite a 10-fold increase in the SO 4 sample load. 35SO 4 was successfully measured in high SO 4 surface waters and groundwaters containing low ratios of 35S activity to SO 4 mass demonstrating that this new analytical method expands the analytical range of 35SO 4 and broadens the utility of 35SO 4 as an intrinsic tracer in hydrologic settings.« less

  6. Glacier retreat and associated sediment dynamics in proglacial areas: a case study from the Silvretta Alps, Austria

    NASA Astrophysics Data System (ADS)

    Felbauer, Lucia; Pöppl, Ronald

    2016-04-01

    Global warming results in an ongoing retreat of glaciers in the Alps, leaving behind large amounts of easily erodible sediments. In addition, the debuttressing of rock-walls and the decay of permafrost in the high mountain regions facilitates mass movements of potential disastrous consequences, such as rock falls, landslides and debris flows. Therefore, it is highly important to quantify the amount of sediments that are supplied from the different compartments and to investigate how glacial retreat influences sediment dynamics in proglacial areas. In the presented work glacier retreat and associated sediment dynamics were investigated in the Kromer valley (Silvretta Alps, Austria) by analyzing remote sensing data. Glacial retreat from the period of 1950 to 2012 was documented by interpreting aerial photographs. By digitizing the different stages of the glaciers for six time frames, changes in glacier area and length were mapped and quantified. In order to identify, characterize and quantify sediment dynamics in the proglacial areas a high resolution DEM of difference (DoD) between 2007 and 2012 was created and analyzed, further differentiating between different zones (e.g. valley bottom, hillslope) and types of geomorphic processes (e.g. fluvial, gravitational). First results will be presented at the EGU General Assembly 2016.

  7. Mesenchymal stromal cells derived from cervical cancer produce high amounts of adenosine to suppress cytotoxic T lymphocyte functions.

    PubMed

    de Lourdes Mora-García, María; García-Rocha, Rosario; Morales-Ramírez, Omar; Montesinos, Juan José; Weiss-Steider, Benny; Hernández-Montes, Jorge; Ávila-Ibarra, Luis Roberto; Don-López, Christian Azucena; Velasco-Velázquez, Marco Antonio; Gutiérrez-Serrano, Vianey; Monroy-García, Alberto

    2016-10-26

    In recent years, immunomodulatory mechanisms of mesenchymal stem/stromal cells (MSCs) from bone marrow and other "classic" sources have been described. However, the phenotypic and functional properties of tumor MSCs are poorly understood. The aim of this study was to analyze the immunosuppressive capacity of cervical cancer-derived MSCs (CeCa-MSCs) on effector T lymphocytes through the purinergic pathway. We determined the expression and functional activity of the membrane-associated ectonucleotidases CD39 and CD73 on CeCa-MSCs and normal cervical tissue-derived MSCs (NCx-MSCs). We also analyzed their immunosuppressive capacity to decrease proliferation, activation and effector cytotoxic T (CD8+) lymphocyte function through the generation of adenosine (Ado). We detected that CeCa-MSCs express higher levels of CD39 and CD73 ectonucleotidases in cell membranes compared to NCx-MSCs, and that this feature was associated with the ability to strongly suppress the proliferation, activation and effector functions of cytotoxic T-cells through the generation of large amounts of Ado from the hydrolysis of ATP, ADP and AMP nucleotides. This study suggests that CeCa-MSCs play an important role in the suppression of the anti-tumor immune response in CeCa through the purinergic pathway.

  8. Primary discussion of a carbon sink in the oceans

    NASA Astrophysics Data System (ADS)

    Ma, Caihua; You, Kui; Ji, Dechun; Ma, Weiwei; Li, Fengqi

    2015-04-01

    As a consequence of global warming and rising sea levels, the oceans are becoming a matter of concern for more and more people because these changes will impact the growth of living organisms as well as people's living standards. In particular, it is extremely important that the oceans absorb massive amounts of carbon dioxide. This paper takes a pragmatic approach to analyzing the oceans with respect to the causes of discontinuities in oceanic variables of carbon dioxide sinks. We report on an application of chemical, physical and biological methods to analyze the changes of carbon dioxide in oceans. Based on the relationships among the oceans, land, atmosphere and sediment with respect to carbon dioxide, the foundation of carbon dioxide in shell-building and ocean acidification, the changes in carbon dioxide in the oceans and their impact on climate change, and so on, a vital conclusion can be drawn from this study. Specifically, under the condition that the oceans are not disturbed by external forces, the oceans are a large carbon dioxide sink. The result can also be inferred by the formula: C=A-B and G=E+F when the marine ecosystem can keep a natural balance and the amount of carbon dioxide emission is limited within the carrying capacity of the oceans.

  9. Sequence of Centromere Separation: Role of Centromeric Heterochromatin

    PubMed Central

    Vig, Baldev K.

    1982-01-01

    The late metaphase-early anaphase cells from various tissues of male Mus musculus, M. poschiavinus, M. spretus, M. castaneus, female and male Bos taurus (cattle) and female Myopus schisticolor (wood lemming) were analyzed for centromeres that showed separation into two daughter centromeres and those that did not show such separation. In all strains and species of mouse the Y chromosome is the first one to separate, as is the X or Y in the cattle. These sex chromosomes are devoid of constitutive heterochromatin, whereas all autosomes in these species carry detectable quantities. In cattle, the late replicating X chromosome appears to separate later than the active X. In the wood lemming the three pairs of autosomes with the least amount of centromeric constitutive heterochromatin separate first. These are followed by the separation of seven pairs of autosomes carrying medium amounts of constitutive heterochromatin. Five pairs of autosomes with the largest amounts of constitutive heterochromatin are the last in the sequence of separation. The sex chromosomes with medium amounts of constitutive heterochromatin around the centromere, and a very large amount of distal heterochromatin, separate among the very late ones but are not the last. These observations assign a specific role to centromeric constitutive heterochromatin and also indicate that nonproximal heterochromatin does not exert control over the sequence in which the centromeres in the genome separate. It appears that qualitative differences among various types of constitutive heterochromatin are as important as quantitative differences in controlling the separation of centromeres. PMID:6764903

  10. Handling qualities of large flexible control-configured aircraft

    NASA Technical Reports Server (NTRS)

    Swaim, R. L.

    1980-01-01

    The effects on handling qualities of low frequency symmetric elastic mode interaction with the rigid body dynamics of a large flexible aircraft was analyzed by use of a mathematical pilot modeling computer simulation. An extension of the optimal control model for a human pilot was made so that the mode interaction effects on the pilot's control task could be assessed. Pilot ratings were determined for a longitudinal tracking task with parametric variations in the undamped natural frequencies of the two lowest frequency symmetric elastic modes made to induce varying amounts of mode interaction. Relating numerical performance index values associated with the frequency variations used in several dynamic cases, to a numerical Cooper-Harper pilot rating has proved successful in discriminating when the mathematical pilot can or cannot separate rigid from elastic response in the tracking task.

  11. DebugIT for patient safety - improving the treatment with antibiotics through multimedia data mining of heterogeneous clinical data.

    PubMed

    Lovis, Christian; Colaert, Dirk; Stroetmann, Veli N

    2008-01-01

    The concepts and architecture underlying a large-scale integrating project funded within the 7th EU Framework Programme (FP7) are discussed. The main objective of the project is to build a tool that will have a significant impact for the monitoring and the control of infectious diseases and antimicrobial resistances in Europe; This will be realized by building a technical and semantic infrastructure able to share heterogeneous clinical data sets from different hospitals in different countries, with different languages and legislations; to analyze large amounts of this clinical data with advanced multimedia data mining and finally apply the obtained knowledge for clinical decisions and outcome monitoring. There are numerous challenges in this project at all levels, technical, semantical, legal and ethical that will have to be addressed.

  12. Fine structure of spectral properties for random correlation matrices: An application to financial markets

    NASA Astrophysics Data System (ADS)

    Livan, Giacomo; Alfarano, Simone; Scalas, Enrico

    2011-07-01

    We study some properties of eigenvalue spectra of financial correlation matrices. In particular, we investigate the nature of the large eigenvalue bulks which are observed empirically, and which have often been regarded as a consequence of the supposedly large amount of noise contained in financial data. We challenge this common knowledge by acting on the empirical correlation matrices of two data sets with a filtering procedure which highlights some of the cluster structure they contain, and we analyze the consequences of such filtering on eigenvalue spectra. We show that empirically observed eigenvalue bulks emerge as superpositions of smaller structures, which in turn emerge as a consequence of cross correlations between stocks. We interpret and corroborate these findings in terms of factor models, and we compare empirical spectra to those predicted by random matrix theory for such models.

  13. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    NASA Astrophysics Data System (ADS)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  14. A Primer on Infectious Disease Bacterial Genomics

    PubMed Central

    Petkau, Aaron; Knox, Natalie; Graham, Morag; Van Domselaar, Gary

    2016-01-01

    SUMMARY The number of large-scale genomics projects is increasing due to the availability of affordable high-throughput sequencing (HTS) technologies. The use of HTS for bacterial infectious disease research is attractive because one whole-genome sequencing (WGS) run can replace multiple assays for bacterial typing, molecular epidemiology investigations, and more in-depth pathogenomic studies. The computational resources and bioinformatics expertise required to accommodate and analyze the large amounts of data pose new challenges for researchers embarking on genomics projects for the first time. Here, we present a comprehensive overview of a bacterial genomics projects from beginning to end, with a particular focus on the planning and computational requirements for HTS data, and provide a general understanding of the analytical concepts to develop a workflow that will meet the objectives and goals of HTS projects. PMID:28590251

  15. Compressive Network Analysis

    PubMed Central

    Jiang, Xiaoye; Yao, Yuan; Liu, Han; Guibas, Leonidas

    2014-01-01

    Modern data acquisition routinely produces massive amounts of network data. Though many methods and models have been proposed to analyze such data, the research of network data is largely disconnected with the classical theory of statistical learning and signal processing. In this paper, we present a new framework for modeling network data, which connects two seemingly different areas: network data analysis and compressed sensing. From a nonparametric perspective, we model an observed network using a large dictionary. In particular, we consider the network clique detection problem and show connections between our formulation with a new algebraic tool, namely Randon basis pursuit in homogeneous spaces. Such a connection allows us to identify rigorous recovery conditions for clique detection problems. Though this paper is mainly conceptual, we also develop practical approximation algorithms for solving empirical problems and demonstrate their usefulness on real-world datasets. PMID:25620806

  16. Challenges dealing with depleted uranium in Germany - Reuse or disposal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moeller, Kai D.

    2007-07-01

    During enrichment large amounts of depleted Uranium are produced. In Germany every year 2.800 tons of depleted uranium are generated. In Germany depleted uranium is not classified as radioactive waste but a resource for further enrichment. Therefore since 1996 depleted Uranium is sent to ROSATOM in Russia. However it still has to be dealt with the second generation of depleted Uranium. To evaluate the alternative actions in case a solution has to be found in Germany, several studies have been initiated by the Federal Ministry of the Environment. The work that has been carried out evaluated various possibilities to dealmore » with depleted uranium. The international studies on this field and the situation in Germany have been analyzed. In case no further enrichment is planned the depleted uranium has to be stored. In the enrichment process UF{sub 6} is generated. It is an international consensus that for storage it should be converted to U{sub 3}O{sub 8}. The necessary technique is well established. If the depleted Uranium would have to be characterized as radioactive waste, a final disposal would become necessary. For the planned Konrad repository - a repository for non heat generating radioactive waste - the amount of Uranium is limited by the licensing authority. The existing license would not allow the final disposal of large amounts of depleted Uranium in the Konrad repository. The potential effect on the safety case has not been roughly analyzed. As a result it may be necessary to think about alternatives. Several possibilities for the use of depleted uranium in the industry have been identified. Studies indicate that the properties of Uranium would make it useful in some industrial fields. Nevertheless many practical and legal questions are open. One further option may be the use as shielding e.g. in casks for transport or disposal. Possible techniques for using depleted Uranium as shielding are the use of the metallic Uranium as well as the inclusion in concrete. Another possibility could be the use of depleted uranium for the blending of High enriched Uranium (HEU) or with Plutonium to MOX-elements. (authors)« less

  17. Technologies and Concepts for Reducing the Fuel Burn of Subsonic Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Nickol, Craig L.

    2012-01-01

    There are many technologies under development that have the potential to enable large fuel burn reductions in the 2025 timeframe for subsonic transport aircraft relative to the current fleet. This paper identifies a potential technology suite and analyzes the fuel burn reduction potential of these technologies when integrated into advanced subsonic transport concepts. Advanced tube-and-wing concepts are developed in the single aisle and large twin aisle class, and a hybrid-wing-body concept is developed for the large twin aisle class. The resulting fuel burn reductions for the advanced tube-and-wing concepts range from a 42% reduction relative to the 777-200 to a 44% reduction relative to the 737-800. In addition, the hybrid-wingbody design resulted in a 47% fuel burn reduction relative to the 777-200. Of course, to achieve these fuel burn reduction levels, a significant amount of technology and concept maturation is required between now and 2025. A methodology for capturing and tracking concept maturity is also developed and presented in this paper.

  18. Improving Service Management in the Internet of Things

    PubMed Central

    Sammarco, Chiara; Iera, Antonio

    2012-01-01

    In the Internet of Things (IoT) research arena, many efforts are devoted to adapt the existing IP standards to emerging IoT nodes. This is the direction followed by three Internet Engineering Task Force (IETF) Working Groups, which paved the way for research on IP-based constrained networks. Through a simplification of the whole TCP/IP stack, resource constrained nodes become direct interlocutors of application level entities in every point of the network. In this paper we analyze some side effects of this solution, when in the presence of large amounts of data to transmit. In particular, we conduct a performance analysis of the Constrained Application Protocol (CoAP), a widely accepted web transfer protocol for the Internet of Things, and propose a service management enhancement that improves the exploitation of the network and node resources. This is specifically thought for constrained nodes in the abovementioned conditions and proves to be able to significantly improve the node energetic performance when in the presence of large resource representations (hence, large data transmissions).

  19. Clustering analysis of line indices for LAMOST spectra with AstroStat

    NASA Astrophysics Data System (ADS)

    Chen, Shu-Xin; Sun, Wei-Min; Yan, Qi

    2018-06-01

    The application of data mining in astronomical surveys, such as the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) survey, provides an effective approach to automatically analyze a large amount of complex survey data. Unsupervised clustering could help astronomers find the associations and outliers in a big data set. In this paper, we employ the k-means method to perform clustering for the line index of LAMOST spectra with the powerful software AstroStat. Implementing the line index approach for analyzing astronomical spectra is an effective way to extract spectral features for low resolution spectra, which can represent the main spectral characteristics of stars. A total of 144 340 line indices for A type stars is analyzed through calculating their intra and inter distances between pairs of stars. For intra distance, we use the definition of Mahalanobis distance to explore the degree of clustering for each class, while for outlier detection, we define a local outlier factor for each spectrum. AstroStat furnishes a set of visualization tools for illustrating the analysis results. Checking the spectra detected as outliers, we find that most of them are problematic data and only a few correspond to rare astronomical objects. We show two examples of these outliers, a spectrum with abnormal continuumand a spectrum with emission lines. Our work demonstrates that line index clustering is a good method for examining data quality and identifying rare objects.

  20. Large repayments of premium subsidies may be owed to the IRS if family income changes are not promptly reported.

    PubMed

    Jacobs, Ken; Graham-Squire, Dave; Gould, Elise; Roby, Dylan

    2013-09-01

    Subsidies for health insurance premiums under the Affordable Care Act are refundable tax credits. They can be taken when taxes are filed or in advance, as reductions in monthly premiums that must be reconciled at tax filing. Recipients who take subsidies in advance will receive tax refunds if their subsidies were too small but will have to make repayments if their subsidies were too high. We analyzed predicted repayments and refunds for people receiving subsidies, using California as a case study. We found that many families could owe large repayments to the Internal Revenue Service at their next tax filing. If income changes were reported and credits adjusted in a timely manner throughout the tax year, the number of filers owing repayments would be reduced by 7-41 percent and the median size of repayments reduced by as much as 61 percent (depending on the level of changes reported and the method used to adjust the subsidy amounts). We recommend that the health insurance exchanges mandated by the Affordable Care Act educate consumers about how the subsidies work and the need to promptly report income changes. We also recommend that they provide tools and assistance to determine the amount of subsidies that enrollees should take in advance.

  1. Test Data Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosas, Joseph

    The National Security Campus (NSC) collects a large amount of test data used The National Security Campus (NSC) collects a large amount of test data used to accept high value and high rigor product. The data has been used historically to support root cause analysis when anomalies are detected in down-stream processes. The opportunity to use the data for predictive failure analysis however, had never been exploited. The primary goal of the Test Data Monitor (TDM) software is to provide automated capabilities to analyze data in near-real-time and report trends that foreshadow actual product failures. To date, the aerospace industrymore » as a whole is challenged at utilizing collected data to the degree that modern technology allows. As a result of the innovation behind TDM, Honeywell is able to monitor millions of data points through a multitude of SPC algorithms continuously and autonomously so that our personnel resources can more efficiently and accurately direct their attention to suspect processes or features. TDM’s capabilities have been recognized by our U.S. Department of Energy National Nuclear Security Administration (NNSA) sponsor for potential use at other sites within the NNSA. This activity supports multiple initiatives including expectations of the NNSA and broader corporate goals that center around data-based quality controls on production.« less

  2. A tool for optimization of the production and user analysis on the Grid, C. Grigoras for the ALICE Collaboration

    NASA Astrophysics Data System (ADS)

    Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores

    2011-12-01

    With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.

  3. Predicting vehicle fuel consumption patterns using floating vehicle data.

    PubMed

    Du, Yiman; Wu, Jianping; Yang, Senyan; Zhou, Liutong

    2017-09-01

    The status of energy consumption and air pollution in China is serious. It is important to analyze and predict the different fuel consumption of various types of vehicles under different influence factors. In order to fully describe the relationship between fuel consumption and the impact factors, massive amounts of floating vehicle data were used. The fuel consumption pattern and congestion pattern based on large samples of historical floating vehicle data were explored, drivers' information and vehicles' parameters from different group classification were probed, and the average velocity and average fuel consumption in the temporal dimension and spatial dimension were analyzed respectively. The fuel consumption forecasting model was established by using a Back Propagation Neural Network. Part of the sample set was used to train the forecasting model and the remaining part of the sample set was used as input to the forecasting model. Copyright © 2017. Published by Elsevier B.V.

  4. Panretinal, high-resolution color photography of the mouse fundus.

    PubMed

    Paques, Michel; Guyomard, Jean-Laurent; Simonutti, Manuel; Roux, Michel J; Picaud, Serge; Legargasson, Jean-François; Sahel, José-Alain

    2007-06-01

    To analyze high-resolution color photographs of the mouse fundus. A contact fundus camera based on topical endoscopy fundus imaging (TEFI) was built. Fundus photographs of C57 and Balb/c mice obtained by TEFI were qualitatively analyzed. High-resolution digital imaging of the fundus, including the ciliary body, was routinely obtained. The reflectance and contrast of retinal vessels varied significantly with the amount of incident and reflected light and, thus, with the degree of fundus pigmentation. The combination of chromatic and spherical aberration favored blue light imaging, in term of both field and contrast. TEFI is a small, low-cost system that allows high-resolution color fundus imaging and fluorescein angiography in conscious mice. Panretinal imaging is facilitated by the presence of the large rounded lens. TEFI significantly improves the quality of in vivo photography of retina and ciliary process of mice. Resolution is, however, affected by chromatic aberration, and should be improved by monochromatic imaging.

  5. Ghosts in the machine: publication planning in the medical sciences.

    PubMed

    Sismondo, Sergio

    2009-04-01

    Publication of pharmaceutical company-sponsored research in medical journals, and its presentation at conferences and meetings, is mostly governed by 'publication plans' that extract the maximum amount of scientific and commercial value out of data and analyses through carefully constructed and placed papers. Clinical research is typically performed by contract research organizations, analyzed by company statisticians, written up by independent medical writers, approved and edited by academic researchers who then serve as authors, and the whole process organized and shepherded through to journal publication by publication planners. This paper reports on a conference of an international association of publication planners. It describes and analyzes their work in an ecological framework that relates it to marketing departments of pharmaceutical companies, medical journals and publishers, academic authors, and potential audiences. The medical research described here forms a new kind of corporate science, designed to look like traditional academic work, but performed largely to market products.

  6. Antioxidant White Grape Seed Phenolics: Pressurized Liquid Extracts from Different Varieties

    PubMed Central

    Garcia-Jares, Carmen; Vazquez, Alberto; Lamas, Juan P.; Pajaro, Marta; Alvarez-Casas, Marta; Lores, Marta

    2015-01-01

    Grape seeds represent a high percentage (20% to 26%) of the grape marc obtained as a byproduct from white winemaking and keep a vast proportion of grape polyphenols. In this study, seeds obtained from 11 monovarietal white grape marcs cultivated in Northwestern Spain have been analyzed in order to characterize their polyphenolic content and antioxidant activity. Seeds of native (Albariño, Caiño, Godello, Loureiro, Torrontés, and Treixadura) and non-native (Chardonnay, Gewurtzträminer, Pinot blanc, Pinot gris, and Riesling) grape varieties have been considered. Low weight phenolics have been extracted by means of pressurized liquid extraction (PLE) and further analyzed by LC-MS/MS. The results showed that PLE extracts, whatever the grape variety of origin, contained large amounts of polyphenols and high antioxidant activity. Differences in the varietal polyphenolic profiles were found, so a selective exploitation of seeds might be possible. PMID:26783956

  7. RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning

    PubMed Central

    O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara

    2014-01-01

    Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503

  8. Structural interaction fingerprints: a new approach to organizing, mining, analyzing, and designing protein-small molecule complexes.

    PubMed

    Singh, Juswinder; Deng, Zhan; Narale, Gaurav; Chuaqui, Claudio

    2006-01-01

    The combination of advances in structure-based drug design efforts in the pharmaceutical industry in parallel with structural genomics initiatives in the public domain has led to an explosion in the number of structures of protein-small molecule complexes structures. This information has critical importance to both the understanding of the structural basis for molecular recognition in biological systems and the design of better drugs. A significant challenge exists in managing this vast amount of data and fully leveraging it. Here, we review our work to develop a simple, fast way to store, organize, mine, and analyze large numbers of protein-small molecule complexes. We illustrate the utility of the approach to the management of inhibitor complexes from the protein kinase family. Finally, we describe our recent efforts in applying this method to the design of target-focused chemical libraries.

  9. When the Sky Falls NASA's Response to Bright Bolide Events Over Continental USA

    NASA Technical Reports Server (NTRS)

    Blaauw, R. C.; Cooke, W. J.; Kingery, A. M.; Moser, D. E.

    2015-01-01

    Being the only U.S. Government entity charged with monitoring the meteor environment, the Meteoroid Environment Office (MEO) has deployed a network of allsky and wide field meteor cameras, along with the appropriate software tools to quickly analyze data from these systems. However, the coverage of this network is still quite limited, forcing the incorporation of data from other cameras posted to the internet in analyzing many of the fireballs reported by the public and media. Information on these bright events often needs to be reported to NASA Headquarters by noon the following day; thus a procedure has been developed that determines the analysis process for a given fireball event based on the types and amount of data available. The differences between these analysis processes are shown by looking at four meteor events that the MEO responded to, all of which were large enough to produce meteorites.

  10. Strengthening Mechanisms and Their Relative Contributions to the Yield Strength of Microalloyed Steels

    NASA Astrophysics Data System (ADS)

    Lu, Junfang; Omotoso, Oladipo; Wiskel, J. Barry; Ivey, Douglas G.; Henein, Hani

    2012-09-01

    Microalloyed steels are used widely in oil and gas pipelines. They are a class of high-strength, low-carbon steels that contain small additions (in amounts less than 0.1 wt pct) of Nb, Ti, and/or V. The steels may contain other alloying elements, such as Mo, in amounts exceeding 0.1 wt pct. Precipitation in these steels can be controlled through thermomechanical-controlled processing, leading to precipitates with sizes that range from several microns to a few nanometers. Microalloyed steels have good strength, good toughness, and excellent weldability, which are attributed in part to the presence of the nanosized carbide and carbonitride precipitates. Because of their fine sizes, wide particle size distribution, and low volume fractions, conventional microscopic methods are not satisfactory for quantifying these precipitates. Matrix dissolution is a promising alternative to extract the precipitates for quantification. Relatively large volumes of material can be analyzed so that statistically significant quantities of precipitates of different sizes are collected. In this article, the microstructure features of a series of microalloyed steels (X70, X80, and X100) as well as a Grade 100 steel are characterized using optical microscopy (OM) and scanning electron microscopy (SEM). A chemical dissolution technique is used to extract the precipitates from the steels. Transmission electron microscopy (TEM) and X-ray diffraction (XRD) are combined to analyze the chemical composition of these precipitates. Rietveld refinement of the XRD patterns is used to quantify fully the relative amounts of these precipitates. The size distribution of the nanosized precipitates is quantified using dark-field imaging (DF) in the TEM. The effects of microalloying content, finish rolling temperature (FRT), and coiling temperature (CT)/interrupted cooling temperature (ICT) on the grain size and the amount of nanoprecipitation are discussed. Individual strengthening contributions from grain size effects, solid-solution strengthening, and precipitation strengthening are quantified to understand fully the strengthening mechanisms for these steels.

  11. Onset of thermally induced gas convection in mine wastes

    USGS Publications Warehouse

    Lu, N.; Zhang, Y.

    1997-01-01

    A mine waste dump in which active oxidation of pyritic materials occurs can generate a large amount of heat to form convection cells. We analyze the onset of thermal convection in a two-dimensional, infinite horizontal layer of waste rock filled with moist gas, with the top surface of the waste dump open to the atmosphere and the bedrock beneath the waste dump forming a horizontal and impermeable boundary. Our analysis shows that the thermal regime of a waste rock system depends heavily on the atmospheric temperature, the strength of the heat source and the vapor pressure. ?? 1997 Elsevier Science Ltd. All rights reserved.

  12. Protein sequence comparison based on K-string dictionary.

    PubMed

    Yu, Chenglong; He, Rong L; Yau, Stephen S-T

    2013-10-25

    The current K-string-based protein sequence comparisons require large amounts of computer memory because the dimension of the protein vector representation grows exponentially with K. In this paper, we propose a novel concept, the "K-string dictionary", to solve this high-dimensional problem. It allows us to use a much lower dimensional K-string-based frequency or probability vector to represent a protein, and thus significantly reduce the computer memory requirements for their implementation. Furthermore, based on this new concept, we use Singular Value Decomposition to analyze real protein datasets, and the improved protein vector representation allows us to obtain accurate gene trees. © 2013.

  13. The use of inexpensive computer-based scanning survey technology to perform medical practice satisfaction surveys.

    PubMed

    Shumaker, L; Fetterolf, D E; Suhrie, J

    1998-01-01

    The recent availability of inexpensive document scanners and optical character recognition technology has created the ability to process surveys in large numbers with a minimum of operator time. Programs, which allow computer entry of such scanned questionnaire results directly into PC based relational databases, have further made it possible to quickly collect and analyze significant amounts of information. We have created an internal capability to easily generate survey data and conduct surveillance across a number of medical practice sites within a managed care/practice management organization. Patient satisfaction surveys, referring physician surveys and a variety of other evidence gathering tools have been deployed.

  14. ASCOT: A Collaborative Platform for the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Marcos, D.; Connolly, A. J.; Krughoff, K. S.; Smith, I.; Wallace, S. C.

    2012-09-01

    The digital networks are changing the way that knowledge is created, structured, curated, consumed, archived and referenced. Projects like Wikipedia, Github or Galaxy Zoo have shown the potential of online communities to develop and communicate ideas. ASCOT is a web based framework that facilitates collaboration among astronomers providing a simple way to share, explore, interact and analyze large amounts of data from a broad range of sources available trough the Virtual Observatories (VO). Designed with a strong emphasis on usability, ASCOT takes advantage of the latest generation of web standards and cloud technologies to implement an extendable and customizable stack of web tools and services.

  15. A FORTRAN program for determining aircraft stability and control derivatives from flight data

    NASA Technical Reports Server (NTRS)

    Maine, R. E.; Iliff, K. W.

    1975-01-01

    A digital computer program written in FORTRAN IV for the estimation of aircraft stability and control derivatives is presented. The program uses a maximum likelihood estimation method, and two associated programs for routine, related data handling are also included. The three programs form a package that can be used by relatively inexperienced personnel to process large amounts of data with a minimum of manpower. This package was used to successfully analyze 1500 maneuvers on 20 aircraft, and is designed to be used without modification on as many types of computers as feasible. Program listings and sample check cases are included.

  16. Publicly Releasing a Large Simulation Dataset with NDS Labs

    NASA Astrophysics Data System (ADS)

    Goldbaum, Nathan

    2016-03-01

    Optimally, all publicly funded research should be accompanied by the tools, code, and data necessary to fully reproduce the analysis performed in journal articles describing the research. This ideal can be difficult to attain, particularly when dealing with large (>10 TB) simulation datasets. In this lightning talk, we describe the process of publicly releasing a large simulation dataset to accompany the submission of a journal article. The simulation was performed using Enzo, an open source, community-developed N-body/hydrodynamics code and was analyzed using a wide range of community- developed tools in the scientific Python ecosystem. Although the simulation was performed and analyzed using an ecosystem of sustainably developed tools, we enable sustainable science using our data by making it publicly available. Combining the data release with the NDS Labs infrastructure allows a substantial amount of added value, including web-based access to analysis and visualization using the yt analysis package through an IPython notebook interface. In addition, we are able to accompany the paper submission to the arXiv preprint server with links to the raw simulation data as well as interactive real-time data visualizations that readers can explore on their own or share with colleagues during journal club discussions. It is our hope that the value added by these services will substantially increase the impact and readership of the paper.

  17. Local Cloudiness Development Forecast Based on Simulation of Solid Phase Formation Processes in the Atmosphere

    NASA Astrophysics Data System (ADS)

    Barodka, Siarhei; Kliutko, Yauhenia; Krasouski, Alexander; Papko, Iryna; Svetashev, Alexander; Turishev, Leonid

    2013-04-01

    Nowadays numerical simulation of thundercloud formation processes is of great interest as an actual problem from the practical point of view. Thunderclouds significantly affect airplane flights, and mesoscale weather forecast has much to contribute to facilitate the aviation forecast procedures. An accurate forecast can certainly help to avoid aviation accidents due to weather conditions. The present study focuses on modelling of the convective clouds development and thunder clouds detection on the basis of mesoscale atmospheric processes simulation, aiming at significantly improving the aeronautical forecast. In the analysis, the primary weather radar information has been used to be further adapted for mesoscale forecast systems. Two types of domains have been selected for modelling: an internal one (with radius of 8 km), and an external one (with radius of 300 km). The internal domain has been directly applied to study the local clouds development, and the external domain data has been treated as initial and final conditions for cloud cover formation. The domain height has been chosen according to the civil aviation forecast data (i.e. not exceeding 14 km). Simulations of weather conditions and local clouds development have been made within selected domains with the WRF modelling system. In several cases, thunderclouds are detected within the convective clouds. To specify the given category of clouds, we employ a simulation technique of solid phase formation processes in the atmosphere. Based on modelling results, we construct vertical profiles indicating the amount of solid phase in the atmosphere. Furthermore, we obtain profiles demonstrating the amount of ice particles and large particles (hailstones). While simulating the processes of solid phase formation, we investigate vertical and horizontal air flows. Consequently, we attempt to separate the total amount of solid phase into categories of small ice particles, large ice particles and hailstones. Also, we strive to reveal and differentiate the basic atmospheric parameters of sublimation and coagulation processes, aiming to predict ice particles precipitation. To analyze modelling results we apply the VAPOR three-dimensional visualization package. For the chosen domains, a diurnal synoptic situation has been simulated, including rain, sleet, ice pellets, and hail. As a result, we have obtained a large scope of data describing various atmospheric parameters: cloud cover, major wind components, basic levels of isobaric surfaces, and precipitation rate. Based on this data, we show both distinction in precipitation formation due to various heights and its differentiation of the ice particles. The relation between particle rise in the atmosphere and its size is analyzed: at 8-10 km altitude large ice particles, resulted from coagulation, dominate, while at 6-7 km altitude one can find snow and small ice particles formed by condensation growth. Also, mechanical trajectories of solid precipitation particles for various ice formation processes have been calculated.

  18. Hydrophobic kenaf nanocrystalline cellulose for the binding of curcumin.

    PubMed

    Zainuddin, Norhidayu; Ahmad, Ishak; Kargarzadeh, Hanieh; Ramli, Suria

    2017-05-01

    Nanocrystalline cellulose (NCC) extracted from lignocellulosic materials has been actively investigated as a drug delivery excipients due to its large surface area, high aspect ratio, and biodegradability. In this study, the hydrophobically modified NCC was used as a drug delivery excipient of hydrophobic drug curcumin. The modification of NCC with a cationic surfactant, cetyl trimethylammonium bromide (CTAB) was used to modulate the loading of hydrophobic drugs that would not normally bind to NCC. The FTIR, Elemental analysis, XRD, TGA, and TEM were used to confirm the modification of NCC with CTAB. The effect of concentration of CTAB on the binding efficiency of hydrophobic drug curcumin was investigated. The amounts of curcumin bound onto the CTAB-NCC nanoparticles were analyzed by UV-vis Spectrophotometric. The result showed that the modified CTAB-NCC bound a significant amount of curcumin, in a range from 80% to 96% curcumin added. Nevertheless, at higher concentration of CTAB resulted in lower binding efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Effects of endogenous small molecular compounds on the rheological properties, texture and microstructure of soymilk coagulum: Removal of phytate using ultrafiltration.

    PubMed

    Wang, Ruican; Guo, Shuntang

    2016-11-15

    This study aims to clarify the roles played by endogenous small molecular components in soymilk coagulation process and the properties of gels. Soymilk samples with decreasing levels of small molecules were prepared by ultrafiltration, to reduce the amount of phytate and salts. CaSO4-induced coagulation process was analyzed using rheological methods. Results showed that removal of free small molecules decreased the activation energy of protein coagulation, resulting in accelerated reaction and increased gel strength. However, too fast a reaction led to the drop in storage modulus (G'). Microscopic observation suggested that accelerated coagulation generated a coarse and non-uniform gel network with large pores. This network could not hold much water, leading to serious syneresis. Endogenous small molecules in soymilk were vital in the fine gel structure. Coagulation rate could be controlled by adjusting the amount of small molecules to obtain tofu products with the optimal texture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.

    PubMed

    Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin

    2014-12-01

    The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.

  1. Assignment of channels and polarisations in a broadcasting satellite service environment

    NASA Astrophysics Data System (ADS)

    Fortes, J. M. P.

    1986-07-01

    In the process of synthesizing a satellite communications plan, a large number of possible configurations has to be analyzed in a short amount of time. An important part of the process concerns the allocation of channels and polarizations to the various systems. It is, of course, desirable to make these allocations based on the aggregate carrier/interference ratios, but this needs a considerable amount of time, and for this reason the single-entry carrier/interference criterion is usually employed. The paper presents an integer programming model based on an approximate evaluation of the aggregate carrier/interference ratios, which is fast enough to justify its application in the synthesis process. It was developed to help the elaboration of a downlink plan for the broadcasting satellite service (BSS) of North, Central, and South America. The official software package of the 1983 Administrative Radio Conference (RARC 83), responsible for the planning of the BSS in region 2, contains a routine based on this model.

  2. Finding Regions of Interest on Toroidal Meshes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Sinha, Rishi R; Jones, Chad

    2011-02-09

    Fusion promises to provide clean and safe energy, and a considerable amount of research effort is underway to turn this aspiration intoreality. This work focuses on a building block for analyzing data produced from the simulation of microturbulence in magnetic confinementfusion devices: the task of efficiently extracting regions of interest. Like many other simulations where a large amount of data are produced,the careful study of ``interesting'' parts of the data is critical to gain understanding. In this paper, we present an efficient approach forfinding these regions of interest. Our approach takes full advantage of the underlying mesh structure in magneticmore » coordinates to produce acompact representation of the mesh points inside the regions and an efficient connected component labeling algorithm for constructingregions from points. This approach scales linearly with the surface area of the regions of interest instead of the volume as shown with bothcomputational complexity analysis and experimental measurements. Furthermore, this new approach is 100s of times faster than a recentlypublished method based on Cartesian coordinates.« less

  3. Amplitude Variations in Pulsating Red Giants. II. Some Systematics

    NASA Astrophysics Data System (ADS)

    Percy, J. R.; Laing, J.

    2017-12-01

    In order to extend our previous studies of the unexplained phenomenon of cyclic amplitude variations in pulsating red giants, we have used the AAVSO time-series analysis package vstar to analyze long-term AAVSO visual observations of 50 such stars, mostly Mira stars. The relative amount of the variation, typically a factor of 1.5, and the time scale of the variation, typically 20-35 pulsation periods, are not significantly different in longer-period, shorter-period, and carbon stars in our sample, and they also occur in stars whose period is changing secularly, perhaps due to a thermal pulse. The time scale of the variations is similar to that in smaller-amplitude SR variables, but the relative amount of the variation appears to be larger in smaller-amplitude stars, and is therefore more conspicuous. The cause of the amplitude variations remains unclear, though they may be due to rotational modulation of a star whose pulsating surface is dominated by the effects of large convective cells.

  4. Increased plastic litter cover affects the foraging activity of the sandy intertidal gastropod Nassarius pullus.

    PubMed

    Aloy, Alexander B; Vallejo, Benjamin M; Juinio-Meñez, Marie Antonette

    2011-08-01

    This study analyzed the foraging behavior of the gastropod Nassarius pullus on garbage-impacted sandy shores of Talim Bay, Batangas, Philippines. The effect of different levels of plastic garbage cover on foraging efficiency was investigated. Controlled in situ baiting experiments were conducted to quantify aspects of foraging behavior as affected by the levels of plastic litter cover in the foraging area. The results of the study indicated that the gastropod's efficiency in locating and in moving towards a food item generally decreased as the level of plastic cover increased. Prolonged food searching time and increased self-burial in sand were highly correlated with increased plastic cover. The accuracy of orientation towards the actual position of the bait decreased significantly when the amount of plastic cover increased to 50%. These results are consistent with the significant decreases in the abundance of the gastropod observed during periods of deposition of large amounts of plastic and other debris on the shore. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Fog collecting biomimetic surfaces: Influence of microstructure and wettability.

    PubMed

    Azad, M A K; Ellerbrok, D; Barthlott, W; Koch, K

    2015-01-19

    We analyzed the fog collection efficiency of three different sets of samples: replica (with and without microstructures), copper wire (smooth and microgrooved) and polyolefin mesh (hydrophilic, superhydrophilic and hydrophobic). The collection efficiency of the samples was compared in each set separately to investigate the influence of microstructures and/or the wettability of the surfaces on fog collection. Based on the controlled experimental conditions chosen here large differences in the efficiency were found. We found that microstructured plant replica samples collected 2-3 times higher amounts of water than that of unstructured (smooth) samples. Copper wire samples showed similar results. Moreover, microgrooved wires had a faster dripping of water droplets than that of smooth wires. The superhydrophilic mesh tested here was proved more efficient than any other mesh samples with different wettability. The amount of collected fog by superhydrophilic mesh was about 5 times higher than that of hydrophilic (untreated) mesh and was about 2 times higher than that of hydrophobic mesh.

  6. A Framework for Real-Time Collection, Analysis, and Classification of Ubiquitous Infrasound Data

    NASA Astrophysics Data System (ADS)

    Christe, A.; Garces, M. A.; Magana-Zook, S. A.; Schnurr, J. M.

    2015-12-01

    Traditional infrasound arrays are generally expensive to install and maintain. There are ~10^3 infrasound channels on Earth today. The amount of data currently provided by legacy architectures can be processed on a modest server. However, the growing availability of low-cost, ubiquitous, and dense infrasonic sensor networks presents a substantial increase in the volume, velocity, and variety of data flow. Initial data from a prototype ubiquitous global infrasound network is already pushing the boundaries of traditional research server and communication systems, in particular when serving data products over heterogeneous, international network topologies. We present a scalable, cloud-based approach for capturing and analyzing large amounts of dense infrasonic data (>10^6 channels). We utilize Akka actors with WebSockets to maintain data connections with infrasound sensors. Apache Spark provides streaming, batch, machine learning, and graph processing libraries which will permit signature classification, cross-correlation, and other analytics in near real time. This new framework and approach provide significant advantages in scalability and cost.

  7. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    PubMed

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Development of an online analyzer of atmospheric H 2O 2 and several organic hydroperoxides for field campaigns

    NASA Astrophysics Data System (ADS)

    François, S.; Sowka, I.; Monod, A.; Temime-Roussel, B.; Laugier, J. M.; Wortham, H.

    2005-03-01

    An online automated instrument was developed for atmospheric measurements of hydroperoxides with separation and quantification of H 2O 2 and several organic hydroperoxides. Samples were trapped in aqueous solutions in a scrubbing glass coil. Analyses were performed on an HPLC column followed by para-hydroxyphenylacetic acid (POPHA) acetic acid and peroxidase derivatization and fluorescence detection. Analytical and sampling tests were performed on different parameters to obtain optimum signal-to-noise ratios, high resolution and collection efficiencies higher than 95% for H 2O 2 and organic hydroperoxides. The obtained performances show large improvements compared to previous studies. The sampling and analytical devices can be coupled providing an online analyzer. The device was used during two field campaigns in the Marseilles area in June 2001 (offline analyzer) and in July 2002 (online analyzer) at rural sites at low and high altitudes, respectively, during the ESCOMPTE and BOND campaigns. During the ESCOMPTE campaign, H 2O 2 was detected occasionally, and no organic hydroperoxides was observed. During the BOND campaign, substantial amounts of H 2O 2 and 1-HEHP+MHP were often detected, and two other organic hydroperoxides were occasionally detected. These observations are discussed.

  9. System Learning via Exploratory Data Analysis: Seeing Both the Forest and the Trees

    NASA Astrophysics Data System (ADS)

    Habash Krause, L.

    2014-12-01

    As the amount of observational Earth and Space Science data grows, so does the need for learning and employing data analysis techniques that can extract meaningful information from those data. Space-based and ground-based data sources from all over the world are used to inform Earth and Space environment models. However, with such a large amount of data comes a need to organize those data in a way such that trends within the data are easily discernible. This can be tricky due to the interaction between physical processes that lead to partial correlation of variables or multiple interacting sources of causality. With the suite of Exploratory Data Analysis (EDA) data mining codes available at MSFC, we have the capability to analyze large, complex data sets and quantitatively identify fundamentally independent effects from consequential or derived effects. We have used these techniques to examine the accuracy of ionospheric climate models with respect to trends in ionospheric parameters and space weather effects. In particular, these codes have been used to 1) Provide summary "at-a-glance" surveys of large data sets through categorization and/or evolution over time to identify trends, distribution shapes, and outliers, 2) Discern the underlying "latent" variables which share common sources of causality, and 3) Establish a new set of basis vectors by computing Empirical Orthogonal Functions (EOFs) which represent the maximum amount of variance for each principal component. Some of these techniques are easily implemented in the classroom using standard MATLAB functions, some of the more advanced applications require the statistical toolbox, and applications to unique situations require more sophisiticated levels of programming. This paper will present an overview of the range of tools available and how they might be used for a variety of time series Earth and Space Science data sets. Examples of feature recognition from both 1D and 2D (e.g. imagery) time series data sets will be presented.

  10. The effect of the 2011 flood on agricultural chemical and sediment movement in the lower Mississippi River Basin

    NASA Astrophysics Data System (ADS)

    Welch, H.; Coupe, R.; Aulenbach, B.

    2012-04-01

    Extreme hydrologic events, such as floods, can overwhelm a surface water system's ability to process chemicals and can move large amounts of material downstream to larger surface water bodies. The Mississippi River is the 3rd largest River in the world behind the Amazon in South America and the Congo in Africa. The Mississippi-Atchafalaya River basin grows much of the country's corn, soybean, rice, cotton, pigs, and chickens. This is large-scale modern day agriculture with large inputs of nutrients to increase yields and large applied amounts of crop protection chemicals, such as pesticides. The basin drains approximately 41% of the conterminous United States and is the largest contributor of nutrients to the Gulf of Mexico each spring. The amount of water and nutrients discharged from the Mississippi River has been related to the size of the low dissolved oxygen area that forms off of the coast of Louisiana and Texas each summer. From March through April 2011, the upper Mississippi River basin received more than five times more precipitation than normal, which combined with snow melt from the Missouri River basin, created a historic flood event that lasted from April through July. The U.S. Geological Survey, as part of the National Stream Quality Accounting Network (NASQAN), collected samples from six sites located in the lower Mississippi-Atchafalaya River basin, as well as, samples from the three flow-diversion structures or floodways: the Birds Point-New Madrid in Missouri and the Morganza and Bonnet Carré in Louisiana, from April through July. Samples were analyzed for nutrients, pesticides, suspended sediments, and particle size; results were used to determine the water quality of the river during the 2011 flood. Monthly loads for nitrate, phosphorus, pesticides (atrazine, glyphosate, fluometuron, and metolachlor), and sediment were calculated to quantify the movement of agricultural chemicals and sediment into the Gulf of Mexico. Nutrient loads were compared to historic loads to assess the effect of the flood on the zone of hypoxia that formed in the Gulf of Mexico during the spring of 2011.

  11. Dexterity: A MATLAB-based analysis software suite for processing and visualizing data from tasks that measure arm or forelimb function.

    PubMed

    Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B

    2017-07-15

    Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Conflict Misleads Large Carnivore Management and Conservation: Brown Bears and Wolves in Spain.

    PubMed

    Fernández-Gil, Alberto; Naves, Javier; Ordiz, Andrés; Quevedo, Mario; Revilla, Eloy; Delibes, Miguel

    2016-01-01

    Large carnivores inhabiting human-dominated landscapes often interact with people and their properties, leading to conflict scenarios that can mislead carnivore management and, ultimately, jeopardize conservation. In northwest Spain, brown bears Ursus arctos are strictly protected, whereas sympatric wolves Canis lupus are subject to lethal control. We explored ecological, economic and societal components of conflict scenarios involving large carnivores and damages to human properties. We analyzed the relation between complaints of depredations by bears and wolves on beehives and livestock, respectively, and bear and wolf abundance, livestock heads, number of culled wolves, amount of paid compensations, and media coverage. We also evaluated the efficiency of wolf culling to reduce depredations on livestock. Bear damages to beehives correlated positively to the number of female bears with cubs of the year. Complaints of wolf predation on livestock were unrelated to livestock numbers; instead, they correlated positively to the number of wild ungulates harvested during the previous season, the number of wolf packs, and to wolves culled during the previous season. Compensations for wolf complaints were fivefold higher than for bears, but media coverage of wolf damages was thirtyfold higher. Media coverage of wolf damages was unrelated to the actual costs of wolf damages, but the amount of news correlated positively to wolf culling. However, wolf culling was followed by an increase in compensated damages. Our results show that culling of the wolf population failed in its goal of reducing damages, and suggest that management decisions are at least partly mediated by press coverage. We suggest that our results provide insight to similar scenarios, where several species of large carnivores share the landscape with humans, and management may be reactive to perceived conflicts.

  13. Conflict Misleads Large Carnivore Management and Conservation: Brown Bears and Wolves in Spain

    PubMed Central

    Fernández-Gil, Alberto; Naves, Javier; Ordiz, Andrés; Quevedo, Mario; Revilla, Eloy; Delibes, Miguel

    2016-01-01

    Large carnivores inhabiting human-dominated landscapes often interact with people and their properties, leading to conflict scenarios that can mislead carnivore management and, ultimately, jeopardize conservation. In northwest Spain, brown bears Ursus arctos are strictly protected, whereas sympatric wolves Canis lupus are subject to lethal control. We explored ecological, economic and societal components of conflict scenarios involving large carnivores and damages to human properties. We analyzed the relation between complaints of depredations by bears and wolves on beehives and livestock, respectively, and bear and wolf abundance, livestock heads, number of culled wolves, amount of paid compensations, and media coverage. We also evaluated the efficiency of wolf culling to reduce depredations on livestock. Bear damages to beehives correlated positively to the number of female bears with cubs of the year. Complaints of wolf predation on livestock were unrelated to livestock numbers; instead, they correlated positively to the number of wild ungulates harvested during the previous season, the number of wolf packs, and to wolves culled during the previous season. Compensations for wolf complaints were fivefold higher than for bears, but media coverage of wolf damages was thirtyfold higher. Media coverage of wolf damages was unrelated to the actual costs of wolf damages, but the amount of news correlated positively to wolf culling. However, wolf culling was followed by an increase in compensated damages. Our results show that culling of the wolf population failed in its goal of reducing damages, and suggest that management decisions are at least partly mediated by press coverage. We suggest that our results provide insight to similar scenarios, where several species of large carnivores share the landscape with humans, and management may be reactive to perceived conflicts. PMID:26974962

  14. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  15. Large predispersion for reduction of intrachannel nonlinear impairments in strongly dispersion-managed transmissions

    NASA Astrophysics Data System (ADS)

    Cao, Wenhua

    2016-05-01

    Predispersion for reduction of intrachannel nonlinear impairments in quasi-linear strongly dispersion-managed transmission system is analyzed in detail by numerical simulations. We show that for moderate amount of predispersion there is an optimal value at which reduction of the nonlinear impairments can be obtained, which is consistent with previous well-known predictions. However, we found that much better transmission performance than that of the previous predictions can be obtained if predispersion is increased to some extent. For large predispersion, the nonlinear impairments reduce monotonically with increasing predispersion and then they tend to be stabilized when predispersion is further increased. Thus, transmission performance can be efficiently improved by inserting a high-dispersive element, such as a chirped fiber bragg grating (CFBG), at the input end of the transmission link to broaden the signal pulses while, at the output end, using another CFBG with the opposite dispersion to recompress the signal.

  16. Automated X-Ray Diffraction of Irradiated Materials

    DOE PAGES

    Rodman, John; Lin, Yuewei; Sprouster, David; ...

    2017-10-26

    Synchrotron-based X-ray diffraction (XRD) and small-angle Xray scattering (SAXS) characterization techniques used on unirradiated and irradiated reactor pressure vessel steels yield large amounts of data. Machine learning techniques, including PCA, offer a novel method of analyzing and visualizing these large data sets in order to determine the effects of chemistry and irradiation conditions on the formation of radiation induced precipitates. In order to run analysis on these data sets, preprocessing must be carried out to convert the data to a usable format and mask the 2-D detector images to account for experimental variations. Once the data has been preprocessed, itmore » can be organized and visualized using principal component analysis (PCA), multi-dimensional scaling, and k-means clustering. In conclusion, from these techniques, it is shown that sample chemistry has a notable effect on the formation of the radiation induced precipitates in reactor pressure vessel steels.« less

  17. Atmospheric deposition of mercury in central Poland: Sources and seasonal trends

    NASA Astrophysics Data System (ADS)

    Siudek, Patrycja; Kurzyca, Iwona; Siepak, Jerzy

    2016-03-01

    Atmospheric deposition of total mercury was studied at two sites in central Poland, between April 2013 and October 2014. Hg in rainwater (bulk deposition) was analyzed in relation to meteorological parameters and major ions (H+, NO3-, Cl-, SO42 -) in order to investigate seasonal variation, identify sources and determine factors affecting atmospheric Hg chemistry and deposition. Total mercury concentrations varied between 1.24 and 22.1 ng L- 1 at the urban sampling site (Poznań) and between 0.57 and 18.3 ng L- 1 in the woodland protected area (Jeziory), with quite similar mean values of 6.96 and 6.37 ng L- 1, respectively. Mercury in precipitation exhibited lower spatial variability within the study domain (urban/forest transect) than the concentrations determined during other similar observations, reflecting the predominant influence of the same local sources. In our study, a significant seasonal pattern of Hg deposition was observed at both sampling sites, with higher and more variable concentrations of Hg reported for the urban area. In particular, deposition values of Hg were higher in the samples attributed to relatively large precipitation amounts in the summer and in those collected during the winter season (the result of higher contributions from combustion sources, i.e. intensive combustion of fossil fuels in residential and commercial boilers, individual power/heat-generating plants). In addition, a significant relationship between Hg concentration and precipitation amount was found while considering different types of wintertime samples (i.e. rain, snow and mixed precipitation). The analysis of backward trajectories showed that air masses arriving from polluted regions of western Europe and southern Poland largely affected the amount of Hg in rainwater. A seasonal variation in Hg deposition fluxes was also observed, with the maximum value of Hg in spring and minimum in winter. Our results indicated that rainwater Hg and, consequently, the wet deposition flux of Hg are related to seasonal differences in precipitation (type, intensity, amount) and the emission source.

  18. POM Pulses: Characterizing the Physical and Chemical Properties of Particulate Organic Matter (POM) Mobilized by Large Storm Events and its Influence on Receiving Fluvial Systems

    NASA Astrophysics Data System (ADS)

    Johnson, E. R.; Rowland, R. D.; Protokowicz, J.; Inamdar, S. P.; Kan, J.; Vargas, R.

    2016-12-01

    Extreme storm events have tremendous erosive energy which is capable of mobilizing vast amounts of material from watershed sources into fluvial systems. This complex mixture of sediment and particulate organic matter (POM) is a nutrient source, and has the potential to impact downstream water quality. The impact of POM on receiving aquatic systems can vary not only by the total amount exported but also by the various sources involved and the particle sizes of POM. This study examines the composition of POM in potential sources and within-event POM by: (1) determining the amount and quality of dissolved organic matter (DOM) that can be leached from coarse, medium and fine particle classes; (2) assessing the C and N content and isotopic character of within-event POM; and (3) coupling physical and chemical properties to evaluate storm event POM influence on stream water. Storm event POM samples and source sediments were collected from a forested headwater catchment (second order stream) in the Piedmont region of Maryland. Samples were sieved into three particle classes - coarse (2mm-1mm), medium (1mm-250µm) and fine (<250µm). Extractions were performed for three particle class sizes and the resulting fluorescent organic matter was analyzed. Carbon (C) and Nitrogen (N) amount, C:N ratio, and isotopic analysis of 13C and 15N were performed on solid state event and source material. Future work will include examination of microbial communities associated with POM particle size classes. Physical size class separation of within-event POM exhibited differences in C:N ratios, δ15N composition, and extracted DOM lability. Smaller size classes exhibited lower C:N ratios, more enriched δ15N and more recalcitrant properties in leached DOM. Source material had varying C:N ratios and contributions to leached DOM. These results indicate that both source and size class strongly influence the POM contribution to fluvial systems during large storm events.

  19. The Health Risk of Cd Released from Low-Cost Jewelry.

    PubMed

    Pouzar, Miloslav; Zvolská, Magdalena; Jarolím, Oldřich; Audrlická Vavrušová, Lenka

    2017-05-12

    The composition of the surface layer of 13 low-cost jewelry samples with a high Cd content was analyzed using an energy-dispersive X-ray fluorescence spectrometer (ED XRF). The analyzed jewels were obtained in cooperation with the Czech Environmental Inspectorate. The jewels were leached in two types of artificial sweat (acidic and alkaline) for 7 days. Twenty microliters of the resulting solution was subsequently placed on a paper carrier and analyzed by an LIBS (Laser-Induced Breakdown Spectrometry) spectrometer after drying. The Cd content in the jewelry surface layer detected by using ED XRF ranged from 13.4% to 44.6% (weight per weight-w/w). The samples were subsequently leached in artificial alkaline, and the acidic sweat and leachates were analyzed using laser-induced breakdown spectrometry (LIBS). The amount of released Cd into alkaline sweat ranged from 24.0 to 370 µg Cd per week, respectively 3.23-61.7 µg/cm²/week. The amount of released Cd into acidic sweat ranged from 16.4 to 1517 µg Cd per week, respectively 3.53-253 µg/cm²/week. The limit of Cd for dermal exposure is not unequivocally determined in the countries of the EU (European Union) or in the U.S. Based on the US EPA (United States Environmental Protection Agency) approach used to establish the reference dose (RfD) for Cd contained in food and information about the bioavailability of Cd after dermal exposure, we assessed our own value of dermal RfD. The value was compared with the theoretical amount of Cd, which can be absorbed into the organism from jewelry in contact with the skin. The calculation was based on the amount of Cd that was released into acidic and alkaline sweat. The highest amount of Cd was released into acidic sweat, which represents 0.1% of dermal RfD and into alkaline sweat, 0.5% of dermal RfD. These results indicate that the analyzed jewelry contains Cd over the limit for composition of jewelry available within the territory of the EU. The determined amount of Cd in analyzed jewelry does not, however, pose a threat in terms of carcinogenic toxic effects.

  20. The battle for hearts and minds: who is communicating most effectively with the cosmetic marketplace?

    PubMed

    Camp, Matthew C; Wong, Wendy W; Mussman, Jason L; Gupta, Subhas C

    2010-01-01

    Cosmetic surgery, historically the purview of plastic surgeons, has in recent years seen an influx of practitioners from other fields of training. Many of these new providers are savvy in marketing and public relations and are beginning to control a surprisingly large amount of cosmetic patient care. The purpose of this study is to measure the amount of traffic being attracted to the Web sites of individual practitioners and organizations vying for cosmetic patients. This study investigates the trends of the past 12 months and identifies changes of special concern to plastic surgeons. The Web sites of 1307 cosmetic providers were monitored over a year's time. The Web activity of two million individuals whose computers were loaded with a self-reporting software package was recorded and analyzed. The Web sites were analyzed according to the specialty training of the site owner and total unique visits per month were tallied for the most prominent specialties. The dominant Web sites were closely scrutinized and the Web optimization strategies of each were also examined. There is a tremendous amount of Web activity surrounding cosmetic procedures and the amount of traffic on the most popular sites is continuing to grow. Also, a large sum of money is being expended to channel Web traffic, with sums in the thousands of dollars being spent daily by top Web sites. Overall in the past year, the private Web sites of plastic surgeons have increased their reach by 10%, growing from 200,000 to approximately 220,000 unique visitors monthly. Plastic surgery remains the specialty with the largest number of Web visitors per month. However, when combined, the private Web sites of all other providers of aesthetic services have significantly outpaced plastic surgery's growth. The traffic going to non-plastic surgeons has grown by 50% (200,000 visitors per month in September 2008 to 300,000 visitors monthly in September 2009). For providers of aesthetic services, communication with the public is of utmost importance. The Web has become the single most important information resource for consumers because of easy access. Plastic surgeons are facing significant competition for the attention of potential patients, with increasingly sophisticated Web sites and listing services being set up by independent parties. It is important for plastic surgeons to become familiar with the available Internet tools for communication with potential patients and to aggressively utilize these tools for effective practice building.

  1. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  2. Bioaccumulation and distribution of selenium in Enterococcus durans.

    PubMed

    Pieniz, Simone; Andreazza, Robson; Mann, Michele Bertoni; Camargo, Flávio; Brandelli, Adriano

    2017-03-01

    Selenium is an essential nutrient for all living organisms. Under appropriate conditions lactic acid bacteria (LAB) are capable for accumulating large amounts of trace elements, such as selenium, and incorporating them into organic compounds. In this study, the capacity of selenium bioaccumulation by Enterococcus durans LAB18s was evaluated. The distribution of organic selenium in selenium-enriched E. durans LAB18s biomass was analyzed, and the highest percentage of organic selenium was found in the fraction of total protein, followed by the fractions of polysaccharides and nucleic acids. When the protein fraction was obtained by different extractions (water, NaCl, ethanol and NaOH) it was demonstrated that alkali-soluble protein showed the higher Selenium content. Analysis of protein fractions by sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) revealed that selenium was present in the proteins ranging from 23 to 100kDa. The cells were analyzed by scanning electron microscopy (SEM); scanning electron microscopy/energy dispersive spectrometry (SEM/EDS) and transmission electron microscopy (TEM). SEM, TEM and SEM/EDS showed the morphology, the selenium particles bioaccumulated into and on the cells and the amounts of selenium present into the cells, respectively. Thus, the isolate E. durans LAB18s can be a promising probiotic to be used as selenium-enriched biomass in feed trials. Copyright © 2016 Elsevier GmbH. All rights reserved.

  3. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  4. A new data format for the commissioning phase of the ATLAS detector

    NASA Astrophysics Data System (ADS)

    Köneke, Karsten; ATLAS Collaboration

    2010-04-01

    In the commissioning phase of the ATLAS experiment, low-level Event Summary Data (ESD) are analyzed to evaluate the performance of the individual subdetectors, the performance of the reconstruction and particle identification algorithms, and to obtain calibration coefficients. In the grid model of distributed analysis, these data must be transferred to Tier-1 and Tier-2 sites before they can be analyzed. However, the large size of ESD (approxeq1 MByte/event) constrains the amount of data that can be distributed on the grid and is available on disks. In order to overcome this constraint and make the data fully available, new data sets — collectively known as Derived Physics Data (DPD) — have been designed. Each DPD set contains a subset of the ESD data, tailored to specific needs of the subdetector and object reconstruction and identification performance groups. Filtering algorithms perform a selection based on physics contents and trigger response, further reducing the data volume. Thanks to these techniques, the total volume of DPD to be distributed on the grid amounts to 20% of the initial ESD data. An evolution of the tools developed in this context serves to produce another set of DPDs that are specifically tailored for physics analysis. All selection criteria and other relevant information is stored inside these DPDs as meta-data and a connection to external databases is also established.

  5. Development of a NIR-based blend uniformity method for a drug product containing multiple structurally similar actives by using the quality by design principles.

    PubMed

    Lin, Yiqing; Li, Weiyong; Xu, Jin; Boulas, Pierre

    2015-07-05

    The aim of this study is to develop an at-line near infrared (NIR) method for the rapid and simultaneous determination of four structurally similar active pharmaceutical ingredients (APIs) in powder blends intended for the manufacturing of tablets. Two of the four APIs in the formula are present in relatively small amounts, one at 0.95% and the other at 0.57%. Such small amounts in addition to the similarity in structures add significant complexity to the blend uniformity analysis. The NIR method is developed using spectra from six laboratory-created calibration samples augmented by a small set of spectra from a large-scale blending sample. Applying the quality by design (QbD) principles, the calibration design included concentration variations of the four APIs and a main excipient, microcrystalline cellulose. A bench-top FT-NIR instrument was used to acquire the spectra. The obtained NIR spectra were analyzed by applying principal component analysis (PCA) before calibration model development. Score patterns from the PCA were analyzed to reveal relationship between latent variables and concentration variations of the APIs. In calibration model development, both PLS-1 and PLS-2 models were created and evaluated for their effectiveness in predicting API concentrations in the blending samples. The final NIR method shows satisfactory specificity and accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Far from thunderstorm UV transient events in the atmosphere measured by Vernov satellite

    NASA Astrophysics Data System (ADS)

    Morozenko, Violetta; Klimov, Pavel; Khrenov, Boris; Gali, Garipov; Margarita, Kaznacheeva; Mikhail, Panasyuk; Sergei, Svertilov; Robert, Holzworth

    2016-04-01

    The steady self-contained classification of events such as sprites, elves, blue jets emerged for the period of transient luminous events (TLE) observation. In accordance with TLE origin theories the presence of the thunderstorm region where the lightnings with the large peak current generating in is necessary. However, some far-from-thunderstorm region events were also detected and revealed to us another TLE generating mechanisms. For the discovering of the TLE nature the Universitetsky-Tatiana-2 and Vernov satellites were equipped with ultraviolet (240-400 nm) and red-infrared ( >610 nm) detectors. In both detector it was carried out regardless the lightnings with the guidance by the flashes in the UV wavelength where lightning's emitting is quite faint. The lowered threshold on the Vernov satellite allowed to select the great amount of TLE with the numerous far-from-thunderstorm region events examples. such events were not conjuncted with lightning activity measured by global lightning location network (WWLLN) on the large area of approximately 107 km2 for 30 minutes before and after the time of registration. The characteristic features of this type of event are: the absence of significant signal in the red-infrared detector's channel; a relatively small number of photons (less than 5 ṡ 1021). A large number of without lightning flash were detected at high latitudes over the ocean (30°S - 60°S). Lightning activity in the magnetic conjugate point also was analyzed. The relationship of far-from-thunderstorm region events with the specific lightning discharges didn't confirmed. Far-from-thunderstorm events - a new type of transient phenomena in the upper atmosphere is not associated with the thunderstorm activity. The mechanism of such discharges is not clear, though it was accumulated a sufficient amount of experimental facts of the existence of such flashes. According to the data of Vernov satellite the temporal profile, duration, location with earth coordinates and the number of photons generated in the far-from-thunderstorm atmospheric events has been analyzed and the discussion of these events origin is in progress.

  7. Source parameters controlling the generation and propagation of potential local tsunamis along the cascadia margin

    USGS Publications Warehouse

    Geist, E.; Yoshioka, S.

    1996-01-01

    The largest uncertainty in assessing hazards from local tsunamis along the Cascadia margin is estimating the possible earthquake source parameters. We investigate which source parameters exert the largest influence on tsunami generation and determine how each parameter affects the amplitude of the local tsunami. The following source parameters were analyzed: (1) type of faulting characteristic of the Cascadia subduction zone, (2) amount of slip during rupture, (3) slip orientation, (4) duration of rupture, (5) physical properties of the accretionary wedge, and (6) influence of secondary faulting. The effect of each of these source parameters on the quasi-static displacement of the ocean floor is determined by using elastic three-dimensional, finite-element models. The propagation of the resulting tsunami is modeled both near the coastline using the two-dimensional (x-t) Peregrine equations that includes the effects of dispersion and near the source using the three-dimensional (x-y-t) linear long-wave equations. The source parameters that have the largest influence on local tsunami excitation are the shallowness of rupture and the amount of slip. In addition, the orientation of slip has a large effect on the directivity of the tsunami, especially for shallow dipping faults, which consequently has a direct influence on the length of coastline inundated by the tsunami. Duration of rupture, physical properties of the accretionary wedge, and secondary faulting all affect the excitation of tsunamis but to a lesser extent than the shallowness of rupture and the amount and orientation of slip. Assessment of the severity of the local tsunami hazard should take into account that relatively large tsunamis can be generated from anomalous 'tsunami earthquakes' that rupture within the accretionary wedge in comparison to interplate thrust earthquakes of similar magnitude. ?? 1996 Kluwer Academic Publishers.

  8. Application of large volume injection GC-MS to analysis of organic compounds in the extracts and leachates of municipal solid waste incineration fly ash

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korenkova, Eva; Matisova, Eva; Slobodnik, Jaroslav

    2006-07-01

    Organic solvent and water extracts of fly ash from a Milan (Italy) municipal solid waste incinerator (MSWI) were analyzed by large volume injection-gas chromatography-mass spectrometry (LVI-GC-MS) with programmable temperature vaporizer (PTV). Using injection volumes of 10-100 {mu}l, typically over a hundred compounds were detected in organic solvent extracts and ca. 35% of them could be tentatively identified from their electron impact ionization mass spectra. A protocol for the determination of the maximum amount of a potential environmental pollutant available for leaching (availability test) was developed for four selected target compounds: pentachlorobenzene (PeCB), hexachlorobenzene (HxCB), o-terphenyl (o-TPH) and m-terphenyl (m-TPH). Keymore » parameters, extraction time and liquid-to-solid ratio (L/S), were studied in more detail. Recoveries of PeCB, HxCB and o-TPH spiked into the fly ash samples at two concentration levels ranged from 38% to 53% for freshly spiked and from 14% to 40% for 40-day aged fly ash. Recoveries of m-TPH were 8% to 11% from freshly spiked and less than 3% from aged spiked fly ash. The native amounts in Milan MSWI fly ash, determined in an interlaboratory exercise using the developed protocol, were 31 ng/g PeCB, 34 ng/g HxCB, 72 ng/g o-TPH and 4.4 ng/g m-TPH. A separate methodology was developed for the determination of compounds extracted from fly ash by water (leaching test). Following 8-h sonication at L/S 20, the leached amounts of PeCB, HxCB and o-TPH were 1.1, 3.1 and 6.0 ng/g fly ash, respectively.« less

  9. Big Data Application in Biomedical Research and Health Care: A Literature Review.

    PubMed

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care.

  10. Occurrence and sorption of fluoroquinolones in poultry litters and soils from São Paulo State, Brazil.

    PubMed

    Leal, Rafael Marques Pereira; Figueira, Rafael Fernandes; Tornisielo, Valdemar Luiz; Regitano, Jussara Borges

    2012-08-15

    Animal production is one of the most expressive sectors of Brazilian agro-economy. Although antibiotics are routinely used in this activity, their occurrence, fate, and potential impacts to the local environment are largely unknown. This research evaluated sorption-desorption and occurrence of four commonly used fluoroquinolones (norfloxacin, ciprofloxacin, danofloxacin, and enrofloxacin) in poultry litter and soil samples from São Paulo State, Brazil. The sorption-desorption studies involved batch equilibration technique and followed the OECD guideline for pesticides. All compounds were analyzed by HPLC, using fluorescence detector. Fluoroquinolones' sorption potential to the poultry litters (K(d) ≤65 L kg(-1)) was lower than to the soil (K(d) ~40,000 L kg(-1)), but was always high (≥69% of applied amount) indicating a higher specificity of fluoroquinolones interaction with soils. The addition of poultry litter (5%) to the soil had not affected sorption or desorption of these compounds. Desorption was negligible in the soil (≤0.5% of sorbed amount), but not in the poultry litters (up to 42% of sorbed amount). Fluoroquinolones' mean concentrations found in the poultry litters (1.37 to 6.68 mg kg(-1)) and soils (22.93 μg kg(-1)) were compatible to those found elsewhere (Austria, China, and Turkey). Enrofloxacin was the most often detected compound (30% of poultry litters and 27% of soils) at the highest mean concentrations (6.68 mg kg(-1) for poultry litters and 22.93 μg kg(-1) for soils). These results show that antibiotics are routinely used in poultry production and might represent one potential source of pollution to the environment that has been largely ignored and should be further investigated in Brazil. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Big Data Application in Biomedical Research and Health Care: A Literature Review

    PubMed Central

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812

  12. An efficient and rapid transgenic pollen screening and detection method using flow cytometry.

    PubMed

    Moon, Hong S; Eda, Shigetoshi; Saxton, Arnold M; Ow, David W; Stewart, C Neal

    2011-01-01

    Assaying for transgenic pollen, a major vector of transgene flow, provides valuable information and essential data for the study of gene flow and assessing the effectiveness of transgene containment. Most studies have employed microscopic screening methods or progeny analyses to estimate the frequency of transgenic pollen. However, these methods are time-consuming and laborious when large numbers of pollen grains must be analyzed to look for rare transgenic pollen grains. Thus, there is an urgent need for the development of a simple, rapid, and high throughput analysis method for transgenic pollen analysis. In this study, our objective was to determine the accuracy of using flow cytometry technology for transgenic pollen quantification in practical application where transgenic pollen is not frequent. A suspension of non-transgenic tobacco pollen was spiked with a known amount of verified transgenic tobacco pollen synthesizing low or high amounts of green fluorescent protein (GFP). The flow cytometric method detected approximately 75% and 100% of pollen grains synthesizing low and high amounts of GFP, respectively. The method is rapid, as it is able to count 5000 pollen grains per minute-long run. Our data indicate that this flow cytometric method is useful to study gene flow and assessment of transgene containment.

  13. High pressure liquid chromatographic method for the separation and quantitation of water-soluble radiolabeled benzene metabolites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabourin, P.J.; Bechtold, W.E.; Henderson, R.F.

    1988-05-01

    The glucuronide and sulfate conjugates of benzene metabolite as well as muconic acid and pre-phenyl- and phenylmercapturic acids were separated by ion-pairing HPLC. The HPLC method developed was suitable for automated analysis of a large number of tissue or excreta samples. p-Nitrophenyl (/sup 14/C)glucuronide was used as an internal standard for quantitation of these water-soluble metabolites. Quantitation was verified by spiking liver tissue with various amounts of phenylsulfate or glucuronides of phenol, catechol, or hydroquinone and analyzing by HPLC. Values determined by HPLC analysis were within 10% of the actual amount with which the liver was spiked. The amount ofmore » metabolite present in urine following exposure to (/sup 3/H)benzene was determined using p-nitrophenyl (/sup 14/C)glucuronide as an internal standard. Phenylsulfate was the major water-soluble metabolite in the urine of F344 rats exposed to 50 ppm (/sup 3/H)benzene for 6 h. Muconic acid and an unknown metabolite which decomposed in acidic media to phenylmercapturic acid were also present. Liver, however, contained a different metabolic profile. This indicates that urinary metabolite profiles may not be a true reflection of what is seen in individual tissues.« less

  14. Use of tropical maize for bioethanol production

    USDA-ARS?s Scientific Manuscript database

    Tropical maize is an alternative energy crop being considered as a feedstock for bioethanol production in the North Central and Midwest United States. Tropical maize is advantageous because it produces large amounts of soluble sugars in its stalks, creates a large amount of biomass, and requires lo...

  15. Surface Operations Systems Improve Airport Efficiency

    NASA Technical Reports Server (NTRS)

    2009-01-01

    With Small Business Innovation Research (SBIR) contracts from Ames Research Center, Mosaic ATM of Leesburg, Virginia created software to analyze surface operations at airports. Surface surveillance systems, which report locations every second for thousands of air and ground vehicles, generate massive amounts of data, making gathering and analyzing this information difficult. Mosaic?s Surface Operations Data Analysis and Adaptation (SODAA) tool is an off-line support tool that can analyze how well the airport surface operation is working and can help redesign procedures to improve operations. SODAA helps researchers pinpoint trends and correlations in vast amounts of recorded airport operations data.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kooperman, Gabriel J.; Pritchard, Michael S.; Burt, Melissa A.

    Changes in the character of rainfall are assessed using a holistic set of statistics based on rainfall frequency and amount distributions in climate change experiments with three conventional and superparameterized versions of the Community Atmosphere Model (CAM and SPCAM). Previous work has shown that high-order statistics of present-day rainfall intensity are significantly improved with superparameterization, especially in regions of tropical convection. Globally, the two modeling approaches project a similar future increase in mean rainfall, especially across the Inter-Tropical Convergence Zone (ITCZ) and at high latitudes, but over land, SPCAM predicts a smaller mean change than CAM. Changes in high-order statisticsmore » are similar at high latitudes in the two models but diverge at lower latitudes. In the tropics, SPCAM projects a large intensification of moderate and extreme rain rates in regions of organized convection associated with the Madden Julian Oscillation, ITCZ, monsoons, and tropical waves. In contrast, this signal is missing in all versions of CAM, which are found to be prone to predicting increases in the amount but not intensity of moderate rates. Predictions from SPCAM exhibit a scale-insensitive behavior with little dependence on horizontal resolution for extreme rates, while lower resolution (~2°) versions of CAM are not able to capture the response simulated with higher resolution (~1°). Furthermore, moderate rain rates analyzed by the “amount mode” and “amount median” are found to be especially telling as a diagnostic for evaluating climate model performance and tracing future changes in rainfall statistics to tropical wave modes in SPCAM.« less

  17. Plasma reactor waste management systems

    NASA Technical Reports Server (NTRS)

    Ness, Robert O., Jr.; Rindt, John R.; Ness, Sumitra R.

    1992-01-01

    The University of North Dakota is developing a plasma reactor system for use in closed-loop processing that includes biological, materials, manufacturing, and waste processing. Direct-current, high-frequency, or microwave discharges will be used to produce plasmas for the treatment of materials. The plasma reactors offer several advantages over other systems, including low operating temperatures, low operating pressures, mechanical simplicity, and relatively safe operation. Human fecal material, sunflowers, oats, soybeans, and plastic were oxidized in a batch plasma reactor. Over 98 percent of the organic material was converted to gaseous products. The solids were then analyzed and a large amount of water and acid-soluble materials were detected. These materials could possibly be used as nutrients for biological systems.

  18. The research and development of water resources management information system based on ArcGIS

    NASA Astrophysics Data System (ADS)

    Cui, Weiqun; Gao, Xiaoli; Li, Yuzhi; Cui, Zhencai

    According to that there are large amount of data, complexity of data type and format in the water resources management, we built the water resources calculation model and established the water resources management information system based on the advanced ArcGIS and Visual Studio.NET development platform. The system can integrate the spatial data and attribute data organically, and manage them uniformly. It can analyze spatial data, inquire by map and data bidirectionally, provide various charts and report forms automatically, link multimedia information, manage database etc. . So it can provide spatial and static synthetical information services for study, management and decision of water resources, regional geology and eco-environment etc..

  19. [Hyperspectral remote sensing in monitoring the vegetation heavy metal pollution].

    PubMed

    Li, Na; Lü, Jian-sheng; Altemann, W

    2010-09-01

    Mine exploitation aggravates the environment pollution. The large amount of heavy metal element in the drainage of slag from the mine pollutes the soil seriously, doing harm to the vegetation growing and human health. The investigation of mining environment pollution is urgent, in which remote sensing, as a new technique, helps a lot. In the present paper, copper mine in Dexing was selected as the study area and China sumac as the study plant. Samples and spectral data in field were gathered and analyzed in lab. The regression model from spectral characteristics for heavy metal content was built, and the feasibility of hyperspectral remote sensing in environment pollution monitoring was testified.

  20. Identifying relationships between unrelated pharmaceutical target proteins on the basis of shared active compounds.

    PubMed

    Miljković, Filip; Kunimoto, Ryo; Bajorath, Jürgen

    2017-08-01

    Computational exploration of small-molecule-based relationships between target proteins from different families. Target annotations of drugs and other bioactive compounds were systematically analyzed on the basis of high-confidence activity data. A total of 286 novel chemical links were established between distantly related or unrelated target proteins. These relationships involved a total of 1859 bioactive compounds including 147 drugs and 141 targets. Computational analysis of large amounts of compounds and activity data has revealed unexpected relationships between diverse target proteins on the basis of compounds they share. These relationships are relevant for drug discovery efforts. Target pairs that we have identified and associated compound information are made freely available.

  1. Determination of small quantities of fluoride in water: A modified zirconium-alizarin method

    USGS Publications Warehouse

    Lamar, W.L.; Seegmiller, C.G.

    1941-01-01

    The zirconium-alizarin method has been modified to facilitate the convenient and accurate determination of small amounts of fluoride in a large number of water samples. Sulfuric acid is used to acidify the samples to reduce the interference of sulfate. The pH is accurately controlled to give the most sensitive comparisons. Most natural waters can be analyzed by the modified procedure without resorting to correction curves. The fluoride content of waters containing less than 500 parts per million of sulfate, 500 parts per million of bicarbonate, and 1000 parts per million of chloride may be determined within a limit of about 0.1 part per million when a 100-ml. sample is used.

  2. Dynamic Creation of Social Networks for Syndromic Surveillance Using Information Fusion

    NASA Astrophysics Data System (ADS)

    Holsopple, Jared; Yang, Shanchieh; Sudit, Moises; Stotz, Adam

    To enhance the effectiveness of health care, many medical institutions have started transitioning to electronic health and medical records and sharing these records between institutions. The large amount of complex and diverse data makes it difficult to identify and track relationships and trends, such as disease outbreaks, from the data points. INFERD: Information Fusion Engine for Real-Time Decision-Making is an information fusion tool that dynamically correlates and tracks event progressions. This paper presents a methodology that utilizes the efficient and flexible structure of INFERD to create social networks representing progressions of disease outbreaks. Individual symptoms are treated as features allowing multiple hypothesis being tracked and analyzed for effective and comprehensive syndromic surveillance.

  3. Personal Genomic Information Management and Personalized Medicine: Challenges, Current Solutions, and Roles of HIM Professionals

    PubMed Central

    Alzu'bi, Amal; Zhou, Leming; Watzlaf, Valerie

    2014-01-01

    In recent years, the term personalized medicine has received more and more attention in the field of healthcare. The increasing use of this term is closely related to the astonishing advancement in DNA sequencing technologies and other high-throughput biotechnologies. A large amount of personal genomic data can be generated by these technologies in a short time. Consequently, the needs for managing, analyzing, and interpreting these personal genomic data to facilitate personalized care are escalated. In this article, we discuss the challenges for implementing genomics-based personalized medicine in healthcare, current solutions to these challenges, and the roles of health information management (HIM) professionals in genomics-based personalized medicine. PMID:24808804

  4. Analyzing the security of an existing computer system

    NASA Technical Reports Server (NTRS)

    Bishop, M.

    1986-01-01

    Most work concerning secure computer systems has dealt with the design, verification, and implementation of provably secure computer systems, or has explored ways of making existing computer systems more secure. The problem of locating security holes in existing systems has received considerably less attention; methods generally rely on thought experiments as a critical step in the procedure. The difficulty is that such experiments require that a large amount of information be available in a format that makes correlating the details of various programs straightforward. This paper describes a method of providing such a basis for the thought experiment by writing a special manual for parts of the operating system, system programs, and library subroutines.

  5. Algorithms for synthesizing management solutions based on OLAP-technologies

    NASA Astrophysics Data System (ADS)

    Pishchukhin, A. M.; Akhmedyanova, G. F.

    2018-05-01

    OLAP technologies are a convenient means of analyzing large amounts of information. An attempt was made in their work to improve the synthesis of optimal management decisions. The developed algorithms allow forecasting the needs and accepted management decisions on the main types of the enterprise resources. Their advantage is the efficiency, based on the simplicity of quadratic functions and differential equations of only the first order. At the same time, the optimal redistribution of resources between different types of products from the assortment of the enterprise is carried out, and the optimal allocation of allocated resources in time. The proposed solutions can be placed on additional specially entered coordinates of the hypercube representing the data warehouse.

  6. Big Data and Analytics in Healthcare.

    PubMed

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  7. Multivariate analysis for scanning tunneling spectroscopy data

    NASA Astrophysics Data System (ADS)

    Yamanishi, Junsuke; Iwase, Shigeru; Ishida, Nobuyuki; Fujita, Daisuke

    2018-01-01

    We applied principal component analysis (PCA) to two-dimensional tunneling spectroscopy (2DTS) data obtained on a Si(111)-(7 × 7) surface to explore the effectiveness of multivariate analysis for interpreting 2DTS data. We demonstrated that several components that originated mainly from specific atoms at the Si(111)-(7 × 7) surface can be extracted by PCA. Furthermore, we showed that hidden components in the tunneling spectra can be decomposed (peak separation), which is difficult to achieve with normal 2DTS analysis without the support of theoretical calculations. Our analysis showed that multivariate analysis can be an additional powerful way to analyze 2DTS data and extract hidden information from a large amount of spectroscopic data.

  8. Automatic Extraction of Planetary Image Features

    NASA Technical Reports Server (NTRS)

    Troglio, G.; LeMoigne, J.; Moser, G.; Serpico, S. B.; Benediktsson, J. A.

    2009-01-01

    With the launch of several Lunar missions such as the Lunar Reconnaissance Orbiter (LRO) and Chandrayaan-1, a large amount of Lunar images will be acquired and will need to be analyzed. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to Lunar data that often present low contrast and uneven illumination characteristics. In this paper, we propose a new method for the extraction of Lunar features (that can be generalized to other planetary images), based on the combination of several image processing techniques, a watershed segmentation and the generalized Hough Transform. This feature extraction has many applications, among which image registration.

  9. Development and Commissioning of an External Beam Facility in the Union College Ion Beam Analysis Laboratory

    NASA Astrophysics Data System (ADS)

    Yoskowitz, Joshua; Clark, Morgan; Labrake, Scott; Vineyard, Michael

    2015-10-01

    We have developed an external beam facility for the 1.1-MV tandem Pelletron accelerator in the Union College Ion Beam Analysis Laboratory. The beam is extracted from an aluminum pipe through a 1 / 4 ' ' diameter window with a 7.5- μm thick Kapton foil. This external beam facility allows us to perform ion beam analysis on samples that cannot be put under vacuum, including wet samples and samples too large to fit into the scattering chamber. We have commissioned the new facility by performing proton induced X-ray emission (PIXE) analysis of several samples of environmental interest. These include samples of artificial turf, running tracks, and a human tooth with an amalgam filling. A 1.7-MeV external proton beam was incident on the samples positioned 2 cm from the window. The resulting X-rays were measured using a silicon drift detector and were analyzed using GUPIX software to determine the concentrations of elements in the samples. The results on the human tooth indicate that while significant concentrations of Hg, Ag, and Sn are present in the amalgam filling, only trace amounts of Hg appear to have leached into the tooth. The artificial turf and running tracks show rather large concentrations of a broad range of elements and trace amounts of Pb in the turf infill.

  10. [Blue-light induced expression of S-adenosy-L-homocysteine hydrolase-like gene in Mucor amphibiorum RCS1].

    PubMed

    Gao, Ya; Wang, Shu; Fu, Mingjia; Zhong, Guolin

    2013-09-04

    To determine blue-light induced expression of S-adenosyl-L-homocysteine hydrolase-like (sahhl) gene in fungus Mucor amphibiorum RCS1. In the random process of PCR, a sequence of 555 bp was obtained from M. amphibiorum RCS1. The 555 bp sequence was labeled with digoxin to prepare the probe for northern hybridization. By northern hybridization, the transcription of sahhl gene was analyzed in M. amphibiorum RCS1 mycelia culture process from darkness to blue light to darkness. Simultaneously real-time PCR method was used to the sahhl gene expression analysis. Compared with the sequence of sahh gene from Homo sapiens, Mus musculus and some fungi species, a high homology of the 555 bp sequence was confirmed. Therefore, the preliminary confirmation has supported that the 555 bp sequence should be sahhl gene from M. amphibiorum RCS1. Under the dark pre-culture in 24 h, a large amounts of transcript of sahhl gene in the mycelia can be detected by northern hybridization and real-time PCR in the condition of 24 h blue light. But a large amounts of transcript of sahhl gene were not found in other detection for the dark pre-culture of 48 h, even though M. amphibiorum RCS1 mycelia were induced by blue light. Blue light can induce the expression of sahhl gene in the vigorous growth of M. amphibiorum RCS1 mycelia.

  11. Influence of the Wenchuan earthquake on self-reported irregular menstrual cycles in surviving women.

    PubMed

    Li, Xiao-Hong; Qin, Lang; Hu, Han; Luo, Shan; Li, Lei; Fan, Wei; Xiao, Zhun; Li, Ying-Xing; Li, Shang-Wei

    2011-09-01

    To explore the influence of stress induced by the Wenchuan earthquake on the menstrual cycles of surviving women. Self-reports of the menstrual cycles of 473 women that survived the Wenchuan earthquake were analyzed. Menstrual regularity was defined as menses between 21 and 35 days long. The death of a child or the loss of property and social resources was verified for all surviving women. The severity of these losses was assessed and graded as high, little, and none. About 21% of the study participants reported that their menstrual cycles became irregular after the Wenchuan earthquake, and this percentage was significantly higher than before the earthquake (6%, p < 0.05). About 30% of the surviving women with a high degree of loss in the earthquake reported menstrual irregularity after the earthquake. Association analyses showed that some stressors of the Wenchuan earthquake were strongly associated with self-reports of menstrual irregularity, including the loss of children (RR: 1.58; 95% CI: 1.09, 2.28), large amounts of property (RR: 1.49; 95% CI: 1.03, 2.15), social resources (RR: 1.34; 95% CI: 1.00, 1.80) and the hormonal contraception use (RR: 1.62; 95% CI: 1.21, 1.83). Self-reported menstrual irregularity is common in women that survived the Wenchuan earthquake, especially in those who lost children, large amounts of property and social resources.

  12. CMS Connect

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Gardner, R., Jr.; Hurtado Anampa, K.; Jayatilaka, B.; Aftab Khan, F.; Lannon, K.; Larson, K.; Letts, J.; Marra Da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.

    2017-10-01

    The CMS experiment collects and analyzes large amounts of data coming from high energy particle collisions produced by the Large Hadron Collider (LHC) at CERN. This involves a huge amount of real and simulated data processing that needs to be handled in batch-oriented platforms. The CMS Global Pool of computing resources provide +100K dedicated CPU cores and another 50K to 100K CPU cores from opportunistic resources for these kind of tasks and even though production and event processing analysis workflows are already managed by existing tools, there is still a lack of support to submit final stage condor-like analysis jobs familiar to Tier-3 or local Computing Facilities users into these distributed resources in an integrated (with other CMS services) and friendly way. CMS Connect is a set of computing tools and services designed to augment existing services in the CMS Physics community focusing on these kind of condor analysis jobs. It is based on the CI-Connect platform developed by the Open Science Grid and uses the CMS GlideInWMS infrastructure to transparently plug CMS global grid resources into a virtual pool accessed via a single submission machine. This paper describes the specific developments and deployment of CMS Connect beyond the CI-Connect platform in order to integrate the service with CMS specific needs, including specific Site submission, accounting of jobs and automated reporting to standard CMS monitoring resources in an effortless way to their users.

  13. Long term real-time GB_InSAR monitoring of a large rock slide

    NASA Astrophysics Data System (ADS)

    Crosta, G. B.; Agliardi, F.; Sosio, R.; Rivolta, C.; Mannucci, G.

    2011-12-01

    We analyze a long term monitoring dataset collected for a deep-seated rockslide (Ruinon, Lombardy, Italy). The rockslide has been actively monitored since 1997 by means of an in situ monitoring network (topographic benchmarks, GPS, wire extensometers) and since 2006 by a ground based radar. Monitoring data have been used to set-up and update the geological model, to identify rockslide extent and geometry, to analyse the sensitivity to seasonal changes and their impact on the reliability and early warning potential of monitoring data. GB-InSAR data allowed us to identify sectors characterized by different behaviours and associated to outcropping bedrock, thick debris cover, major structures. GB-Insar data have been used to set-up a "virtual monitoring network" by a posteriori selection of critical locations. Displacement time series extracted from GB-InSAR data provide a large amount of information even in debris-covered areas, when ground-based instrumentation fails. Such spatially-distributed, improved information, validated by selected ground-based measurements, allowed to establish new velocity and displacement thresholds for early warning purposes. The data are analysed to verify the dependency of the observed displacements on the line of sight orientation as well as on that of the framed resolution cell. Relationships with rainfall and morphological slope characteristics have been analysed to verify the sensitivity to rain intensity and amount and to distinguish among the different possible mechanisms.

  14. Submarine landslides

    USGS Publications Warehouse

    Hampton, M.A.; Lee, H.J.; Locat, J.

    1996-01-01

    Landslides are common on inclined areas of the seafloor, particularly in environments where weak geologic materials such as rapidly deposited, finegrained sediment or fractured rock are subjected to strong environmental stresses such as earthquakes, large storm waves, and high internal pore pressures. Submarine landslides can involve huge amounts of material and can move great distances: slide volumes as large as 20,000 km3 and runout distances in excess of 140 km have been reported. They occur at locations where the downslope component of stress exceeds the resisting stress, causing movement along one or several concave to planar rupture surfaces. Some recent slides that originated nearshore and retrogressed back across the shoreline were conspicuous by their direct impact on human life and activities. Most known slides, however, occurred far from land in prehistoric time and were discovered by noting distinct to subtle characteristics, such as headwall scarps and displaced sediment or rock masses, on acoustic-reflection profiles and side-scan sonar images. Submarine landslides can be analyzed using the same mechanics principles as are used for occurrences on land. However, some loading mechanisms are unique, for example, storm waves, and some, such as earthquakes, can have greater impact. The potential for limited-deformation landslides to transform into sediment flows that can travel exceedingly long distances is related to the density of the slope-forming material and the amount of shear strength that is lost when the slope fails.

  15. MaRaCluster: A Fragment Rarity Metric for Clustering Fragment Spectra in Shotgun Proteomics.

    PubMed

    The, Matthew; Käll, Lukas

    2016-03-04

    Shotgun proteomics experiments generate large amounts of fragment spectra as primary data, normally with high redundancy between and within experiments. Here, we have devised a clustering technique to identify fragment spectra stemming from the same species of peptide. This is a powerful alternative method to traditional search engines for analyzing spectra, specifically useful for larger scale mass spectrometry studies. As an aid in this process, we propose a distance calculation relying on the rarity of experimental fragment peaks, following the intuition that peaks shared by only a few spectra offer more evidence than peaks shared by a large number of spectra. We used this distance calculation and a complete-linkage scheme to cluster data from a recent large-scale mass spectrometry-based study. The clusterings produced by our method have up to 40% more identified peptides for their consensus spectra compared to those produced by the previous state-of-the-art method. We see that our method would advance the construction of spectral libraries as well as serve as a tool for mining large sets of fragment spectra. The source code and Ubuntu binary packages are available at https://github.com/statisticalbiotechnology/maracluster (under an Apache 2.0 license).

  16. Gene Expression Browser: Large-Scale and Cross-Experiment Microarray Data Management, Search & Visualization

    USDA-ARS?s Scientific Manuscript database

    The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...

  17. A Simple Tool for the Design and Analysis of Multiple-Reflector Antennas in a Multi-Disciplinary Environment

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea

    2000-01-01

    The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.

  18. Sign: large-scale gene network estimation environment for high performance computing.

    PubMed

    Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru

    2011-01-01

    Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .

  19. Propellant combustion product analyses on an M16 rifle and a 105 mm caliber gun

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ase, P.; Eisenberg, W.; Gordon, S.

    1985-01-01

    Some of the propellant combustion products (particulates and gases) that are formed on firing an M16 rifle and 105 mm caliber gun have been subjected to qualitative, and to a more limited extent, quantitative chemical analyses. For both weapons, large numbers of trace gas species, 90 to 70 respectively, were identified in the combustion effluents from the small large bore weapons. Quantifiable data were obtained for 15 of these species in terms of mass of compound formed per unit mass of propellant burned. Polynuclear aromatic hydrocarbons, 11 and 4 respectively, were identified and quantified in the combustion products from themore » small and large bore weapons. Metal particulates in the respirable range in the combustion products from the M16 rifle were analyzed and quantified. Many of the chemical species identified in the study have known toxicological properties. Although the data base is limited, it appears that within the confines of the different propellants' stoichiometries, the amounts of combustion products formed are approximately directly proportional to the masses of propellant burned.« less

  20. Consumption with Large Sip Sizes Increases Food Intake and Leads to Underestimation of the Amount Consumed

    PubMed Central

    Bolhuis, Dieuwerke P.; Lakemond, Catriona M. M.; de Wijk, Rene A.; Luning, Pieternel A.; de Graaf, Cees

    2013-01-01

    Background A number of studies have shown that bite and sip sizes influence the amount of food intake. Consuming with small sips instead of large sips means relatively more sips for the same amount of food to be consumed; people may believe that intake is higher which leads to faster satiation. This effect may be disturbed when people are distracted. Objective The objective of the study is to assess the effects of sip size in a focused state and a distracted state on ad libitum intake and on the estimated amount consumed. Design In this 3×2 cross-over design, 53 healthy subjects consumed ad libitum soup with small sips (5 g, 60 g/min), large sips (15 g, 60 g/min), and free sips (where sip size was determined by subjects themselves), in both a distracted and focused state. Sips were administered via a pump. There were no visual cues toward consumption. Subjects then estimated how much they had consumed by filling soup in soup bowls. Results Intake in the small-sip condition was ∼30% lower than in both the large-sip and free-sip conditions (P<0.001). In addition, subjects underestimated how much they had consumed in the large-sip and free-sip conditions (P<0.03). Distraction led to a general increase in food intake (P = 0.003), independent of sip size. Distraction did not influence sip size or estimations. Conclusions Consumption with large sips led to higher food intake, as expected. Large sips, that were either fixed or chosen by subjects themselves led to underestimations of the amount consumed. This may be a risk factor for over-consumption. Reducing sip or bite sizes may successfully lower food intake, even in a distracted state. PMID:23372657

  1. Sediment budget analysis from Landslide debris and river channel change during the extreme event - example of Typhoon Morakot at Laonong river, Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, Kuo-Jen; Huang, Yu-Ting; Huang, Mei-Jen; Chiang, Yi-Lin; Yeh, En-Chao; Chao, Yu-Jui

    2014-05-01

    Taiwan, due to the high seismicity and high annual rainfall, numerous landslides triggered every year and severe impacts affect the island. Typhoon Morakot brought extreme and long-time rainfall for Taiwan in August 2009. It further caused huge loss of life and property in central and southern Taiwan. Laonong River is the largest tributary of Gaoping River. It's length is 137 km, and the basin area is 1373 km2. More than 2000mm rainfall brought and maximum rainfall exceeded 100mm/hr in the region by Typhoon Morakot in Aug, 2009. Its heavy rains made many landslides and debris flew into the river and further brought out accumulation and erosion on river banks of different areas. It caused severe disasters within the Laonong River drainage. In the past, the study of sediment blockage of river channel usually relies on field investigation, but due to inconvenient transportation, topographical barriers, or located in remote areas, etc. the survey is hardly to be completed sometimes. In recent years, the rapid development of remote sensing technology improves image resolution and quality significantly. Remote sensing technology can provide a wide range of image data, and provide essential and precious information. Furthermore, although the amount of sediment transportation can be estimated by using data such as rainfall, river flux, and suspended loads, the situation of large debris migration cannot be studied via those data. However, landslides, debris flow and river sediment transportation model in catchment area can be evaluated easily through analyzing the digital terrain model (DTM) . The purpose of this study is to investigate the phenomenon of river migration and to evaluate the amount of migration along Laonong River by analyzing the DEM before and after the typhoon Morakot. The DEMs are built by using the aerial images taken by digital mapping camera (DMC) and by airborne digital scanner 40 (ADS 40) before and after typhoon event. The results show that lateral erosion of the Laonong River caused by the typhoon seriously, especially in Yushan National Park, and midstream region. However, lateral erosion in downstream region is not so obvious. Meanwhile the siltation depth resulted from the Typhoon Morakot is larger in upstream region than in midstream and downstream regions. The amount of landslide debris created by Typhoon Morakot was too excessive to be transported. Materials just siltated in the upstream in place, same as in the middle stream area. Because of the amount of river slope erosion and sediment collapse in the downstream region is less than in upstream and midstream region, the amount of river erosion slightly larger than the amount of river siltation. The goals of this project are trying to decipher the sliding process and morphologic changes of large landslide areas, sediment transport and budgets, and to investigate the phenomenon of river migration. The results of this study provides not only geomatics and GIS dataset of the hazards, but also for essential geomorphologic information for other study, and for hazard mitigation and planning, as well.

  2. SPIKY: a graphical user interface for monitoring spike train synchrony

    PubMed Central

    Mulansky, Mario; Bozanic, Nebojsa

    2015-01-01

    Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. PMID:25744888

  3. SPIKY: a graphical user interface for monitoring spike train synchrony.

    PubMed

    Kreuz, Thomas; Mulansky, Mario; Bozanic, Nebojsa

    2015-05-01

    Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. Copyright © 2015 the American Physiological Society.

  4. Model for fluorescence quenching in light harvesting complex II in different aggregation states.

    PubMed

    Andreeva, Atanaska; Abarova, Silvia; Stoitchkova, Katerina; Busheva, Mira

    2009-02-01

    Low-temperature (77 K) steady-state fluorescence emission spectroscopy and dynamic light scattering were applied to the main chlorophyll a/b protein light harvesting complex of photosystem II (LHC II) in different aggregation states to elucidate the mechanism of fluorescence quenching within LHC II oligomers. Evidences presented that LHC II oligomers are heterogeneous and consist of large and small particles with different fluorescence yield. At intermediate detergent concentrations the mean size of the small particles is similar to that of trimers, while the size of large particles is comparable to that of aggregated trimers without added detergent. It is suggested that in small particles and trimers the emitter is monomeric chlorophyll, whereas in large aggregates there is also another emitter, which is a poorly fluorescing chlorophyll associate. A model, describing populations of antenna chlorophyll molecules in small and large aggregates in their ground and first singlet excited states, is considered. The model enables us to obtain the ratio of the singlet excited-state lifetimes in small and large particles, the relative amount of chlorophyll molecules in large particles, and the amount of quenchers as a function of the degree of aggregation. These dependencies reveal that the quenching of the chl a fluorescence upon aggregation is due to the formation of large aggregates and the increasing of the amount of chlorophyll molecules forming these aggregates. As a consequence, the amount of quenchers, located in large aggregates, is increased, and their singlet excited-state lifetimes steeply decrease.

  5. Human, vector and parasite Hsp90 proteins: A comparative bioinformatics analysis.

    PubMed

    Faya, Ngonidzashe; Penkler, David L; Tastan Bishop, Özlem

    2015-01-01

    The treatment of protozoan parasitic diseases is challenging, and thus identification and analysis of new drug targets is important. Parasites survive within host organisms, and some need intermediate hosts to complete their life cycle. Changing host environment puts stress on parasites, and often adaptation is accompanied by the expression of large amounts of heat shock proteins (Hsps). Among Hsps, Hsp90 proteins play an important role in stress environments. Yet, there has been little computational research on Hsp90 proteins to analyze them comparatively as potential parasitic drug targets. Here, an attempt was made to gain detailed insights into the differences between host, vector and parasitic Hsp90 proteins by large-scale bioinformatics analysis. A total of 104 Hsp90 sequences were divided into three groups based on their cellular localizations; namely cytosolic, mitochondrial and endoplasmic reticulum (ER). Further, the parasitic proteins were divided according to the type of parasite (protozoa, helminth and ectoparasite). Primary sequence analysis, phylogenetic tree calculations, motif analysis and physicochemical properties of Hsp90 proteins suggested that despite the overall structural conservation of these proteins, parasitic Hsp90 proteins have unique features which differentiate them from human ones, thus encouraging the idea that protozoan Hsp90 proteins should be further analyzed as potential drug targets.

  6. Agricultural Management Practices Explain Variation in Global Yield Gaps of Major Crops

    NASA Astrophysics Data System (ADS)

    Mueller, N. D.; Gerber, J. S.; Ray, D. K.; Ramankutty, N.; Foley, J. A.

    2010-12-01

    The continued expansion and intensification of agriculture are key drivers of global environmental change. Meeting a doubling of food demand in the next half-century will further induce environmental change, requiring either large cropland expansion into carbon- and biodiversity-rich tropical forests or increasing yields on existing croplands. Closing the “yield gaps” between the most and least productive farmers on current agricultural lands is a necessary and major step towards preserving natural ecosystems and meeting future food demand. Here we use global climate, soils, and cropland datasets to quantify yield gaps for major crops using equal-area climate analogs. Consistent with previous studies, we find large yield gaps for many crops in Eastern Europe, tropical Africa, and parts of Mexico. To analyze the drivers of yield gaps, we collected sub-national agricultural management data and built a global dataset of fertilizer application rates for over 160 crops. We constructed empirical crop yield models for each climate analog using the global management information for 17 major crops. We find that our climate-specific models explain a substantial amount of the global variation in yields. These models could be widely applied to identify management changes needed to close yield gaps, analyze the environmental impacts of agricultural intensification, and identify climate change adaptation techniques.

  7. Fibroblast responses and antibacterial activity of Cu and Zn co-doped TiO2 for percutaneous implants

    NASA Astrophysics Data System (ADS)

    Zhang, Lan; Guo, Jiaqi; Yan, Ting; Han, Yong

    2018-03-01

    In order to enhance skin integration and antibacterial activity of Ti percutaneous implants, microporous TiO2 coatings co-doped with different doses of Cu2+ and Zn2+ were directly fabricated on Ti via micro-arc oxidation (MAO). The structures of coatings were investigated; the behaviors of fibroblasts (L-929) as well as the response of Staphylococcus aureus (S. aureus) were evaluated. During the MAO process, a large number of micro-arc discharges forming on Ti performed as penetrating channels; O2-, Ca2+, Zn2+, Cu2+ and PO43- delivered via the channels, giving rise to the formation of doped TiO2. Surface characteristics including phase component, topography, surface roughness and wettability were almost the same for different coatings, whereas, the amount of Cu doped in TiO2 decreased with the increased Zn amount. Compared with Cu single-doped TiO2 (0.77 Wt% Cu), the co-doped with appropriate amounts of Cu and Zn, for example, 0.55 Wt% Cu and 2.53 Wt% Zn, further improved proliferation of L-929, facilitated fibroblasts to switch to fibrotic phenotype, and enhanced synthesis of collagen I as well as the extracellular collagen secretion; the antibacterial properties including contact-killing and release-killing were also enhanced. By analyzing the relationship of Cu/Zn amount in TiO2 and the behaviors of L-929 and S. aureus, it can be deduced that when the doped Zn is in a low dose (<1.79 Wt%), the behaviors of L-929 and S. aureus are sensitive to the reduced amount of Cu2+, whereas, Zn2+ plays a key role in accelerating fibroblast functions and reducing S. aureus when its dose obviously increases from 2.63 to 6.47 Wt%.

  8. River basin affected by rare perturbation events: the Chaiten volcanic eruption.

    NASA Astrophysics Data System (ADS)

    Picco, Lorenzo; Iroumé, Andrés; Oss-Cazzador, Daniele; Ulloa, Hector

    2017-04-01

    Natural disasters can strongly and rapidly affect a wide array of environments. Among these, volcanic eruptions can exert severe impacts on the dynamic equilibrium of riverine environment. The production and subsequent mobilization of large amounts of sediment all over the river basin, can strongly affect both hydrology and sediment and large wood transport dynamics. The aim of this research is to quantify the impact of a volcanic eruption along the Blanco River basin (Southern Chile), considering the geomorphic settings, the sediment dynamics and wood transport. Moreover, an overview on the possible management strategies to reduce the risks will be proposed. The research was carried out mainly along a 2.2 km-long reach of the fourth-order Blanco stream. Almost the entire river basin was affected by the volcanic eruption, several meters of tephra (up to 8 m) were deposited, affecting the evergreen forest and the fluvial corridor. Field surveys and remote sense analysis were carried out to investigate the effect of such extreme event. A Terrestrial Laser Scanner (TLS) was used to detect the morphological changes by computing Difference of Dems (DoDs), while field surveys were carried out to detect the amount of in-channel wood; moreover aerial photos have been analyzed to detect the extension of the impact of volcanic eruption over the river basin. As expected, the DoDs analysis permitted to detect predominant erosional processes along the channel network. In fact, over 190569 m2 there was erosion that produced about 362999 m3 of sediment mobilized, while the deposition happened just over 58715 m2 for a total amount of 23957 m3. Looking then to the LW recruited and transported downstream, was possible to detect as along the active channel corridor a total amount of 113 m3/ha of wood was present. Moreover, analyzing aerial photographs taken before and after the volcanic eruption was possible to define as a total area of about 2.19 km2 was affected by tephra deposition, 0.87 km2 has already been eroded by floods, while 1.32 km2 is still there. Considering an average depth of 5 m, the potential amount of sediment erodible and potentially transported downstream during the next near future is around 6.5 x 106 m3. Finally, from the same area can be recruited other 7.3 x 104 m3 of LW that can be transported towards the mouth. These results may help to better define management strategies to reduce the potential risks to the sensitive structures and cross section downstream. In fact, the management of sediment and LW transport through the lower Chaiten village appear of fundamental importance to guarantee a safer condition. This research is funded by the Chilean research Project FONDECYT 1141064 "Effects of vegetation on channel morphodynamics: a multiscale investigation in Chilean gravel-bed rivers".

  9. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  10. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  11. Assessment of Large Wood budget in the gravel-bed Piave River: first attempt

    NASA Astrophysics Data System (ADS)

    Tonon, Alessia; Picco, Lorenzo; Ravazzolo, Diego; Aristide Lenzi, Mario

    2015-04-01

    During the last decades, the dynamics of large wood (LW) in rivers were analyzed to consider and define the LW budget. The space-time variations of LW amount results from the differences among input (e.g. fluvial transport, lateral recruitment) and output (e.g. fluvial transport, overbank deposition, natural chronic dead) of LW along a riverine environment. Different methodologies were applied in several fluvial environments, however in large river systems characterized by complex LW dynamics, the processes are still poor quantified. Aim of this contribution is to perform a LW budget estimation over the short period, assessing the effect of an over bankfull flood (Q=1039 m3 s-1; R.I=3.5 years). The research was carried out along a 1 km-long reach (around 15 ha) located into the middle course of the large gravel-bed Piave River (North East of Italy). The LW budget has been defined considering the recruitment through bank erosion and the fluvial transport of LW into and out of the study reach. The former factor was achieved integrating field data on riparian vegetation with the monitoring of riverbanks with a Differential Global Positioning System (DGPS). The latter was obtained detecting all LW elements (diameter ≥ 0.10 m and/or length ≥ 1 m) stored along the study reach, before and after the flood. For each LW the GPS position was recorded and a numbered tag was installed with the addition of colored paint to permit a rapid post-event recovery. Preliminary results indicate that, along the study area, the floating transport of LW is one of the most significant processes able to modify the amount of LW deposited along a riverine system. In fact, considering the input of LW, the 99.4 % (102 m3 km-1) comes from upstream due to floating, whereas the 0.6% (0.17 m3 km-1) was recruited through bank erosion. Analyzing the output, 94.3% (40.26 m3 km-1) of LW was transported downstream of the study area, whereas only the 5.7 % (2.43 m3 km-1) of LW was involved in the "internal displacement". In this study, the amount of LW increased of about 60.29% in the number of LW elements and 145% in volume, corresponding to 61.98 m3 km-1. The methodology here presented appears an easy and economical way to assess LW budget at a small spatial scale. However, further improvements are needed to allow the construction of comprehensive LW budget, considering also the loss of LW from overbank deposition as from natural decay. This research is funded within both, the University of Padova Research Project CPDA149091- "WoodAlp: linking large Wood and morphological dynamics of gravel bed rivers of Eastern Italian Alps"- 2014-16 and the Project "SedAlp: sediment management in Alpine basins, integrating sediment continuum, risk mitigation and hydropower", 83-4-3-AT, in the framework of the European Territorial Cooperation Program "Alpine Space" 2007-13.

  12. Quantitative Determination of Citric and Ascorbic Acid in Powdered Drink Mixes

    ERIC Educational Resources Information Center

    Sigmann, Samuella B.; Wheeler, Dale E.

    2004-01-01

    A procedure by which the reactions are used to quantitatively determine the amount of total acid, the amount of total ascorbic acid and the amount of citric acid in a given sample of powdered drink mix, are described. A safe, reliable and low-cost quantitative method to analyze consumer product for acid content is provided.

  13. Advanced heat receiver conceptual design study

    NASA Technical Reports Server (NTRS)

    Kesseli, James; Saunders, Roger; Batchelder, Gary

    1988-01-01

    Solar Dynamic space power systems are candidate electrical power generating systems for future NASA missions. One of the key components of the solar dynamic power system is the solar receiver/thermal energy storage (TES) subsystem. Receiver development was conducted by NASA in the late 1960's and since then a very limited amount of work has been done in this area. Consequently the state of the art (SOA) receivers designed for the IOC space station are large and massive. The objective of the Advanced Heat Receiver Conceptual Design Study is to conceive and analyze advanced high temperature solar dynamic Brayton and Stirling receivers. The goal is to generate innovative receiver concepts that are half of the mass, smaller, and more efficient than the SOA. It is also necessary that these innovative receivers offer ease of manufacturing, less structural complexity and fewer thermal stress problems. Advanced Brayton and Stirling receiver storage units are proposed and analyzed in this study which can potentially meet these goals.

  14. A web server for mining Comparative Genomic Hybridization (CGH) data

    NASA Astrophysics Data System (ADS)

    Liu, Jun; Ranka, Sanjay; Kahveci, Tamer

    2007-11-01

    Advances in cytogenetics and molecular biology has established that chromosomal alterations are critical in the pathogenesis of human cancer. Recurrent chromosomal alterations provide cytological and molecular markers for the diagnosis and prognosis of disease. They also facilitate the identification of genes that are important in carcinogenesis, which in the future may help in the development of targeted therapy. A large amount of publicly available cancer genetic data is now available and it is growing. There is a need for public domain tools that allow users to analyze their data and visualize the results. This chapter describes a web based software tool that will allow researchers to analyze and visualize Comparative Genomic Hybridization (CGH) datasets. It employs novel data mining methodologies for clustering and classification of CGH datasets as well as algorithms for identifying important markers (small set of genomic intervals with aberrations) that are potentially cancer signatures. The developed software will help in understanding the relationships between genomic aberrations and cancer types.

  15. On-the-fly machine-learning for high-throughput experiments: search for rare-earth-free permanent magnets

    PubMed Central

    Kusne, Aaron Gilad; Gao, Tieren; Mehta, Apurva; Ke, Liqin; Nguyen, Manh Cuong; Ho, Kai-Ming; Antropov, Vladimir; Wang, Cai-Zhuang; Kramer, Matthew J.; Long, Christian; Takeuchi, Ichiro

    2014-01-01

    Advanced materials characterization techniques with ever-growing data acquisition speed and storage capabilities represent a challenge in modern materials science, and new procedures to quickly assess and analyze the data are needed. Machine learning approaches are effective in reducing the complexity of data and rapidly homing in on the underlying trend in multi-dimensional data. Here, we show that by employing an algorithm called the mean shift theory to a large amount of diffraction data in high-throughput experimentation, one can streamline the process of delineating the structural evolution across compositional variations mapped on combinatorial libraries with minimal computational cost. Data collected at a synchrotron beamline are analyzed on the fly, and by integrating experimental data with the inorganic crystal structure database (ICSD), we can substantially enhance the accuracy in classifying the structural phases across ternary phase spaces. We have used this approach to identify a novel magnetic phase with enhanced magnetic anisotropy which is a candidate for rare-earth free permanent magnet. PMID:25220062

  16. ImmuneDB: a system for the analysis and exploration of high-throughput adaptive immune receptor sequencing data.

    PubMed

    Rosenfeld, Aaron M; Meng, Wenzhao; Luning Prak, Eline T; Hershberg, Uri

    2017-01-15

    As high-throughput sequencing of B cells becomes more common, the need for tools to analyze the large quantity of data also increases. This article introduces ImmuneDB, a system for analyzing vast amounts of heavy chain variable region sequences and exploring the resulting data. It can take as input raw FASTA/FASTQ data, identify genes, determine clones, construct lineages, as well as provide information such as selection pressure and mutation analysis. It uses an industry leading database, MySQL, to provide fast analysis and avoid the complexities of using error prone flat-files. ImmuneDB is freely available at http://immunedb.comA demo of the ImmuneDB web interface is available at: http://immunedb.com/demo CONTACT: Uh25@drexel.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Minimal-assumption inference from population-genomic data

    NASA Astrophysics Data System (ADS)

    Weissman, Daniel; Hallatschek, Oskar

    Samples of multiple complete genome sequences contain vast amounts of information about the evolutionary history of populations, much of it in the associations among polymorphisms at different loci. Current methods that take advantage of this linkage information rely on models of recombination and coalescence, limiting the sample sizes and populations that they can analyze. We introduce a method, Minimal-Assumption Genomic Inference of Coalescence (MAGIC), that reconstructs key features of the evolutionary history, including the distribution of coalescence times, by integrating information across genomic length scales without using an explicit model of recombination, demography or selection. Using simulated data, we show that MAGIC's performance is comparable to PSMC' on single diploid samples generated with standard coalescent and recombination models. More importantly, MAGIC can also analyze arbitrarily large samples and is robust to changes in the coalescent and recombination processes. Using MAGIC, we show that the inferred coalescence time histories of samples of multiple human genomes exhibit inconsistencies with a description in terms of an effective population size based on single-genome data.

  18. Airborne system for detection and location of radio interference sources

    NASA Astrophysics Data System (ADS)

    Audone, Bruno; Pastore, Alberto

    1992-11-01

    The rapid expansion of telecommunication has practically saturated every band of Radio Frequency Spectrum; a similar expansion of electrical and electronic devices has affected all radio communications which are, in some way, influenced by a large amount of interferences, either intentionally or unintentionally produced. Operational consequences of these interferences, particularly in the frequency channels used for aeronautical services, can be extremely dangerous, making mandatory a tight control of Electromagnetic Spectrum. The present paper analyzes the requirements and the problems related to the surveillance, for civil application, of the Electromagnetic Spectrum between 20 and 1000 MHz, with particular attention to the detection and location of radio interference sources; after a brief introduction and the indication of the advantages of an airborne versus ground installation, the airborne system designed by Alenia in cooperation with Italian Ministry of Post and Telecommunication, its practical implementation and the prototype installation on board of a small twin turboprop aircraft for experimentation purposes is presented. The results of the flight tests are also analyzed and discussed.

  19. Development of an SPE/CE method for analyzing HAAs

    USGS Publications Warehouse

    Zhang, L.; Capel, P.D.; Hozalski, R.M.

    2007-01-01

    The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.

  20. Protein-Protein Interaction Network and Gene Ontology

    NASA Astrophysics Data System (ADS)

    Choi, Yunkyu; Kim, Seok; Yi, Gwan-Su; Park, Jinah

    Evolution of computer technologies makes it possible to access a large amount and various kinds of biological data via internet such as DNA sequences, proteomics data and information discovered about them. It is expected that the combination of various data could help researchers find further knowledge about them. Roles of a visualization system are to invoke human abilities to integrate information and to recognize certain patterns in the data. Thus, when the various kinds of data are examined and analyzed manually, an effective visualization system is an essential part. One instance of these integrated visualizations can be combination of protein-protein interaction (PPI) data and Gene Ontology (GO) which could help enhance the analysis of PPI network. We introduce a simple but comprehensive visualization system that integrates GO and PPI data where GO and PPI graphs are visualized side-by-side and supports quick reference functions between them. Furthermore, the proposed system provides several interactive visualization methods for efficiently analyzing the PPI network and GO directedacyclic- graph such as context-based browsing and common ancestors finding.

  1. Space radiation dosimetry on US and Soviet manned missions

    NASA Technical Reports Server (NTRS)

    Parnell, T. A.; Benton, E. V.

    1995-01-01

    Radiation measurements obtained on board U.S. and Soviet spacecraft are presented and discussed. A considerable amount of data has now been collected and analyzed from measurements with a variety of detector types in low-Earth orbit. The objectives of these measurements have been to investigate the dose and Linear Energy Transfer (LET) spectra within the complex shielding of large spacecraft. The shielding modifies the external radiation (trapped protons, electrons, cosmic ray nuclei) which, in turn, is quite dependent on orbital parameters (altitude, inclination). For manned flights, these measurements provide a crew exposure record and a data base for future spacecraft design and flight planning. For the scientific community they provide useful information for planning and analyzing data from experiments with high sensitivity to radiation. In this paper, results of measurements by both passive and active detectors are described. High-LET spectra measurements were obtained by means of plastic nuclear track detectors (PNTD's) while thermoluminescent dosimeters (TLD's) measured the dose.

  2. Iterative Integration of Visual Insights during Scalable Patent Search and Analysis.

    PubMed

    Koch, S; Bosch, H; Giereth, M; Ertl, T

    2011-05-01

    Patents are of growing importance in current economic markets. Analyzing patent information has, therefore, become a common task for many interest groups. As a prerequisite for patent analysis, extensive search for relevant patent information is essential. Unfortunately, the complexity of patent material inhibits a straightforward retrieval of all relevant patent documents and leads to iterative, time-consuming approaches in practice. Already the amount of patent data to be analyzed poses challenges with respect to scalability. Further scalability issues arise concerning the diversity of users and the large variety of analysis tasks. With "PatViz", a system for interactive analysis of patent information has been developed addressing scalability at various levels. PatViz provides a visual environment allowing for interactive reintegration of insights into subsequent search iterations, thereby bridging the gap between search and analytic processes. Because of its extensibility, we expect that the approach we have taken can be employed in different problem domains that require high quality of search results regarding their completeness.

  3. Mechanical analysis and force chain determination in granular materials using digital image correlation.

    PubMed

    Chen, Fanxiu; Zhuang, Qi; Zhang, Huixin

    2016-06-20

    The mechanical behaviors of granular materials are governed by the grain properties and microstructure of the materials. We conducted experiments to study the force transmission in granular materials using plane strain tests. The large amount of nearly continuous displacement data provided by the advanced noncontact experimental technique of digital image correlation (DIC) has provided a means to quantify local displacements and strains at the particle level. The average strain of each particle could be calculated based on the DIC method, and the average stress could be obtained using Hooke's law. The relationship between the stress and particle force could be obtained based on basic Newtonian mechanics and the balance of linear momentum at the particle level. This methodology is introduced and validated. In the testing procedure, the system is tested in real 2D particle cases, and the contact forces and force chain are obtained and analyzed. The system has great potential for analyzing a real granular system and measuring the contact forces and force chain.

  4. Artificial maturation of an immature sulfur- and organic matter-rich limestone from the Ghareb Formation, Jordan

    USGS Publications Warehouse

    Koopmans, M.P.; Rijpstra, W.I.C.; De Leeuw, J. W.; Lewan, M.D.; Damste, J.S.S.

    1998-01-01

    An immature (Ro=0.39%), S-rich (S(org)/C = 0.07), organic matter-rich (19.6 wt. % TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220 ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophenes and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.An immature (Ro = 0.39%), S-rich (Sorg/C = 0.07), organic matter-rich (19.6 wt.% TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220, ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophene and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.

  5. CaSO4 Scale Inhibition by a Trace Amount of Zinc Ion in Piping System

    NASA Astrophysics Data System (ADS)

    Mangestiyono, W.; Sutrisno

    2017-05-01

    Usually, a small steam generator is not complemented by equipment such as demineralization and chlorination process apparatus since the economic aspect was a precedence. Such phenomenon was uncovered in a case study of green tea industrial process in which the boiler capacity was not more than 1 ton/hour. The operation of the small boiler affected the scaling process in its piping system. In a year operation, there was already a large scale of calcium attached to the inner surface of the pipe. Such large scale formed a layer and decreased the overall heat transfer coefficient, prolonged the process time and decreased the production. The aim of the current research was to solve the problem through a laboratory research to inhibit the CaSO4 scale formation by the addition of trace amounts of zinc ion. This research was conducted through a built in-house experimental rig which consisted of a dosing pump for controlling the flow rate and a thermocouple to control the temperature. Synthesis solution was prepared with 3,500 ppm concentration of CaCl2 and Na2SO4. The concentration of zinc was set at 0.00; 5.00 and 10.00 ppm. The data found were characterized by scanning electron microscopy (SEM) to analyze crystal polymorph as the influence of zinc ion addition. The induction time was also investigated to analyze the nucleation time, and it was found on the 9th, 13th, and 19th minute of the zinc ion addition of 0.00, 5.00 and 10.00 ppm. After running for a four-hour duration, the scale grow-rate was found to be 5.799; 5.501 and 4.950 × 10-3 gr/min for 0.00; 5.00 and 10.00 ppm of zinc addition at 50 °C.

  6. Modifications of the U.S. Geological Survey modular, finite-difference, ground-water flow model to read and write geographic information system files

    USGS Publications Warehouse

    Orzol, Leonard L.; McGrath, Timothy S.

    1992-01-01

    This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.

  7. Estimating Contrail Climate Effects from Satellite Data

    NASA Technical Reports Server (NTRS)

    Minnis, Patrick; Duda, David P.; Palikonda, Rabindra; Bedka, Sarah T.; Boeke, Robyn; Khlopenkov, Konstantin; Chee, Thad; Bedka, Kristopher T.

    2011-01-01

    An automated contrail detection algorithm (CDA) is developed to exploit six of the infrared channels on the 1-km MODerate-resolution Imaging Spectroradiometer (MODIS) on the Terra and Aqua satellites. The CDA is refined and balanced using visual error analysis. It is applied to MODIS data taken by Terra and Aqua over the United States during 2006 and 2008. The results are consistent with flight track data, but differ markedly from earlier analyses. Contrail coverage is a factor of 4 less than other retrievals and the retrieved contrail optical depths and radiative forcing are smaller by approx.30%. The discrepancies appear to be due to the inability to detect wider, older contrails that comprise a significant amount of the contrail coverage. An example of applying the algorithm to MODIS data over the entire Northern Hemisphere is also presented. Overestimates of contrail coverage are apparent in some tropical regions. Methods for improving the algorithm are discussed and are to be implemented before analyzing large amounts of Northern Hemisphere data. The results should be valuable for guiding and validating climate models seeking to account for aviation effects on climate.

  8. A proposed method to minimize waste from institutional radiation safety surveillance programs through the application of expected value statistics.

    PubMed

    Emery, R J

    1997-03-01

    Institutional radiation safety programs routinely use wipe test sampling and liquid scintillation counting analysis to indicate the presence of removable radioactive contamination. Significant volumes of liquid waste can be generated by such surveillance activities, and the subsequent disposal of these materials can sometimes be difficult and costly. In settings where large numbers of negative results are regularly obtained, the limited grouping of samples for analysis based on expected value statistical techniques is possible. To demonstrate the plausibility of the approach, single wipe samples exposed to varying amounts of contamination were analyzed concurrently with nine non-contaminated samples. Although the sample grouping inevitably leads to increased quenching with liquid scintillation counting systems, the effect did not impact the ability to detect removable contamination in amounts well below recommended action levels. Opportunities to further improve this cost effective semi-quantitative screening procedure are described, including improvements in sample collection procedures, enhancing sample-counting media contact through mixing and extending elution periods, increasing sample counting times, and adjusting institutional action levels.

  9. Measurement and characterization of external oil in the fried waxy maize starch granules using ATR-FTIR and XRD.

    PubMed

    Chen, Long; Tian, Yaoqi; Sun, Binghua; Cai, Canxin; Ma, Rongrong; Jin, Zhengyu

    2018-03-01

    Concerns regarding increased dietary oil uptake have prompted efforts to investigate the oil absorption and distribution in fried starchy foods. In the present study, attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy, together with a chloroform-methanol method, was used to analyze the external and internal oil contents in fried starchy samples. The micromorphology of fried starchy samples was further investigated using scanning electron microscope (SEM), polarized light microscope (PLM) and confocal laser scanning microscopy (CLSM). The results indicated that large amounts of oil were absorbed in or within waxy maize starch, but the majority of oil was located near the surface layer of the starch granules. After defatting, the internal oil was thoroughly removed, while a small amount of external oil remained. As evidenced by the changes of the crystalline characteristics with the help of X-ray diffraction (XRD), the interaction between starch and lipids on the surface was confirmed to form V-type complex compounds during frying at high moisture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. A system architecture for online data interpretation and reduction in fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Röder, Thorsten; Geisbauer, Matthias; Chen, Yang; Knoll, Alois; Uhl, Rainer

    2010-01-01

    In this paper we present a high-throughput sample screening system that enables real-time data analysis and reduction for live cell analysis using fluorescence microscopy. We propose a novel system architecture capable of analyzing a large amount of samples during the experiment and thus greatly minimizing the post-analysis phase that is the common practice today. By utilizing data reduction algorithms, relevant information of the target cells is extracted from the online collected data stream, and then used to adjust the experiment parameters in real-time, allowing the system to dynamically react on changing sample properties and to control the microscope setup accordingly. The proposed system consists of an integrated DSP-FPGA hybrid solution to ensure the required real-time constraints, to execute efficiently the underlying computer vision algorithms and to close the perception-action loop. We demonstrate our approach by addressing the selective imaging of cells with a particular combination of markers. With this novel closed-loop system the amount of superfluous collected data is minimized, while at the same time the information entropy increases.

  11. Photogrammetric analysis of concrete specimens and structures for condition assessment

    NASA Astrophysics Data System (ADS)

    D'Amico, Nicolas; Yu, Tzuyang

    2016-04-01

    Deterioration of civil infrastructure in America demands routine inspection and maintenance to avoid catastrophic failures from occurring. Among many other non-destructive evaluations (NDE), photogrammetry is an accessible and realistic approach used for non-destructive evaluation (NDE) of a civil infrastructure systems. The objective of this paper is to explore the capabilities of photogrammetry for locating, sizing, and analyzing the remaining capacity of a specimen or system using point cloud data. Geometric interpretations, composed from up to 70 photographs are analyzed as a mesh or point cloud models. In this case study, concrete, which exhibits a large amount of surface texture features, was thoroughly examined. These evaluative techniques discussed were applied to concrete cylinder models as well as portions of civil infrastructure including buildings, retaining walls, and bridge abutments. In this paper, the aim is to demonstrate the basic analytical functionality of photogrammetry, as well as its applicability to in-situ civil infrastructure systems. In concrete specimens defect length and location can be evaluated in a fully defined model (one with the maximum amount of correctly acquired photographs) with less than 2% error. Error was found to be inversely proportional to the number of acceptable photographs acquired, remaining significantly under 10% error for any model with enough data to render. Furthermore, volumetric stress evaluations were applied using a cross sectional evaluation technique to locate the critical area, and determine the severity of damages. Finally, findings and the accuracy of the results are discussed.

  12. Novel duplex vapor-electrochemical method for silicon solar cells

    NASA Technical Reports Server (NTRS)

    Kapur, V. K.; Nanis, L.; Sanjurjo, A.

    1977-01-01

    Silicon was produced by alternate pulse feeding of the reactants SiF4 gas and liquid sodium. The average temperature in the reactor could be controlled, by regulating the amount of reactant in each pulse. Silicon tetrafluoride gas was analyzed by mass spectrometry to determine the nature and amount of contained volatile impurities which included silicon oxyfluorides, sulfur oxyfluorides, and sulfur dioxide. Sodium metal was analyzed by emission spectrography, and it was found to contain only calcium and copper as impurities.

  13. A MODIFIED METHOD OF OBTAINING LARGE AMOUNTS OF RICKETTSIA PROWAZEKI BY ROENTGEN IRRADIATION OF RATS

    PubMed Central

    Macchiavello, Atilio; Dresser, Richard

    1935-01-01

    The radiation method described by Zinsser and Castaneda for obtaining large amounts of Rickettsia has been carried out successfully with an ordinary radiographic machine. This allows the extension of the method to those communities which do not possess a high voltage Roentgen therapy unit as originally employed. PMID:19870416

  14. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  15. Profiling of lipid and glycogen accumulations under different growth conditions in the sulfothermophilic red alga Galdieria sulphuraria.

    PubMed

    Sakurai, Toshihiro; Aoki, Motohide; Ju, Xiaohui; Ueda, Tatsuya; Nakamura, Yasunori; Fujiwara, Shoko; Umemura, Tomonari; Tsuzuki, Mikio; Minoda, Ayumi

    2016-01-01

    The unicellular red alga Galdieria sulphuraria grows efficiently and produces a large amount of biomass in acidic conditions at high temperatures. It has great potential to produce biofuels and other beneficial compounds without becoming contaminated with other organisms. In G. sulphuraria, biomass measurements and glycogen and lipid analyses demonstrated that the amounts and compositions of glycogen and lipids differed when cells were grown under autotrophic, mixotrophic, and heterotrophic conditions. Maximum biomass production was obtained in the mixotrophic culture. High amounts of glycogen were obtained in the mixotrophic cultures, while the amounts of neutral lipids were similar between mixotrophic and heterotrophic cultures. The amounts of neutral lipids were highest in red algae, including thermophiles. Glycogen structure and fatty acids compositions largely depended on the growth conditions. Copyright © 2015. Published by Elsevier Ltd.

  16. A prospective survey of nutritional support practices in intensive care unit patients: what is prescribed? What is delivered?

    PubMed

    De Jonghe, B; Appere-De-Vechi, C; Fournier, M; Tran, B; Merrer, J; Melchior, J C; Outin, H

    2001-01-01

    To assess the amount of nutrients delivered, prescribed, and required for critically ill patients and to identify the reasons for discrepancies between prescriptions and requirements and between prescriptions and actual delivery of nutrition. Prospective cohort study. Twelve-bed medical intensive care unit in a university-affiliated general hospital. Fifty-one consecutive patients, receiving nutritional support either enterally or intravenously for > or = 2 days. We followed patients for the first 14 days of nutritional delivery. The amount of calories prescribed and the amount actually delivered were recorded daily and compared with the theoretical energy requirements. A combined regimen of enteral and parenteral nutrition was administered on 58% of the 484 nutrition days analyzed, and 63.5% of total caloric intake was delivered enterally. Seventy-eight percent of the mean caloric amount required was prescribed, and 71% was effectively delivered. The amount of calories actually delivered compared with the amount prescribed was significantly lower in enteral than in parenteral administration (86.8% vs. 112.4%, p < .001). Discrepancies between prescription and delivery of enterally administered nutrients were attributable to interruptions caused by digestive intolerance (27.7%, mean daily wasted volume 641 mL), airway management (30.8%, wasted volume 745 mL), and diagnostic procedures (26.6%, wasted volume 567 mL). Factors significantly associated with a low prescription rate of nutritional support were the administration of vasoactive drugs, central venous catheterization, and the need for extrarenal replacement. An inadequate delivery of enteral nutrition and a low rate of nutrition prescription resulted in low caloric intake in our intensive care unit patients. A large volume of enterally administered nutrients was wasted because of inadequate timing in stopping and restarting enteral feeding. The inverse correlation between the prescription rate of nutrition and the intensity of care required suggests that physicians need to pay more attention to providing appropriate nutritional support for the most severely ill patients.

  17. Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser

    NASA Astrophysics Data System (ADS)

    Christen, M.

    2016-06-01

    Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.

  18. Mouthwash overdose

    MedlinePlus

    ... are: Chlorhexidine gluconate Ethanol (ethyl alcohol) Hydrogen peroxide Methyl salicylate ... amounts of alcohol (drunkenness). Swallowing large amounts of methyl salicylate and hydrogen peroxide may also cause serious stomach ...

  19. Individual Differences in Temporal Selective Attention as Reflected in Pupil Dilation.

    PubMed

    Willems, Charlotte; Herdzin, Johannes; Martens, Sander

    2015-01-01

    Attention is restricted for the second of two targets when it is presented within 200-500 ms of the first target. This attentional blink (AB) phenomenon allows one to study the dynamics of temporal selective attention by varying the interval between the two targets (T1 and T2). Whereas the AB has long been considered as a robust and universal cognitive limitation, several studies have demonstrated that AB task performance greatly differs between individuals, with some individuals showing no AB whatsoever. Here, we studied these individual differences in AB task performance in relation to differences in attentional timing. Furthermore, we investigated whether AB magnitude is predictive for the amount of attention allocated to T1. For both these purposes pupil dilation was measured, and analyzed with our recently developed deconvolution method. We found that the dynamics of temporal attention in small versus large blinkers differ in a number of ways. Individuals with a relatively small AB magnitude seem better able to preserve temporal order information. In addition, they are quicker to allocate attention to both T1 and T2 than large blinkers. Although a popular explanation of the AB is that it is caused by an unnecessary overinvestment of attention allocated to T1, a more complex picture emerged from our data, suggesting that this may depend on whether one is a small or a large blinker. The use of pupil dilation deconvolution seems to be a powerful approach to study the temporal dynamics of attention, bringing us a step closer to understanding the elusive nature of the AB. We conclude that the timing of attention to targets may be more important than the amount of allocated attention in accounting for individual differences.

  20. Annotating novel genes by integrating synthetic lethals and genomic information

    PubMed Central

    Schöner, Daniel; Kalisch, Markus; Leisner, Christian; Meier, Lukas; Sohrmann, Marc; Faty, Mahamadou; Barral, Yves; Peter, Matthias; Gruissem, Wilhelm; Bühlmann, Peter

    2008-01-01

    Background Large scale screening for synthetic lethality serves as a common tool in yeast genetics to systematically search for genes that play a role in specific biological processes. Often the amounts of data resulting from a single large scale screen far exceed the capacities of experimental characterization of every identified target. Thus, there is need for computational tools that select promising candidate genes in order to reduce the number of follow-up experiments to a manageable size. Results We analyze synthetic lethality data for arp1 and jnm1, two spindle migration genes, in order to identify novel members in this process. To this end, we use an unsupervised statistical method that integrates additional information from biological data sources, such as gene expression, phenotypic profiling, RNA degradation and sequence similarity. Different from existing methods that require large amounts of synthetic lethal data, our method merely relies on synthetic lethality information from two single screens. Using a Multivariate Gaussian Mixture Model, we determine the best subset of features that assign the target genes to two groups. The approach identifies a small group of genes as candidates involved in spindle migration. Experimental testing confirms the majority of our candidates and we present she1 (YBL031W) as a novel gene involved in spindle migration. We applied the statistical methodology also to TOR2 signaling as another example. Conclusion We demonstrate the general use of Multivariate Gaussian Mixture Modeling for selecting candidate genes for experimental characterization from synthetic lethality data sets. For the given example, integration of different data sources contributes to the identification of genetic interaction partners of arp1 and jnm1 that play a role in the same biological process. PMID:18194531

  1. An Extensible Processing Framework for Eddy-covariance Data

    NASA Astrophysics Data System (ADS)

    Durden, D.; Fox, A. M.; Metzger, S.; Sturtevant, C.; Durden, N. P.; Luo, H.

    2016-12-01

    The evolution of large data collecting networks has not only led to an increase of available information, but also in the complexity of analyzing the observations. Timely dissemination of readily usable data products necessitates a streaming processing framework that is both automatable and flexible. Tower networks, such as ICOS, Ameriflux, and NEON, exemplify this issue by requiring large amounts of data to be processed from dispersed measurement sites. Eddy-covariance data from across the NEON network are expected to amount to 100 Gigabytes per day. The complexity of the algorithmic processing necessary to produce high-quality data products together with the continued development of new analysis techniques led to the development of a modular R-package, eddy4R. This allows algorithms provided by NEON and the larger community to be deployed in streaming processing, and to be used by community members alike. In order to control the processing environment, provide a proficient parallel processing structure, and certify dependencies are available during processing, we chose Docker as our "Development and Operations" (DevOps) platform. The Docker framework allows our processing algorithms to be developed, maintained and deployed at scale. Additionally, the eddy4R-Docker framework fosters community use and extensibility via pre-built Docker images and the Github distributed version control system. The capability to process large data sets is reliant upon efficient input and output of data, data compressibility to reduce compute resource loads, and the ability to easily package metadata. The Hierarchical Data Format (HDF5) is a file format that can meet these needs. A NEON standard HDF5 file structure and metadata attributes allow users to explore larger data sets in an intuitive "directory-like" structure adopting the NEON data product naming conventions.

  2. Consuming the daily recommended amounts of dairy products would reduce the prevalence of inadequate micronutrient intakes in the United States: diet modeling study based on NHANES 2007-2010.

    PubMed

    Quann, Erin E; Fulgoni, Victor L; Auestad, Nancy

    2015-09-04

    A large portion of Americans are not meeting the Dietary Reference Intakes (DRI) for several essential vitamins and minerals due to poor dietary choices. Dairy products are a key source of many of the nutrients that are under consumed, but children and adults do not consume the recommended amounts from this food group. This study modeled the impact of meeting daily recommended amounts of dairy products on population-based nutrient intakes. Two-day 24-h dietary recalls collected from participants ≥ 2 years (n = 8944) from the 2007-2010 What We Eat in America, National Health and Nutrition Examination Survey (NHANES) were analyzed. Databases available from the WWEIA/NHANES and the United States Department of Agriculture (USDA) were used to determine nutrient, food group, and dietary supplement intakes. Modeling was performed by adding the necessary number of dairy servings, using the dairy composite designed by USDA, to each participant's diet to meet the dairy recommendations outlined by the 2010 Dietary Guidelines for Americans. All analyses included sample weights to account for the NHANES survey design. The majority of children 4 years and older (67.4-88.8%) and nearly all adults (99.0-99.6%) fall below the recommended 2.5-3 daily servings of dairy products. Increasing dairy consumption to recommended amounts would result in a significant reduction in the percent of adults with calcium, magnesium, and vitamin A intakes below the Estimated Average Requirement (EAR) when considering food intake alone (0-2.0 vs. 9.9-91.1%; 17.3-75.0 vs. 44.7-88.5%; 0.1-15.1 vs. 15.3-48.0%, respectively), as well as food and dietary supplement intake. Minimal, but significant, improvements were observed for the percent of people below the EAR for vitamin D (91.7-99.9 vs. 91.8-99.9%), and little change was achieved for the large percentage of people below the Adequate Intake for potassium. Increasing dairy food consumption to recommended amounts is one practical dietary change that could significantly improve the population's adequacy for certain vitamins and minerals that are currently under-consumed, as well as have a positive impact on health.

  3. Influence of calcium carbonate and charcoal application on aggregation processes and organic matter retention at the silt-size scale

    NASA Astrophysics Data System (ADS)

    Asefaw Berhe, Asmeret; Kaiser, Michael; Ghezzehei, Teamrat; Myrold, David; Kleber, Markus

    2013-04-01

    The effectiveness of charcoal and calcium carbonate applications to improve soil conditions has been well documented. However, their influence on the formation of silt-sized aggregates and the amount and protection of associated organic matter (OM) against microbial decomposition is still largely unknown. For sustainable management of agricultural soils, silt-sized aggregates (2-53 µm) are of particularly large importance because they store up to 60% of soil organic carbon with mean residence times between 70 and 400 years. The objectives are i) to analyze the ability of CaCO3 and/or charcoal application to increase the amount of silt-sized aggregates and associated OM, ii) vary soil mineral conditions to establish relevant boundary conditions for amendment-induced aggregation processes, iii) to determine how amendment-induced changes in formation of silt-sized aggregates relate to microbial decomposition of OM. We set up artificial high reactive (HR, clay: 40%, sand: 57%, OM: 3%) and low reactive soils (LR, clay: 10%, sand: 89%, OM: 1%) and mixed them with charcoal (CC, 1%) and/or calcium carbonate (Ca, 0.2%). The samples were adjusted to a water potential of 0.3 bar and sub samples were incubated with microbial inoculum (MO). After a 16-weeks aggregation experiment, size fractions were separated by wet-sieving and sedimentation. Since we did not use mineral compounds in the artificial mixtures within the size range of 2 to 53 µm, we consider material recovered in this fraction as silt-sized aggregates, which was confirmed by SEM analyses. For the LR mixtures, we detected increasing N concentrations within the 2-53 µm fractions of the charcoal amended samples (CC, CC+Ca, and CC+Ca+MO) as compared to the Control sample with the strongest effect for the CC+Ca+MO sample. This indicates an association of N-containing microbial derived OM with silt-sized aggregates. For the charcoal amended LR and HR mixtures, the C concentrations of the 2-53 µm fractions are larger than those of the respective fractions of the Control samples but the effect is several times stronger for the LR mixtures. The C concentrations of the 2-53 µm fractions relative to the total C amount of the LR and HR mixtures are between 30 and 50%. The charcoal amended samples show generally larger relative C amounts associated with the 2-53 µm fractions than the Control samples. Benefits for aggregate formation and OM storage were larger for sand (LR) than for clay soil (HR). The gained data are similar to respective data for natural soils. Consequently, the suggested microcosm experiments are suitable to analyze mechanisms within soil aggregation processes.

  4. Long term effects of fire on carbon and nitrogen pools and fluxes in the arctic permafrost and subarctic forests (ARCTICFIRE)

    NASA Astrophysics Data System (ADS)

    Pumpanen, Jukka; Köster, Kajar; Aaltonen, Heidi; Köster, Egle; Zhou, Xuan; Zhang-Turpeinen, Huizhong; Heinonsalo, Jussi; Palviainen, Marjo; Sun, Hui; Biasi, Christina; Bruckman, Viktor; Prokushkin, Anatoly; Berninger, Frank

    2017-04-01

    Boreal forests, which are to a large extent located on permafrost soils, are a crucial part of the climate system because of their large soil carbon (C) pool. Even small change in this pool may change the terrestrial C sink in the arctic into a source with a consequent increase in CO2 concentrations. About 1% of boreal forests are exposed to fire annually, which affects the soil and permafrost under them. Thawing of permafrost increases the depth of the active layer containing large C and N stocks. In addition to temperature, the decomposition of soil organic matter depends on its chemical composition which may also be affected by fires. Part of the soil organic matter is turned into pyrogenic C and N resistant to decomposition. We studied the effect of forest fires on soil greenhouse gas fluxes (CO2, CH4 and N2O)and biogenic volatile organic compound fluxes using portable chambers. The amount of easily decomposable and recalcitrant fractions in soil organic matter were determined with water, ethanol and acid extraction, and the natural 13C and 15N abundances as well as chemical quality with Fourier Transform Infrared Spectroscopy (FTIR) were studied. Also, changes in microbial community structure and composition were analyzed with next generation pyrosequencing. Our preliminary results indicate that soil CO2 effluxes were significantly decreased immediately after the fire, and the recovery to pre-fire level took several decades. Soils were a small sink of CH4 and a source of N2O in all age classes, and the CH4 uptake was increased and N2O fluxes decreased still 20 years following the fire. A clear vertical distribution was observed in the amount of extractable soil organic matter the amount of extractable organic matter being highest in the soil surface layers and decreasing with depth. The natural 13C and 15N abundances and FTIR spectra and changes in microbial community composition are still under analysis.

  5. Abundance gradients in cooling flow clusters: Ginga Large Area Counters and Einstein Solid State Spectrometer spectra of A496, A1795, A2142, and A2199

    NASA Technical Reports Server (NTRS)

    White, Raymond E., III; Day, C. S. R.; Hatsukade, Isamu; Hughes, John P.

    1994-01-01

    We analyze the Ginga Large Area Counters (LAC) and Einstein Solid State Spectrometer (SSS) spectra of four cooling flow clusters, A496, A1795, A2142, and A2199, each of which shows firm evidence of a relatively cool component. The inclusion of such cool spectral components in joint fits of SSS and LAC data leads to somewhat higher global temperatures than are derived from the high-energy LAC data alone. We find little evidence of cool emission outside the SSS field of view. Metal abundances appear to be centrally enhanced in all four clusters, with varying degrees of model dependence and statistical significance: the evidence is statistically strongest for A496 and A2142, somewhat weaker for A2199 and weakest for A1795. We also explore the model dependence in the amount of cold, X-ray-absorbing matter discovered in these clusters by White et al.

  6. Deciding to Decide: How Decisions Are Made and How Some Forces Affect the Process.

    PubMed

    McConnell, Charles R

    There is a decision-making pattern that applies in all situations, large or small, although in small decisions, the steps are not especially evident. The steps are gathering information, analyzing information and creating alternatives, selecting and implementing an alternative, and following up on implementation. The amount of effort applied in any decision situation should be consistent with the potential consequences of the decision. Essentially, all decisions are subject to certain limitations or constraints, forces, or circumstances that limit one's range of choices. Follow-up on implementation is the phase of decision making most often neglected, yet it is frequently the phase that determines success or failure. Risk and uncertainty are always present in a decision situation, and the application of human judgment is always necessary. In addition, there are often emotional forces at work that can at times unwittingly steer one away from that which is best or most workable under the circumstances and toward a suboptimal result based largely on the desires of the decision maker.

  7. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  8. Profiling Oman education data using data visualization technique

    NASA Astrophysics Data System (ADS)

    Alalawi, Sultan Juma Sultan; Shaharanee, Izwan Nizal Mohd; Jamil, Jastini Mohd

    2016-10-01

    This research works presents an innovative data visualization technique to understand and visualize the information of Oman's education data generated from the Ministry of Education Oman "Educational Portal". The Ministry of Education in Sultanate of Oman have huge databases contains massive information. The volume of data in the database increase yearly as many students, teachers and employees enter into the database. The task for discovering and analyzing these vast volumes of data becomes increasingly difficult. Information visualization and data mining offer a better ways in dealing with large volume of information. In this paper, an innovative information visualization technique is developed to visualize the complex multidimensional educational data. Microsoft Excel Dashboard, Visual Basic Application (VBA) and Pivot Table are utilized to visualize the data. Findings from the summarization of the data are presented, and it is argued that information visualization can help related stakeholders to become aware of hidden and interesting information from large amount of data drowning in their educational portal.

  9. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  10. Statistical and clustering analysis for disturbances: A case study of voltage dips in wind farms

    DOE PAGES

    Garcia-Sanchez, Tania; Gomez-Lazaro, Emilio; Muljadi, Eduard; ...

    2016-01-28

    This study proposes and evaluates an alternative statistical methodology to analyze a large number of voltage dips. For a given voltage dip, a set of lengths is first identified to characterize the root mean square (rms) voltage evolution along the disturbance, deduced from partial linearized time intervals and trajectories. Principal component analysis and K-means clustering processes are then applied to identify rms-voltage patterns and propose a reduced number of representative rms-voltage profiles from the linearized trajectories. This reduced group of averaged rms-voltage profiles enables the representation of a large amount of disturbances, which offers a visual and graphical representation ofmore » their evolution along the events, aspects that were not previously considered in other contributions. The complete process is evaluated on real voltage dips collected in intense field-measurement campaigns carried out in a wind farm in Spain among different years. The results are included in this paper.« less

  11. Timing Results Using an FPGA-Based TDC with Large Arrays of 144 SiPMs

    NASA Astrophysics Data System (ADS)

    Aguilar, A.; González, A. J.; Torres, J.; García-Olcina, R.; Martos, J.; Soret, J.; Conde, P.; Hernández, L.; Sánchez, F.; Benlloch, J. M.

    2015-02-01

    Silicon photomultipliers (SiPMs) have become an alternative to traditional tubes due to several features. However, their implementation to form large arrays is still a challenge especially due to their relatively high intrinsic noise, depending on the chosen readout. In this contribution, two modules composed of 12 ×12 SiPMs with an area of roughly 50 mm×50 mm are used in coincidence. Coincidence resolving time (CRT) results with a field-programmable gate array, in combination with a time to digital converter, are shown as a function of both the sensor bias voltage and the digitizer threshold. The dependence of the CRT on the sensor matrix temperature, the amount of SiPM active area and the crystal type is also analyzed. Measurements carried out with a crystal array of 2 mm pixel size and 10 mm height have shown time resolutions for the entire 288 SiPM two-detector set-up as good as 800 ps full width at half maximum (FWHM).

  12. Evaluating a variety of text-mined features for automatic protein function prediction with GOstruct.

    PubMed

    Funk, Christopher S; Kahanda, Indika; Ben-Hur, Asa; Verspoor, Karin M

    2015-01-01

    Most computational methods that predict protein function do not take advantage of the large amount of information contained in the biomedical literature. In this work we evaluate both ontology term co-mention and bag-of-words features mined from the biomedical literature and analyze their impact in the context of a structured output support vector machine model, GOstruct. We find that even simple literature based features are useful for predicting human protein function (F-max: Molecular Function =0.408, Biological Process =0.461, Cellular Component =0.608). One advantage of using literature features is their ability to offer easy verification of automated predictions. We find through manual inspection of misclassifications that some false positive predictions could be biologically valid predictions based upon support extracted from the literature. Additionally, we present a "medium-throughput" pipeline that was used to annotate a large subset of co-mentions; we suggest that this strategy could help to speed up the rate at which proteins are curated.

  13. Numerical modeling of cracking pattern's influence on the dynamic response of thickened tailings disposals: a periodic approach

    NASA Astrophysics Data System (ADS)

    Ferrer, Gabriel; Sáez, Esteban; Ledezma, Christian

    2018-01-01

    Copper production is an essential component of the Chilean economy. During the extraction process of copper, large quantities of waste materials (tailings) are produced, which are typically stored in large tailing ponds. Thickened Tailings Disposal (TTD) is an alternative to conventional tailings ponds. In TTD, a considerable amount of water is extracted from the tailings before their deposition. Once a thickened tailings layer is deposited, it loses water and it shrinks, forming a relatively regular structure of tailings blocks with vertical cracks in between, which are then filled up with "fresh" tailings once the new upper layer is deposited. The dynamic response of a representative column of this complex structure made out of tailings blocks with softer material in between was analyzed using a periodic half-space finite element model. The tailings' behavior was modeled using an elasto-plastic multi-yielding constitutive model, and Chilean earthquake records were used for the seismic analyses. Special attention was given to the liquefaction potential evaluation of TTD.

  14. Social Media Visual Analytics for Events

    NASA Astrophysics Data System (ADS)

    Diakopoulos, Nicholas; Naaman, Mor; Yazdani, Tayebeh; Kivran-Swaine, Funda

    For large-scale multimedia events such as televised debates and speeches, the amount of content on social media channels such as Facebook or Twitter can easily become overwhelming, yet still contain information that may aid and augment understanding of the multimedia content via individual social media items, or aggregate information from the crowd's response. In this work we discuss this opportunity in the context of a social media visual analytic tool, Vox Civitas, designed to help journalists, media professionals, or other researchers make sense of large-scale aggregations of social media content around multimedia broadcast events. We discuss the design of the tool, present and evaluate the text analysis techniques used to enable the presentation, and detail the visual and interaction design. We provide an exploratory evaluation based on a user study in which journalists interacted with the system to analyze and report on a dataset of over one 100 000 Twitter messages collected during the broadcast of the U.S. State of the Union presidential address in 2010.

  15. The development of intelligent healthcare in China.

    PubMed

    Zheng, Xiaochen; Rodríguez-Monroy, Carlos

    2015-05-01

    Intelligent healthcare (IH) is proposed with the fast application of Internet of Things technology in the healthcare area in recent years. It is considered as an expansion of e-health and telemedicine. As the largest developing country, China is investing large amounts of resources to push forward the development of IH. It is one of the centerpieces of the country's New Healthcare Reform, and great expectation is placed on it to help solve the conflict between limited healthcare resources and the large patient population. Essential policies, milestones, standards, and specifications from the Chinese government since the 1990s were reviewed to show the brief development history of IH in China. Some typical cases and products have been analyzed to present the current situation. The main problems and future development directions have been summarized. The IH industry in China has great potential and is growing very fast, but a lot of challenges also exist. In the future both government support and the active participation of nongovernmental capital are needed to push forward the whole industry.

  16. Flood triggering in Switzerland: the role of daily to monthly preceding precipitation

    NASA Astrophysics Data System (ADS)

    Froidevaux, P.; Schwanbeck, J.; Weingartner, R.; Chevalier, C.; Martius, O.

    2015-03-01

    Determining the role of different precipitation periods for peak discharge generation is crucial for both projecting future changes in flood probability and for short- and medium-range flood forecasting. We analyze catchment-averaged daily precipitation time series prior to annual peak discharge events (floods) in Switzerland. The high amount of floods considered - more than 4000 events from 101 catchments have been analyzed - allows to derive significant information about the role of antecedent precipitation for peak discharge generation. Based on the analysis of precipitation times series, we propose a new separation of flood-related precipitation periods: (i) the period 0 to 1 day before flood days, when the maximum flood-triggering precipitation rates are generally observed, (ii) the period 2 to 3 days before flood days, when longer-lasting synoptic situations generate "significantly higher than normal" precipitation amounts, and (iii) the period from 4 days to one month before flood days when previous wet episodes may have already preconditioned the catchment. The novelty of this study lies in the separation of antecedent precipitation into the precursor antecedent precipitation (4 days before floods or earlier, called PRE-AP) and the short range precipitation (0 to 3 days before floods, a period when precipitation is often driven by one persistent weather situation like e.g. a stationary low-pressure system). Because we consider a high number of events and because we work with daily precipitation values, we do not separate the "antecedent" and "peak-triggering" precipitation. The whole precipitation recorded during the flood day is included in the short-range antecedent precipitation. The precipitation accumulating 0 to 3 days before an event is the most relevant for floods in Switzerland. PRE-AP precipitation has only a weak and region-specific influence on flood probability. Floods were significantly more frequent after wet PRE-AP periods only in the Jura Mountains, in the western and eastern Swiss plateau, and at the exit of large lakes. As a general rule, wet PRE-AP periods enhance the flood probability in catchments with gentle topography, high infiltration rates, and large storage capacity (karstic cavities, deep soils, large reservoirs). In contrast, floods were significantly less frequent after wet PRE-AP periods in glacial catchments because of reduced melt. For the majority of catchments however, no significant correlation between precipitation amounts and flood occurrences is found when the last three days before floods are omitted in the precipitation amounts. Moreover, the PRE-AP was not higher for extreme floods than for annual floods with a high frequency and was very close to climatology for all floods. The weak influence of PRE-AP is a clear indicator of a short discharge memory of Prealpine, Alpine and Southalpine Swiss catchments. Our study nevertheless poses the question whether the impact of long-term precursory precipitation for floods in such catchments is not overestimated in the general perception. We conclude that the consideration of a 3-4 days precipitation period should be sufficient to represent (understand, reconstruct, model, project) Swiss Alpine floods.

  17. Results of initial analyses of the salt (macro) batch 9 tank 21H qualification samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T. B.

    2015-10-01

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Interim Salt Disposition Project (ISDP) Salt (Macro) Batch 9 for processing through the Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU). This document reports the initial results of the analyses of samples of Tank 21H. Analysis of the Tank 21H Salt (Macro) Batch 9 composite sample indicates that the material does not display any unusual characteristics or observations, such as floating solids, the presence of large amount of solids, or unusual colors. Further results on the chemistry and other tests willmore » be issued in the future.« less

  18. Constructing DNA Barcode Sets Based on Particle Swarm Optimization.

    PubMed

    Wang, Bin; Zheng, Xuedong; Zhou, Shihua; Zhou, Changjun; Wei, Xiaopeng; Zhang, Qiang; Wei, Ziqi

    2018-01-01

    Following the completion of the human genome project, a large amount of high-throughput bio-data was generated. To analyze these data, massively parallel sequencing, namely next-generation sequencing, was rapidly developed. DNA barcodes are used to identify the ownership between sequences and samples when they are attached at the beginning or end of sequencing reads. Constructing DNA barcode sets provides the candidate DNA barcodes for this application. To increase the accuracy of DNA barcode sets, a particle swarm optimization (PSO) algorithm has been modified and used to construct the DNA barcode sets in this paper. Compared with the extant results, some lower bounds of DNA barcode sets are improved. The results show that the proposed algorithm is effective in constructing DNA barcode sets.

  19. In situ high temperature MAS NMR study of the mechanisms of catalysis. Ethane aromatization on Zn-modified zeolite BEA.

    PubMed

    Arzumanov, Sergei S; Gabrienko, Anton A; Freude, Dieter; Stepanov, Alexander G

    2009-04-01

    Ethane conversion into aromatic hydrocarbons over Zn-modified zeolite BEA has been analyzed by high-temperature MAS NMR spectroscopy. Information about intermediates (Zn-ethyl species) and reaction products (mainly toluene and methane), which were formed under the conditions of a batch reactor, was obtained by (13)C MAS NMR. Kinetics of the reaction, which was monitored by (1)H MAS NMR in situ at the temperature of 573K, provided information about the reaction mechanism. Simulation of the experimental kinetics within the frames of the possible kinetic schemes of the reaction demonstrates that a large amount of methane evolved under ethane aromatization arises from the stage of direct ethane hydrogenolysis.

  20. The Abnormal vs. Normal ECG Classification Based on Key Features and Statistical Learning

    NASA Astrophysics Data System (ADS)

    Dong, Jun; Tong, Jia-Fei; Liu, Xia

    As cardiovascular diseases appear frequently in modern society, the medicine and health system should be adjusted to meet the new requirements. Chinese government has planned to establish basic community medical insurance system (BCMIS) before 2020, where remote medical service is one of core issues. Therefore, we have developed the "remote network hospital system" which includes data server and diagnosis terminal by the aid of wireless detector to sample ECG. To improve the efficiency of ECG processing, in this paper, abnormal vs. normal ECG classification approach based on key features and statistical learning is presented, and the results are analyzed. Large amount of normal ECG could be filtered by computer automatically and abnormal ECG is left to be diagnosed specially by physicians.

  1. Demand Response and Energy Storage Integration Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Ookie; Cheung, Kerry; Olsen, Daniel J.

    2016-03-01

    Demand response and energy storage resources present potentially important sources of bulk power system services that can aid in integrating variable renewable generation. While renewable integration studies have evaluated many of the challenges associated with deploying large amounts of variable wind and solar generation technologies, integration analyses have not yet fully incorporated demand response and energy storage resources. This report represents an initial effort in analyzing the potential integration value of demand response and energy storage, focusing on the western United States. It evaluates two major aspects of increased deployment of demand response and energy storage: (1) Their operational valuemore » in providing bulk power system services and (2) Market and regulatory issues, including potential barriers to deployment.« less

  2. Storm Induced Injection of the Mississippi River Plume Into the Open Gulf of Mexico

    NASA Technical Reports Server (NTRS)

    Yuan, Jinchun; Miller, Richard L.; Powell, Rodney T.; Dagg, Michael J.

    2004-01-01

    The direct impact of the Mississippi River on the open Gulf of Mexico is typically considered to be limited due to the predominantly along-shore current pattern. Using satellite imagery, we analyzed chl a distributions in the northern Gulf of Mexico before and after the passage of two storms: Hurricane Lili and Tropical Storm Barry. Our analyses indicate that storm-induced eddies can rapidly inject large volumes of nutrient-rich Mississippi River water to the open gulf, and lead to phytoplankton blooms. Although these events last only a few weeks, they transport significant amounts of fluvial substances to the ocean. These river-ocean interactions are especially significant in tropical and subtropical regions because receiving waters are typically permanently stratified and oligotrophic.

  3. Variation of the channel temperature in the transmission of lightning leader

    NASA Astrophysics Data System (ADS)

    Chang, Xuan; Yuan, Ping; Cen, Jianyong; Wang, Xuejuan

    2017-06-01

    According to the time-resolved spectra of the lightning stepped leader and dart leader processes, the channel temperature, its evolution characteristics with time and the variation along the channel height in the transmission process were analyzed. The results show that the stepped leader tip has a slightly higher temperature than the trailing end, which should be caused by a large amount of electric charges on the leader tip. In addition, both temperature and brightness are enhanced at the position of the channel node. The dart leader has a higher channel temperature than the stepped leader but a lower temperature than the return stroke. Meanwhile, the channel temperature of the dart leader obviously increases when the dart leader propagates to the ground.

  4. Visual mining geo-related data using pixel bar charts

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Keim, Daniel A.; Dayal, Umeshwar; Wright, Peter; Schneidewind, Joern

    2005-03-01

    A common approach to analyze geo-related data is using bar charts or x-y plots. They are intuitive and easy to use. But important information often gets lost. In this paper, we introduce a new interactive visualization technique called Geo Pixel Bar Charts, which combines the advantages of Pixel Bar Charts and interactive maps. This technique allows analysts to visualize large amounts of spatial data without aggregation and shows the geographical regions corresponding to the spatial data attribute at the same time. In this paper, we apply Geo Pixel Bar Charts to visually mining sales transactions and Internet usage from different locations. Our experimental results show the effectiveness of this technique for providing data distribution and exceptions from the map.

  5. In silico gene expression profiling in Cannabis sativa.

    PubMed

    Massimino, Luca

    2017-01-01

    The cannabis plant and its active ingredients (i.e., cannabinoids and terpenoids) have been socially stigmatized for half a century. Luckily, with more than 430,000 published scientific papers and about 600 ongoing and completed clinical trials, nowadays cannabis is employed for the treatment of many different medical conditions. Nevertheless, even if a large amount of high-throughput functional genomic data exists, most researchers feature a strong background in molecular biology but lack advanced bioinformatics skills. In this work, publicly available gene expression datasets have been analyzed giving rise to a total of 40,224 gene expression profiles taken from cannabis plant tissue at different developmental stages. The resource presented here will provide researchers with a starting point for future investigations with Cannabis sativa .

  6. Use of parallel computing for analyzing big data in EEG studies of ambiguous perception

    NASA Astrophysics Data System (ADS)

    Maksimenko, Vladimir A.; Grubov, Vadim V.; Kirsanov, Daniil V.

    2018-02-01

    Problem of interaction between human and machine systems through the neuro-interfaces (or brain-computer interfaces) is an urgent task which requires analysis of large amount of neurophysiological EEG data. In present paper we consider the methods of parallel computing as one of the most powerful tools for processing experimental data in real-time with respect to multichannel structure of EEG. In this context we demonstrate the application of parallel computing for the estimation of the spectral properties of multichannel EEG signals, associated with the visual perception. Using CUDA C library we run wavelet-based algorithm on GPUs and show possibility for detection of specific patterns in multichannel set of EEG data in real-time.

  7. Analysis of Land Subsidence Monitoring in Mining Area with Time-Series Insar Technology

    NASA Astrophysics Data System (ADS)

    Sun, N.; Wang, Y. J.

    2018-04-01

    Time-series InSAR technology has become a popular land subsidence monitoring method in recent years, because of its advantages such as high accuracy, wide area, low expenditure, intensive monitoring points and free from accessibility restrictions. In this paper, we applied two kinds of satellite data, ALOS PALSAR and RADARSAT-2, to get the subsidence monitoring results of the study area in two time periods by time-series InSAR technology. By analyzing the deformation range, rate and amount, the time-series analysis of land subsidence in mining area was realized. The results show that InSAR technology could be used to monitor land subsidence in large area and meet the demand of subsidence monitoring in mining area.

  8. Experiences with Deriva: An Asset Management Platform for Accelerating eScience.

    PubMed

    Bugacov, Alejandro; Czajkowski, Karl; Kesselman, Carl; Kumar, Anoop; Schuler, Robert E; Tangmunarunkit, Hongsuda

    2017-10-01

    The pace of discovery in eScience is increasingly dependent on a scientist's ability to acquire, curate, integrate, analyze, and share large and diverse collections of data. It is all too common for investigators to spend inordinate amounts of time developing ad hoc procedures to manage their data. In previous work, we presented Deriva, a Scientific Asset Management System, designed to accelerate data driven discovery. In this paper, we report on the use of Deriva in a number of substantial and diverse eScience applications. We describe the lessons we have learned, both from the perspective of the Deriva technology, as well as the ability and willingness of scientists to incorporate Scientific Asset Management into their daily workflows.

  9. Latest results on νμ → ντ oscillations from the OPERA experiment

    NASA Astrophysics Data System (ADS)

    Komatsu, Masahiro; OPERA Collaboration

    2016-04-01

    The OPERA experiment is designed to prove neutrino oscillations in the νμ to ντ channel through the direct observation of the tau lepton in tau neutrino charged current interactions. The experiment has accumulated data for five years, from 2008 to 2012, with the CERN Neutrinos to Gran Sasso (CNGS), an almost pure νμ beam. In the last two years, a very large amount of the data accumulated in the nuclear emulsions has been analyzed. The latest results on oscillations with the increased statistics, which include a fourth tau neutrino candidate event, will be presented. Given the extremely low expected background, this result corresponds to the observation of the oscillation process with a four sigma level significance.

  10. Piezoelectric and dielectric properties of nanoporous polyvinylidence fluoride (PVDF) films

    NASA Astrophysics Data System (ADS)

    Zhao, Ping; Wang, Shifa; Kadlec, Alec

    2016-04-01

    A nanoporous polyvinylidene Fluoride (PVDF) thin film was developed for applications in energy harvesting, medical surgeries, and industrial robotics. This sponge-like nanoporous PVDF structure dramatically enhanced the piezoelectric effect because it yielded considerably large deformation under a small force. A casting-etching method was adopted to make films, which is effective to control the porosity, flexibility, and thickness of the film. The films with various Zinc Oxide (ZnO) mass fractions ranging from 10 to 50% were fabricated to investigate the porosity effect. The piezoelectric coefficient d33 as well as dielectric constant and loss of the films were characterized. The results were analyzed and the optimal design of the film with the right amount of ZnO nanoparticles was determined.

  11. Demand Response and Energy Storage Integration Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Ookie; Cheung, Kerry

    Demand response and energy storage resources present potentially important sources of bulk power system services that can aid in integrating variable renewable generation. While renewable integration studies have evaluated many of the challenges associated with deploying large amounts of variable wind and solar generation technologies, integration analyses have not yet fully incorporated demand response and energy storage resources. This report represents an initial effort in analyzing the potential integration value of demand response and energy storage, focusing on the western United States. It evaluates two major aspects of increased deployment of demand response and energy storage: (1) Their operational valuemore » in providing bulk power system services and (2) Market and regulatory issues, including potential barriers to deployment.« less

  12. a Spatiotemporal Aggregation Query Method Using Multi-Thread Parallel Technique Based on Regional Division

    NASA Astrophysics Data System (ADS)

    Liao, S.; Chen, L.; Li, J.; Xiong, W.; Wu, Q.

    2015-07-01

    Existing spatiotemporal database supports spatiotemporal aggregation query over massive moving objects datasets. Due to the large amounts of data and single-thread processing method, the query speed cannot meet the application requirements. On the other hand, the query efficiency is more sensitive to spatial variation then temporal variation. In this paper, we proposed a spatiotemporal aggregation query method using multi-thread parallel technique based on regional divison and implemented it on the server. Concretely, we divided the spatiotemporal domain into several spatiotemporal cubes, computed spatiotemporal aggregation on all cubes using the technique of multi-thread parallel processing, and then integrated the query results. By testing and analyzing on the real datasets, this method has improved the query speed significantly.

  13. Research on the relationships of the domestic mutual investment of China based on the cross-shareholding networks of the listed companies

    NASA Astrophysics Data System (ADS)

    Ma, Yuan-yuan; Zhuang, Xin-tian; Li, Ling-xuan

    2011-02-01

    Enterprises are the core power and the carriers to promote the country's economy developing sustainably and rapidly; the listed enterprises are the outstanding companies which can represent the economic level at the places where the enterprises are located, so we establish the cross-shareholding networks of the listed companies between 2002 and 2009, and then analyze the mutual investment at company-level, province-level and region-level. We have researched the overall trend of economic development and the overall tendency of capital flow of China in the recent 8 years based on the cross-shareholding networks, the influence of a global economic crisis on the stock markets and the overall economics of China in 2008 and the recovery of the economy after the economic crisis. Moreover, we analyze the variations of the cross-shareholding networks and the influence of the state-owned large and medium enterprises listing frequently on Chinese stock markets. We divide the provinces of China into 3 main categories according to their industrial situations. Though the analysis, we find that the wealth gap between the different areas is not significantly reduced even though the government has carried out strategies such as the Development of the West Regions and the Rejuvenation of Old Industrial Bases in Northeastern China. We analyze the cumulative distribution function of the degree of the vertices and use large amounts of data to do empirical analysis. The methods used include the hierarchical cluster analysis, regression analysis, etc.

  14. The Daily Consumption of Cola Can Determine Hypocalcemia: A Case Report of Postsurgical Hypoparathyroidism-Related Hypocalcemia Refractory to Supplemental Therapy with High Doses of Oral Calcium.

    PubMed

    Guarnotta, Valentina; Riela, Serena; Massaro, Marina; Bonventre, Sebastiano; Inviati, Angela; Ciresi, Alessandro; Pizzolanti, Giuseppe; Benvenga, Salvatore; Giordano, Carla

    2017-01-01

    The consumption of soft drinks is a crucial factor in determining persistent hypocalcemia. The aim of the study is to evaluate the biochemical mechanisms inducing hypocalcemia in a female patient with usual high consumption of cola drink and persistent hypocalcemia, who failed to respond to high doses of calcium and calcitriol supplementation. At baseline and after pentagastrin injection, gastric secretion (Gs) and duodenal secretion (Ds) samples were collected and calcium and total phosphorus (P tot ) concentrations were evaluated. At the same time, blood calcium, P tot , sodium, potassium, chloride, magnesium concentrations, and vitamin D were sampled. After intake of cola (1 L) over 180 min, Gs and Ds and blood were collected and characterized in order to analyze the amount of calcium and P tot or sodium, potassium, magnesium, and chloride ions, respectively. A strong pH decrease was observed after cola intake with an increase in phosphorus concentration. Consequently, a decrease in calcium concentration in Gs and Ds was observed. A decrease in calcium concentration was also observed in blood. In conclusion, we confirm that in patients with postsurgical hypoparathyroidism, the intake of large amounts of cola containing high amounts of phosphoric acid reduces calcium absorption efficiency despite the high doses of calcium therapy.

  15. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  16. Cone Penetration Testing, a new approach to quantify coastal-deltaic land subsidence by peat consolidation

    NASA Astrophysics Data System (ADS)

    Koster, Kay; Erkens, Gilles; Zwanenburg, Cor

    2016-04-01

    It is undisputed that land subsidence threatens coastal-deltaic lowlands all over the world. Any loss of elevation (on top of sea level rise) increases flood risk in these lowlands, and differential subsidence may cause damage to infrastructure and constructions. Many of these settings embed substantial amounts of peat, which is, due to its mechanically weak organic composition, one of the main drivers of subsidence. Peat is very susceptible to volume reduction by loading and drainage induced consolidation, which dissipates pore water, resulting in a tighter packing of the organic components. Often, the current state of consolidation of peat embedded within coastal-deltaic subsidence hotspots (e.g. Venice lagoon, Mississippi delta, San Joaquin delta, Kalimantan peatlands), is somewhere between its initial (natural) and maximum compressed stage. Quantifying the current state regarding peat volume loss, is of utmost importance to predict potential (near) future subsidence when draining or loading an area. The processes of subsidence often afflict large areas (>103 km2), thus demanding large datasets to assess the current state of the subsurface. In contrast to data describing the vertical motions of the actual surface (geodesy, satellite imagery), subsurface information applicable for subsidence analysis are often lacking in subsiding deltas. This calls for new initiatives to bridge that gap. Here we introduce Cone Penetration Testing (CPT) to quantify the amount of volume loss peat layers embedded within the Holland coastal plain (the Netherlands) experienced. CPT measures soil mechanical strength, and hundreds of thousands of CPTs are conducted each year on all continents. We analyzed 28 coupled CPT-borehole observations, and found strong empirical relations between volume loss and increased peat mechanical strength. The peat lost between ~20 - 95% of its initial thickness by dissipation of excess pore water. An increase in 0.1 - 0.4 MPa of peat strength is accountable for 20 - 75 % of the volume loss, and 0.4 - 0.7 MPa for 75 - 95 % volume loss. This indicates that large amounts of volume by water dissipation has to be lost, before peat experiences a serious increase in strength, which subsequently continuous to increase with only small amount of volume loss. To demonstrate the robustness of our approach to the international field of land subsidence, we applied the obtained empirical relations to previously published CPT logs deriving from the peat-rich San Joaquin-Sacramento delta and the Kalimantan peatlands, and found volume losses that correspond with previously published results. Furthermore, we used the obtained results to predict maximum surface lowering for these areas by consolidation. In conclusion, these promising results and its worldwide popularity yielding large datasets, open the door for CPT as a generic method to contribute to quantifying the imminent threat of coastal-deltaic land subsidence.

  17. Unit-Dose Bags For Formulating Intravenous Solutions

    NASA Technical Reports Server (NTRS)

    Finley, Mike; Kipp, Jim; Scharf, Mike; Packard, Jeff; Owens, Jim

    1993-01-01

    Smaller unit-dose flowthrough bags devised for use with large-volume parenteral (LVP) bags in preparing sterile intravenous solutions. Premeasured amount of solute stored in such unit-dose bag flushed by predetermined amount of water into LVP bag. Relatively small number of LVP bags used in conjunction with smaller unit-dose bags to formulate large number of LVP intravenous solutions in nonsterile environment.

  18. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  19. Patchy reaction-diffusion and population abundance: the relative importance of habitat amount and arrangement

    Treesearch

    Curtis H. Flather; Michael Bevers

    2002-01-01

    A discrete reaction-diffusion model was used to estimate long-term equilibrium populations of a hypothetical species inhabiting patchy landscapes to examine the relative importance of habitat amount and arrangement in explaining population size. When examined over a broad range of habitat amounts and arrangements, population size was largely determined by a pure amount...

  20. 2014 Mount Ontake eruption: characteristics of the phreatic eruption as inferred from aerial observations

    NASA Astrophysics Data System (ADS)

    Kaneko, Takayuki; Maeno, Fukashi; Nakada, Setsuya

    2016-05-01

    The sudden eruption of Mount Ontake on September 27, 2014, led to a tragedy that caused more than 60 fatalities including missing persons. In order to mitigate the potential risks posed by similar volcano-related disasters, it is vital to have a clear understanding of the activity status and progression of eruptions. Because the erupted material was largely disturbed while access was strictly prohibited for a month, we analyzed the aerial photographs taken on September 28. The results showed that there were three large vents in the bottom of the Jigokudani valley on September 28. The vent in the center was considered to have been the main vent involved in the eruption, and the vents on either side were considered to have been formed by non-explosive processes. The pyroclastic flows extended approximately 2.5 km along the valley at an average speed of 32 km/h. The absence of burned or fallen trees in this area indicated that the temperatures and destructive forces associated with the pyroclastic flow were both low. The distribution of ballistics was categorized into four zones based on the number of impact craters per unit area, and the furthest impact crater was located 950 m from the vents. Based on ballistic models, the maximum initial velocity of the ejecta was estimated to be 111 m/s. Just after the beginning of the eruption, very few ballistic ejecta had arrived at the summit, even though the eruption plume had risen above the summit, which suggested that a large amount of ballistic ejecta was expelled from the volcano several tens-of-seconds after the beginning of the eruption. This initial period was characterized by the escape of a vapor phase from the vents, which then caused the explosive eruption phase that generated large amounts of ballistic ejecta via sudden decompression of a hydrothermal reservoir.

  1. Intramolecular energy transfer and the driving mechanisms for large-amplitude collective motions of clusters

    NASA Astrophysics Data System (ADS)

    Yanao, Tomohiro; Koon, Wang Sang; Marsden, Jerrold E.

    2009-04-01

    This paper uncovers novel and specific dynamical mechanisms that initiate large-amplitude collective motions in polyatomic molecules. These mechanisms are understood in terms of intramolecular energy transfer between modes and driving forces. Structural transition dynamics of a six-atom cluster between a symmetric and an elongated isomer is highlighted as an illustrative example of what is a general message. First, we introduce a general method of hyperspherical mode analysis to analyze the energy transfer among internal modes of polyatomic molecules. In this method, the (3n-6) internal modes of an n-atom molecule are classified generally into three coarse level gyration-radius modes, three fine level twisting modes, and (3n-12) fine level shearing modes. We show that a large amount of kinetic energy flows into the gyration-radius modes when the cluster undergoes structural transitions by changing its mass distribution. Based on this fact, we construct a reactive mode as a linear combination of the three gyration-radius modes. It is shown that before the reactive mode acquires a large amount of kinetic energy, activation or inactivation of the twisting modes, depending on the geometry of the isomer, plays crucial roles for the onset of a structural transition. Specifically, in a symmetric isomer with a spherical mass distribution, activation of specific twisting modes drives the structural transition into an elongated isomer by inducing a strong internal centrifugal force, which has the effect of elongating the mass distribution of the system. On the other hand, in an elongated isomer, inactivation of specific twisting modes initiates the structural transition into a symmetric isomer with lower potential energy by suppressing the elongation effect of the internal centrifugal force and making the effects of the potential force dominant. This driving mechanism for reactions as well as the present method of hyperspherical mode analysis should be widely applicable to molecular reactions in which a system changes its overall mass distribution in a significant way.

  2. Grandmothers' productivity and the HIV/AIDS pandemic in sub-Saharan Africa.

    PubMed

    Bock, John; Johnson, Sara E

    2008-06-01

    The human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) pandemic has left large numbers of orphans in sub-Saharan Africa. Botswana has an HIV prevalence rate of approximately 40% in adults. Morbidity and mortality are high, and in a population of a 1.3 million there are nearly 50,000 children who have lost one or both parents to HIV/AIDS. The extended family, particularly grandparents, absorbs much of the childrearing responsibilities. This creates large amounts of additional work for grandmothers especially. The embodied capital model and the grandmother hypothesis are both derived from life history theory within evolutionary ecology, and both predict that one important factor in the evolution of the human extended family structure is that postreproductive individuals such as grandmothers provide substantial support to their grandchildren's survival. Data collected in the pre-pandemic context in a traditional multi-ethnic community in the Okavango Delta of Botswana are analyzed to calculate the amount of work effort provided to a household by women of different ages. Results show that the contributions of older and younger women to the household in term of both productivity and childrearing are qualitatively and quantitatively different. These results indicate that it is unrealistic to expect older women to be able to compensate for the loss of younger women's contributions to the household, and that interventions be specifically designed to support older women based on the type of activities in which they engage that affect child survival, growth, and development.

  3. Development and application of an algorithm to compute weighted multiple glycan alignments.

    PubMed

    Hosoda, Masae; Akune, Yukie; Aoki-Kinoshita, Kiyoko F

    2017-05-01

    A glycan consists of monosaccharides linked by glycosidic bonds, has branches and forms complex molecular structures. Databases have been developed to store large amounts of glycan-binding experiments, including glycan arrays with glycan-binding proteins. However, there are few bioinformatics techniques to analyze large amounts of data for glycans because there are few tools that can handle the complexity of glycan structures. Thus, we have developed the MCAW (Multiple Carbohydrate Alignment with Weights) tool that can align multiple glycan structures, to aid in the understanding of their function as binding recognition molecules. We have described in detail the first algorithm to perform multiple glycan alignments by modeling glycans as trees. To test our tool, we prepared several data sets, and as a result, we found that the glycan motif could be successfully aligned without any prior knowledge applied to the tool, and the known recognition binding sites of glycans could be aligned at a high rate amongst all our datasets tested. We thus claim that our tool is able to find meaningful glycan recognition and binding patterns using data obtained by glycan-binding experiments. The development and availability of an effective multiple glycan alignment tool opens possibilities for many other glycoinformatics analysis, making this work a big step towards furthering glycomics analysis. http://www.rings.t.soka.ac.jp. kkiyoko@soka.ac.jp. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  4. Precipitation chemistry in central Amazonia

    NASA Technical Reports Server (NTRS)

    Andreae, M. O.; Talbot, R. W.; Berresheim, H.; Beecher, K. M.

    1990-01-01

    Rain samples from three sites in central Amazonia were collected over a period of 6 weeks during the 1987 wet season and analyzed for ionic species and dissolved organic carbon. A continuous record of precipitation chemistry and amount was obtained at two of these sites, which were free from local or regional pollution, for a time period of over 1 month. The volume-weighted mean concentrations of most species were found to be about a factor of 5 lower during the wet season compared with previous results from the dry season. Only sodium, potassium, and chloride showed similar concentrations in both seasons. When the seasonal difference in rainfall amount is taken into consideration, the deposition fluxes are only slightly lower for most species during the wet season than during the dry season, again with the exception of chloride, potassium, and sodium. Sodium and chloride are present in the same ratio as in sea salt; rapid advection of air masses of marine origin to the central Amazon Basin during the wet season may be responsible for the observed higher deposition flux of these species. Statistical analysis suggests that sulfate is, to a large extent, of marine (sea salt and biogenic) origin, but that long-range transport of combustion-derived aerosols also makes a significant contribution to sulfate and nitrate levels in Amazonian rain. Organic acid concentrations in rain were responsible for a large fraction of the observed precipitation acidity; their concentration was strongly influenced by gas/liquid interactions.

  5. Biosynthesis of staphylococcal enterotoxin A by genetic engineering technology and determination of staphylococcal enterotoxin A in water by HPLC-ESI-TOF.

    PubMed

    Li, Hong-Na; Yuan, Fei; Luo, Yun-Jing; Wang, Jian-Feng; Zhang, Chuan-Bin; Zhou, Wei-E; Ren, Zhi-Qin; Wu, Wen-Jie; Zhang, Feng

    2017-08-01

    Staphylococcal enterotoxin A (SEA) was the major virulence factor of Staphylococcus aureus and a biomarker of S. aureus. To establish a fast, low cost, high accuracy, reliable, and simple method for detecting S. aureus, SEA was analyzed by HPLC-ESI-TOF. SEA was not yet commercially available in universal, so SEA was prepared before it was analyzed by HPLC-ESI-TOF. The result showed that high purified SEA was successfully prepared and SEA has normal distribution in mass spectra. A large amount of recombinant SEA (rSEA) was obtained by engineering technology and was purified by Ni affinity chromatography column, and the expression and purity of rSEA and SEA were analyzed by SDS-PAGE. The factors effected on ionization of SEA were studied, and the qualitative analysis of SEA by HPLC-ESI-TOF. The result showed that large amount of SEs expressed within a short time at 28 °C or thereabouts, and there was no impurity bands in electrophorogram after rSEA was purified by Ni affinity chromatography column. In addition, the SEA which had homologous AA sequence with wild SEA was made by rSEA. The retention of SEA in column and ionization of SEA in ESI-TOF were studied for qualitative analysis of S. aureus. The result showed that the content of formic acid in mobile phase was an important factor for ionization of SEs in ESI-TOF. And the result provided theoretical foundation for qualitative detection of S. aureus. [SEs + nH + + mNH 4 + ] n+m+ was shown on ESI-TOF spectra when SEA was detected by ESI-TOF in positive ion mode, and the numerical value of n+m was less than or equal to the number of basic amino acids in SEs. This method was applied to determine SEA in water samples preliminarily, and the detection limit of SEA in spiked water sample was 3 mg/kg. The limit of detection of 3 mg/kg was low sensitivity for low molecular weight matters, but it was high sensitivity for SEA which had a high molecular weight of 27 kDa. Of SEA, 3 mg/kg was equivalent to 10 -4  mmol/kg of SEA. This study can provide evidence for establishing method to determine SEA in real samples.

  6. Link Winds: A visual data analysis system and its application to the atmospheric ozone depletion problem

    NASA Technical Reports Server (NTRS)

    Jacobson, Allan S.; Berkin, Andrew L.

    1995-01-01

    The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration system resulting from a NASA Jet Propulsion Laboratory (JPL) program of research into the application of graphical methods for rapidly accessing, displaying, and analyzing large multi variate multidisciplinary data sets. Running under UNIX it is an integrated multi-application executing environment using a data-linking paradigm to dynamically interconnect and control multiple windows containing a variety of displays and manipulators. This paradigm, resulting in a system similar to a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but leads to a highly intuitive, easy-to-learn user interface. It provides great flexibility in rapidly interacting with large masses of complex data to detect trends, correlations, and anomalies. The system, containing an expanding suite of non-domain-specific applications, provides for the ingestion of a variety of data base formats and hard -copy output of all displays. Remote networked workstations running LinkWinds may be interconnected, providing a multiuser science environment (MUSE) for collaborative data exploration by a distributed science team. The system is being developed in close collaboration with investigators in a variety of science disciplines using both archived and real-time data. It is currently being used to support the Microwave Limb Sounder (MLS) in orbit aboard the Upper Atmosphere Research Satellite (UARS). This paper describes the application of LinkWinds to this data to rapidly detect features, such as the ozone hole configuration, and to analyze correlations between chemical constituents of the atmosphere.

  7. Light-Dependent OCT Structure Changes in Photoreceptor Degenerative rd 10 Mouse Retina

    PubMed Central

    Li, Yichao; Zhang, Yikui; Chen, Sonia; Vernon, Gregory; Wong, Wai T.

    2018-01-01

    Purpose Using optical coherence tomography (OCT) to analyze the effects of light/dark adaptation in a mouse model of inherited photoreceptor degeneration (rd10), and to study dynamics of subretinal fluid during the progress of retinal degeneration. Methods rd10 and wild-type (WT) C57BL/6J mice were reared in cyclic light or darkness and imaged with Bioptigen UHR-OCT or Spectralis HRA+OCT after adaptation to either light or darkness. Results OCT images from rd10 mice were analyzed at three progressive stages of degeneration. After light-adaptation, stage I (postnatal age [P]26–29) eyes demonstrated no apparent subretinal fluid. At stage II (P32–38), subretinal fluid was present and restricted to parapapillary area, while at stage III (P44–45) extensive subretinal fluid was present across many retinal areas. Following overnight dark-adaptation, WT eyes showed a large reduction in outer retinal thickness (4.6 ± 1.4 μm, n = 16), whereas this change was significantly smaller in stage I rd10 eyes (1.5 ± 0.5 μm, n = 14). In stage II rd10 eyes, dark-adaptation significantly reduced the extent of subretinal fluid, with the amount of reduction correlating with the amount of fluid pre-existing in the light-adapted state. However, dark-adaptation did not significantly alter the amount of subretinal fluid observed in stage III rd10 mice. In addition, dark-rearing of rd10 mice from P6 to P30 slowed retinal degeneration. Conclusions Visual experience in the form of light/dark adaptation exerts a significant effect on outer retinal structure in the context of photoreceptor degeneration. This effect may arise from light-dependent alterations in fluid transport across the RPE monolayer, and promote photoreceptor survival as induced by dark-rearing. PMID:29490345

  8. Gravity modeling finds a large magma body in the deep crust below the Gulf of Naples, Italy.

    PubMed

    Fedi, M; Cella, F; D'Antonio, M; Florio, G; Paoletti, V; Morra, V

    2018-05-29

    We analyze a wide gravity low in the Campania Active Volcanic Area and interpret it by a large and deep source distribution of partially molten, low-density material from about 8 to 30 km depth. Given the complex spatial-temporal distribution of explosive volcanism in the area, we model the gravity data consistently with several volcanological and petrological constraints. We propose two possible models: one accounts for the coexistence, within the lower/intermediate crust, of large amounts of melts and cumulates besides country rocks. It implies a layered distribution of densities and, thus, a variation with depth of percentages of silicate liquids, cumulates and country rocks. The other reflects a fractal density distribution, based on the scaling exponent estimated from the gravity data. According to this model, the gravity low would be related to a distribution of melt pockets within solid rocks. Both density distributions account for the available volcanological and seismic constraints and can be considered as end-members of possible models compatible with gravity data. Such results agree with the general views about the roots of large areas of ignimbritic volcanism worldwide. Given the prolonged history of magmatism in the Campania area since Pliocene times, we interpret the detected low-density body as a developing batholith.

  9. River bed Elevation Changes and Increasing Flood Hazards in the Nisqually River at Mount Rainier National Park, Washington

    NASA Astrophysics Data System (ADS)

    Halmon, S.; Kennard, P.; Beason, S.; Beaulieu, E.; Mitchell, L.

    2006-12-01

    Mount Rainier, located in Southwestern Washington, is the most heavily glaciated volcano of the Cascade Mountain Range. Due to the large quantities of glaciers, Mount Rainier also has a large number of braided rivers, which are formed by a heavy sediment load being released from the glaciers. As sediment builds in the river, its bed increases, or aggrades,its floodplain changes. Some contributions to a river's increased sediment load are debris flows, erosion, and runoff, which tend to carry trees, boulders, and sediment downstream. Over a period of time, the increased sediment load will result in the river's rise in elevation. The purpose of this study is to monitor aggradation rates, which is an increase in height of the river bed, in one of Mount Rainier's major rivers, the Nisqually. The studied location is near employee offices and visitor attractions in Longmire. The results of this study will also provide support to decision makers regarding geological hazard reduction in the area. The Nisqually glacier is located on the southern side of the volcano, which receives a lot of sunlight, thus releasing large amounts of snowmelt and sediment in the summer. Historical data indicate that several current features which may contribute to future flooding, such as the unnatural uphill slope to the river, which is due to a major depositional event in the late 1700s where 15 ft of material was deposited in this area. Other current features are the glaciers surrounding the Nisqually glacier, such as the Van Trump and Kaultz glaciers that produced large outbursts, affecting the Nisqually River and the Longmire area in 2001, 2003, and 2005. In an effort to further explore these areas, the research team used a surveying device, total station, in the Nisqually River to measure elevation change and angles of various positions within ten cross sections along the Longmire area. This data was then put into GIS for analyzation of its current sediment level and for comparison to previous cross sections, which were in 1993 and 2005. Results of the data analysis revealed changes in altitude of the sediment, as well as new areas of built up sediment. For example, a 7 foot increase in elevation, which was not revealed in the 2005 data, indicated there was an increased amount of debris that traveled from upstream. Further data will be obtained once all the cross sections are completed and data is closer analyzed.

  10. Single-wavelength based rice leaf color analyzer for nitrogen status estimation

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2014-02-01

    With the need of a tool for efficient nitrogen (N) fertilizer management in the rice field, this paper proposes a low-cost compact single-wavelength based colorimeter that can be used to indicate the specified six color levels of a rice leaf associated with the desired amount of N fertilizer for the rice field. Our key design is in a reflective optical architecture that allows us to investigate the amount of light scattered from only one side of the rice leaf. We also show how we implement this needed rice leaf color analyzer by integrating an off-the-shelf 562-nm wavelength light emitting diode (LED), a silicon photodiode, an 8-bit microcontroller, and a 6×1 LED panel in a compact plastic package. Field test results in rice fields confirm that leaf color levels of 1, 2, 3, 5, and 6 are effectively identified and their corresponding amount of N fertilizer can be determined. For the leaf color level of 4, our single-wavelength based rice leaf color analyzer sometimes indicates a higher color level of 5 whose suggested amount of N fertilizer is equal to that for the leaf color level of 4. Other key features include ease of use and upgradability for different color levels.

  11. Old age and underlying interstitial abnormalities are risk factors for development of ARDS after pleurodesis using limited amount of large particle size talc.

    PubMed

    Shinno, Yuki; Kage, Hidenori; Chino, Haruka; Inaba, Atsushi; Arakawa, Sayaka; Noguchi, Satoshi; Amano, Yosuke; Yamauchi, Yasuhiro; Tanaka, Goh; Nagase, Takahide

    2018-01-01

    Talc pleurodesis is commonly performed to manage refractory pleural effusion or pneumothorax. It is considered as a safe procedure as long as a limited amount of large particle size talc is used. However, acute respiratory distress syndrome (ARDS) is a rare but serious complication after talc pleurodesis. We sought to determine the risk factors for the development of ARDS after pleurodesis using a limited amount of large particle size talc. We retrospectively reviewed patients who underwent pleurodesis with talc or OK-432 at the University of Tokyo Hospital. Twenty-seven and 35 patients underwent chemical pleurodesis using large particle size talc (4 g or less) or OK-432, respectively. Four of 27 (15%) patients developed ARDS after talc pleurodesis. Patients who developed ARDS were significantly older than those who did not (median 80 vs 66 years, P = 0.02) and had a higher prevalence of underlying interstitial abnormalities on chest computed tomography (CT; 2/4 vs 1/23, P < 0.05). No patient developed ARDS after pleurodesis with OK-432. This is the first case series of ARDS after pleurodesis using a limited amount of large particle size talc. Older age and underlying interstitial abnormalities on chest CT seem to be risk factors for developing ARDS after talc pleurodesis. © 2017 Asian Pacific Society of Respirology.

  12. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  13. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  14. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  15. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  16. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  17. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  18. Bag For Formulating And Dispersing Intravenous Solution

    NASA Technical Reports Server (NTRS)

    Kipp, Jim; Owens, Jim; Scharf, Mike; Finley, Mike; Dudar, Tom; Veillon, Joe; Ogle, Jim

    1993-01-01

    Large-volume parenteral (LVP) bag in which predetermined amount of sterile solution formulated by combining premeasured, prepackaged amount of sterile solute with predetermined amount of water. Bag designed to hold predetermined amount, typically 1 L, of sterile solution. Sterility of solution maintained during mixing by passing water into bag through sterilizing filter. System used in field or hospitals not having proper sterile facilities, and in field research.

  19. Ion Homeostasis in Chloroplasts under Salinity and Mineral Deficiency 1

    PubMed Central

    Schröppel-Meier, Gabriele; Kaiser, Werner M.

    1988-01-01

    Spinach (Spinacia oleracea var “Yates”) plants in hydroponic culture were exposed to stepwise increased concentrations of NaCl or NaNO3 up to a final concentration of 300 millimoles per liter, at constant Ca2+-concentration. Leaf cell sap and extracts from aqueously isolated spinach chloroplasts were analyzed for mineral cations, anions, amino acids, sugars, and quarternary ammonium compounds. Total osmolality of leaf sap and photosynthetic capacity of leaves were also measured. For comparison, leaf sap from salt-treated pea plants was also analyzed. Spinach plants under NaCl or NaNO3 salinity took up large amounts of sodium (up to 400 millimoles per liter); nitrate as the accompanying anion was taken up less (up to 90 millimoles per liter) than chloride (up to 450 millimoles per liter). Under chloride salinity, nitrate content in leaves decreased drastically, but total amino acid concentrations remained constant. This response was much more pronounced (and occurred at lower salt concentrations) in leaves from the glycophyte (pea, Pisum sativum var “Kleine Rheinländerin”) than from moderately salt-tolerant spinach. In spinach, sodium chloride or nitrate taken up into leaves was largely sequestered in the vacuoles; both salts induced synthesis of quarternary ammonium compounds, which were accumulated mainly in chloroplasts (and cytosol). This prevented impairment of metabolism, as indicated by an unchanged photosynthetic capacity of leaves. PMID:16666232

  20. Low-authority control synthesis for large space structures

    NASA Technical Reports Server (NTRS)

    Aubrun, J. N.; Margulies, G.

    1982-01-01

    The control of vibrations of large space structures by distributed sensors and actuators is studied. A procedure is developed for calculating the feedback loop gains required to achieve specified amounts of damping. For moderate damping (Low Authority Control) the procedure is purely algebraic, but it can be applied iteratively when larger amounts of damping are required and is generalized for arbitrary time invariant systems.

  1. GeoNotebook: Browser based Interactive analysis and visualization workflow for very large climate and geospatial datasets

    NASA Astrophysics Data System (ADS)

    Ozturk, D.; Chaudhary, A.; Votava, P.; Kotfila, C.

    2016-12-01

    Jointly developed by Kitware and NASA Ames, GeoNotebook is an open source tool designed to give the maximum amount of flexibility to analysts, while dramatically simplifying the process of exploring geospatially indexed datasets. Packages like Fiona (backed by GDAL), Shapely, Descartes, Geopandas, and PySAL provide a stack of technologies for reading, transforming, and analyzing geospatial data. Combined with the Jupyter notebook and libraries like matplotlib/Basemap it is possible to generate detailed geospatial visualizations. Unfortunately, visualizations generated is either static or does not perform well for very large datasets. Also, this setup requires a great deal of boilerplate code to create and maintain. Other extensions exist to remedy these problems, but they provide a separate map for each input cell and do not support map interactions that feed back into the python environment. To support interactive data exploration and visualization on large datasets we have developed an extension to the Jupyter notebook that provides a single dynamic map that can be managed from the Python environment, and that can communicate back with a server which can perform operations like data subsetting on a cloud-based cluster.

  2. Dual Use of Cigarettes, Little Cigars, Cigarillos, and Large Cigars: Smoking Topography and Toxicant Exposure.

    PubMed

    Pickworth, Wallace B; Rosenberry, Zachary R; O'Grady, Kevin E; Koszowski, Bartosz

    2017-04-01

    Smoking topography variables and toxicant exposure (plasma nicotine and exhaled CO) were examined in 3 groups of study participants that smoked both cigarettes and either filtered little cigars (Winchester), cigarillos (Black & Mild), or large cigars (Phillies Blunt). Laboratory ad lib smoking of the cigar products was collected with a smoking puff analyzer; plasma levels of nicotine and exhaled CO were collected before and after smoking. Although there were no statistically significant differences in demographic and cigarette smoking topography among the groups, there were significant differences in how the different cigar products were smoked. Plasma nicotine boost was similar after all products but exhaled CO was greater after the cigarillo and large cigar than the little cigar. Some of the differences were due to the differences in article size but other differences were apparent even after adjustment for the amount of tobacco burned or the mouth intake (puff volume). The topography findings of differences among products challenge the practice of grouping cigars as a single entity in surveys, regulatory decisions, and discussions of toxicant exposure. The results add to the discussion of distinctions among products in the scientific assessment of public health risk and regulatory decisions.

  3. Student profiling on university co-curriculum activities using data visualization tools

    NASA Astrophysics Data System (ADS)

    Jamil, Jastini Mohd.; Shaharanee, Izwan Nizal Mohd

    2017-11-01

    Co-curricular activities are playing a vital role in the development of a holistic student. Co-curriculum can be described as an extension of the formal learning experiences in a course or academic program. There are many co-curriculum activities such as students' participation in sports, volunteerism, leadership, entrepreneurship, uniform body, student council, and other social events. The number of student involves in co-curriculum activities are large, thus creating an enormous volume of data including their demographic facts, academic performance and co-curriculum types. The task for discovering and analyzing these information becomes increasingly difficult and hard to comprehend. Data visualization offer a better ways in handling with large volume of information. The need for an understanding of these various co-curriculum activities and their effect towards student performance are essential. Visualizing these information can help related stakeholders to become aware of hidden and interesting information from large amount of data drowning in their student data. The main objective of this study is to provide a clearer understanding of the different trends hidden in the student co-curriculum activities data with related to their activities and academic performances. The data visualization software was used to help visualize the data extracted from the database.

  4. Aqueous Two-Phase Systems at Large Scale: Challenges and Opportunities.

    PubMed

    Torres-Acosta, Mario A; Mayolo-Deloisa, Karla; González-Valdez, José; Rito-Palomares, Marco

    2018-06-07

    Aqueous two-phase systems (ATPS) have proved to be an efficient and integrative operation to enhance recovery of industrially relevant bioproducts. After ATPS discovery, a variety of works have been published regarding their scaling from 10 to 1000 L. Although ATPS have achieved high recovery and purity yields, there is still a gap between their bench-scale use and potential industrial applications. In this context, this review paper critically analyzes ATPS scale-up strategies to enhance the potential industrial adoption. In particular, large-scale operation considerations, different phase separation procedures, the available optimization techniques (univariate, response surface methodology, and genetic algorithms) to maximize recovery and purity and economic modeling to predict large-scale costs, are discussed. ATPS intensification to increase the amount of sample to process at each system, developing recycling strategies and creating highly efficient predictive models, are still areas of great significance that can be further exploited with the use of high-throughput techniques. Moreover, the development of novel ATPS can maximize their specificity increasing the possibilities for the future industry adoption of ATPS. This review work attempts to present the areas of opportunity to increase ATPS attractiveness at industrial levels. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. A divide-and-conquer algorithm for large-scale de novo transcriptome assembly through combining small assemblies from existing algorithms.

    PubMed

    Sze, Sing-Hoi; Parrott, Jonathan J; Tarone, Aaron M

    2017-12-06

    While the continued development of high-throughput sequencing has facilitated studies of entire transcriptomes in non-model organisms, the incorporation of an increasing amount of RNA-Seq libraries has made de novo transcriptome assembly difficult. Although algorithms that can assemble a large amount of RNA-Seq data are available, they are generally very memory-intensive and can only be used to construct small assemblies. We develop a divide-and-conquer strategy that allows these algorithms to be utilized, by subdividing a large RNA-Seq data set into small libraries. Each individual library is assembled independently by an existing algorithm, and a merging algorithm is developed to combine these assemblies by picking a subset of high quality transcripts to form a large transcriptome. When compared to existing algorithms that return a single assembly directly, this strategy achieves comparable or increased accuracy as memory-efficient algorithms that can be used to process a large amount of RNA-Seq data, and comparable or decreased accuracy as memory-intensive algorithms that can only be used to construct small assemblies. Our divide-and-conquer strategy allows memory-intensive de novo transcriptome assembly algorithms to be utilized to construct large assemblies.

  6. Catchment hydrological responses to forest harvest amount and spatial pattern - 2011

    EPA Science Inventory

    We used an ecohydrological model, Visualizing Ecosystems for Land Management Assessments (VELMA), to analyze the effects of forest harvest location and amount on ecosystem carbon (C) and nitrogen (N) dynamics in an intensively studied headwater catchment (WS10) in western Oregon,...

  7. Automated blood-sample handling in the clinical laboratory.

    PubMed

    Godolphin, W; Bodtker, K; Uyeno, D; Goh, L O

    1990-09-01

    The only significant advances in blood-taking in 25 years have been the disposable needle and evacuated blood-drawing tube. With the exception of a few isolated barcode experiments, most sample-tracking is performed through handwritten or computer-printed labels. Attempts to reduce the hazards of centrifugation have resulted in air-tight lids or chambers, the use of which is time-consuming and cumbersome. Most commonly used clinical analyzers require serum or plasma, distributed into specialized containers, unique to that analyzer. Aliquots for different tests are prepared by handpouring or pipetting. Moderate to large clinical laboratories perform so many different tests that even multi-analyzers performing multiple analyses on a single sample may account for only a portion of all tests ordered for a patient. Thus several aliquots of each specimen are usually required. We have developed a proprietary serial centrifuge and blood-collection tube suitable for incorporation into an automated or robotic sample-handling system. The system we propose is (a) safe--avoids or prevents biological danger to the many "handlers" of blood; (b) small--minimizes the amount of sample taken and space required to adapt to the needs of satellite and mobile testing, and direct interfacing with analyzers; (c) serial--permits each sample to be treated according to its own "merits," optimizes throughput, and facilitates flexible automation; and (d) smart--ensures quality results through monitoring and intelligent control of patient identification, sample characteristics, and separation process.

  8. A MBD-seq protocol for large-scale methylome-wide studies with (very) low amounts of DNA.

    PubMed

    Aberg, Karolina A; Chan, Robin F; Shabalin, Andrey A; Zhao, Min; Turecki, Gustavo; Staunstrup, Nicklas Heine; Starnawska, Anna; Mors, Ole; Xie, Lin Y; van den Oord, Edwin Jcg

    2017-09-01

    We recently showed that, after optimization, our methyl-CpG binding domain sequencing (MBD-seq) application approximates the methylome-wide coverage obtained with whole-genome bisulfite sequencing (WGB-seq), but at a cost that enables adequately powered large-scale association studies. A prior drawback of MBD-seq is the relatively large amount of genomic DNA (ideally >1 µg) required to obtain high-quality data. Biomaterials are typically expensive to collect, provide a finite amount of DNA, and may simply not yield sufficient starting material. The ability to use low amounts of DNA will increase the breadth and number of studies that can be conducted. Therefore, we further optimized the enrichment step. With this low starting material protocol, MBD-seq performed equally well, or better, than the protocol requiring ample starting material (>1 µg). Using only 15 ng of DNA as input, there is minimal loss in data quality, achieving 93% of the coverage of WGB-seq (with standard amounts of input DNA) at similar false/positive rates. Furthermore, across a large number of genomic features, the MBD-seq methylation profiles closely tracked those observed for WGB-seq with even slightly larger effect sizes. This suggests that MBD-seq provides similar information about the methylome and classifies methylation status somewhat more accurately. Performance decreases with <15 ng DNA as starting material but, even with as little as 5 ng, MBD-seq still achieves 90% of the coverage of WGB-seq with comparable genome-wide methylation profiles. Thus, the proposed protocol is an attractive option for adequately powered and cost-effective methylome-wide investigations using (very) low amounts of DNA.

  9. DNA amount of X and B chromosomes in the grasshoppers Eyprepocnemis plorans and Locusta migratoria.

    PubMed

    Ruiz-Ruano, F J; Ruiz-Estévez, M; Rodríguez-Pérez, J; López-Pino, J L; Cabrero, J; Camacho, J P M

    2011-01-01

    We analyzed the DNA amount in X and B chromosomes of 2 XX/X0 grasshopper species (Eyprepocnemis plorans and Locusta migratoria), by means of Feulgen image analysis densitometry (FIAD), using previous estimates in L. migratoria as standard (5.89 pg). We first analyzed spermatids of 0B males and found a bimodal distribution of integrated optical densities (IODs), suggesting that one peak corresponded to +X and the other to -X spermatids. The difference between the 2 peaks corresponded to the X chromosome DNA amount, which was 1.28 pg in E. plorans and 0.80 pg in L. migratoria. In addition, the +X peak in E. plorans gave an estimate of the C-value in this species (10.39 pg). We next analyzed diplotene cells from 1B males in E. plorans and +B males in L. migratoria (a species where Bs are mitotically unstable and no integer B number can be defined for an individual) and measured B chromosome IOD relative to X chromosome IOD, within the same cell, taking advantage of the similar degree of condensation for both positively heteropycnotic chromosomes at this meiotic stage. From this proportion, we estimated the DNA amount for 3 different B chromosome variants found in individuals from 3 E. plorans Spanish populations (0.54 pg for B1 from Saladares, 0.51 pg for B2 from Salobreña and 0.64 for B24 from Torrox). Likewise, we estimated the DNA amount of the B chromosome in L. migratoria to be 0.15 pg. To automate measurements, we wrote a GPL3 licensed Python program (pyFIA). We discuss the utility of the present approach for estimating X and B chromosome DNA amount in a variety of situations, and the meaning of the DNA amount estimates for X and B chromosomes in these 2 species. Copyright © 2011 S. Karger AG, Basel.

  10. Shallow and Deep Latent Heating Modes Over Tropical Oceans Observed with TRMM PR Spectral Latent Heating Data

    NASA Technical Reports Server (NTRS)

    Takayabu, Yukari N.; Shige, Shoichi; Tao, Wei-Kuo; Hirota, Nagio

    2010-01-01

    The global hydrological cycle is central to the Earth's climate system, with rainfall and the physics of its formation acting as the key links in the cycle. Two-thirds of global rainfall occurs in the Tropics. Associated with this rainfall is a vast amount of heat, which is known as latent heat. It arises mainly due to the phase change of water vapor condensing into liquid droplets; three-fourths of the total heat energy available to the Earth's atmosphere comes from tropical rainfall. In addition, fresh water provided by tropical rainfall and its variability exerts a large impact upon the structure and motions of the upper ocean layer. Three-dimensional distributions of latent heating estimated from Tropical Rainfall Measuring Mission Precipitation Radar (TRMM PR)utilizing the Spectral Latent Heating (SLH) algorithm are analyzed. Mass-weighted and vertically integrated latent heating averaged over the tropical oceans is estimated as approx.72.6 J/s (approx.2.51 mm/day), and that over tropical land is approx.73.7 J/s (approx.2.55 mm/day), for 30degN-30degS. It is shown that non-drizzle precipitation over tropical and subtropical oceans consists of two dominant modes of rainfall systems, deep systems and congestus. A rough estimate of shallow mode contribution against the total heating is about 46.7 % for the average tropical oceans, which is substantially larger than 23.7 % over tropical land. While cumulus congestus heating linearly correlates with the SST, deep mode is dynamically bounded by large-scale subsidence. It is notable that substantial amount of rain, as large as 2.38 mm day-1 in average, is brought from congestus clouds under the large-scale subsiding circulation. It is also notable that even in the region with SST warmer than 28 oC, large-scale subsidence effectively suppresses the deep convection, remaining the heating by congestus clouds. Our results support that the entrainment of mid-to-lower-tropospheric dry air, which accompanies the large-scale subsidence is the major factor suppressing the deep convection. Therefore, representation of the realistic entrainment is very important for proper reproduction of precipitation distribution and resultant large-scale circulation.

  11. Supply of large woody debris in a stream channel

    USGS Publications Warehouse

    Diehl, Timothy H.; Bryan, Bradley A.

    1993-01-01

    The amount of large woody debris that potentially could be transported to bridge sites was assessed in the basin of the West Harpeth River in Tennessee in the fall of 1992. The assessment was based on inspections of study sites at 12 bridges and examination of channel reaches between bridges. It involved estimating the amount of woody material at least 1.5 meters long, stored in the channel, and not rooted in soil. Study of multiple sites allowed estimation of the amount, characteristics, and sources of debris stored in the channel, and identification of geomorphic features of the channel associated with debris production. Woody debris is plentiful in the channel network, and much of the debris could be transported by a large flood. Tree trunks with attached root masses are the dominant large debris type. Death of these trees is primarily the result of bank erosion. Bank instability seems to be the basin characteristic most useful in identifying basins with a high potential for abundant production of debris.

  12. Diet - liver disease

    MedlinePlus

    ... of toxic waste products. Increasing your intake of carbohydrates to be in proportion with the amount of ... severe liver disease include: Eat large amounts of carbohydrate foods. Carbohydrates should be the major source of ...

  13. Storage and Retrieval of Large RDF Graph Using Hadoop and MapReduce

    NASA Astrophysics Data System (ADS)

    Farhan Husain, Mohammad; Doshi, Pankil; Khan, Latifur; Thuraisingham, Bhavani

    Handling huge amount of data scalably is a matter of concern for a long time. Same is true for semantic web data. Current semantic web frameworks lack this ability. In this paper, we describe a framework that we built using Hadoop to store and retrieve large number of RDF triples. We describe our schema to store RDF data in Hadoop Distribute File System. We also present our algorithms to answer a SPARQL query. We make use of Hadoop's MapReduce framework to actually answer the queries. Our results reveal that we can store huge amount of semantic web data in Hadoop clusters built mostly by cheap commodity class hardware and still can answer queries fast enough. We conclude that ours is a scalable framework, able to handle large amount of RDF data efficiently.

  14. Method and system for formation and withdrawal of a sample from a surface to be analyzed

    DOEpatents

    Van Berkel, Gary J.; Kertesz, Vilmos

    2017-10-03

    A method and system for formation and withdrawal of a sample from a surface to be analyzed utilizes a collection instrument having a port through which a liquid solution is conducted onto the surface to be analyzed. The port is positioned adjacent the surface to be analyzed, and the liquid solution is conducted onto the surface through the port so that the liquid solution conducted onto the surface interacts with material comprising the surface. An amount of material is thereafter withdrawn from the surface. Pressure control can be utilized to manipulate the solution balance at the surface to thereby control the withdrawal of the amount of material from the surface. Furthermore, such pressure control can be coordinated with the movement of the surface relative to the port of the collection instrument within the X-Y plane.

  15. A thermal and chemical degradation approach to decipher pristane and phytane precursors in sedimentary organic matter

    USGS Publications Warehouse

    Koopmans, M.P.; Rijpstra, W.I.C.; Klapwijk, M.M.; De Leeuw, J. W.; Lewan, M.D.; Sinninghe, Damste J.S.

    1999-01-01

    A thermal and chemical degradation approach was followed to determine the precursors of pristane (Pr) and phytane (Ph) in samples from the Gessoso-solfifera, Ghareb and Green River Formations. Hydrous pyrolysis of these samples yields large amounts of Pr and Ph carbon skeletons, indicating that their precursors are predominantly sequestered in high-molecular-weight fractions. However, chemical degradation of the polar fraction and the kerogen of the unheated samples generally does not release large amounts of Pr and Ph. Additional information on the precursors of Pr and Ph is obtained from flash pyrolysis analyses of kerogens and residues after hydrous pyrolysis and after chemical degradation. Multiple precursors for Pr and Ph are recognised in these three samples. The main increase of the Pr/Ph ratio with increasing maturation temperature, which is associated with strongly increasing amounts of Pr and Ph, is probably due to the higher amount of precursors of Pr compared to Ph, and not to the different timing of generation of Pr and Ph.A thermal and chemical degradation approach was followed to determine the precursors of pristane (Pr) and phytane (Ph) in samples from the Gessoso-solfifera, Ghareb and Green River Formations. Hydrous pyrolysis of these samples yields large amounts of Pr and Ph carbon skeletons, indicating that their precursors are predominantly sequestered in high-molecular-weight fractions. However, chemical degradation of the polar fraction and the kerogen of the unheated samples generally does not release large amounts of Pr and Ph. Additional information on the precursors of Pr and Ph is obtained from flash pyrolysis analyses of kerogens and residues after hydrous pyrolysis and after chemical degradation. Multiple precursors for Pr and Ph are recognised in these three samples. The main increase of the Pr/Ph ratio with increasing maturation temperature, which is associated with strongly increasing amounts of Pr and Ph, is probably due to the higher amount of precursors of Pr compared to Ph, and not to the different timing of generation of Pr and Ph.

  16. Spontaneous, generalized lipidosis in captive greater horseshoe bats (Rhinolophus ferrumequinum).

    PubMed

    Gozalo, Alfonso S; Schwiebert, Rebecca S; Metzner, Walter; Lawson, Gregory W

    2005-11-01

    During a routine 6-month quarantine period, 3 of 34 greater horseshoe bats (Rhinolophus ferrumequinum) captured in mainland China and transported to the United States for use in echolocation studies were found dead with no prior history of illness. All animals were in good body condition at the time of death. At necropsy, a large amount of white fat was found within the subcutis, especially in the sacrolumbar region. The liver, kidneys, and heart were diffusely tan in color. Microscopic examination revealed that hepatocytes throughout the liver were filled with lipid, and in some areas, lipid granulomas were present. renal lesions included moderate amounts of lipid in the cortical tubular epithelium and large amounts of protein and lipid within Bowman's capsules in the glomeruli. In addition, one bat had large lipid vacuoles diffusely distributed throughout the myocardium. The exact pathologic mechanism inducing the hepatic, renal, and cardiac lipidosis is unknown. The horseshoe bats were captured during hibernation and immediately transported to the United States. It is possible that the large amount of fat stored coupled with changes in photoperiod, lack of exercise, and/or the stress of captivity might have contributed to altering the normal metabolic processes, leading to anorexia and consequently lipidosis in these animals.

  17. Method for large-scale fabrication of atomic-scale structures on material surfaces using surface vacancies

    DOEpatents

    Lim, Chong Wee; Ohmori, Kenji; Petrov, Ivan Georgiev; Greene, Joseph E.

    2004-07-13

    A method for forming atomic-scale structures on a surface of a substrate on a large-scale includes creating a predetermined amount of surface vacancies on the surface of the substrate by removing an amount of atoms on the surface of the material corresponding to the predetermined amount of the surface vacancies. Once the surface vacancies have been created, atoms of a desired structure material are deposited on the surface of the substrate to enable the surface vacancies and the atoms of the structure material to interact. The interaction causes the atoms of the structure material to form the atomic-scale structures.

  18. Effects of consumption of choline and lecithin on neurological and cardiovascular systems.

    PubMed

    Wood, J L; Allison, R G

    1982-12-01

    This report concerns possible adverse health effects and benefits that might result from consumption of large amounts of choline, lecithin, or phosphatidylcholine. Indications from preliminary investigations that administration of choline or lecithin might alleviate some neurological disturbances, prevent hypercholesteremia and atherosclerosis, and restore memory and cognition have resulted in much research and public interest. Symptoms of tardive dyskinesia and Alzheimer's disease have been ameliorated in some patients and varied responses have been observed in the treatment of Gilles de la Tourette's disease, Friedreich's ataxia, levodopa-induced dyskinesia, mania, Huntington's disease, and myasthenic syndrome. Further clinical trials, especially in conjunction with cholinergic drugs, are considered worthwhile but will require sufficient amounts of pure phosphatidylcholine. The public has access to large amounts of commercial lecithin. Because high intakes of lecithin or choline produce acute gastrointestinal distress, sweating, salivation, and anorexia, it is improbable that individuals will incur lasting health hazards from self-administration of either compound. Development of depression or supersensitivity of dopamine receptors and disturbance of the cholinergic-dopaminergic-serotinergic balance is a concern with prolonged, repeated intakes of large amounts of lecithin.

  19. Understanding the contribution of habitats and regional variation to long-term population trends in tricolored blackbirds

    PubMed Central

    Graves, Emily E; Holyoak, Marcel; Rodd Kelsey, T; Meese, Robert J

    2013-01-01

    Population trends represent a minimum amount of information required to assess the conservation status of a species. However, understanding and detecting trends can be complicated by variation among habitats and regions, and by dispersal connecting habitats through source-sink dynamics. We analyzed trends in breeding populations between habitats and regions to better understand the overall dynamics of a species' decline. Specifically, we analyzed historical trends in breeding populations of tricolored blackbirds (Agelaius tricolor) using breeding records from 1907 to 2009. The species breeds itinerantly and ephemerally uses multiple habitat types and breeding areas, which make interpretation of trends complex. We found overall abundance declines of 63% between 1935 and 1975. Since 1980 overall declines became nonsignificant and obscure despite large amounts of data from 1980 to 2009. Temporal trends differed between breeding habitat types and were associated with regional differences in population declines. A new habitat, triticale crops (a wheat-rye hybrid grain) produced colonies 40× larger, on average, than other breeding habitats, and contributed to a change in regional distribution since it primarily occurred in a single region. The mechanism for such an effect is not clear, but could represent the local availability of foodstuffs in the landscape rather than something specific to triticale crops. While variation in trends among habitats clearly occurred, they could not easily be ascribed to source-sink dynamics, ecological traps, habitat selection or other detailed ecological mechanisms. Nonetheless, such exchanges provide valuable information to guide management of dynamic systems. PMID:24101977

  20. Use Cases for Combining Web Services with ArcPython Tools for Enabling Quality Control of Land Remote Sensing Data Products.

    NASA Astrophysics Data System (ADS)

    Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.

    2016-12-01

    Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.

  1. A Rapid and Accurate Extraction Procedure for Analysing Free Amino Acids in Meat Samples by GC-MS

    PubMed Central

    Barroso, Miguel A.; Ruiz, Jorge; Antequera, Teresa

    2015-01-01

    This study evaluated the use of a mixer mill as the homogenization tool for the extraction of free amino acids in meat samples, with the main goal of analyzing a large number of samples in the shortest time and minimizing sample amount and solvent volume. Ground samples (0.2 g) were mixed with 1.5 mL HCl 0.1 M and homogenized in the mixer mill. The final biphasic system was separated by centrifugation. The supernatant was deproteinized, derivatized and analyzed by gas chromatography. This procedure showed a high extracting ability, especially in samples with high free amino acid content (recovery = 88.73–104.94%). It also showed a low limit of detection and quantification (3.8 · 10−4–6.6 · 10−4  μg μL−1 and 1.3 · 10−3–2.2 · 10−2  μg μL−1, resp.) for most amino acids, an adequate precision (2.15–20.15% for run-to-run), and a linear response for all amino acids (R 2 = 0.741–0.998) in the range of 1–100 µg mL−1. Moreover, it takes less time and requires lower amount of sample and solvent than conventional techniques. Thus, this is a cost and time efficient tool for homogenizing in the extraction procedure of free amino acids from meat samples, being an adequate option for routine analysis. PMID:25873963

  2. Unprecedented Arctic Ozone Loss in 2011

    NASA Image and Video Library

    2011-10-02

    In mid-March 2011, NASA Aura spacecraft observed ozone in Earth stratosphere -- low ozone amounts are shown in purple and grey colors, large amounts of chlorine monoxide are shown in dark blue colors.

  3. Results of initial analyses of the salt (macro) batch 10 tank 21H qualification samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T. B.

    2017-01-01

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Interim Salt Disposition Project (ISDP) Salt (Macro) Batch 10 for processing through the Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU). This document reports the initial results of the analyses of samples of Tank 21H. Analysis of the Tank 21H Salt (Macro) Batch 10 composite sample indicates that the material does not display any unusual characteristics or observations, such as floating solids, the presence of large amount of solids, or unusual colors. Further sample results will be reported in a futuremore » document. This memo satisfies part of Deliverable 3 of the Technical Task Request (TTR).« less

  4. Automating Content Analysis of Open-Ended Responses: Wordscores and Affective Intonation

    PubMed Central

    Baek, Young Min; Cappella, Joseph N.; Bindman, Alyssa

    2014-01-01

    This study presents automated methods for predicting valence and quantifying valenced thoughts of a text. First, it examines whether Wordscores, developed by Laver, Benoit, and Garry (2003), can be adapted to reliably predict the valence of open-ended responses in a survey about bioethical issues in genetics research, and then tests a complementary and novel technique for coding the number of valenced thoughts in open-ended responses, termed Affective Intonation. Results show that Wordscores successfully predicts the valence of brief and grammatically imperfect open-ended responses, and Affective Intonation achieves comparable performance to human coders when estimating number of valenced thoughts. Both Wordscores and Affective Intonation have promise as reliable, effective, and efficient methods when researchers content-analyze large amounts of textual data systematically. PMID:25558294

  5. InGaN pn-junctions grown by PA-MBE: Material characterization and fabrication of nanocolumn electroluminescent devices

    NASA Astrophysics Data System (ADS)

    Gherasoiu, I.; Yu, K. M.; Reichertz, L.; Walukiewicz, W.

    2015-09-01

    PN junctions are basic building blocks of many electronic devices and their performance depends on the structural properties of the component layers and on the type and the amount of the doping impurities incorporated. Magnesium is the common p-type dopant for nitride semiconductors while silicon and more recently germanium are the n-dopants of choice. In this paper, therefore we analyze the quantitative limits for Mg and Ge incorporation on GaN and InGaN with high In content. We also discuss the challenges posed by the growth and characterization of InGaN pn-junctions and we discuss the properties of large area, long wavelength nanocolumn LEDs grown on silicon (1 1 1) by PA-MBE.

  6. Using Big Data to Discover Diagnostics and Therapeutics for Gastrointestinal and Liver Diseases

    PubMed Central

    Wooden, Benjamin; Goossens, Nicolas; Hoshida, Yujin; Friedman, Scott L.

    2016-01-01

    Technologies such as genome sequencing, gene expression profiling, proteomic and metabolomic analyses, electronic medical records, and patient-reported health information have produced large amounts of data, from various populations, cell types, and disorders (big data). However, these data must be integrated and analyzed if they are to produce models or concepts about physiologic function or mechanisms of pathogenesis. Many of these data are available to the public, allowing researchers anywhere to search for markers of specific biologic processes or therapeutic targets for specific diseases or patient types. We review recent advances in the fields of computational and systems biology, and highlight opportunities for researchers to use big data sets in the fields of gastroenterology and hepatology, to complement traditional means of diagnostic and therapeutic discovery. PMID:27773806

  7. scraps: An open-source Python-based analysis package for analyzing and plotting superconducting resonator data

    DOE PAGES

    Carter, Faustin Wirkus; Khaire, Trupti S.; Novosad, Valentyn; ...

    2016-11-07

    We present "scraps" (SuperConducting Analysis and Plotting Software), a Python package designed to aid in the analysis and visualization of large amounts of superconducting resonator data, specifically complex transmission as a function of frequency, acquired at many different temperatures and driving powers. The package includes a least-squares fitting engine as well as a Monte-Carlo Markov Chain sampler for sampling the posterior distribution given priors, marginalizing over nuisance parameters, and estimating covariances. A set of plotting tools for generating publication-quality figures is also provided in the package. Lastly, we discuss the functionality of the software and provide some examples of itsmore » utility on data collected from a niobium-nitride coplanar waveguide resonator fabricated at Argonne National Laboratory.« less

  8. LANDSAT 4 investigations of Thematic Mapper and multispectral scanner applications. [Death Valley, California; Silver Bell Copper Mine, Arizona, and Dulles Airport near Washington, D.C.

    NASA Technical Reports Server (NTRS)

    Lauer, D. T. (Principal Investigator)

    1984-01-01

    The optimum index factor package was used to choose TM band for color compositing. Processing techniques were also used on TM data over several sites to: (1) reduce the amount of data that needs to be processed and analyzed by using statistical methods or by combining full-resolution products with spatially compressed products; (2) digitally process small subareas to improve the visual appearance of large-scale products or to merge different-resolution image data; and (3) evaluate and compare the information content of the different three-band combinations that can be made using the TM data. Results indicate that for some applications the added spectral information over MSS is even more important than the TM's increased spatial resolution.

  9. Results of initial analyses of the salt (macro) batch 11 Tank 21H qualification samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T. B.

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Interim Salt Disposition Project (ISDP) Salt (Macro) Batch 11 for processing through the Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU). This document reports the initial results of the analyses of samples of Tank 21H. Analysis of the Tank 21H Salt (Macro) Batch 11 composite sample indicates that the material does not display any unusual characteristics or observations, such as floating solids, the presence of large amounts of solids, or unusual colors. Further sample results will be reported in a futuremore » document. This memo satisfies part of Deliverable 3 of the Technical Task Request (TTR).« less

  10. Spectrum image analysis tool - A flexible MATLAB solution to analyze EEL and CL spectrum images.

    PubMed

    Schmidt, Franz-Philipp; Hofer, Ferdinand; Krenn, Joachim R

    2017-02-01

    Spectrum imaging techniques, gaining simultaneously structural (image) and spectroscopic data, require appropriate and careful processing to extract information of the dataset. In this article we introduce a MATLAB based software that uses three dimensional data (EEL/CL spectrum image in dm3 format (Gatan Inc.'s DigitalMicrograph ® )) as input. A graphical user interface enables a fast and easy mapping of spectral dependent images and position dependent spectra. First, data processing such as background subtraction, deconvolution and denoising, second, multiple display options including an EEL/CL moviemaker and, third, the applicability on a large amount of data sets with a small work load makes this program an interesting tool to visualize otherwise hidden details. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Using Geostationary Communications Satellites as a Sensor: Telemetry Search Algorithms

    NASA Astrophysics Data System (ADS)

    Cahoy, K.; Carlton, A.; Lohmeyer, W. Q.

    2014-12-01

    For decades, operators and manufacturers have collected large amounts of telemetry from geostationary (GEO) communications satellites to monitor system health and performance, yet this data is rarely mined for scientific purposes. The goal of this work is to mine data archives acquired from commercial operators using new algorithms that can detect when a space weather (or non-space weather) event of interest has occurred or is in progress. We have developed algorithms to statistically analyze power amplifier current and temperature telemetry and identify deviations from nominal operations or other trends of interest. We then examine space weather data to see what role, if any, it might have played. We also closely examine both long and short periods of time before an anomaly to determine whether or not the anomaly could have been predicted.

  12. Using Deep Learning to Analyze the Voices of Stars.

    NASA Astrophysics Data System (ADS)

    Boudreaux, Thomas Macaulay

    2018-01-01

    With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and compare the performance of different deep learning algorithms, including Artifical Neural Netoworks, and Convolutional Neural Networks, in classifing these synthetic data sets as either pulsators, or not observed to vary stars.

  13. Core-Shell Columns in High-Performance Liquid Chromatography: Food Analysis Applications

    PubMed Central

    Preti, Raffaella

    2016-01-01

    The increased separation efficiency provided by the new technology of column packed with core-shell particles in high-performance liquid chromatography (HPLC) has resulted in their widespread diffusion in several analytical fields: from pharmaceutical, biological, environmental, and toxicological. The present paper presents their most recent applications in food analysis. Their use has proved to be particularly advantageous for the determination of compounds at trace levels or when a large amount of samples must be analyzed fast using reliable and solvent-saving apparatus. The literature hereby described shows how the outstanding performances provided by core-shell particles column on a traditional HPLC instruments are comparable to those obtained with a costly UHPLC instrumentation, making this novel column a promising key tool in food analysis. PMID:27143972

  14. The power and limits of a rule-based morpho-semantic parser.

    PubMed Central

    Baud, R. H.; Rassinoux, A. M.; Ruch, P.; Lovis, C.; Scherrer, J. R.

    1999-01-01

    The venue of Electronic Patient Record (EPR) implies an increasing amount of medical texts readily available for processing, as soon as convenient tools are made available. The chief application is text analysis, from which one can drive other disciplines like indexing for retrieval, knowledge representation, translation and inferencing for medical intelligent systems. Prerequisites for a convenient analyzer of medical texts are: building the lexicon, developing semantic representation of the domain, having a large corpus of texts available for statistical analysis, and finally mastering robust and powerful parsing techniques in order to satisfy the constraints of the medical domain. This article aims at presenting an easy-to-use parser ready to be adapted in different settings. It describes its power together with its practical limitations as experienced by the authors. PMID:10566313

  15. The power and limits of a rule-based morpho-semantic parser.

    PubMed

    Baud, R H; Rassinoux, A M; Ruch, P; Lovis, C; Scherrer, J R

    1999-01-01

    The venue of Electronic Patient Record (EPR) implies an increasing amount of medical texts readily available for processing, as soon as convenient tools are made available. The chief application is text analysis, from which one can drive other disciplines like indexing for retrieval, knowledge representation, translation and inferencing for medical intelligent systems. Prerequisites for a convenient analyzer of medical texts are: building the lexicon, developing semantic representation of the domain, having a large corpus of texts available for statistical analysis, and finally mastering robust and powerful parsing techniques in order to satisfy the constraints of the medical domain. This article aims at presenting an easy-to-use parser ready to be adapted in different settings. It describes its power together with its practical limitations as experienced by the authors.

  16. 174Yb 3P1 level relaxation found via weak magnetic field dependence of collision-induced stimulated photon echo

    NASA Astrophysics Data System (ADS)

    Rubtsova, N. N.; Gol’dort, V. G.; Khvorostov, E. B.; Kochubei, S. A.; Reshetov, V. A.

    2018-06-01

    Collision-induced stimulated photon echo generated at transition was analyzed theoretically and investigated experimentally in the gaseous mixture of ytterbium vapour diluted with a large amount of buffer gas xenon in the presence of a weak longitudinal magnetic field. The inter-combination transition of 174Yb (6s2) 1S(6s6p) 3P1 was used; all experimental parameters were carefully controlled for their correspondence to the broad spectral line conditions. The curve representing the collision-induced stimulated photon echo variations versus a weak magnetic field strength showed very good agreement with the corresponding theoretical curve; this agreement permitted getting the decay rates for 174Yb level 3P1 orientation and alignment in collisions with Xe.

  17. Simultaneous stack gas scrubbing wastewater purification

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Variations of a process for removing sulfur dioxide from stack gases and using it to treat municipal waste water are described. The once-through system lowers the pH of the scrubbing water from minor depressions to a pH of about 2.5 under certain conditions. A recycle system uses iron for catalytic oxidation of sulfurous acid to sulfuric acid allowing very large amounts of sulfur dioxide to be absorbed in a small portion of water. The partial recycle system uses municipal wastewater and iron as a scrubbing medium, followed by neutralization of the wastewater with lime to produce an iron hydroxide precipitation which, when removed, produces tertiary quality treated wastewater. The SO2 scrubber is described, test results are analyzed, and a preliminary capital cost estimate for the three processes is included.

  18. Neural Mirroring Systems: Exploring the EEG Mu Rhythm in Human Infancy

    PubMed Central

    Marshall, Peter J.; Meltzoff, Andrew N.

    2010-01-01

    How do human children come to understand the actions of other people? What neural systems are associated with the processing of others’ actions and how do these systems develop, starting in infancy? These questions span cognitive psychology and developmental cognitive neuroscience, and addressing them has important implications for the study of social cognition. A large amount of research has used behavioral measures to investigate infants’ imitation of the actions of other people; a related but smaller literature has begun to use neurobiological measures to study of infants’ action representation. Here we focus on experiments employing electroencephalographic (EEG) techniques for assessing mu rhythm desynchronization in infancy, and analyze how this work illuminates the links between action perception and production prior to the onset of language. PMID:21528008

  19. The radial velocity, velocity dispersion, and mass-to-light ratio of the Sculptor dwarf galaxy

    NASA Technical Reports Server (NTRS)

    Armandroff, T. E.; Da Costa, G. S.

    1986-01-01

    The radial velocity, velocity dispersion, and mass-to-light ratio for 16 K giants in the Sculptor dwarf galaxy are calculated. Spectra at the Ca II triplet are analyzed using cross-correlation techniques in order to obtain the mean velocity of + 107.4 + or - 2.0 km/s. The dimensional velocity dispersion estimated as 6.3 (+1.1, -1.3) km/s is combined with the calculated core radius and observed central surface brightness to produce a mass-to-light ratio of 6.0 in solar units. It is noted that the data indicate that the Sculptor contains a large amount of mass not found in globular clusters, and the mass is either in the form of remnant stars or low-mass dwarfs.

  20. An afterburner-powered methane/steam reformer for a solid oxide fuel cells application

    NASA Astrophysics Data System (ADS)

    Mozdzierz, Marcin; Chalusiak, Maciej; Kimijima, Shinji; Szmyd, Janusz S.; Brus, Grzegorz

    2018-04-01

    Solid oxide fuel cell (SOFC) systems can be fueled by natural gas when the reforming reaction is conducted in a stack. Due to its maturity and safety, indirect internal reforming is usually used. A strong endothermic methane/steam reforming process needs a large amount of heat, and it is convenient to provide thermal energy by burning the remainders of fuel from a cell. In this work, the mathematical model of afterburner-powered methane/steam reformer is proposed. To analyze the effect of a fuel composition on SOFC performance, the zero-dimensional model of a fuel cell connected with a reformer is formulated. It is shown that the highest efficiency of a solid oxide fuel cell is achieved when the steam-to-methane ratio at the reforming reactor inlet is high.

  1. Impacts of cloud superparameterization on projected daily rainfall intensity climate changes in multiple versions of the Community Earth System Model

    DOE PAGES

    Kooperman, Gabriel J.; Pritchard, Michael S.; Burt, Melissa A.; ...

    2016-09-26

    Changes in the character of rainfall are assessed using a holistic set of statistics based on rainfall frequency and amount distributions in climate change experiments with three conventional and superparameterized versions of the Community Atmosphere Model (CAM and SPCAM). Previous work has shown that high-order statistics of present-day rainfall intensity are significantly improved with superparameterization, especially in regions of tropical convection. Globally, the two modeling approaches project a similar future increase in mean rainfall, especially across the Inter-Tropical Convergence Zone (ITCZ) and at high latitudes, but over land, SPCAM predicts a smaller mean change than CAM. Changes in high-order statisticsmore » are similar at high latitudes in the two models but diverge at lower latitudes. In the tropics, SPCAM projects a large intensification of moderate and extreme rain rates in regions of organized convection associated with the Madden Julian Oscillation, ITCZ, monsoons, and tropical waves. In contrast, this signal is missing in all versions of CAM, which are found to be prone to predicting increases in the amount but not intensity of moderate rates. Predictions from SPCAM exhibit a scale-insensitive behavior with little dependence on horizontal resolution for extreme rates, while lower resolution (~2°) versions of CAM are not able to capture the response simulated with higher resolution (~1°). Furthermore, moderate rain rates analyzed by the “amount mode” and “amount median” are found to be especially telling as a diagnostic for evaluating climate model performance and tracing future changes in rainfall statistics to tropical wave modes in SPCAM.« less

  2. Comparison and optimization of detection methods for noroviruses in frozen strawberries containing different amounts of RT-PCR inhibitors.

    PubMed

    Bartsch, Christina; Szabo, Kathrin; Dinh-Thanh, Mai; Schrader, Christina; Trojnar, Eva; Johne, Reimar

    2016-12-01

    Frozen berries have been repeatedly identified as vehicles for norovirus (NoV) transmission causing large gastroenteritis outbreaks. However, virus detection in berries is often hampered by the presence of RT-PCR-inhibiting substances. Here, several virus extraction methods for subsequent real-time RT-PCR-based NoV-RNA detection in strawberries were compared and optimized. NoV recovery rates (RRs) between 0.21 ± 0.13% and 10.29 ± 6.03% were found when five different artificially contaminated strawberry batches were analyzed by the ISO/TS15216-2 method indicating the presence of different amounts of RT-PCR inhibitors. A comparison of five different virus extraction methods using artificially contaminated strawberries containing high amounts of RT-PCR inhibitors revealed the best NoV RRs for the ISO/TS15216 method. Further improvement of NoV RRs from 2.83 ± 2.92% to 15.28 ± 9.73% was achieved by the additional use of Sephacryl(®)-based columns for RNA purification. Testing of 22 frozen strawberry samples from a batch involved in a gastroenteritis outbreak resulted in 5 vs. 13 NoV GI-positive and in 9 vs. 20 NoV GII-positive samples using the original ISO/TS15216 method vs. the extended protocol, respectively. It can be concluded that the inclusion of an additional RNA purification step can increase NoV detection by the ISO/TS15216-2 method in frozen berries containing high amounts of RT-PCR inhibitors. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Historical changes in organic matter input to the muddy sediments along the Zhejiang-Fujian Coast, China over the past 160 years

    USGS Publications Warehouse

    Chen, Li-lei; Liu, Jian; Xing, Lei; Krauss, Ken W.; Wang, Jia-sheng; Xu, Gang; Li, Li

    2017-01-01

    The burial of sedimentary organic matter (SOM) in the large river-influenced estuarine-coastal regions is affected by hydrodynamic sorting, diagenesis and human activities. Typically, the inner shelf region of the East China Sea is a major carbon sink of the Yangtze River-derived fine-grained sediments. Most of the previous work concentrated on the studies of surface sediments or used a single-proxy in this region. In this study, two cores from the Zhejiang-Fujian Coast were analyzed using bulk (TOC, TN and δ13CTOC) and molecular biomarker (n-alkane, brassicasterol, dinosterol and glycerol dialkyl glycerol tetraether lipids) techniques to clarify the sources, spatiotemporal distribution and fate of SOM in the Yangtze River Estuary and adjacent shelf. Results from this study indicated that the effects of diagenesis and diffusion on different sedimentary biomarkers resulted in overestimation of the relative contribution of terrestrial organic matter (%OMterr), compared with those based on δ13CTOC. The amounts of terrestrial plant organic matter (OMplant) and%OMterr in sediments decreased offshore. In contrast, the amounts of marine organic matter (OMmarine) increased offshore, but closer to the Yangtze River mouth, the amounts of soil organic matter (OMsoil) increased. Moreover, the amounts of TOC, OMplant and OMmarine biomarkers increased, but OMsoil and%OMterrdecreased over time in recent decades. Our study suggests that spatial organic matter distribution patterns in marine shelf sediments were controlled primarily by hydrodynamic sorting and nutrient concentrations, and temporally diverse patterns were controlled predominantly by anthropogenic influence (e.g., dam construction and soil conservation, reclamation and agricultural plantations, anthropogenic nutrient input, dust storms, eutrophication, etc) and climate events (e.g., interdecadal climatic jump and heavy rain events) in the geological period.

  4. A multilinear regression methodology to analyze the effect of atmospheric and surface forcing on Arctic clouds

    NASA Astrophysics Data System (ADS)

    Boeke, R.; Taylor, P. C.; Li, Y.

    2017-12-01

    Arctic cloud amount as simulated in CMIP5 models displays large intermodel spread- models disagree on the processes important for cloud formation as well as the radiative impact of clouds. The radiative response to cloud forcing can be better assessed when the drivers of Arctic cloud formation are known. Arctic cloud amount (CA) is a function of both atmospheric and surface conditions, and it is crucial to separate the influences of unique processes to understand why the models are different. This study uses a multilinear regression methodology to determine cloud changes using 3 variables as predictors: lower tropospheric stability (LTS), 500-hPa vertical velocity (ω500), and sea ice concentration (SIC). These three explanatory variables were chosen because their effects on clouds can be attributed to unique climate processes: LTS is a thermodynamic indicator of the relationship between clouds and atmospheric stability, SIC determines the interaction between clouds and the surface, and ω500 is a metric for dynamical change. Vertical, seasonal profiles of necessary variables are obtained from the Coupled Model Intercomparison Project 5 (CMIP5) historical simulation, an ocean-atmosphere couple model forced with the best-estimate natural and anthropogenic radiative forcing from 1850-2005, and statistical significance tests are used to confirm the regression equation. A unique heuristic model will be constructed for each climate model and for observations, and models will be tested by their ability to capture the observed cloud amount and behavior. Lastly, the intermodel spread in Arctic cloud amount will be attributed to individual processes, ranking the relative contributions of each factor to shed light on emergent constraints in the Arctic cloud radiative effect.

  5. Nutritional aspects of second generation soy foods.

    PubMed

    Alezandro, Marcela Roquim; Granato, Daniel; Lajolo, Franco Maria; Genovese, Maria Inés

    2011-05-25

    Samples of 15 second generation soy-based products (n = 3), commercially available, were analyzed for their protein and isoflavone contents and in vitro antioxidant activity, by means of the Folin-Ciocalteu reducing ability, DPPH radical scavenging capacity, and oxygen radical absorbance capacity. Isoflavone identification and quantification were performed by high-performance liquid chromatography. Products containing soy and/or soy-based ingredients represent important sources of protein in addition to the low fat amounts. However, a large variation in isoflavone content and in vitro antioxidant capacity was observed. The isoflavone content varied from 2.4 to 18.1 mg/100 g (FW), and soy kibe and soy sausage presented the highest amounts. Chocolate had the highest antioxidant capacity, but this fact was probably associated with the addition of cocoa liquor, a well-known source of polyphenolics. This study showed that the soy-based foods do not present a significant content of isoflavones when compared with the grain, and their in vitro antioxidant capacity is not related with these compounds but rather to the presence of other phenolics and synthetic antioxidants, such as sodium erythorbate. However, they may represent alternative sources and provide soy protein, isoflavones, and vegetable fat for those who are not ready to eat traditional soy foods.

  6. SEMIPARAMETRIC ZERO-INFLATED MODELING IN MULTI-ETHNIC STUDY OF ATHEROSCLEROSIS (MESA)

    PubMed Central

    Liu, Hai; Ma, Shuangge; Kronmal, Richard; Chan, Kung-Sik

    2013-01-01

    We analyze the Agatston score of coronary artery calcium (CAC) from the Multi-Ethnic Study of Atherosclerosis (MESA) using semi-parametric zero-inflated modeling approach, where the observed CAC scores from this cohort consist of high frequency of zeroes and continuously distributed positive values. Both partially constrained and unconstrained models are considered to investigate the underlying biological processes of CAC development from zero to positive, and from small amount to large amount. Different from existing studies, a model selection procedure based on likelihood cross-validation is adopted to identify the optimal model, which is justified by comparative Monte Carlo studies. A shrinkaged version of cubic regression spline is used for model estimation and variable selection simultaneously. When applying the proposed methods to the MESA data analysis, we show that the two biological mechanisms influencing the initiation of CAC and the magnitude of CAC when it is positive are better characterized by an unconstrained zero-inflated normal model. Our results are significantly different from those in published studies, and may provide further insights into the biological mechanisms underlying CAC development in human. This highly flexible statistical framework can be applied to zero-inflated data analyses in other areas. PMID:23805172

  7. Fatty acids in energy metabolism of the central nervous system.

    PubMed

    Panov, Alexander; Orynbayeva, Zulfiya; Vavilin, Valentin; Lyakhovich, Vyacheslav

    2014-01-01

    In this review, we analyze the current hypotheses regarding energy metabolism in the neurons and astroglia. Recently, it was shown that up to 20% of the total brain's energy is provided by mitochondrial oxidation of fatty acids. However, the existing hypotheses consider glucose, or its derivative lactate, as the only main energy substrate for the brain. Astroglia metabolically supports the neurons by providing lactate as a substrate for neuronal mitochondria. In addition, a significant amount of neuromediators, glutamate and GABA, is transported into neurons and also serves as substrates for mitochondria. Thus, neuronal mitochondria may simultaneously oxidize several substrates. Astrocytes have to replenish the pool of neuromediators by synthesis de novo, which requires large amounts of energy. In this review, we made an attempt to reconcile β-oxidation of fatty acids by astrocytic mitochondria with the existing hypothesis on regulation of aerobic glycolysis. We suggest that, under condition of neuronal excitation, both metabolic pathways may exist simultaneously. We provide experimental evidence that isolated neuronal mitochondria may oxidize palmitoyl carnitine in the presence of other mitochondrial substrates. We also suggest that variations in the brain mitochondrial metabolic phenotype may be associated with different mtDNA haplogroups.

  8. GPUmotif: An Ultra-Fast and Energy-Efficient Motif Analysis Program Using Graphics Processing Units

    PubMed Central

    Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui

    2012-01-01

    Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a “fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/ PMID:22662128

  9. Science & Technology Review October 2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aufderheide III, M B

    This month's issue has the following articles: (1) Important Missions, Great Science, and Innovative Technology--Commentary by Cherry A. Murray; (2) NanoFoil{reg_sign} Solders with Less Heat--Soldering and brazing to join an array of materials are now Soldering and brazing to join an array of materials are now possible without furnaces, torches, or lead; (3) Detecting Radiation on the Move--An award-winning technology can detect even small amounts An award-winning technology can detect even small amounts of radioactive material in transit; (4) Identifying Airborne Pathogens in Time to Respond--A mass spectrometer identifies airborne spores in less than A mass spectrometer identifies airborne sporesmore » in less than a minute with no false positives; (5) Picture Perfect with VisIt--The Livermore-developed software tool VisIt helps scientists The Livermore-developed software tool VisIt helps scientists visualize and analyze large data sets; (6) Revealing the Mysteries of Water--Scientists are using Livermore's Thunder supercomputer and new algorithms to understand the phases of water; and (7) Lightweight Target Generates Bright, Energetic X Rays--Livermore scientists are producing aerogel targets for use in inertial Livermore scientists are producing aerogel targets for use in inertial confinement fusion experiments and radiation-effects testing.« less

  10. Event-Based User Classification in Weibo Media

    PubMed Central

    Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235

  11. Event-based user classification in Weibo media.

    PubMed

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  12. High Conduction Neutron Absorber to Simulate Fast Reactor Environment in an Existing Test Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guillen, Donna; Greenwood, Lawrence R.; Parry, James

    2014-06-22

    A need was determined for a thermal neutron absorbing material that could be cooled in a gas reactor environment without using large amounts of a coolant that would thermalize the neutron flux. A new neutron absorbing material was developed that provided high conduction so a small amount of water would be sufficient for cooling thereby thermalizing the flux as little as possible. An irradiation experiment was performed to assess the effects of radiation and the performance of a new neutron absorbing material. Neutron fluence monitors were placed inside specially fabricated holders within a set of drop-in capsules and irradiated formore » up to four cycles in the Advanced Test Reactor. Following irradiation, the neutron fluence monitor wires were analyzed by gamma and x-ray spectrometry to determine the activities of the activation products. The adjusted neutron fluences were calculated and grouped into three bins – thermal, epithermal and fast to evaluate the spectral shift created by the new material. Fluence monitors were evaluated after four different irradiation periods to evaluate the effects of burn-up in the absorbing material. Additionally, activities of the three highest activity isotopes present in the specimens are given.« less

  13. Fatty Acids in Energy Metabolism of the Central Nervous System

    PubMed Central

    Orynbayeva, Zulfiya; Vavilin, Valentin; Lyakhovich, Vyacheslav

    2014-01-01

    In this review, we analyze the current hypotheses regarding energy metabolism in the neurons and astroglia. Recently, it was shown that up to 20% of the total brain's energy is provided by mitochondrial oxidation of fatty acids. However, the existing hypotheses consider glucose, or its derivative lactate, as the only main energy substrate for the brain. Astroglia metabolically supports the neurons by providing lactate as a substrate for neuronal mitochondria. In addition, a significant amount of neuromediators, glutamate and GABA, is transported into neurons and also serves as substrates for mitochondria. Thus, neuronal mitochondria may simultaneously oxidize several substrates. Astrocytes have to replenish the pool of neuromediators by synthesis de novo, which requires large amounts of energy. In this review, we made an attempt to reconcile β-oxidation of fatty acids by astrocytic mitochondria with the existing hypothesis on regulation of aerobic glycolysis. We suggest that, under condition of neuronal excitation, both metabolic pathways may exist simultaneously. We provide experimental evidence that isolated neuronal mitochondria may oxidize palmitoyl carnitine in the presence of other mitochondrial substrates. We also suggest that variations in the brain mitochondrial metabolic phenotype may be associated with different mtDNA haplogroups. PMID:24883315

  14. Paradoxical Effects of Fruit on Obesity

    PubMed Central

    Sharma, Satya P.; Chung, Hea J.; Kim, Hyeon J.; Hong, Seong T.

    2016-01-01

    Obesity is exponentially increasing regardless of its preventable characteristics. The current measures for preventing obesity have failed to address the severity and prevalence of obesity, so alternative approaches based on nutritional and diet changes are attracting attention for the treatment of obesity. Fruit contains large amounts of simple sugars (glucose, fructose, sucrose, etc.), which are well known to induce obesity. Thus, considering the amount of simple sugars found in fruit, it is reasonable to expect that their consumption should contribute to obesity rather than weight reduction. However, epidemiological research has consistently shown that most types of fruit have anti-obesity effects. Thus, due to their anti-obesity effects as well as their vitamin and mineral contents, health organizations are suggesting the consumption of fruit for weight reduction purposes. These contradictory characteristics of fruit with respect to human body weight management motivated us to study previous research to understand the contribution of different types of fruit to weight management. In this review article, we analyze and discuss the relationships between fruit and their anti-obesity effects based on numerous possible underlying mechanisms, and we conclude that each type of fruit has different effects on body weight. PMID:27754404

  15. GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.

    PubMed

    Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui

    2012-01-01

    Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/

  16. Investigation of the role of the micro-porous layer in polymer electrolyte fuel cells with hydrogen deuterium contrast neutron radiography.

    PubMed

    Cho, Kyu Taek; Mench, Matthew M

    2012-03-28

    In this study, the high resolution hydrogen-deuterium contrast radiography method was applied to elucidate the impact of the micro-porous layer (MPL) on water distribution in the porous fuel cell media. At the steady state, deuterium replaced hydrogen in the anode stream, and the large difference in neutron attenuation of the D(2)O produced at the cathode was used to track the produced water. It was found that the water content peaked in the cathode-side diffusion media (DM) for the cell without MPL, but with an MPL on the anode and cathode DM, the peak water amount was pushed toward the anode, resulting in a relatively flattened water profile through components and demonstrating a liquid barrier effect. Additionally, the dynamic water behavior in diffusion media was analyzed to understand the effect of a MPL and operating conditions. The water content in the DM changed with applied current, although there is a significant amount of residual liquid content that does not appear to be part of capillary channels. The effect of the MPL on irreducible saturation in DM and cell performance was also investigated.

  17. GESearch: An Interactive GUI Tool for Identifying Gene Expression Signature.

    PubMed

    Ye, Ning; Yin, Hengfu; Liu, Jingjing; Dai, Xiaogang; Yin, Tongming

    2015-01-01

    The huge amount of gene expression data generated by microarray and next-generation sequencing technologies present challenges to exploit their biological meanings. When searching for the coexpression genes, the data mining process is largely affected by selection of algorithms. Thus, it is highly desirable to provide multiple options of algorithms in the user-friendly analytical toolkit to explore the gene expression signatures. For this purpose, we developed GESearch, an interactive graphical user interface (GUI) toolkit, which is written in MATLAB and supports a variety of gene expression data files. This analytical toolkit provides four models, including the mean, the regression, the delegate, and the ensemble models, to identify the coexpression genes, and enables the users to filter data and to select gene expression patterns by browsing the display window or by importing knowledge-based genes. Subsequently, the utility of this analytical toolkit is demonstrated by analyzing two sets of real-life microarray datasets from cell-cycle experiments. Overall, we have developed an interactive GUI toolkit that allows for choosing multiple algorithms for analyzing the gene expression signatures.

  18. Matrix Dissolution Techniques Applied to Extract and Quantify Precipitates from a Microalloyed Steel

    NASA Astrophysics Data System (ADS)

    Lu, Junfang; Wiskel, J. Barry; Omotoso, Oladipo; Henein, Hani; Ivey, Douglas G.

    2011-07-01

    Microalloyed steels possess good strength and toughness, as well as excellent weldability; these attributes are necessary for oil and gas pipelines in northern climates. These properties are attributed in part to the presence of nanosized carbide and carbonitride precipitates. To understand the strengthening mechanisms and to optimize the strengthening effects, it is necessary to quantify the size distribution, volume fraction, and chemical speciation of these precipitates. However, characterization techniques suitable for quantifying fine precipitates are limited because of their fine sizes, wide particle size distributions, and low volume fractions. In this article, two matrix dissolution techniques have been developed to extract precipitates from a Grade100 (yield strength of 690 MPa) microalloyed steel. Relatively large volumes of material can be analyzed, and statistically significant quantities of precipitates of different sizes are collected. Transmission electron microscopy (TEM) and X-ray diffraction (XRD) are combined to analyze the chemical speciation of these precipitates. Rietveld refinement of XRD patterns is used to quantify fully the relative amounts of the precipitates. The size distribution of the nanosized precipitates is quantified using dark-field imaging in the TEM.

  19. Multiple-Feature Extracting Modules Based Leak Mining System Design

    PubMed Central

    Cho, Ying-Chiang; Pan, Jen-Yi

    2013-01-01

    Over the years, human dependence on the Internet has increased dramatically. A large amount of information is placed on the Internet and retrieved from it daily, which makes web security in terms of online information a major concern. In recent years, the most problematic issues in web security have been e-mail address leakage and SQL injection attacks. There are many possible causes of information leakage, such as inadequate precautions during the programming process, which lead to the leakage of e-mail addresses entered online or insufficient protection of database information, a loophole that enables malicious users to steal online content. In this paper, we implement a crawler mining system that is equipped with SQL injection vulnerability detection, by means of an algorithm developed for the web crawler. In addition, we analyze portal sites of the governments of various countries or regions in order to investigate the information leaking status of each site. Subsequently, we analyze the database structure and content of each site, using the data collected. Thus, we make use of practical verification in order to focus on information security and privacy through black-box testing. PMID:24453892

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumpala, Ravikumar; Nano Functional Materials Technology Centre, Department of Physics, Indian Institute of Technology Madras, Chennai 600036; Kumar, N.

    Tribo-layer formation and frictional characteristics of the SiC ball were studied with the sliding test against nanocrystalline diamond coating under atmospheric test conditions. Unsteady friction coefficients in the range of 0.04 to 0.1 were observed during the tribo-test. Friction and wear characteristics were found to be influenced by the formation of cohesive tribo-layer (thickness ∼ 1.3 μm) in the wear track of nanocrystalline diamond coating. Hardness of the tribo-layer was measured using nanoindentation technique and low hardness of ∼ 1.2 GPa was observed. The presence of silicon and oxygen in the tribo-layer was noticed by the energy dispersive spectroscopy mappingmore » and the chemical states of the silicon were analyzed using X-ray photoelectron spectroscopy. Large amount of oxygen content in the tribo-layer indicated tribo-oxidation wear mechanism. - Highlights: • Sliding wear and friction characteristics of SiC were studied against NCD coating. • Silicon oxide tribo-layer formation was observed in the NCD coating wear track. • Low hardness 1.2 GPa of tribo-layer was measured using nanoindentation technique. • Chemical states of silicon were analyzed using X-ray photoelectron spectroscopy.« less

  1. Multiple-feature extracting modules based leak mining system design.

    PubMed

    Cho, Ying-Chiang; Pan, Jen-Yi

    2013-01-01

    Over the years, human dependence on the Internet has increased dramatically. A large amount of information is placed on the Internet and retrieved from it daily, which makes web security in terms of online information a major concern. In recent years, the most problematic issues in web security have been e-mail address leakage and SQL injection attacks. There are many possible causes of information leakage, such as inadequate precautions during the programming process, which lead to the leakage of e-mail addresses entered online or insufficient protection of database information, a loophole that enables malicious users to steal online content. In this paper, we implement a crawler mining system that is equipped with SQL injection vulnerability detection, by means of an algorithm developed for the web crawler. In addition, we analyze portal sites of the governments of various countries or regions in order to investigate the information leaking status of each site. Subsequently, we analyze the database structure and content of each site, using the data collected. Thus, we make use of practical verification in order to focus on information security and privacy through black-box testing.

  2. A small amount can make a difference: a prospective human study of the paradoxical coagulation characteristics of hemothorax.

    PubMed

    Smith, W Zachary; Harrison, Hannah B; Salhanick, Marc A; Higgins, Russell A; Ortiz, Alfonso; Olson, John D; Schwacha, Martin G; Harrison, Chantal R; Aydelotte, Jayson D; Stewart, Ronald M; Dent, Daniel L

    2013-12-01

    The evacuated hemothorax has been poorly described because it varies with time, it has been found to be incoagulable, and its potential effect on the coagulation cascade during autotransfusion is largely unknown. This is a prospective descriptive study of adult patients with traumatic chest injury necessitating tube thoracostomy. Pleural and venous samples were analyzed for coagulation, hematology, and electrolytes at 1 to 4 hours after drainage. Pleural samples were also analyzed for their effect on the coagulation cascade via mixing studies. Thirty-four subjects were enrolled with a traumatic hemothorax. The following measured coagulation factors were significantly depleted compared with venous blood: international normalized ratio (>9 vs 1.1) (P < .001) and activated partial thromboplastin time (aPTT) (>180 vs 24.5 seconds) (P < .001). Mixing studies showed a dose-dependent increase in coagulation dilutions through 1:8 (P < .05). An evacuated hemothorax does not vary in composition significantly with time and is incoagulable alone. Mixing studies with hemothorax plasma increased coagulation, raising safety concerns. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. PanCoreGen - Profiling, detecting, annotating protein-coding genes in microbial genomes.

    PubMed

    Paul, Sandip; Bhardwaj, Archana; Bag, Sumit K; Sokurenko, Evgeni V; Chattopadhyay, Sujay

    2015-12-01

    A large amount of genomic data, especially from multiple isolates of a single species, has opened new vistas for microbial genomics analysis. Analyzing the pan-genome (i.e. the sum of genetic repertoire) of microbial species is crucial in understanding the dynamics of molecular evolution, where virulence evolution is of major interest. Here we present PanCoreGen - a standalone application for pan- and core-genomic profiling of microbial protein-coding genes. PanCoreGen overcomes key limitations of the existing pan-genomic analysis tools, and develops an integrated annotation-structure for a species-specific pan-genomic profile. It provides important new features for annotating draft genomes/contigs and detecting unidentified genes in annotated genomes. It also generates user-defined group-specific datasets within the pan-genome. Interestingly, analyzing an example-set of Salmonella genomes, we detect potential footprints of adaptive convergence of horizontally transferred genes in two human-restricted pathogenic serovars - Typhi and Paratyphi A. Overall, PanCoreGen represents a state-of-the-art tool for microbial phylogenomics and pathogenomics study. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Research on the correlation between corona current spectrum and audible noise spectrum of HVDC transmission line

    NASA Astrophysics Data System (ADS)

    Liu, Yingyi; Zhou, Lijuan; Liu, Yuanqing; Yuan, Haiwen; Ji, Liang

    2017-11-01

    Audible noise is closely related to corona current on a high voltage direct current (HVDC) transmission line. In this paper, we measured a large amount of audible noise and corona current waveforms simultaneously based on the largest outdoor HVDC corona cage all over the world. By analyzing the experimental data, the related statistical regularities between a corona current spectrum and an audible noise spectrum were obtained. Furthermore, the generation mechanism of audible noise was analyzed theoretically, and the related mathematical expression between the audible noise spectrum and the corona current spectrum, which is suitable for all of these measuring points in the space, has been established based on the electro-acoustic conversion theory. Finally, combined with the obtained mathematical relation, the internal reasons for these statistical regularities appearing in measured corona current and audible noise data were explained. The results of this paper not only present the statistical association regularities between the corona current spectrum and the audible noise spectrum on a HVDC transmission line, but also reveal the inherent reasons of these associated rules.

  5. Spectroscopic Factors from the Single Neutron Transfer Reaction 111Cd(d,p)112Cd

    NASA Astrophysics Data System (ADS)

    Jamieson, D. S.; Garrett, P. E.; Demand, G. A.; Finlay, P.; Green, K. L.; Leach, K. G.; Phillips, A. A.; Svensson, C. E.; Sumithrarachchi, C. S.; Triambak, S.; Wong, J.; Ball, G.; Faestermann, T.; Krücken, R.; Hertenberger, R.; Wirth, H.-F.

    2013-03-01

    The cadmium isotopes have been cited as excellent examples of vibrational nuclei for decades, with multi-phonon quadrupole, quadrupole-octupole, and mixed-symmetry states proposed. From a variety of experimental studies, a large amount of spectroscopic data has been obtained, recently focused on γ-ray studies. In the present work, the single-particle structure of 112Cd has been investigated using the 111Cd(ěcd, p)112Cd reaction. The investigation was carried out using a 22 MeV beam of polarized deuterons obtained from the Maier-Leibnitz Laboratory at Garching, Germany. The reaction ejectiles were momentum analyzed using a Q3D spectrograph, and 115 levels have been identified up to 4.2 MeV of excitation energy. Spin-parity has been assigned to each analyzed level, and angular distributions for the reaction cross sections and analyzing powers were obtained. Many additional levels have been observed compared with the previous (d,p) study performed with 8 MeV deuterons,1 including strongly populated 5- and 6- states. The former was previously assigned as a member of the quadrupole-octupole quintuplet, based on a strongly enhanced B(E2) value to the 3- state, but is now re-assigned as being predominately s1/2 ⊗ h11/2 configuration.

  6. Modern Day Re-analysis of Pinatubo SO2 Injection, Cloud dispersion and Stratospheric Aerosols

    NASA Astrophysics Data System (ADS)

    Bhartia, P. K.; Krotkov, N. A.; Aquila, V.; Hughes, E. J.; Li, C.; Fisher, B. L.

    2016-12-01

    Cataclysmic June 15 1991 eruption of Mt. Pinatubo injected largest amount of SO2 in the lower stratosphere during the satellite era. The resulting volcanic clouds were tracked by the NASA's Nimbus 7 TOMS sensor that provided first estimates of total emissions of SO2 gas ( 15+/-3 Mt). Over time SO2 converted to long-lasting sulfate aerosols affecting radiation balance and composition of the stratosphere. Large numbers of articles and papers published in the past 25 years make this the most well-studied volcanic eruption. Still, several unresolved scientific issues remain: SO2 injection height, subsequent lofting of SO2 and aerosols in the stratosphere, how much sulfate aerosols were produced in the eruption (i.e., initial sulfate to SO2 ratio), and impact on stratospheric ozone. To answer these questions we have re-analyzed past satellite measurements using modern day tools, such as re-analyzed wind data from Goddard Modeling and Assimilation Office (GMAO), improved trajectory analysis tools, better radiative transfer model to process backscatter UV data from N7/TOMS and two NOAA SBUV/2s sensors, which provided measurements at shorter UV wavelengths that are sensitive to aerosols and SO2 in the mid stratosphere ( 25 km). We have also re-analyzed aerosol data from SAGE, AVHRR, and several instruments on the UARS satellite. These data provide strong support for recent assessment by modelers that the bulk of SO2 mass injected by the volcano was well below the 25 km altitude, contrary to earlier estimates. We also find convincing evidence that there was significant amount of sulfate aerosols embedded even in the day-old SO2 cloud. These results strongly support the hypothesis that SO2 gas self-lofted to 25 km as seeen by UARS MLS several weeks after the eruption and aerosols to 35 km, as seen by the SAGE sensor several months later.

  7. Automatic Approach to Morphological Classification of Galaxies With Analysis of Galaxy Populations in Clusters

    NASA Astrophysics Data System (ADS)

    Sultanova, Madina; Barkhouse, Wayne; Rude, Cody

    2018-01-01

    The classification of galaxies based on their morphology is a field in astrophysics that aims to understand galaxy formation and evolution based on their physical differences. Whether structural differences are due to internal factors or a result of local environment, the dominate mechanism that determines galaxy type needs to be robustly quantified in order to have a thorough grasp of the origin of the different types of galaxies. The main subject of my Ph.D. dissertation is to explore the use of computers to automatically classify and analyze large numbers of galaxies according to their morphology, and to analyze sub-samples of galaxies selected by type to understand galaxy formation in various environments. I have developed a computer code to classify galaxies by measuring five parameters from their images in FITS format. The code was trained and tested using visually classified SDSS galaxies from Galaxy Zoo and the EFIGI data set. I apply my morphology software to numerous galaxies from diverse data sets. Among the data analyzed are the 15 Abell galaxy clusters (0.03 < z < 0.184) from Rude et al. 2017 (in preparation), which were observed by the Canada-France-Hawaii Telescope. Additionally, I studied 57 galaxy clusters from Barkhouse et al. (2007), 77 clusters from the WINGS survey (Fasano et al. 2006), and the six Hubble Space Telescope (HST) Frontier Field galaxy clusters. The high resolution of HST allows me to compare distant clusters with those nearby to look for evolutionary changes in the galaxy cluster population. I use the results from the software to examine the properties (e.g. luminosity functions, radial dependencies, star formation rates) of selected galaxies. Due to the large amount of data that will be available from wide-area surveys in the future, the use of computer software to classify and analyze the morphology of galaxies will be extremely important in terms of efficiency. This research aims to contribute to the solution of this problem.

  8. Quantifying Fugitive Methane Emissions at an Underground Coal Fire Using Cavity Ring-Down Spectroscopy Technology

    NASA Astrophysics Data System (ADS)

    Fleck, D.; Gannon, L.; Kim-Hak, D.; Ide, T.

    2016-12-01

    Understanding methane emissions is of utmost importance due to its greenhouse warming potential. Methane emissions can occur from a variety of natural and anthropogenic sources which include wetlands, landfills, oil/gas/coal extraction activities, underground coal fires, and natural gas distribution systems. Locating and containing these emissions are critical to minimizing their environmental impacts and economically beneficial when retrieving large fugitive amounts. In order to design a way to mitigate these methane emissions, they must first be accurately quantified. One such quantification method is to measure methane fluxes, which is a measurement technique that is calculated based on rate of gas accumulation in a known chamber volume over methane seepages. This allows for quantification of greenhouse gas emissions at a localized level (sub one meter) that can complement remote sensing and other largescale modeling techniques to further paint the picture of emission points. High performance analyzers are required to provide both sufficient temporal resolution and precise concentration measurements in order to make these measurements over only minutes. A method of measuring methane fluxes was developed using the latest portable, battery-powered Cavity Ring-Down Spectroscopy analyzer from Picarro (G4301). In combination with a mobile accumulation chamber, the instrument allows for rapid measurement of methane and carbon dioxide fluxes over wide areas. For this study, methane fluxes that were measured at an underground coal fire near the Four Corners region using the Picarro analyzer are presented. The flux rates collected demonstrate the ability for the analyzer to detect methane fluxes across many orders of magnitude. Measurements were accompanied by simultaneously geotagging the measurements with GPS to georeferenced the data. Methane flux data were instrumental in our ability to characterize the extent and the migration of the underground fire. In the future, examining the tradeoffs and dynamics between methane and carbon dioxide emissions will allow us to further understand the propagation and evolution of these large greenhouse gas emitters.

  9. Identification of various cell culture models for the study of Zika virus

    PubMed Central

    Himmelsbach, Kiyoshi; Hildt, Eberhard

    2018-01-01

    AIM To identify cell culture models supportive for Zika virus (ZIKV) replication. METHODS Various human and non-human cell lines were infected with a defined amount of ZIKV Polynesia strain. Cells were analyzed 48 h post infection for the amount of intracellular and extracellular viral genomes and infectious viral particles by quantitative real-time PCR and virus titration assay. The extent of replication was monitored by immunofluorescence and western blot analysis by using Env and NS1 specific antibodies. Innate immunity was assayed by luciferase reporter assay and immunofluorescence analysis. RESULTS All investigated cell lines except CHO cells supported infection, replication and release of ZIKV. While in infected A549 and Vero cells a pronounced cytopathic effect was observed COS7, 293T and Huh7.5 cells were most resistant. Although the analyzed cell lines released comparable amounts of viral genomes to the supernatant significant differences were found for the number of infectious viral particles. The neuronal cell lines N29.1 and SH-SY5Y released 100 times less infectious viral particles than Vero-, A549- or 293T-cells. However there is no strict correlation between the amount of produced viral particles and the induction of an interferon response in the analyzed cell lines. CONCLUSION The investigated cell lines with their different tissue origins and diverging ZIKV susceptibility display a toolbox for ZIKV research. PMID:29468137

  10. Detection of tiny amounts of fissile materials in large-sized containers with radioactive waste

    NASA Astrophysics Data System (ADS)

    Batyaev, V. F.; Skliarov, S. V.

    2018-01-01

    The paper is devoted to non-destructive control of tiny amounts of fissile materials in large-sized containers filled with radioactive waste (RAW). The aim of this work is to model an active neutron interrogation facility for detection of fissile ma-terials inside NZK type containers with RAW and determine the minimal detectable mass of U-235 as a function of various param-eters: matrix type, nonuniformity of container filling, neutron gen-erator parameters (flux, pulse frequency, pulse duration), meas-urement time. As a result the dependence of minimal detectable mass on fissile materials location inside container is shown. Nonu-niformity of the thermal neutron flux inside a container is the main reason of the space-heterogeneity of minimal detectable mass in-side a large-sized container. Our experiments with tiny amounts of uranium-235 (<1 g) confirm the detection of fissile materials in NZK containers by using active neutron interrogation technique.

  11. Committed emissions from existing and planned power plants and asset stranding required to meet the Paris Agreement

    NASA Astrophysics Data System (ADS)

    Pfeiffer, Alexander; Hepburn, Cameron; Vogt-Schilb, Adrien; Caldecott, Ben

    2018-05-01

    Over the coming decade, the power sector is expected to invest ~7.2 trillion USD in power plants and grids globally, much of it into CO2-emitting coal and gas plants. These assets typically have long lifetimes and commit large amounts of (future) CO2 emissions. Here, we analyze the historic development of emission commitments from power plants and compare the emissions committed by current and planned plants with remaining carbon budgets. Based on this comparison we derive the likely amount of stranded assets that would be required to meet the 1.5 °C–2 °C global warming goal. We find that even though the growth of emission commitments has slowed down in recent years, currently operating generators still commit us to emissions (~300 GtCO2) above the levels compatible with the average 1.5 °C–2 °C scenario (~240 GtCO2). Furthermore, the current pipeline of power plants would add almost the same amount of additional commitments (~270 GtCO2). Even if the entire pipeline was cancelled, therefore, ~20% of global capacity would need to be stranded to meet the climate goals set out in the Paris Agreement. Our results can help companies and investors re-assess their investments in fossil-fuel power plants, and policymakers strengthen their policies to avoid further carbon lock-in.

  12. Structural features and complement-fixing activity of pectin from three Brassica oleracea varieties: white cabbage, kale, and red kale.

    PubMed

    Samuelsen, Anne Berit; Westereng, Bjørge; Yousif, Osman; Holtekjølen, Ann Katrin; Michaelsen, Terje E; Knutsen, Svein H

    2007-02-01

    Leaves of different cabbage species are used both as food and as wound healing remedies in traditional medicine. This supposed wound healing activity might be connected to presence of immunomodulating water soluble polysaccharides. To study this, three different cabbage varieties, white cabbage (W), kale (K), and red kale (RK), were pretreated with 80% ethanol and then extracted with water at 50 degrees C and 100 degrees C for isolation of polysaccharide-containing fractions. The fractions were analyzed for monosaccharide composition, glycosidic linkages, Mw distribution, protein content, and phenolic compounds and then tested for complement-fixing activity. All fractions contained pectin type polysaccharides with linkages corresponding to homogalacturonan and hairy regions. Those extracted at 50 degrees C contained higher amounts of neutral side chains and were more active in the complement-fixation test than those extracted at 100 degrees C. The fractions can be ranged by decreasing activity: K-50 > RK-50 > W-50 approximately = K-100 > RK100 approximately = W-100. Studies on structure-activity relationships (SAR) employing multivariate statistical analysis strongly suggest that the magnitude of the measured activity is influenced by the content of certain side chains in the polymers. High activity correlates to large neutral side chains with high amounts of (1-->6)- and (1-->3,6)-linked Gal and low amounts of (1-->4)-linked GalA but not on molecular weight distribution of the polymers.

  13. The effect of the polymerization initiator and light source on the elution of residual Bis-GMA and TEGDMA monomers: A study using liquid chromatography with UV detection.

    PubMed

    Denis, Aline B; Diagone, Cristina A; Plepis, Ana M G; Viana, Rommel B

    2015-12-05

    A method for the extraction and quantification of two residual monomers, bisphenol glycidyl dimethacrylate (Bis-GMA) and triethylene glycol dimethacrylate (TEGDMA), that were evaluated using high efficiency liquid chromatography with UV detection was developed and validated in this study. Three types of solvents were applied in the extraction of the monomers (methanol, ethanol and acetonitrile), where the highest extraction efficiency was obtained using acetonitrile. The different resins were prepared by photoactivation of Bis-GMA and TEGDMA monomers. Additionally, the effects of the addition of two photoinitiators (camphorquinone (CQ) and phenyl propanodione (PPD) and that of a co-initiator (N,N-dimethyl-p-toluidine) were also analyzed. When only the CQ photoinitiator was used, a smaller amount of residual monomers was obtained, whereas a larger amount was obtained with PPD. When the two photoinitiators were used in the same matrix, however, no significant changes were observed in relation to the amount of residual TEGDMA monomers. For the addition of the co-initiator, there were no large changes in the extraction of residual monomers. The effect of the two photoactivation sources (halogen lamp and LED) led to small differences in the elution of the two monomers, although all of the resins differed significantly when photoactivated with a LED. Quantum chemical calculations using Density Functional Theory were carried out to characterize several molecular properties of each monomer. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Aggregating available soil water holding capacity data for crop yield models

    NASA Technical Reports Server (NTRS)

    Seubert, C. E.; Daughtry, C. S. T.; Holt, D. A.; Baumgardner, M. F.

    1984-01-01

    The total amount of water available to plants that is held against gravity in a soil is usually estimated as the amount present at -0.03 MPa average water potential minus the amount present at -1.5 MPa water potential. This value, designated available water-holding capacity (AWHC), is a very important soil characteristic that is strongly and positively correlated to the inherent productivity of soils. In various applications, including assessing soil moisture status over large areas, it is necessary to group soil types or series as to their productivity. Current methods to classify AWHC of soils consider only total capacity of soil profiles and thus may group together soils which differ greatly in AWHC as a function of depth in the profile. A general approach for evaluating quantitatively the multidimensional nature of AWHC in soils is described. Data for 902 soil profiles, representing 184 soil series, in Indiana were obtained from the Soil Characterization Laboratory at Purdue University. The AWHC for each of ten 150-mm layers in each soil was established, based on soil texture and parent material. A multivariate clustering procedure was used to classify each soil profile into one of 4, 8, or 12 classes based upon ten-dimensional AWHC values. The optimum number of classes depends on the range of AWHC in the population of oil profiles analyzed and on the sensitivity of a crop to differences in distribution of water within the soil profile.

  15. Seasonal variation in drifting eggs and larvae in the upper Yangtze, China.

    PubMed

    Jiang, Wei; Liu, Huan-Zhang; Duan, Zhong-Hua; Cao, Wen-Xuan

    2010-05-01

    From 5 March to 25 July 2008, ichthyoplankton drifting into the Three Gorges Reservoir from the upper reaches of the Yangtze River were sampled daily to investigate the species composition, abundance, and seasonal variation in early-stage fishes in this area. Twenty-eight species belonging to five orders and 17 families or subfamilies were identified by analyzing fish eggs and larvae, and a total of 14.16 billion individuals were estimated drifting through the sampling section during the investigation. Among the ichthyoplankton sampled, species in Cultrinae, Cobitidae, Gobioninae and Gobiidae, along with the common carp (Cyprinus carpio Linnaeus), comprised 89.6% of the total amount. Six peaks of drift density were identified during the sampling period, and a significant correlation was found between drift density with water discharge. The dominant species were different in each drift peak, indicating different spawning times for the major species. The total amount of the four major Chinese carps that drifted through the sampling section was estimated as 0.88 billion, indicating an increase in the population sizes of these species in the upper reaches of the Yangtze River after construction of the Three Gorges Dam. Actually, these reaches have become the largest spawning area for the four major Chinese carps in the Yangtze River. The large total amount of eggs and larvae drifting through this section demonstrated that the upper reaches of the Yangtze River provided important spawning sites for many fish species, and that conservation of this area should be of great concern.

  16. A Rational Approach for Discovering and Validating Cancer Markers in Very Small Samples Using Mass Spectrometry and ELISA Microarrays

    DOE PAGES

    Zangar, Richard C.; Varnum, Susan M.; Covington, Chandice Y.; ...

    2004-01-01

    Identifying useful markers of cancer can be problematic due to limited amounts of sample. Some samples such as nipple aspirate fluid (NAF) or early-stage tumors are inherently small. Other samples such as serum are collected in larger volumes but archives of these samples are very valuable and only small amounts of each sample may be available for a single study. Also, given the diverse nature of cancer and the inherent variability in individual protein levels, it seems likely that the best approach to screen for cancer will be to determine the profile of a battery of proteins. As a result,more » a major challenge in identifying protein markers of disease is the ability to screen many proteins using very small amounts of sample. In this review, we outline some technological advances in proteomics that greatly advance this capability. Specifically, we propose a strategy for identifying markers of breast cancer in NAF that utilizes mass spectrometry (MS) to simultaneously screen hundreds or thousands of proteins in each sample. The best potential markers identified by the MS analysis can then be extensively characterized using an ELISA microarray assay. Because the microarray analysis is quantitative and large numbers of samples can be efficiently analyzed, this approach offers the ability to rapidly assess a battery of selected proteins in a manner that is directly relevant to traditional clinical assays.« less

  17. Radiation therapy and internet - what can patients expect? homepage analysis of german radiotherapy institutions.

    PubMed

    Janssen, Stefan; Meyer, Andreas; Vordermark, Dirk; Steinmann, Diana

    2010-12-01

    the internet as a source of medical information has emerged during the last years. There is a confusing amount of medical websites with a great diversity of quality. Websites of radiotherapy institutions could offer a safe and an easy-to-control way to assist patients' requests. 205 internet appearances of German radiotherapy institutions were analyzed in June 2009 (nonuniversity hospitals n = 108, medical practices n = 62, university hospitals n = 35). For the evaluation of each homepage verifiable criteria concerning basic information, service and medical issues were used. the quality of information published via internet by different radiotherapy institutions showed a large variety. Basic information like telephone numbers, operating hours, and direction guidance were provided in 96.7%, 40%, and 50.7%, respectively. 85% of the websites introduced the staff, 50.2% supplied photos and 14% further information on the attending physicians. The mean amount of continuative links to other websites was 5.4, the mean amount of articles supplying medical information for patients summed up to 4.6. Medical practices and university hospitals had statistically significant more informative articles and links to other websites than nonuniversity hospitals. No statistically significant differences could be found in most other categories like service issues and basic information. internet presences of radiotherapy institutions hold the chance to supply patients with professional and individualized medical information. While some websites are already using this opportunity, others show a lack of basic information or of user-friendliness.

  18. Spectral Characterization of the Wave Energy Resource for Puerto Rico (PR) and the United States Virgin Islands (USVI)

    NASA Astrophysics Data System (ADS)

    Garcia, C. G.; Canals, M.; Irizarry, A. A.

    2016-02-01

    Nowadays a significant amount of wave energy assessments have taken place due to the development of the ocean energy markets worldwide. Energy contained in surface gravity waves is scattered along frequency components that can be described using wave spectra. Correspondingly, characterization and quantification of harvestable wave energy is inherently dictated by the nature of the two-dimensional wave spectrum. The present study uses spectral wave data from the operational SWAN-based CariCOOS Nearshore Wave Model to evaluate the capture efficiency of multiple wave energy converters (WEC). This study revolves around accurately estimating available wave energy as a function of varying spectral distributions, effectively providing a detailed insight concerning local wave conditions for PR and USVI and the resulting available-energy to generated-power ratio. Results in particular, provide a comprehensive characterization of three years' worth of SWAN-based datasets by outlining where higher concentrations of wave energy are localized in the spectrum. Subsequently, the aforementioned datasets were processed to quantify the amount of energy incident on two proposed sites located in PR and USVI. Results were largely influenced by local trade wind activity, which drive predominant sea states, and the amount of North-Atlantic swells that propagate towards the region. Each wave event was numerically analyzed in the frequency domain to evaluate the capacity of a WEC to perform under different spectral distribution scenarios, allowing for a correlation between electrical power output and spectral energy distribution to be established.

  19. Sound Levels in East Texas Schools.

    ERIC Educational Resources Information Center

    Turner, Aaron Lynn

    A survey of sound levels was taken in several Texas schools to determine the amount of noise and sound present by size of class, type of activity, location of building, and the presence of air conditioning and large amounts of glass. The data indicate that class size and relative amounts of glass have no significant bearing on the production of…

  20. What Determines the Amount Students Borrow? Revisiting the Crisis-Convenience Debate

    ERIC Educational Resources Information Center

    Hart, Natala K.; Mustafa, Shoumi

    2008-01-01

    Recent studies have questioned the wisdom in blaming college costs for the escalation of student loans. It would appear that less affluent students borrow large amounts because inexpensive subsidized loans are available. This study attempted to verify the claim, estimating a model of the amount of loan received by students as a function of net…

  1. The metabolism of testosterone by dermal papilla cells cultured from human pubic and axillary hair follicles concurs with hair growth in 5 alpha-reductase deficiency.

    PubMed

    Hamada, K; Thornton, M J; Laing, I; Messenger, A G; Randall, V A

    1996-05-01

    Androgens regulate the growth of many human hair follicles, but only pubic, axillary, and scalp hair growth occur in men with 5 alpha-reductase deficiency. This suggests that 5 alpha-dihydrotestosterone is the active intracellular androgen in androgen-dependent follicles, except in the axilla and pubis. Since the dermal papilla plays a major regulatory role in hair follicles and may be the site of androgen action, we have investigated androgen metabolism in six primary lines of cultured dermal papilla cells from pubic and axillary hair follicles; previous studies have shown that beard cells take up and metabolize testosterone, retaining and secreting 5 alpha-dihydrotestosterone. After 24 h preincubation in serum-free Eagle's medium 199, 100-mm dishes of confluent cells were incubated for 2 h with 5 nM [1,2,6,7-3H]testosterone. Media were collected and the cells washed with phosphate-buffered saline and extracted with chloroform: methanol (2:1). After the addition of unlabeled and 14C-labeled marker steroids, the extracts were analyzed by a two-step thin-layer chromatography system; steroid identity was confirmed by recrystallization to a constant 3H/14C ratio. Beard and pubic dermal papilla cells were also incubated for 24 h, and the medium was analyzed at various times. The results from pubic and axillary primary cell lines were similar. In both cells and media the major steroid identified was testosterone, but significant amounts of androstenedione were present, indicating 17 beta-hydroxysteroid dehydrogenase activity; androstenedione was also identified within the cells, but a small amount of 5 alpha-dihydrotestosterone was only identified in one pubic cell line. Beard dermal papilla cells secreted large amounts of 5 alpha-dihydrotestosterone into the medium over 24 h in contrast to pubic cells, which produced only very small amounts. The pubic and axillary cell results contrasts with the observations of pronounced 5 alpha-dihydrotestosterone in beard cells and confirm that androgen metabolism in cultured dermal papilla cells reflects the parent follicle's ability to respond to androgen in the absence of 5 alpha-reductase type II in vivo. This supports our hypothesis that androgen acts on hair follicles via the dermal papilla and suggests that cultured dermal papilla cells may offer an important model system for studies of androgen action.

  2. Aspects regarding at 13C isotope separation column control using Petri nets system

    NASA Astrophysics Data System (ADS)

    Boca, M. L.; Ciortea, M. E.

    2015-11-01

    This paper is intended to show that Petri nets can be also applicable in the chemical industry. It used linear programming, modeling underlying Petri nets, especially discrete event systems for isotopic separation, the purpose of considering and control events in real-time through graphical representations. In this paper it is simulate the control of 13C Isotope Separation column using Petri nets. The major problem with 13C comes from the difficulty of obtaining it and raising its natural fraction. Carbon isotopes can be obtained using many methods, one of them being the cryogenic distillation of carbon monoxide. Some few aspects regarding operating conditions and the construction of such cryogenic plants are known today, and even less information are available as far as the separation process modeling and control are concerned. In fact, the efficient control of the carbon monoxide distillation process represents a necessity for large-scale 13C production. Referring to a classic distillation process, some models for carbon isotope separation have been proposed, some based on mass, component and energy balance equations, some on the nonlinear wave theory or the Cohen equations. For modeling the system it was used Petri nets because in this case it is deal with discrete event systems. In use of the non-timed and with auxiliary times Petri model, the transport stream was divided into sections and these sections will be analyzed successively. Because of the complexity of the system and the large amount of calculations required it was not possible to analyze the system as a unitary whole. A first attempt to model the system as a unitary whole led to the blocking of the model during simulation, because of the large processing times.

  3. Stress-induced microcrack density evolution in β-eucryptite ceramics: Experimental observations and possible route to strain hardening

    DOE PAGES

    Müller, B. R.; Cooper, R. C.; Lange, A.; ...

    2017-11-01

    In order to investigate their microcracking behaviour, the microstructures of several β-eucryptite ceramics, obtained from glass precursor and cerammed to yield different grain sizes and microcrack densities, were characterized by laboratory and synchrotron x-ray refraction and tomography. Here, results were compared with those obtained from scanning electron microscopy (SEM). In SEM images, the characterized materials appeared fully dense but computed tomography showed the presence of pore clusters. Uniaxial tensile testing was performed on specimens while strain maps were recorded and analyzed by Digital Image Correlation (DIC). X-ray refraction techniques were applied on specimens before and after tensile testing to measuremore » the amount of the internal specific surface (i.e., area per unit volume). X-ray refraction revealed that (a) the small grain size (SGS) material contained a large specific surface, originating from the grain boundaries and the interfaces of TiO 2 precipitates; (b) the medium (MGS) and large grain size (LGS) materials possessed higher amounts of specific surface compared to SGS material due to microcracks, which decreased after tensile loading; (c) the precursor glass had negligible internal surface. The unexpected decrease in the internal surface of MGS and LGS after tensile testing is explained by the presence of compressive regions in the DIC strain maps and further by theoretical arguments. It is suggested that while some microcracks merge via propagation, more close mechanically, thereby explaining the observed X-ray refraction results. Lastly, the mechanisms proposed would allow the development of a strain hardening route in ceramics.« less

  4. Image processing for improved eye-tracking accuracy

    NASA Technical Reports Server (NTRS)

    Mulligan, J. B.; Watson, A. B. (Principal Investigator)

    1997-01-01

    Video cameras provide a simple, noninvasive method for monitoring a subject's eye movements. An important concept is that of the resolution of the system, which is the smallest eye movement that can be reliably detected. While hardware systems are available that estimate direction of gaze in real-time from a video image of the pupil, such systems must limit image processing to attain real-time performance and are limited to a resolution of about 10 arc minutes. Two ways to improve resolution are discussed. The first is to improve the image processing algorithms that are used to derive an estimate. Off-line analysis of the data can improve resolution by at least one order of magnitude for images of the pupil. A second avenue by which to improve resolution is to increase the optical gain of the imaging setup (i.e., the amount of image motion produced by a given eye rotation). Ophthalmoscopic imaging of retinal blood vessels provides increased optical gain and improved immunity to small head movements but requires a highly sensitive camera. The large number of images involved in a typical experiment imposes great demands on the storage, handling, and processing of data. A major bottleneck had been the real-time digitization and storage of large amounts of video imagery, but recent developments in video compression hardware have made this problem tractable at a reasonable cost. Images of both the retina and the pupil can be analyzed successfully using a basic toolbox of image-processing routines (filtering, correlation, thresholding, etc.), which are, for the most part, well suited to implementation on vectorizing supercomputers.

  5. FUn: a framework for interactive visualizations of large, high-dimensional datasets on the web.

    PubMed

    Probst, Daniel; Reymond, Jean-Louis

    2018-04-15

    During the past decade, big data have become a major tool in scientific endeavors. Although statistical methods and algorithms are well-suited for analyzing and summarizing enormous amounts of data, the results do not allow for a visual inspection of the entire data. Current scientific software, including R packages and Python libraries such as ggplot2, matplotlib and plot.ly, do not support interactive visualizations of datasets exceeding 100 000 data points on the web. Other solutions enable the web-based visualization of big data only through data reduction or statistical representations. However, recent hardware developments, especially advancements in graphical processing units, allow for the rendering of millions of data points on a wide range of consumer hardware such as laptops, tablets and mobile phones. Similar to the challenges and opportunities brought to virtually every scientific field by big data, both the visualization of and interaction with copious amounts of data are both demanding and hold great promise. Here we present FUn, a framework consisting of a client (Faerun) and server (Underdark) module, facilitating the creation of web-based, interactive 3D visualizations of large datasets, enabling record level visual inspection. We also introduce a reference implementation providing access to SureChEMBL, a database containing patent information on more than 17 million chemical compounds. The source code and the most recent builds of Faerun and Underdark, Lore.js and the data preprocessing toolchain used in the reference implementation, are available on the project website (http://doc.gdb.tools/fun/). daniel.probst@dcb.unibe.ch or jean-louis.reymond@dcb.unibe.ch.

  6. Thermochemical pretreatment of lignocellulose residues: assessment of the effect on operational conditions and their interactions on the characteristics of leachable fraction.

    PubMed

    Vásquez, Denisse; Contreras, Elsa; Palma, Carolyn; Carvajal, Andrea

    2015-01-01

    Annually, large amounts of agricultural residues are produced in Chile, which can be turned into a good opportunity to diversify the energy matrix. These residues have a slow hydrolysis stage during anaerobic digestion; therefore, the application of a pretreatment seems to be an alternative to improve the process. This work focused on applying a thermochemical pretreatment with NaOH on two lignocellulosic residues. The experiments were performed according to a 2(4) factorial design. The factors studied in a 2(4) factorial design were: temperature (60 and 120 °C), pretreatment time (10 and 30 minutes), NaOH dose (2 and 4%), and residue size (<1 and 1-3 mm for wheat straw; 1-5 and 5-10 mm for corn stover). The analyzed response variables were the solubilization of organic matter, and the biodegradability of the lignocellulose hydrolysate. The statistical analysis of the data allowed the identification of the experimental conditions that maximized solubilization of organic matter and biodegradability. The main results showed that more aggressive experimental conditions could increase the breaking down of the structure; in addition, the time of pretreatment was not significant. Conversely, the less aggressive experimental conditions, regarding regent dosage and downsizing, favored the release of biodegradable organic matter. The main conclusion of this study was the identification of the operational conditions of the thermochemical pretreatment that promote maximum biogas production, which was caused due to the solubilization of a large amount of organic matter, but not because of the increase in biodegradability of the released organic matter.

  7. Nonessential role of beta3 and beta5 integrin subunits for efficient clearance of cellular debris after light-induced photoreceptor degeneration.

    PubMed

    Joly, Sandrine; Samardzija, Marijana; Wenzel, Andreas; Thiersch, Markus; Grimm, Christian

    2009-03-01

    During light-induced photoreceptor degeneration, large amounts of cellular debris are formed that must be cleared from the subretinal space. The integrins alphavbeta5 and alphavbeta3 are involved in the normal physiological process of phagocytosis in the retina. This study was conducted to investigate the question of whether the lack of beta5 and/or beta3 integrin subunits might influence the course of retinal degeneration and/or clearance of photoreceptor debris induced by acute exposure to light. Wild-type, beta5(-/-) and beta3(-/-) single-knockout, and beta3(-/-)/beta5(-/-) Ccl2(-/-)/beta5(-/-) double-knockout mice were exposed to 13,000 lux of white light for 2 hours to induce severe photoreceptor degeneration. Real-time PCR and Western blot analysis were used to analyze gene and protein expression, light- and electron microscopy to judge retinal morphology, and immunofluorescence to study retinal distribution of proteins. Individual or combined deletion of beta3 and beta5 integrin subunits did not affect the pattern of photoreceptor cell loss or the clearance of photoreceptor debris in mice compared with that in wild-type mice. Invading macrophages may contribute to efficient phagocytosis. However, ablation of the MCP-1 gene did not prevent macrophage recruitment. Several chemokines in addition to MCP-1 were induced after light-induced damage that may have compensated for the deletion of MCP-1. Acute clearance of a large amount of cellular debris from the subretinal space involves invading macrophages and does not depend on beta3 and beta5 integrins.

  8. Stress-induced microcrack density evolution in β-eucryptite ceramics: Experimental observations and possible route to strain hardening

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Müller, B. R.; Cooper, R. C.; Lange, A.

    In order to investigate their microcracking behaviour, the microstructures of several β-eucryptite ceramics, obtained from glass precursor and cerammed to yield different grain sizes and microcrack densities, were characterized by laboratory and synchrotron x-ray refraction and tomography. Here, results were compared with those obtained from scanning electron microscopy (SEM). In SEM images, the characterized materials appeared fully dense but computed tomography showed the presence of pore clusters. Uniaxial tensile testing was performed on specimens while strain maps were recorded and analyzed by Digital Image Correlation (DIC). X-ray refraction techniques were applied on specimens before and after tensile testing to measuremore » the amount of the internal specific surface (i.e., area per unit volume). X-ray refraction revealed that (a) the small grain size (SGS) material contained a large specific surface, originating from the grain boundaries and the interfaces of TiO 2 precipitates; (b) the medium (MGS) and large grain size (LGS) materials possessed higher amounts of specific surface compared to SGS material due to microcracks, which decreased after tensile loading; (c) the precursor glass had negligible internal surface. The unexpected decrease in the internal surface of MGS and LGS after tensile testing is explained by the presence of compressive regions in the DIC strain maps and further by theoretical arguments. It is suggested that while some microcracks merge via propagation, more close mechanically, thereby explaining the observed X-ray refraction results. Lastly, the mechanisms proposed would allow the development of a strain hardening route in ceramics.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Yun; Gong, Daoyi; Fan, Jiwen

    Long-term observational data reveal that both the frequency and amount of light rain have decreased in eastern China (EC) for 1956-2005 with high spatial coherency. This is different from the trend of total rainfall observed in EC, which decreases in northern EC and increases in southern EC. To examine the cause of the light rain trends, we analyzed the long-term variability of atmospheric water vapor and its correlation with light rain events. Results show very weak relationships between large-scale moisture transport and light rain in EC. This suggests that light rain trend in EC is not driven by large-scale circulationmore » changes. Because of human activities, pollutant emission has increased dramatically in China for the last few decades, leading to significant reductions in visibility between 1960 and 2000. Cloud-resolving model simulations show that aerosols corresponding to heavily polluted conditions can significantly increase the cloud droplet number concentration (CDNC) and reduce droplet sizes compared to pristine conditions. This can lead to a significant decline in raindrop concentration and delay raindrop formation because smaller cloud droplets are less efficient in the collision and coalescence processes. Together with weaker convection, the precipitation frequency and amount are significantly reduced in the polluted case. Satellite data also reveal higher CDNC and smaller droplet size over polluted land in EC relative to pristine regions, which is consistent with the model results. This evidence suggests that the significantly increased aerosol particles produced by air pollution are at least partly responsible for the decreased light rain events observed in China over the past fifty years.« less

  10. Effective and efficient analysis of spatio-temporal data

    NASA Astrophysics Data System (ADS)

    Zhang, Zhongnan

    Spatio-temporal data mining, i.e., mining knowledge from large amount of spatio-temporal data, is a highly demanding field because huge amounts of spatio-temporal data have been collected in various applications, ranging from remote sensing, to geographical information systems (GIS), computer cartography, environmental assessment and planning, etc. The collection data far exceeded human's ability to analyze which make it crucial to develop analysis tools. Recent studies on data mining have extended to the scope of data mining from relational and transactional datasets to spatial and temporal datasets. Among the various forms of spatio-temporal data, remote sensing images play an important role, due to the growing wide-spreading of outer space satellites. In this dissertation, we proposed two approaches to analyze the remote sensing data. The first one is about applying association rules mining onto images processing. Each image was divided into a number of image blocks. We built a spatial relationship for these blocks during the dividing process. This made a large number of images into a spatio-temporal dataset since each image was shot in time-series. The second one implemented co-occurrence patterns discovery from these images. The generated patterns represent subsets of spatial features that are located together in space and time. A weather analysis is composed of individual analysis of several meteorological variables. These variables include temperature, pressure, dew point, wind, clouds, visibility and so on. Local-scale models provide detailed analysis and forecasts of meteorological phenomena ranging from a few kilometers to about 100 kilometers in size. When some of above meteorological variables have some special change tendency, some kind of severe weather will happen in most cases. Using the discovery of association rules, we found that some special meteorological variables' changing has tight relation with some severe weather situation that will happen very soon. This dissertation is composed of three parts: an introduction, some basic knowledges and relative works, and my own three contributions to the development of approaches for spatio-temporal data mining: DYSTAL algorithm, STARSI algorithm, and COSTCOP+ algorithm.

  11. The Status of Pragmatics among Iranian EFL Learners

    ERIC Educational Resources Information Center

    Mohammad-Bagheri, Mehri

    2015-01-01

    The present study attempted to investigate the status of pragmatics among Iranian EFL learners. Status of pragmatics was analyzed in terms of the amount of pragmatic knowledge EFL learners believed to have and the amount of pragmatic knowledge they believed to receive from teachers, classmates, course books, and exams. Additionally, attempts were…

  12. [Monitoring of methyl methacrylate monomer released from autopolymerized denture base polymers during processing using time-of-flight mass spectrometer].

    PubMed

    Ma, Yu-juan; Cui, Hua-peng; Li, Hai-yang

    2011-04-01

    To analyze the amount and tendency of methyl methacrylate (MMA) released from autopolymerized denture base polymer (self-curing resin) during processing using time-of-flight mass spectrometer (TOF-MS). Self-curing resin was mixed in the container using a ratio of 2 g of powder to 1 g of liquid in accordance with the manufacturer's instructions for 40 s as a specimen. The amount of MMA released from the specimen was continuously monitored and simultaneously recorded every minute by TOF-MS since immediately after mixing. A total of five specimens were monitored. The amount of MMA increased dramatically at 11 min [(45.2 ± 3.5) mg/L] after mixing, and reached the highest level at 13 min [(228.9 ± 22.6) mg/L], then become stable at 23 min [(8.8 ± 2.3) mg/L] after mixing. The releasing tendency of MMA could be analyzed accurately with continuously monitoring during processing. The amount of MMA released from self-curing resin changed rapidly and the processing was complicated and changeful.

  13. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berres, Anne Sabine; Adhinarayanan, Vignesh; Turton, Terece

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline atmore » the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.« less

  14. Neutron-induced Backgrounds in 134Xe for Large-Scale Neutrinoless Double-Beta Decay Experiments

    NASA Astrophysics Data System (ADS)

    Moriguchi, Nina; Kidd, Mary; Tornow, Werner

    2016-09-01

    136Xe is used in large neutrinoless double-beta (0 νββ) decay experiments, such as KamLAND- Zen and EXO 200. Though highly purified, 136Xe still contains a significant amount of 134Xe. Recently, a new nuclear energy level was found in 134Xe. If 134Xe decays from this proposed excited state, it will emit a 2485.7 keV gamma ray. Because this energy lies near the region of interest of 136Xe νββ decay experiments (Q value 2457.8 keV), it could make a significant contribution to the background. A purified gaseous sample of 134Xe will be irradiated with neutrons of an incident energy of 4.0 MeV at Triangle Universities Nuclear Laboratory and monitored with high-purity germanium detectors. The spectra obtained from these detectors will be analyzed for the presence of the 2581 keV gamma ray. We will report on the status of this experiment. Future plans include expanding this measurement to higher initial neutron energies. Tennesse Tech University CISE Grant program.

  15. Nuclear weapons tests and environmental consequences: a global perspective.

    PubMed

    Prăvălie, Remus

    2014-10-01

    The beginning of the atomic age marked the outset of nuclear weapons testing, which is responsible for the radioactive contamination of a large number of sites worldwide. The paper aims to analyze nuclear weapons tests conducted in the second half of the twentieth century, highlighting the impact of radioactive pollution on the atmospheric, aquatic, and underground environments. Special attention was given to the concentration of main radioactive isotopes which were released, such as ¹⁴C, ¹³⁷Cs, and ⁹⁰Sr, generally stored in the atmosphere and marine environment. In addition, an attempt was made to trace the spatial delimitation of the most heavily contaminated sites worldwide, and to note the human exposure which has caused a significantly increased incidence of thyroidal cancer locally and regionally. The United States is one of the important examples of assessing the correlation between the increase in the thyroid cancer incidence rate and the continental-scale radioactive contamination with ¹³¹I, a radioactive isotope which was released in large amounts during the nuclear tests carried out in the main test site, Nevada.

  16. Profiling and Improving I/O Performance of a Large-Scale Climate Scientific Application

    NASA Technical Reports Server (NTRS)

    Liu, Zhuo; Wang, Bin; Wang, Teng; Tian, Yuan; Xu, Cong; Wang, Yandong; Yu, Weikuan; Cruz, Carlos A.; Zhou, Shujia; Clune, Tom; hide

    2013-01-01

    Exascale computing systems are soon to emerge, which will pose great challenges on the huge gap between computing and I/O performance. Many large-scale scientific applications play an important role in our daily life. The huge amounts of data generated by such applications require highly parallel and efficient I/O management policies. In this paper, we adopt a mission-critical scientific application, GEOS-5, as a case to profile and analyze the communication and I/O issues that are preventing applications from fully utilizing the underlying parallel storage systems. Through in-detail architectural and experimental characterization, we observe that current legacy I/O schemes incur significant network communication overheads and are unable to fully parallelize the data access, thus degrading applications' I/O performance and scalability. To address these inefficiencies, we redesign its I/O framework along with a set of parallel I/O techniques to achieve high scalability and performance. Evaluation results on the NASA discover cluster show that our optimization of GEOS-5 with ADIOS has led to significant performance improvements compared to the original GEOS-5 implementation.

  17. Study on Material Selection of Reactor Pressure Vessel of SCWR

    NASA Astrophysics Data System (ADS)

    Ma, Shuli; Luo, Ying; Yin, Qinwei; Li, Changxiang; Xie, Guofu

    This paper first analyzes the feasibility of SA-508 Grade 3 Class 1 Steel as an alternative material for Supercritical Water-Cooled Reactor (SCWR) Reactor Pressure Vessel (RPV). This kind of steel is limited to be applied in SCWR RPV due to its quenching property, though large forging could be accomplished by domestic manufacturers in forging aspect. Therefore, steels with higher strength and better quenching property are needed for SWCR RPV. The chemical component of SA-508 Gr.3 Cl.2 steel is similar to that of SA-508 Gr.3 Cl.1 steel, and more appropriate matching of strength and toughness could be achieved by the adjusting the elements contents, as well as proper control of tempering temperature and time. In light of the fact that Cl.2 steel has been successfully applied to steam generator, it could be an alternative material for SWCR RPV. SA-508 Gr.4N steel with high strength and good toughness is another alternative material for SCWR RPV. But large amount of research work before application is still needed for the lack of data on welding and irradiation etc.

  18. Blended Wing Body (BWB) Boundary Layer Ingestion (BLI) Inlet Configuration and System Studies

    NASA Technical Reports Server (NTRS)

    Kawai, Ronald T.; Friedman, Douglas M.; Serrano, Leonel

    2006-01-01

    A study was conducted to determine the potential reduction in fuel burned for BLI (boundary layer ingestion) inlets on a BWB (blended wing body) airplane employing AFC (active flow control). The BWB is a revolutionary type airplane configuration with engines on the aft upper surface where thick boundary layer offers the greatest opportunity for ram drag reduction. AFC is an emerging technology for boundary layer control. Several BLI inlet configurations were analyzed in the NASA-developed RANS Overflow CFD code. The study determined that, while large reductions in ram drag result from BLI, lower inlet pressure recovery produces engine performance penalties that largely offset this ram drag reduction. AFC could, however, enable a short BLI inlet that allows surface mounting of the engine which, when coupled with a short diffuser, would significantly reduce drag and weight for a potential 10% reduction in fuel burned. Continuing studies are therefore recommended to achieve this reduction in fuel burned considering the use of more modest amounts of BLI coupled with both AFC and PFC (Passive Flow Control) to produce a fail-operational system.

  19. Monitoring pathogens from irradiated agriculture products

    NASA Astrophysics Data System (ADS)

    Butterweck, Joseph S.

    The final food and environmental safety assessment of agriculture product irradiation can only be determined by product history. Product history will be used for future research and development, regulations, commercial practices and implementation of agriculture and food irradiation on a regional basis. The commercial irradiator treats large varieties and amounts of products that are used in various environments. It, in time, will generate a large data base of product history. Field product monitoring begins when food irradiation progresses from the pilot/demonstration phase to the commercial phase. At that time, it is important that there be in place a monitoring system to collect and analyze field data. The systems managers, public health authorities and exotic disease specialists will use this information to assess the reduction of food pathogens on the populace and the environment. The common sources of monitoring data are as follows: 1) Host Monitoring a) Medical Diagnosis b) Autopsy c) Serology Surveys 2) Environmental Monitoring a) Sentinel b) Pest Surveys/Microbial Counts c) Sanitary Inspections 3) Food Industries Quality Assurance Monitoring a) End Product Inspection b) Complaints c) Continual Use of the Product

  20. An evaluation of the suitability of ERTS data for the purposes of petroleum exploration. [Anadarko Basin of Texas and Oklahoma

    NASA Technical Reports Server (NTRS)

    Collins, R. J.; Mccown, F. P.; Stonis, L. P.; Petzel, G.; Everett, J. R.

    1974-01-01

    This experiment was designed to determine the types and amounts of information valuable to petroleum exploration extractable from ERTS data and the cost of obtaining the information using traditional or conventional means. It was desired that an evaluation of this new petroleum exploration tool be made in a geologically well known area in order to assess its usefulness in an unknown area. The Anadarko Basin lies in western Oklahoma and the panhandle of Texas. It was chosen as a test site because there is a great deal of published information available on the surface and subsurface geology of the area, and there are many known structures that act as traps for hydrocarbons. This basin is similar to several other large epicontinental sedimentary basins. It was found that ERTS imagery is an excellent tool for reconnaissance exploration of large sedimentary basins or new exploration provinces. For the first time, small and medium size oil companies can rapidly and effectively analyze exploration provinces as a whole.

  1. Computer literacy for life sciences: helping the digital-era biology undergraduates face today's research.

    PubMed

    Smolinski, Tomasz G

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of computers in their lives, seem to be largely unfamiliar with how computers are being used to pursue and answer such questions. This article describes an innovative undergraduate-level course, titled Computer Literacy for Life Sciences, that aims to teach students the basics of a computerized scientific research pursuit. The purpose of the course is for students to develop a hands-on working experience in using standard computer software tools as well as computer techniques and methodologies used in life sciences research. This paper provides a detailed description of the didactical tools and assessment methods used in and outside of the classroom as well as a discussion of the lessons learned during the first installment of the course taught at Emory University in fall semester 2009.

  2. The Diversity Present in 5140 Human Mitochondrial Genomes

    PubMed Central

    Pereira, Luísa; Freitas, Fernando; Fernandes, Verónica; Pereira, Joana B.; Costa, Marta D.; Costa, Stephanie; Máximo, Valdemar; Macaulay, Vincent; Rocha, Ricardo; Samuels, David C.

    2009-01-01

    We analyzed the current status (as of the end of August 2008) of human mitochondrial genomes deposited in GenBank, amounting to 5140 complete or coding-region sequences, in order to present an overall picture of the diversity present in the mitochondrial DNA of the global human population. To perform this task, we developed mtDNA-GeneSyn, a computer tool that identifies and exhaustedly classifies the diversity present in large genetic data sets. The diversity observed in the 5140 human mitochondrial genomes was compared with all possible transitions and transversions from the standard human mitochondrial reference genome. This comparison showed that tRNA and rRNA secondary structures have a large effect in limiting the diversity of the human mitochondrial sequences, whereas for the protein-coding genes there is a bias toward less variation at the second codon positions. The analysis of the observed amino acid variations showed a tolerance of variations that convert between the amino acids V, I, A, M, and T. This defines a group of amino acids with similar chemical properties that can interconvert by a single transition. PMID:19426953

  3. Interactive and coordinated visualization approaches for biological data analysis.

    PubMed

    Cruz, António; Arrais, Joel P; Machado, Penousal

    2018-03-26

    The field of computational biology has become largely dependent on data visualization tools to analyze the increasing quantities of data gathered through the use of new and growing technologies. Aside from the volume, which often results in large amounts of noise and complex relationships with no clear structure, the visualization of biological data sets is hindered by their heterogeneity, as data are obtained from different sources and contain a wide variety of attributes, including spatial and temporal information. This requires visualization approaches that are able to not only represent various data structures simultaneously but also provide exploratory methods that allow the identification of meaningful relationships that would not be perceptible through data analysis algorithms alone. In this article, we present a survey of visualization approaches applied to the analysis of biological data. We focus on graph-based visualizations and tools that use coordinated multiple views to represent high-dimensional multivariate data, in particular time series gene expression, protein-protein interaction networks and biological pathways. We then discuss how these methods can be used to help solve the current challenges surrounding the visualization of complex biological data sets.

  4. Snow cover, snowmelt and runoff in the Himalayan River basins

    NASA Technical Reports Server (NTRS)

    Dey, B.; Sharma, V. K.; Goswami, D. C.; Rao, P. Subba

    1988-01-01

    Not withstanding the seasonal vagaries of both rainfall amount and snowcover extent, the Himalayan rivers retain their basic perennial character. However, it is the component of snowmelt yield that accounts for some 60 to 70 percent of the total annual flow volumes from Hamilayan watersheds. On this large hydropotential predominantly depends the temporal performance of hydropower generation and major irrigation projects. The large scale effects of Himalayan snowcover on the hydrologic responses of a few selected catchments in western Himalayas was studied. The antecedent effects of snowcover area on long and short term meltwater yields can best be analyzed by developing appropriate hydrologic models forecasting the pattern of snowmelt as a function of variations in snowcover area. It is hoped that these models would be of practical value in the management of water resources. The predictability of meltwater for the entire snowmelt season was studied, as was the concurrent flow variation in adjacent watersheds, and their hydrologic significance. And the applicability of the Snowmelt-Runoff Model for real time forecast of daily discharges during the major part of the snowmelt season is examined.

  5. Large Eddy Simulation study of the development of finite-channel lock-release currents at high Grashof numbers

    NASA Astrophysics Data System (ADS)

    Ooi, Seng-Keat

    2005-11-01

    Lock-exchange gravity current flows produced by the instantaneous release of a heavy fluid are investigated using 3-D well resolved Large Eddy Simulation simulations at Grashof numbers up to 8*10^9. It is found the 3-D simulations correctly predict a constant front velocity over the initial slumping phase and a front speed decrease proportional to t-1/3 (the time t is measured from the release) over the inviscid phase, in agreement with theory. The evolution of the current in the simulations is found to be similar to that observed experimentally by Hacker et al. (1996). The effect of the dynamic LES model on the solutions is discussed. The energy budget of the current is discussed and the contribution of the turbulent dissipation to the total dissipation is analyzed. The limitations of less expensive 2D simulations are discussed; in particular their failure to correctly predict the spatio-temporal distributions of the bed shear stresses which is important in determining the amount of sediment the gravity current can entrain in the case in advances of a loose bed.

  6. Lost in transportation: Information measures and cognitive limits in multilayer navigation.

    PubMed

    Gallotti, Riccardo; Porter, Mason A; Barthelemy, Marc

    2016-02-01

    Cities and their transportation systems become increasingly complex and multimodal as they grow, and it is natural to wonder whether it is possible to quantitatively characterize our difficulty navigating in them and whether such navigation exceeds our cognitive limits. A transition between different search strategies for navigating in metropolitan maps has been observed for large, complex metropolitan networks. This evidence suggests the existence of a limit associated with cognitive overload and caused by a large amount of information that needs to be processed. In this light, we analyzed the world's 15 largest metropolitan networks and estimated the information limit for determining a trip in a transportation system to be on the order of 8 bits. Similar to the "Dunbar number," which represents a limit to the size of an individual's friendship circle, our cognitive limit suggests that maps should not consist of more than 250 connection points to be easily readable. We also show that including connections with other transportation modes dramatically increases the information needed to navigate in multilayer transportation networks. In large cities such as New York, Paris, and Tokyo, more than 80% of the trips are above the 8-bit limit. Multimodal transportation systems in large cities have thus already exceeded human cognitive limits and, consequently, the traditional view of navigation in cities has to be revised substantially.

  7. Lost in transportation: Information measures and cognitive limits in multilayer navigation

    PubMed Central

    Gallotti, Riccardo; Porter, Mason A.; Barthelemy, Marc

    2016-01-01

    Cities and their transportation systems become increasingly complex and multimodal as they grow, and it is natural to wonder whether it is possible to quantitatively characterize our difficulty navigating in them and whether such navigation exceeds our cognitive limits. A transition between different search strategies for navigating in metropolitan maps has been observed for large, complex metropolitan networks. This evidence suggests the existence of a limit associated with cognitive overload and caused by a large amount of information that needs to be processed. In this light, we analyzed the world’s 15 largest metropolitan networks and estimated the information limit for determining a trip in a transportation system to be on the order of 8 bits. Similar to the “Dunbar number,” which represents a limit to the size of an individual’s friendship circle, our cognitive limit suggests that maps should not consist of more than 250 connection points to be easily readable. We also show that including connections with other transportation modes dramatically increases the information needed to navigate in multilayer transportation networks. In large cities such as New York, Paris, and Tokyo, more than 80% of the trips are above the 8-bit limit. Multimodal transportation systems in large cities have thus already exceeded human cognitive limits and, consequently, the traditional view of navigation in cities has to be revised substantially. PMID:26989769

  8. 4P: fast computing of population genetics statistics from large DNA polymorphism panels

    PubMed Central

    Benazzo, Andrea; Panziera, Alex; Bertorelle, Giorgio

    2015-01-01

    Massive DNA sequencing has significantly increased the amount of data available for population genetics and molecular ecology studies. However, the parallel computation of simple statistics within and between populations from large panels of polymorphic sites is not yet available, making the exploratory analyses of a set or subset of data a very laborious task. Here, we present 4P (parallel processing of polymorphism panels), a stand-alone software program for the rapid computation of genetic variation statistics (including the joint frequency spectrum) from millions of DNA variants in multiple individuals and multiple populations. It handles a standard input file format commonly used to store DNA variation from empirical or simulation experiments. The computational performance of 4P was evaluated using large SNP (single nucleotide polymorphism) datasets from human genomes or obtained by simulations. 4P was faster or much faster than other comparable programs, and the impact of parallel computing using multicore computers or servers was evident. 4P is a useful tool for biologists who need a simple and rapid computer program to run exploratory population genetics analyses in large panels of genomic data. It is also particularly suitable to analyze multiple data sets produced in simulation studies. Unix, Windows, and MacOs versions are provided, as well as the source code for easier pipeline implementations. PMID:25628874

  9. Segtor: Rapid Annotation of Genomic Coordinates and Single Nucleotide Variations Using Segment Trees

    PubMed Central

    Renaud, Gabriel; Neves, Pedro; Folador, Edson Luiz; Ferreira, Carlos Gil; Passetti, Fabio

    2011-01-01

    Various research projects often involve determining the relative position of genomic coordinates, intervals, single nucleotide variations (SNVs), insertions, deletions and translocations with respect to genes and their potential impact on protein translation. Due to the tremendous increase in throughput brought by the use of next-generation sequencing, investigators are routinely faced with the need to annotate very large datasets. We present Segtor, a tool to annotate large sets of genomic coordinates, intervals, SNVs, indels and translocations. Our tool uses segment trees built using the start and end coordinates of the genomic features the user wishes to use instead of storing them in a database management system. The software also produces annotation statistics to allow users to visualize how many coordinates were found within various portions of genes. Our system currently can be made to work with any species available on the UCSC Genome Browser. Segtor is a suitable tool for groups, especially those with limited access to programmers or with interest to analyze large amounts of individual genomes, who wish to determine the relative position of very large sets of mapped reads and subsequently annotate observed mutations between the reads and the reference. Segtor (http://lbbc.inca.gov.br/segtor/) is an open-source tool that can be freely downloaded for non-profit use. We also provide a web interface for testing purposes. PMID:22069465

  10. Studies of Coronae and Large Volcanoes on Venus: Constraining the Diverse Outcomes of Small-Scale Mantle Upwellings on Venus

    NASA Technical Reports Server (NTRS)

    Stofan, Ellen R.

    2005-01-01

    Proxemy Research had a grant from NASA to perform science research on upwelling and volcanism on Venus. This was a 3 year Planetary Geology and Geophysics grant to E. Stofan, entitled Coronae and Large volcanoes on Venus. This grant closes on 12/31/05. Here we summarize the scientific progress and accomplishments of this grant. Scientific publications and abstracts of presentations are indicated in the final section. This was a very productive grant and the progress that was made is summarized. Attention is drawn to the publications and abstracts published in each year. The proposal consisted of two tasks, one examining coronae and one studying large volcanoes. The corona task (Task 1) consisted of three parts: 1) a statistical study of the updated corona population, with Sue Smrekar, Lori Glaze, Paula Martin and Steve Baloga; 2) geologic analysis of several specific groups of coronae, with Sue Smrekar and others; and 3) determining the histories and significance of a number of coronae with extreme amounts of volcanism, with Sue Smrekar. Task 2, studies of large volcanoes, consisted of two subtasks. In the first, we studied the geologic history of several volcanoes, with John Guest, Peter Grindrod, Antony Brian and Steve Anderson. In the second subtask, I analyzed a number of Venusian volcanoes with evidence of summit diking along with Peter Grindrod and Francis Nimmo.

  11. Supraventricular tachycardia induced by chocolate: is chocolate too sweet for the heart?

    PubMed

    Parasramka, Saurabh; Dufresne, Alix

    2012-09-01

    Conflicting studies have been published concerning the association between chocolate and cardiovascular diseases. Fewer articles have described the potential arrhythmogenic risk related to chocolate intake. We present a case of paroxysmal supraventricular tachycardia in a woman after consumption of large quantity of chocolate. A 53-year-old woman with no significant medical history presented to us with complaints of palpitations and shortness of breath after consuming large amounts of chocolate. Electrocardiogram showed supraventricular tachycardia at 165 beats per minute, which was restored to sinus rhythm after adenosine bolus injection. Electrophysiology studies showed atrioventricular nodal reentry tachycardia, which was treated with radiofrequency ablation. Chocolate contains caffeine and theobromine, which are methylxanthines and are competitive antagonists of adenosine and can have arrhythmogenic potential. Our case very well describes an episode of tachycardia precipitated by large amount of chocolate consumption in a patient with underlying substrate. There are occasional case reports describing association between chocolate, caffeine, and arrhythmias. A large Danish study, however, did not find any association between amount of daily caffeine consumption and risk of arrhythmia.

  12. North-South precipitation patterns in western North America on interannual-to-decadal timescales

    USGS Publications Warehouse

    Dettinger, M.D.; Cayan, D.R.; Diaz, Henry F.; Meko, D.M.

    1998-01-01

    The overall amount of precipitation deposited along the West Coast and western cordillera of North America from 25??to 55??N varies from year to year, and superimposed on this domain-average variability are varying north-south contrasts on timescales from at least interannual to interdecadal. In order to better understand the north-south precipitation contrasts, their interannual and decadal variations are studied in terms of how much they affect overall precipitation amounts and how they are related to large-scale climatic patterns. Spatial empirical orthogonal functions (EOFs) and spatial moments (domain average, central latitude, and latitudinal spread) of zonally averaged precipitation anomalies along the westernmost parts of North America are analyzed, and each is correlated with global sea level pressure (SLP) and sea surface temperature series, on interannual (defined here as 3-7 yr) and decadal (>7 yr) timescales. The interannual band considered here corresponds to timescales that are particularly strong in tropical climate variations and thus is expected to contain much precipitation variability that is related to El Nino-Southern Oscillation; the decadal scale is defined so as to capture the whole range of long-term climatic variations affecting western North America. Zonal EOFs of the interannual and decadal filtered versions of the zonal-precipitation series are remarkably similar. At both timescales, two leading EOFs describe 1) a north-south seesaw of precipitation pivoting near 40??N and 2) variations in precipitation near 40??N, respectively. The amount of overall precipitation variability is only about 10% of the mean and is largely determined by precipitation variations around 40??-45??N and most consistently influenced by nearby circulation patterns; in this sense, domain-average precipitation is closely related to the second EOF. The central latitude and latitudinal spread of precipitation distributions are strongly influenced by precipitation variations in the southern parts of western North America and are closely related to the first EOF. Central latitude of precipitation moves south (north) with tropical warming (cooling) in association with midlatitude western Pacific SLP variations, on both interannual and decadal timescales. Regional patterns and zonal averages of precipitation-sensitive tree-ring series are used to corroborate these patterns and to extend them into the past and appear to share much long- and short-term information with the instrumentally based zonal precipitation EOFs and moments.The overall amount of precipitation deposited along the West Coast and western cordillera of North America from 25?? to 55 ??N varies from year to year, and superimposed on this domain-average variability are varying north-south contrasts on timescales from at least interannual to interdecadal. In order to better understand the north-south precipitation contrasts, their interannual and decadal variations are studied in terms of how much they affect overall precipitation amounts and how they are related to large-scale climatic patterns. Spatial empirical orthogonal functions (EOFs) and spatial moments (domain average, central latitude, and latitudinal spread) of zonally averaged precipitation anomalies along the westernmost parts of North America are analyzed, and each is correlated with global sea level pressure (SLP) and sea surface temperature series, on interannual (defined here as 3-7 yr) and decadal (>7 yr) timescales. The interannual band considered here corresponds to timescales that are particularly strong in tropical climate variations and thus is expected to contain much precipitation variability that is related to El Nino-Southern Oscillation; the decadal scale is defined so as to capture the whole range of long-term climatic variations affecting western North America. Zonal EOFs of the interannual and decadal filtered versions of the zonal-precipitation series are remarkably similar. At both tim

  13. Changes in size of deforested patches in the Brazilian Amazon.

    PubMed

    Rosa, Isabel M D; Souza, Carlos; Ewers, Robert M

    2012-10-01

    Different deforestation agents, such as small farmers and large agricultural businesses, create different spatial patterns of deforestation. We analyzed the proportion of deforestation associated with different-sized clearings in the Brazilian Amazon from 2002 through 2009. We used annual deforestation maps to determine total area deforested and the size distribution of deforested patches per year. The size distribution of deforested areas changed over time in a consistent, directional manner. Large clearings (>1000 ha) comprised progressively smaller amounts of total annual deforestation. The number of smaller clearings (6.25-50.00 ha) remained unchanged over time. Small clearings accounted for 73% of all deforestation in 2009, up from 30% in 2002, whereas the proportion of deforestation attributable to large clearings decreased from 13% to 3% between 2002 and 2009. Large clearings were concentrated in Mato Grosso, but also occurred in eastern Pará and in Rondônia. In 2002 large clearings accounted for 17%, 15%, and 10% of all deforestation in Mato Grosso, Pará, and Rondônia, respectively. Even in these states, where there is a highly developed agricultural business dominated by soybean production and cattle ranching, the proportional contribution of large clearings to total deforestation declined. By 2009 large clearings accounted for 2.5%, 3.5%, and 1% of all deforestation in Mato Grosso, Pará, and Rondônia, respectively. These changes in deforestation patch size are coincident with the implementation of new conservation policies by the Brazilian government, which suggests that these policies are not effectively reducing the number of small clearings in primary forest, whether these are caused by large landholders or smallholders, but have been more effective at reducing the frequency of larger clearings. ©2012 Society for Conservation Biology.

  14. A Cost Benefit Analysis of Emerging LED Water Purification Systems in Expeditionary Environments

    DTIC Science & Technology

    2017-03-23

    the initial contingency response phase, ROWPUs are powered by large generators which require relatively large amounts of fossil fuels. The amount of...they attract and cling together forming a larger particle (Chem Treat, 2016). Flocculation is the addition of a polymer to water that clumps...smaller particles together to form larger particles. The idea for both methods is that larger particles will either settle out of or be removed from the

  15. Galaxy And Mass Assembly (GAMA): the connection between metals, specific SFR and H I gas in galaxies: the Z-SSFR relation

    NASA Astrophysics Data System (ADS)

    Lara-López, M. A.; Hopkins, A. M.; López-Sánchez, A. R.; Brough, S.; Colless, M.; Bland-Hawthorn, J.; Driver, S.; Foster, C.; Liske, J.; Loveday, J.; Robotham, A. S. G.; Sharp, R. G.; Steele, O.; Taylor, E. N.

    2013-06-01

    We study the interplay between gas phase metallicity (Z), specific star formation rate (SSFR) and neutral hydrogen gas (H I) for galaxies of different stellar masses. Our study uses spectroscopic data from Galaxy and Mass Assembly and Sloan Digital Sky Survey (SDSS) star-forming galaxies, as well as H I detection from the Arecibo Legacy Fast Arecibo L-band Feed Array (ALFALFA) and Galex Arecibo SDSS Survey (GASS) public catalogues. We present a model based on the Z-SSFR relation that shows that at a given stellar mass, depending on the amount of gas, galaxies will follow opposite behaviours. Low-mass galaxies with a large amount of gas will show high SSFR and low metallicities, while low-mass galaxies with small amounts of gas will show lower SSFR and high metallicities. In contrast, massive galaxies with a large amount of gas will show moderate SSFR and high metallicities, while massive galaxies with small amounts of gas will show low SSFR and low metallicities. Using ALFALFA and GASS counterparts, we find that the amount of gas is related to those drastic differences in Z and SSFR for galaxies of a similar stellar mass.

  16. Detection of BrO plumes over various sources using OMI and GOME-2 measurements

    NASA Astrophysics Data System (ADS)

    Seo, Sora; Richter, Andreas; Blechschmidt, Anne-Marlene; Burrows, John P.

    2017-04-01

    Reactive halogen species (RHS) play important roles in the chemistry of the stratosphere and troposphere. They are responsible for ozone depletion through catalytic reaction cycles, changes in the OH/HO2 and NO/NO2 ratios, and oxidation of compounds such as gaseous elemental mercury (GEM) and dimethyl sulphide (DMS). Thus, monitoring of halogen oxides is important for understanding global atmospheric oxidation capacity and climate change. Bromine monoxide (BrO) is one of the most common active halogen oxides. In the troposphere, large amounts of bromine are detected in Polar Regions in spring, over salt lakes, and in volcanic plumes. In this study, we analyse BrO column densities using OMI and GOME-2 observations. The measured spectra from both UV-visible nadir satellites were analyzed using the differential optical absorption spectroscopy (DOAS) method with different settings depending on the instrumental characteristics. Large amounts of volcanic BrO from the Kasatochi eruption in 2008 were detected for 6 days from August 8 to August 13. Especially large BrO amounts were found in the plume center for 3 days from August 9 to 11 with slant column densities (SCD) of up to ˜1.6x1015 molecules cm-2 and ˜5.5x1014 molecules cm-2 in OMI and GOME-2 measurements, respectively. In addition to the volcanic sources, events of widespread BrO enhancements were also observed over the Arctic and Antarctic coastal regions during the spring time by both satellites. As the overpass time of the two instruments is not the same, differences between the two data sets are expected. In this study, the agreement between OMI and GOME-2 BrO data is investigated using both the operational products and different DOAS fits. Systematic differences are found in BrO slant columns and fitting residuals, both being larger in the case of OMI data. In addition, results are sensitive to the choice of fitting window. From a monitoring point of view, due to the higher spatial resolution of OMI compared to GOME-2, OMI results are better suited for observing the shape variation and transport pattern of volcanic BrO. This will be further improved with upcoming the European Sentinel 5 Precursor satellite which has an even higher spatial resolution (3.5 / 7x7 km2).

  17. The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James; hide

    2016-01-01

    Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.

  18. [Value influence of different compatibilities of main active parts in yangyintongnao granule on pharmacokinetics parameters in rats with cerebral ischemia reperfusion injury by total amount statistic moment method].

    PubMed

    Guo, Ying; Yang, Jiehong; Znang, Hengyi; Fu, Xuchun; Zhnag, Yuyan; Wan, Haitong

    2010-02-01

    To study the influence of the different combinations of the main active parts in Yangyintongnao granule on the pharmacokinetics parameters of the two active components--ligustrazine and puerarin using the method of total amount statistic moment for pharmacokinetics. Combinations were formed according to the dosages of the four active parts (alkaloid, flavone, saponin, naphtha) by orthogonal experiment L9 (3(4)). Blood concentrations of ligustrazine and puerarin were determinated by HPLC at different time. Zero rank moment (AUC) and one rank moment (MRT, mean residence time) of ligustrazine and puerarin have been worked out to calculate the total amount statistic moment parameters was analyzed of Yangyintongnao granule by the method of the total amount statistic moment. The influence of different compatibilities on the pharmacokinetics parameters was analyzed by orthogonal test. Flavone has the strongest effect than saponin on the total AUC. Ligustrazine has the strongest effect on the total MRT. Saponin has little effect on the two parameters, but naphtha has more effect on both of them. It indicates that naphtha may promote metabolism of ligustrazine and puerarin in rat. Total amount statistic moment parameters can be used to guide for compatibilities of TCM.

  19. Threshold amounts of organic carbon needed to initiate reductive dechlorination in groundwater systems

    USGS Publications Warehouse

    Chapelle, Francis H.; Thomas, Lashun K.; Bradley, Paul M.; Rectanus, Heather V.; Widdowson, Mark A.

    2012-01-01

    Aquifer sediment and groundwater chemistry data from 15 Department of Defense facilities located throughout the United States were collected and analyzed with the goal of estimating the amount of natural organic carbon needed to initiate reductive dechlorination in groundwater systems. Aquifer sediments were analyzed for hydroxylamine and NaOH-extractable organic carbon, yielding a probable underestimate of potentially bioavailable organic carbon (PBOC). Aquifer sediments were also analyzed for total organic carbon (TOC) using an elemental combustion analyzer, yielding a probable overestimate of bioavailable carbon. Concentrations of PBOC correlated linearly with TOC with a slope near one. However, concentrations of PBOC were consistently five to ten times lower than TOC. When mean concentrations of dissolved oxygen observed at each site were plotted versus PBOC, it showed that anoxic conditions were initiated at approximately 200 mg/kg of PBOC. Similarly, the accumulation of reductive dechlorination daughter products relative to parent compounds increased at a PBOC concentration of approximately 200 mg/kg. Concentrations of total hydrolysable amino acids (THAA) in sediments also increased at approximately 200 mg/kg, and bioassays showed that sediment CO2 production correlated positively with THAA. The results of this study provide an estimate for threshold amounts of bioavailable carbon present in aquifer sediments (approximately 200 mg/kg of PBOC; approximately 1,000 to 2,000 mg/kg of TOC) needed to support reductive dechlorination in groundwater systems.

  20. Development of an automated analysis system for data from flow cytometric intracellular cytokine staining assays from clinical vaccine trials

    PubMed Central

    Shulman, Nick; Bellew, Matthew; Snelling, George; Carter, Donald; Huang, Yunda; Li, Hongli; Self, Steven G.; McElrath, M. Juliana; De Rosa, Stephen C.

    2008-01-01

    Background Intracellular cytokine staining (ICS) by multiparameter flow cytometry is one of the primary methods for determining T cell immunogenicity in HIV-1 clinical vaccine trials. Data analysis requires considerable expertise and time. The amount of data is quickly increasing as more and larger trials are performed, and thus there is a critical need for high throughput methods of data analysis. Methods A web based flow cytometric analysis system, LabKey Flow, was developed for analyses of data from standardized ICS assays. A gating template was created manually in commercially-available flow cytometric analysis software. Using this template, the system automatically compensated and analyzed all data sets. Quality control queries were designed to identify potentially incorrect sample collections. Results Comparison of the semi-automated analysis performed by LabKey Flow and the manual analysis performed using FlowJo software demonstrated excellent concordance (concordance correlation coefficient >0.990). Manual inspection of the analyses performed by LabKey Flow for 8-color ICS data files from several clinical vaccine trials indicates that template gates can appropriately be used for most data sets. Conclusions The semi-automated LabKey Flow analysis system can analyze accurately large ICS data files. Routine use of the system does not require specialized expertise. This high-throughput analysis will provide great utility for rapid evaluation of complex multiparameter flow cytometric measurements collected from large clinical trials. PMID:18615598

  1. Skin secretion peptides: the molecular facet of the deimatic behavior of the four-eyed frog, Physalaemus nattereri (Anura, Leptodactylidae).

    PubMed

    Barbosa, Eder Alves; Iembo, Tatiane; Martins, Graciella Ribeiro; Silva, Luciano Paulino; Prates, Maura Vianna; Andrade, Alan Carvalho; Bloch, Carlos

    2015-11-15

    Amphibians can produce a large amount of bioactive peptides over the skin. In order to map the precise tissue localization of these compounds and evaluate their functions, mass spectrometry imaging (MSI) and gene expression studies were used to investigate a possible correlation between molecules involved in the antimicrobial defense mechanisms and anti-predatory behavior by Physalaemus nattereri. Total skin secretion of P. nattereri was analyzed by classical Protein Chemistry and proteomic techniques. Intact inguinal macroglands were dissected from the rest of the skin and both tissues were analyzed by MSI and real-time polymerase chain reaction (RT-PCR) experiments. Peptides were primarily identified by de novo sequencing, automatic Edman degradation and cDNA data. Fifteen bradykinin (BK)-related peptides and two antimicrobial peptides were sequenced and mapped by MSI on the inguinal macrogland and the rest of P. nattereri skin. RT-PCR results revealed that BK-related peptide levels of expression were about 30,000 times higher on the inguinal macroglands than on the any other region of the skin, whilst antimicrobial peptide ions appear to be evenly distributed in both investigated regions. The presence of antimicrobial peptides in all investigated tissue regions is in accordance with the defensive role against microorganisms thoroughly demonstrated in the literature, whereas BK-related molecules are largely found on the inguinal macroglands suggesting an intriguing link between their noxious activities against potential predators of P. nattereri and the frog's deimatic behavior. Copyright © 2015 John Wiley & Sons, Ltd.

  2. DWARF – a data warehouse system for analyzing protein families

    PubMed Central

    Fischer, Markus; Thai, Quan K; Grieb, Melanie; Pleiss, Jürgen

    2006-01-01

    Background The emerging field of integrative bioinformatics provides the tools to organize and systematically analyze vast amounts of highly diverse biological data and thus allows to gain a novel understanding of complex biological systems. The data warehouse DWARF applies integrative bioinformatics approaches to the analysis of large protein families. Description The data warehouse system DWARF integrates data on sequence, structure, and functional annotation for protein fold families. The underlying relational data model consists of three major sections representing entities related to the protein (biochemical function, source organism, classification to homologous families and superfamilies), the protein sequence (position-specific annotation, mutant information), and the protein structure (secondary structure information, superimposed tertiary structure). Tools for extracting, transforming and loading data from public available resources (ExPDB, GenBank, DSSP) are provided to populate the database. The data can be accessed by an interface for searching and browsing, and by analysis tools that operate on annotation, sequence, or structure. We applied DWARF to the family of α/β-hydrolases to host the Lipase Engineering database. Release 2.3 contains 6138 sequences and 167 experimentally determined protein structures, which are assigned to 37 superfamilies 103 homologous families. Conclusion DWARF has been designed for constructing databases of large structurally related protein families and for evaluating their sequence-structure-function relationships by a systematic analysis of sequence, structure and functional annotation. It has been applied to predict biochemical properties from sequence, and serves as a valuable tool for protein engineering. PMID:17094801

  3. Antibody recognition of the glycoprotein g of viral haemorrhagic septicemia virus (VHSV) purified in large amounts from insect larvae

    PubMed Central

    2011-01-01

    Background There are currently no purification methods capable of producing the large amounts of fish rhabdoviral glycoprotein G (gpG) required for diagnosis and immunisation purposes or for studying structure and molecular mechanisms of action of this molecule (ie. pH-dependent membrane fusion). As a result of the unavailability of large amounts of the gpG from viral haemorrhagic septicaemia rhabdovirus (VHSV), one of the most dangerous viruses affecting cultured salmonid species, research interests in this field are severely hampered. Previous purification methods to obtain recombinant gpG from VHSV in E. coli, yeast and baculovirus grown in insect cells have not produced soluble conformations or acceptable yields. The development of large-scale purification methods for gpGs will also further research into other fish rhabdoviruses, such as infectious haematopoietic necrosis virus (IHNV), spring carp viremia virus (SVCV), hirame rhabdovirus (HIRRV) and snakehead rhabdovirus (SHRV). Findings Here we designed a method to produce milligram amounts of soluble VHSV gpG. Only the transmembrane and carboxy terminal-deleted (amino acid 21 to 465) gpG was efficiently expressed in insect larvae. Recognition of G21-465 by ß-mercaptoethanol-dependent neutralizing monoclonal antibodies (N-MAbs) and pH-dependent recognition by sera from VHSV-hyperimmunized or VHSV-infected rainbow trout (Oncorhynchus mykiss) was demonstrated. Conclusions Given that the purified G21-465 conserved some of its most important properties, this method might be suitable for the large-scale production of fish rhabdoviral gpGs for use in diagnosis, fusion and antigenicity studies. PMID:21693048

  4. Antibody recognition of the glycoprotein g of viral haemorrhagic septicemia virus (VHSV) purified in large amounts from insect larvae.

    PubMed

    Encinas, Paloma; Gomez-Sebastian, Silvia; Nunez, Maria Carmen; Gomez-Casado, Eduardo; Escribano, Jose M; Estepa, Amparo; Coll, Julio

    2011-06-21

    There are currently no purification methods capable of producing the large amounts of fish rhabdoviral glycoprotein G (gpG) required for diagnosis and immunisation purposes or for studying structure and molecular mechanisms of action of this molecule (ie. pH-dependent membrane fusion). As a result of the unavailability of large amounts of the gpG from viral haemorrhagic septicaemia rhabdovirus (VHSV), one of the most dangerous viruses affecting cultured salmonid species, research interests in this field are severely hampered. Previous purification methods to obtain recombinant gpG from VHSV in E. coli, yeast and baculovirus grown in insect cells have not produced soluble conformations or acceptable yields. The development of large-scale purification methods for gpGs will also further research into other fish rhabdoviruses, such as infectious haematopoietic necrosis virus (IHNV), spring carp viremia virus (SVCV), hirame rhabdovirus (HIRRV) and snakehead rhabdovirus (SHRV). Here we designed a method to produce milligram amounts of soluble VHSV gpG. Only the transmembrane and carboxy terminal-deleted (amino acid 21 to 465) gpG was efficiently expressed in insect larvae. Recognition of G21-465 by ß-mercaptoethanol-dependent neutralizing monoclonal antibodies (N-MAbs) and pH-dependent recognition by sera from VHSV-hyperimmunized or VHSV-infected rainbow trout (Oncorhynchus mykiss) was demonstrated. Given that the purified G21-465 conserved some of its most important properties, this method might be suitable for the large-scale production of fish rhabdoviral gpGs for use in diagnosis, fusion and antigenicity studies.

  5. Analysis on the Intention to Purchase Weather Index Insurance and Development Agenda

    NASA Astrophysics Data System (ADS)

    Park, K.; Jung, J.; Shin, J.; Kim, B.

    2013-12-01

    The purpose of this paper is to analyze how to revitalize weather insurance. Current state of weather insurance market is firstly described, and the necessity of insurance products and intention to purchase are analyzed based on the recognition survey regarding weather insurance focusing on the weather index insurance. The result of intention to purchase insurance products were examined with Ordered Logit Analysis (OLA), indicating that the amount of damages, the impacts of weather change, and experience of damage and loss have a positive relationship with the intention to purchase weather insurance. In addition, recognition of the amount of acceptable payment for insurance (i.e. willingness to pay) was analyzed for both the group who wants to purchase insurance (Group 1) and the group who does not want to (Group 2). The results demonstrate that Group 1 shows statistically higher significance than Group 2. Based on the results above with the increase in abnormal weather phenomena, we could predict that the amount of damages and losses will be rapidly increasing. The portion of weather insurance market is also expected to consistently develop and expand. This study could be a cornerstone for drawing a plan to revitalize weather insurance.

  6. Low-frequency noise from large wind turbines.

    PubMed

    Møller, Henrik; Pedersen, Christian Sejer

    2011-06-01

    As wind turbines get larger, worries have emerged that the turbine noise would move down in frequency and that the low-frequency noise would cause annoyance for the neighbors. The noise emission from 48 wind turbines with nominal electric power up to 3.6 MW is analyzed and discussed. The relative amount of low-frequency noise is higher for large turbines (2.3-3.6 MW) than for small turbines (≤ 2 MW), and the difference is statistically significant. The difference can also be expressed as a downward shift of the spectrum of approximately one-third of an octave. A further shift of similar size is suggested for future turbines in the 10-MW range. Due to the air absorption, the higher low-frequency content becomes even more pronounced, when sound pressure levels in relevant neighbor distances are considered. Even when A-weighted levels are considered, a substantial part of the noise is at low frequencies, and for several of the investigated large turbines, the one-third-octave band with the highest level is at or below 250 Hz. It is thus beyond any doubt that the low-frequency part of the spectrum plays an important role in the noise at the neighbors. © 2011 Acoustical Society of America

  7. Dual Use of Cigarettes, Little Cigars, Cigarillos, and Large Cigars: Smoking Topography and Toxicant Exposure

    PubMed Central

    Pickworth, Wallace B.; Rosenberry, Zachary R.; O’Grady, Kevin E.; Koszowski, Bartosz

    2017-01-01

    Objective Smoking topography variables and toxicant exposure (plasma nicotine and exhaled CO) were examined in 3 groups of study participants that smoked both cigarettes and either filtered little cigars (Winchester), cigarillos (Black & Mild), or large cigars (Phillies Blunt). Methods Laboratory ad lib smoking of the cigar products was collected with a smoking puff analyzer; plasma levels of nicotine and exhaled CO were collected before and after smoking. Results Although there were no statistically significant differences in demographic and cigarette smoking topography among the groups, there were significant differences in how the different cigar products were smoked. Plasma nicotine boost was similar after all products but exhaled CO was greater after the cigarillo and large cigar than the little cigar. Some of the differences were due to the differences in article size but other differences were apparent even after adjustment for the amount of tobacco burned or the mouth intake (puff volume). Conclusions The topography findings of differences among products challenge the practice of grouping cigars as a single entity in surveys, regulatory decisions, and discussions of toxicant exposure. The results add to the discussion of distinctions among products in the scientific assessment of public health risk and regulatory decisions. PMID:28966952

  8. Analyzing large-scale proteomics projects with latent semantic indexing.

    PubMed

    Klie, Sebastian; Martens, Lennart; Vizcaíno, Juan Antonio; Côté, Richard; Jones, Phil; Apweiler, Rolf; Hinneburg, Alexander; Hermjakob, Henning

    2008-01-01

    Since the advent of public data repositories for proteomics data, readily accessible results from high-throughput experiments have been accumulating steadily. Several large-scale projects in particular have contributed substantially to the amount of identifications available to the community. Despite the considerable body of information amassed, very few successful analyses have been performed and published on this data, leveling off the ultimate value of these projects far below their potential. A prominent reason published proteomics data is seldom reanalyzed lies in the heterogeneous nature of the original sample collection and the subsequent data recording and processing. To illustrate that at least part of this heterogeneity can be compensated for, we here apply a latent semantic analysis to the data contributed by the Human Proteome Organization's Plasma Proteome Project (HUPO PPP). Interestingly, despite the broad spectrum of instruments and methodologies applied in the HUPO PPP, our analysis reveals several obvious patterns that can be used to formulate concrete recommendations for optimizing proteomics project planning as well as the choice of technologies used in future experiments. It is clear from these results that the analysis of large bodies of publicly available proteomics data by noise-tolerant algorithms such as the latent semantic analysis holds great promise and is currently underexploited.

  9. IS/IT the prescription to enable medical group practices attain their goals.

    PubMed

    Wickramasinghe, Nilmini; Silvers, J B

    2003-05-01

    The US spends significantly more money as a percentage of GDP on health care than any other OECD country and more importantly, this amount is anticipated to increase exponentially. In this high cost environment, two important trends have occurred: (1) the movement to managed care, and (2) large investments in Information Systems/Information Technology (IS/IT). Managed care has emerged as an attempt to provide good quality yet cost effective health care treatment. Its implications are not well discussed in the literature while, its impact on different types of medical group practices is even less well understood. The repercussions of the large investments in IS/IT on the health care sector in general and on the medical group practice in particular, although clearly of importance, are also largely ignored by the literature. This study attempts to address this significant void in the literature. By analyzing three different types of group practices; an Independent Practice Association (IPA), a Faculty Practice and a Multi Specialty Group Practice in a managed care environment during their implementation of practice management/billing systems, we are able to draw some conclusions regarding the impacts of these two central trends on health care in general as well as on the medical group practice in particular.

  10. Asymmetric author-topic model for knowledge discovering of big data in toxicogenomics.

    PubMed

    Chung, Ming-Hua; Wang, Yuping; Tang, Hailin; Zou, Wen; Basinger, John; Xu, Xiaowei; Tong, Weida

    2015-01-01

    The advancement of high-throughput screening technologies facilitates the generation of massive amount of biological data, a big data phenomena in biomedical science. Yet, researchers still heavily rely on keyword search and/or literature review to navigate the databases and analyses are often done in rather small-scale. As a result, the rich information of a database has not been fully utilized, particularly for the information embedded in the interactive nature between data points that are largely ignored and buried. For the past 10 years, probabilistic topic modeling has been recognized as an effective machine learning algorithm to annotate the hidden thematic structure of massive collection of documents. The analogy between text corpus and large-scale genomic data enables the application of text mining tools, like probabilistic topic models, to explore hidden patterns of genomic data and to the extension of altered biological functions. In this paper, we developed a generalized probabilistic topic model to analyze a toxicogenomics dataset that consists of a large number of gene expression data from the rat livers treated with drugs in multiple dose and time-points. We discovered the hidden patterns in gene expression associated with the effect of doses and time-points of treatment. Finally, we illustrated the ability of our model to identify the evidence of potential reduction of animal use.

  11. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  12. What Is Big Data and Why Is It Important?

    ERIC Educational Resources Information Center

    Pence, Harry E.

    2014-01-01

    Big Data Analytics is a topic fraught with both positive and negative potential. Big Data is defined not just by the amount of information involved but also its variety and complexity, as well as the speed with which it must be analyzed or delivered. The amount of data being produced is already incredibly great, and current developments suggest…

  13. Holocaust Studies in Austrian Elementary and Secondary Schools

    ERIC Educational Resources Information Center

    Mittnik, Philipp

    2016-01-01

    This article presents arguments in support of teaching about the Holocaust and Nazism in Austria at an early age. To accomplish this, Austrian and German elementary school textbooks were analyzed for the amount of content dealing with the Holocaust and Jews; the results showed that since 1980 the amount of content on the Holocaust increased in…

  14. Effect of noble gases on an atmospheric greenhouse /Titan/.

    NASA Technical Reports Server (NTRS)

    Cess, R.; Owen, T.

    1973-01-01

    Several models for the atmosphere of Titan have been investigated, taking into account various combinations of neon and argon. The investigation shows that the addition of large amounts of Ne and/or Ar will substantially reduce the hydrogen abundance required for a given greenhouse effect. The fact that a large amount of neon should be present if the atmosphere is a relic of the solar nebula is an especially attractive feature of the models, because it is hard to justify appropriate abundances of other enhancing agents.

  15. Characterization of multiblock copolymers by chromatographic techniques.

    PubMed

    N'Goma, Patrick Yoba; Radke, Wolfgang; Malz, Frank; Ziegler, Hans Jörg; Zierke, Michael; Behl, Marc; Lendlein, Andreas

    2011-02-01

    Multiblock copolymers (MBC) composed of blocks of poly(1,4-dioxanone) (PPDO) and poly(e-caprolactone) (PCL) were investigated in order to gain information on the extend of chemical heterogeneity of the samples. A gradient chromatographic method was established allowing separation of purely PPDO- from purely PCL-containing chains. Application of the gradient to MBC made of PPDO- and PCL-diols connected by trimethylhexamethylene diisocyanate (TMDI) resulted in two well separated peaks which were analyzed by means of FTIR, 1H-NMR and pyrolysis GC-MS. It was shown that the first peak was composed to a large extent of PPDO and only lower amounts of PCL were incorporated. Conversely, the second peak consisted predominantly of PCL with only a minor fraction of PPDO. Thus, the MBCs having PPDO and PCL segments show an unexpected broad chemical heterogeneity.

  16. Age dependency of base modification in rabbit liver DNA

    NASA Technical Reports Server (NTRS)

    Yamamoto, O.; Fuji, I.; Yoshida, T.; Cox, A. B.; Lett, J. T.

    1988-01-01

    Age-related modifications of DNA bases have been observed in the liver of the New Zealand white (NZW) rabbit (Oryctolagus cuniculus), a lagomorph with a median life span in captivity of 5-7 yr. The ages of the animals studied ranged from 6 wk to 9 yr. After the DNA had been extracted from the liver cell nuclei and hydrolyzed with acid, the bases were analyzed by column chromatography with Cellulofine gels (GC-15-m). Two peaks in the chromatogram, which eluted before the four DNA bases, contained modified bases. Those materials, which were obtained in relatively large amounts from old animals, were highly fluorescent, and were shown to be crosslinked base products by mass spectrometry. The yield of crosslinked products versus rabbit age (greater than 0.5 yr) can be fitted by an exponential function (correlation coefficient: 0.76 +/- 0.09).

  17. Senior health clinics: are they financially viable?

    PubMed

    McAtee, Robin E; Crandall, Debra; Wright, Larry D; Beverly, Claudia J

    2009-07-01

    Are hospital-based outpatient interdisciplinary clinics a financially viable alternative for caring for our burgeoning population of older adults in America? Although highly popular, with high patient satisfaction rates among older adults and their families, senior health clinics (SHCs) can be expensive to operate, with limited quantifiable health outcomes. This study analyzed three geriatric hospital-based interdisciplinary clinics in rural Arkansas by examining their patient profiles, revenues, and expenses. It closely examined the effects of the downstream revenue using the multiplier effect and acknowledged other factors that weigh heavily on the success of SHCs and the care of older adults. The findings highlight the similarities and differences in the three clinics' operating and financial structures in addition to the clinics' and providers' productivity. The analysis presents an evidence-based illustration that SHCs can break even or lose large amounts of money.

  18. The architectural form of Qikou Cave dwellings in Chinese "Earth" culture

    NASA Astrophysics Data System (ADS)

    Chen, Xuanchen; Feng, Xinqun

    2018-03-01

    Cave building is not only a kind of architecture with unique style, but also a manifestation of Chinese traditional culture. Cave culture is an important part of Chinese traditional culture. The main purpose of this thesis which studies the architectural form of Qikou Cave, is to analyze how the cave building plays a positive role in promoting the development and application of modern resources and in cultural transmission. Based on a large amount of literature material, and taking Qikou Cave as an example, by studying the morphological characteristics of cave building, the paper takes an optimistic outlook on its future development and the sustainable development of the resources. It is expected that the cave culture can be further explored to promote the traditional Chinese culture and to drive the development of modern construction industry and resource conservation.

  19. Wildfire smoke transport and impact on air quality observed by a mullti-wavelength elastic-raman lidar and ceilometer in New York city

    NASA Astrophysics Data System (ADS)

    Wu, Yonghua; Peña, Wilson; Gross, Barry.; Moshary, Fred

    2018-04-01

    The intense wildfires from the western Canada in May 2016 injected large amount of smoke into the atmosphere. This paper presents integrated observation of the event by a lidar, ceilometer, and satellite together with models and an assessment of smoke plume impacts on local air quality in New York City (NYC) area. A dense aloft plume on May 20 and a boundary layer plume on May 25 are analyzed. The smoke mixing into planetary-boundary-layer (PBL) and strong diurnal variation of PBL-top are shown. For the 2ndcase, the ground PM2.5 measurements show a significant increase in both the urban and upwind non-urban areas of NYC. The smoke sources and transport paths are further verified by the satellite observations and HYSPLIT model data.

  20. Response of middle-taiga permafrost landscapes of Central Siberia to global warming in the late 20th and early 21st centuries

    NASA Astrophysics Data System (ADS)

    Medvedkov, Alexey A.

    2016-11-01

    In this paper, regional features of a climatogenic response of the middle-taiga permafrost landscapes of Central Siberia, as well as corresponding transformations of the exodynamic processes, are considered. Lithological-geomorphologic and landscape- geocryological data are analyzed with large amounts of actual data and results of monitoring surveys. Specific features of an ecotone localization of middle-taiga permafrost landscapes and their typical physiognomic characteristics are described. A comprehensive investigation of representative key sites makes it possible to discover the response of different types of permafrost landscapes to regional climate warming. A rapid increase in the active layer depth, slower creep, transformations of the moving kurums, intensive solifluction, and a local replacement of solifluction by landslides-earthflows are revealed within ecotone landscapes of the cryolithozone.

Top