Sample records for process large amounts

  1. Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser

    NASA Astrophysics Data System (ADS)

    Christen, M.

    2016-06-01

    Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.

  2. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  3. The effectiveness of flocculants on inorganic and metallic species removal during aerobic digestion of wastewater from poultry processing plant

    USDA-ARS?s Scientific Manuscript database

    : Large amount of water is used for processing of our food supplies, especially in meat processing plants. The resulting amount of wastewater cannot be discarded freely back into natural settings due to regulatory mandates, whether the sinks would be rivers, ponds, or other natural systems. These wa...

  4. The effectiveness of flocculants on inorganic and metallic species removal during aerobic digestion of wastewater from poultry processing plant

    USDA-ARS?s Scientific Manuscript database

    Large amount of water is used for processing of our food supplies, especially in meat processing plants. The resulting amount of wastewater cannot be discarded freely back into natural settings due to regulatory mandates, whether the sinks would be rivers, ponds, or other natural systems. These wast...

  5. An algorithm of discovering signatures from DNA databases on a computer cluster.

    PubMed

    Lee, Hsiao Ping; Sheu, Tzu-Fang

    2014-10-05

    Signatures are short sequences that are unique and not similar to any other sequence in a database that can be used as the basis to identify different species. Even though several signature discovery algorithms have been proposed in the past, these algorithms require the entirety of databases to be loaded in the memory, thus restricting the amount of data that they can process. It makes those algorithms unable to process databases with large amounts of data. Also, those algorithms use sequential models and have slower discovery speeds, meaning that the efficiency can be improved. In this research, we are debuting the utilization of a divide-and-conquer strategy in signature discovery and have proposed a parallel signature discovery algorithm on a computer cluster. The algorithm applies the divide-and-conquer strategy to solve the problem posed to the existing algorithms where they are unable to process large databases and uses a parallel computing mechanism to effectively improve the efficiency of signature discovery. Even when run with just the memory of regular personal computers, the algorithm can still process large databases such as the human whole-genome EST database which were previously unable to be processed by the existing algorithms. The algorithm proposed in this research is not limited by the amount of usable memory and can rapidly find signatures in large databases, making it useful in applications such as Next Generation Sequencing and other large database analysis and processing. The implementation of the proposed algorithm is available at http://www.cs.pu.edu.tw/~fang/DDCSDPrograms/DDCSD.htm.

  6. DEP : a computer program for evaluating lumber drying costs and investments

    Treesearch

    Stewart Holmes; George B. Harpole; Edward Bilek

    1983-01-01

    The DEP computer program is a modified discounted cash flow computer program designed for analysis of problems involving economic analysis of wood drying processes. Wood drying processes are different from other processes because of the large amounts of working capital required to finance inventories, and because of relatively large shares of costs charged to inventory...

  7. Moving zone Marangoni drying of wet objects using naturally evaporated solvent vapor

    DOEpatents

    Britten, Jerald A.

    1997-01-01

    A surface tension gradient driven flow (a Marangoni flow) is used to remove the thin film of water remaining on the surface of an object following rinsing. The process passively introduces by natural evaporation and diffusion of minute amounts of alcohol (or other suitable material) vapor in the immediate vicinity of a continuously refreshed meniscus of deionized water or another aqueous-based, nonsurfactant rinsing agent. Used in conjunction with cleaning, developing or wet etching application, rinsing coupled with Marangoni drying provides a single-step process for 1) cleaning, developing or etching, 2) rinsing, and 3) drying objects such as flat substrates or coatings on flat substrates without necessarily using heat, forced air flow, contact wiping, centrifugation or large amounts of flammable solvents. This process is useful in one-step cleaning and drying of large flat optical substrates, one-step developing/rinsing and drying or etching/rinsing/drying of large flat patterned substrates and flat panel displays during lithographic processing, and room-temperature rinsing/drying of other large parts, sheets or continuous rolls of material.

  8. Moving zone Marangoni drying of wet objects using naturally evaporated solvent vapor

    DOEpatents

    Britten, J.A.

    1997-08-26

    A surface tension gradient driven flow (a Marangoni flow) is used to remove the thin film of water remaining on the surface of an object following rinsing. The process passively introduces by natural evaporation and diffusion of minute amounts of alcohol (or other suitable material) vapor in the immediate vicinity of a continuously refreshed meniscus of deionized water or another aqueous-based, nonsurfactant rinsing agent. Used in conjunction with cleaning, developing or wet etching application, rinsing coupled with Marangoni drying provides a single-step process for (1) cleaning, developing or etching, (2) rinsing, and (3) drying objects such as flat substrates or coatings on flat substrates without necessarily using heat, forced air flow, contact wiping, centrifugation or large amounts of flammable solvents. This process is useful in one-step cleaning and drying of large flat optical substrates, one-step developing/rinsing and drying or etching/rinsing/drying of large flat patterned substrates and flat panel displays during lithographic processing, and room-temperature rinsing/drying of other large parts, sheets or continuous rolls of material. 5 figs.

  9. Impact of the BALLOTS Shared Cataloging System on the Amount of Change in the Library Technical Processing Department.

    ERIC Educational Resources Information Center

    Kershner, Lois M.

    The amount of change resulting from the implementation of the Bibliographic Automation of Large Library Operations using a Time-sharing System (BALLOTS) is analyzed, in terms of (1) physical room arrangement, (2) work procedure, and (3) organizational structure. Also considered is the factor of amount of time the new system has been in use.…

  10. Thermal behavior of copper processed by ECAP at elevated temperatures

    NASA Astrophysics Data System (ADS)

    Gonda, Viktor

    2018-05-01

    Large amount of strengthening can be achieved by equal channel angular pressing (ECAP), by the applied severe plastic deformation during the processing. For pure metals, this high strength is accompanied with low thermal stability due to the large activation energy for recrystallization. In the present paper, the chosen technological route was elevated temperature single pass ECAP processing of copper, and its effect on the thermal behavior during the restoration processes of the deformed samples was studied.

  11. Highly Sensitive GMO Detection Using Real-Time PCR with a Large Amount of DNA Template: Single-Laboratory Validation.

    PubMed

    Mano, Junichi; Hatano, Shuko; Nagatomi, Yasuaki; Futo, Satoshi; Takabatake, Reona; Kitta, Kazumi

    2018-03-01

    Current genetically modified organism (GMO) detection methods allow for sensitive detection. However, a further increase in sensitivity will enable more efficient testing for large grain samples and reliable testing for processed foods. In this study, we investigated real-time PCR-based GMO detection methods using a large amount of DNA template. We selected target sequences that are commonly introduced into many kinds of GM crops, i.e., 35S promoter and nopaline synthase (NOS) terminator. This makes the newly developed method applicable to a wide range of GMOs, including some unauthorized ones. The estimated LOD of the new method was 0.005% of GM maize events; to the best of our knowledge, this method is the most sensitive among the GM maize detection methods for which the LOD was evaluated in terms of GMO content. A 10-fold increase in the DNA amount as compared with the amount used under common testing conditions gave an approximately 10-fold reduction in the LOD without PCR inhibition. Our method is applicable to various analytical samples, including processed foods. The use of other primers and fluorescence probes would permit highly sensitive detection of various recombinant DNA sequences besides the 35S promoter and NOS terminator.

  12. Semantic orchestration of image processing services for environmental analysis

    NASA Astrophysics Data System (ADS)

    Ranisavljević, Élisabeth; Devin, Florent; Laffly, Dominique; Le Nir, Yannick

    2013-09-01

    In order to analyze environmental dynamics, a major process is the classification of the different phenomena of the site (e.g. ice and snow for a glacier). When using in situ pictures, this classification requires data pre-processing. Not all the pictures need the same sequence of processes depending on the disturbances. Until now, these sequences have been done manually, which restricts the processing of large amount of data. In this paper, we present how to realize a semantic orchestration to automate the sequencing for the analysis. It combines two advantages: solving the problem of the amount of processing, and diversifying the possibilities in the data processing. We define a BPEL description to express the sequences. This BPEL uses some web services to run the data processing. Each web service is semantically annotated using an ontology of image processing. The dynamic modification of the BPEL is done using SPARQL queries on these annotated web services. The results obtained by a prototype implementing this method validate the construction of the different workflows that can be applied to a large number of pictures.

  13. Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey

    ERIC Educational Resources Information Center

    Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin

    2015-01-01

    "Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…

  14. Using mobile computers to automate the inspection process for highway construction projects.

    DOT National Transportation Integrated Search

    2013-12-01

    Highway construction projects are characterized by the large amount of data that needs to be collected, processed, and exchanged among the : different project participants. Collection of construction inspection data, in particular, allows field perso...

  15. Genetics Home Reference: lysinuric protein intolerance

    MedlinePlus

    ... abnormally large amount of these amino acids in urine. A shortage of lysine, arginine, and ornithine disrupts many vital functions. Arginine and ornithine are involved in a cellular process called the urea cycle, which processes excess nitrogen (in the form ...

  16. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  17. A review of concentrated flow erosion processes on rangelands: fundamental understanding and knowledge gaps

    USDA-ARS?s Scientific Manuscript database

    Concentrated flow erosion processes are distinguished from splash and sheetflow processes in their enhanced ability to mobilize and transport large amounts of soil, water and dissolved elements. On rangelands, soil, nutrients and water are scarce and only narrow margins of resource losses are tolera...

  18. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  19. Hydrocyclone/Filter for Concentrating Biomarkers from Soil

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian; Obenhuber, Donald

    2008-01-01

    The hydrocyclone-filtration extractor (HFE), now undergoing development, is a simple, robust apparatus for processing large amounts of soil to extract trace amounts of microorganisms, soluble organic compounds, and other biomarkers from soil and to concentrate the extracts in amounts sufficient to enable such traditional assays as cell culturing, deoxyribonucleic acid (DNA) analysis, and isotope analysis. Originally intended for incorporation into a suite of instruments for detecting signs of life on Mars, the HFE could also be used on Earth for similar purposes, including detecting trace amounts of biomarkers or chemical wastes in soils.

  20. Analyzing large scale genomic data on the cloud with Sparkhit

    PubMed Central

    Huang, Liren; Krüger, Jan

    2018-01-01

    Abstract Motivation The increasing amount of next-generation sequencing data poses a fundamental challenge on large scale genomic analytics. Existing tools use different distributed computational platforms to scale-out bioinformatics workloads. However, the scalability of these tools is not efficient. Moreover, they have heavy run time overheads when pre-processing large amounts of data. To address these limitations, we have developed Sparkhit: a distributed bioinformatics framework built on top of the Apache Spark platform. Results Sparkhit integrates a variety of analytical methods. It is implemented in the Spark extended MapReduce model. It runs 92–157 times faster than MetaSpark on metagenomic fragment recruitment and 18–32 times faster than Crossbow on data pre-processing. We analyzed 100 terabytes of data across four genomic projects in the cloud in 21 h, which includes the run times of cluster deployment and data downloading. Furthermore, our application on the entire Human Microbiome Project shotgun sequencing data was completed in 2 h, presenting an approach to easily associate large amounts of public datasets with reference data. Availability and implementation Sparkhit is freely available at: https://rhinempi.github.io/sparkhit/. Contact asczyrba@cebitec.uni-bielefeld.de Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253074

  1. The Development and Microstructure Analysis of High Strength Steel Plate NVE36 for Large Heat Input Welding

    NASA Astrophysics Data System (ADS)

    Peng, Zhang; Liangfa, Xie; Ming, Wei; Jianli, Li

    In the shipbuilding industry, the welding efficiency of the ship plate not only has a great effect on the construction cost of the ship, but also affects the construction speed and determines the delivery cycle. The steel plate used for large heat input welding was developed sufficiently. In this paper, the composition of the steel with a small amount of Nb, Ti and large amount of Mn had been designed in micro-alloyed route. The content of C and the carbon equivalent were also designed to a low level. The technology of oxide metallurgy was used during the smelting process of the steel. The rolling technology of TMCP was controlled at a low rolling temperature and ultra-fast cooling technology was used, for the purpose of controlling the transformation of the microstructure. The microstructure of the steel plate was controlled to be the mixed microstructure of low carbon bainite and ferrite. Large amount of oxide particles dispersed in the microstructure of steel, which had a positive effects on the mechanical property and welding performance of the steel. The mechanical property of the steel plate was excellent and the value of longitudinal Akv at -60 °C is more than 200 J. The toughness of WM and HAZ were excellent after the steel plate was welded with a large heat input of 100-250 kJ/cm. The steel plate processed by mentioned above can meet the requirement of large heat input welding.

  2. LanzaTech- Capturing Carbon. Fueling Growth.

    ScienceCinema

    NONE

    2018-01-16

    LanzaTech will design a gas fermentation system that will significantly improve the rate at which methane gas is delivered to a biocatalyst. Current gas fermentation processes are not cost effective compared to other gas-to-liquid technologies because they are too slow for large-scale production. If successful, LanzaTech's system will process large amounts of methane at a high rate, reducing the energy inputs and costs associated with methane conversion.

  3. Really big data: Processing and analysis of large datasets

    USDA-ARS?s Scientific Manuscript database

    Modern animal breeding datasets are large and getting larger, due in part to the recent availability of DNA data for many animals. Computational methods for efficiently storing and analyzing those data are under development. The amount of storage space required for such datasets is increasing rapidl...

  4. Automatic recognition of lactating sow behaviors through depth image processing

    USDA-ARS?s Scientific Manuscript database

    Manual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shiftin...

  5. The Dynamics of Pheromone Gland Synthesis and Release: a Paradigm Shift for Understanding Sex Pheromone Quantity in Female Moths.

    PubMed

    Foster, Stephen P; Anderson, Karin G; Casas, Jérôme

    2018-05-10

    Moths are exemplars of chemical communication, especially with regard to specificity and the minute amounts they use. Yet, little is known about how females manage synthesis and storage of pheromone to maintain release rates attractive to conspecific males and why such small amounts are used. We developed, for the first time, a quantitative model, based on an extensive empirical data set, describing the dynamical relationship among synthesis, storage (titer) and release of pheromone over time in a moth (Heliothis virescens). The model is compartmental, with one major state variable (titer), one time-varying (synthesis), and two constant (catabolism and release) rates. The model was a good fit, suggesting it accounted for the major processes. Overall, we found the relatively small amounts of pheromone stored and released were largely a function of high catabolism rather than a low rate of synthesis. A paradigm shift may be necessary to understand the low amounts released by female moths, away from the small quantities synthesized to the (relatively) large amounts catabolized. Future research on pheromone quantity should focus on structural and physicochemical processes that limit storage and release rate quantities. To our knowledge, this is the first time that pheromone gland function has been modeled for any animal.

  6. Stock flow diagram analysis on solid waste management in Malaysia

    NASA Astrophysics Data System (ADS)

    Zulkipli, Faridah; Nopiah, Zulkifli Mohd; Basri, Noor Ezlin Ahmad; Kie, Cheng Jack

    2016-10-01

    The effectiveness on solid waste management is a major importance to societies. Numerous generation of solid waste from our daily activities has risked for our communities. These due to rapid population grow and advance in economic development. Moreover, the complexity of solid waste management is inherently involved large scale, diverse and element of uncertainties that must assist stakeholders with deviating objectives. In this paper, we proposed a system dynamics simulation by developing a stock flow diagram to illustrate the solid waste generation process and waste recycle process. The analysis highlights the impact on increasing the number of population toward the amount of solid waste generated and the amount of recycled waste. The results show an increment in the number of population as well as the amount of recycled waste will decrease the amount of waste generated. It is positively represent the achievement of government aim to minimize the amount of waste to be disposed by year 2020.

  7. Large Faraday effect of borate glasses with high Tb3+ content prepared by containerless processing

    NASA Astrophysics Data System (ADS)

    Suzuki, Futoshi; Sato, Fumio; Oshita, Hiroyuki; Yao, Situ; Nakatsuka, Yuko; Tanaka, Katsuhisa

    2018-02-01

    Borate glasses containing a large amount of Tb3+ ions have been prepared by containerless processing. The content of Tb2O3 reached 60 mol%. The glass bearing the highest content of Tb3+ ions showed a large Faraday effect; the Verdet constant was 234 rad/T m. Annealing of the glasses in H2/N2 atmosphere resulted in a low optical absorption coefficient, leading to an extremely large magneto-optical figure of merit that was ∼1.7 times higher than that of Tb3Ga5O12 single crystal.

  8. Beowulf Distributed Processing and the United States Geological Survey

    USGS Publications Warehouse

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.

  9. Precisions Measurement for the Grasp of Welding Deformation amount of Time Series for Large-Scale Industrial Products

    NASA Astrophysics Data System (ADS)

    Abe, R.; Hamada, K.; Hirata, N.; Tamura, R.; Nishi, N.

    2015-05-01

    As well as the BIM of quality management in the construction industry, demand for quality management of the manufacturing process of the member is higher in shipbuilding field. The time series of three-dimensional deformation of the each process, and are accurately be grasped strongly demanded. In this study, we focused on the shipbuilding field, will be examined three-dimensional measurement method. The shipyard, since a large equipment and components are intricately arranged in a limited space, the installation of the measuring equipment and the target is limited. There is also the element to be measured is moved in each process, the establishment of the reference point for time series comparison is necessary to devise. In this paper will be discussed method for measuring the welding deformation in time series by using a total station. In particular, by using a plurality of measurement data obtained from this approach and evaluated the amount of deformation of each process.

  10. Variable Stars in the Field of V729 Aql

    NASA Astrophysics Data System (ADS)

    Cagaš, P.

    2017-04-01

    Wide field instruments can be used to acquire light curves of tens or even hundreds of variable stars per night, which increases the probability of new discoveries of interesting variable stars and generally increases the efficiency of observations. At the same time, wide field instruments produce a large amount of data, which must be processed using advanced software. The traditional approach, typically used by amateur astronomers, requires an unacceptable amount of time needed to process each data set. New functionality, built into SIPS software package, can shorten the time needed to obtain light curves by several orders of magnitude. Also, newly introduced SILICUPS software is intended for post-processing of stored light curves. It can be used to visualize observations from many nights, to find variable star periods, evaluate types of variability, etc. This work provides an overview of tools used to process data from the large field of view around the variable star V729 Aql. and demonstrates the results.

  11. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  12. Autonomous Object Characterization with Large Datasets

    DTIC Science & Technology

    2015-10-18

    desk, where a substantial amount of effort is required to transform raw photometry into a data product, minimizing the amount of time the analyst has...were used to explore concepts in satellite characterization and satellite state change. The first algorithm provides real- time stability estimation... Timely and effective space object (SO) characterization is a challenge, and requires advanced data processing techniques. Detection and identification

  13. Integrated Data Capturing Requirements for 3d Semantic Modelling of Cultural Heritage: the Inception Protocol

    NASA Astrophysics Data System (ADS)

    Di Giulio, R.; Maietti, F.; Piaia, E.; Medici, M.; Ferrari, F.; Turillazzi, B.

    2017-02-01

    The generation of high quality 3D models can be still very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In this framework, the ongoing EU funded project INCEPTION - Inclusive Cultural Heritage in Europe through 3D semantic modelling proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  14. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    PubMed

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  15. GPU applications for data processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vladymyrov, Mykhailo, E-mail: mykhailo.vladymyrov@cern.ch; Aleksandrov, Andrey; INFN sezione di Napoli, I-80125 Napoli

    2015-12-31

    Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.

  16. A divide-and-conquer algorithm for large-scale de novo transcriptome assembly through combining small assemblies from existing algorithms.

    PubMed

    Sze, Sing-Hoi; Parrott, Jonathan J; Tarone, Aaron M

    2017-12-06

    While the continued development of high-throughput sequencing has facilitated studies of entire transcriptomes in non-model organisms, the incorporation of an increasing amount of RNA-Seq libraries has made de novo transcriptome assembly difficult. Although algorithms that can assemble a large amount of RNA-Seq data are available, they are generally very memory-intensive and can only be used to construct small assemblies. We develop a divide-and-conquer strategy that allows these algorithms to be utilized, by subdividing a large RNA-Seq data set into small libraries. Each individual library is assembled independently by an existing algorithm, and a merging algorithm is developed to combine these assemblies by picking a subset of high quality transcripts to form a large transcriptome. When compared to existing algorithms that return a single assembly directly, this strategy achieves comparable or increased accuracy as memory-efficient algorithms that can be used to process a large amount of RNA-Seq data, and comparable or decreased accuracy as memory-intensive algorithms that can only be used to construct small assemblies. Our divide-and-conquer strategy allows memory-intensive de novo transcriptome assembly algorithms to be utilized to construct large assemblies.

  17. Models of resource planning during formation of calendar construction plans for erection of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Pocebneva, Irina; Belousov, Vadim; Fateeva, Irina

    2018-03-01

    This article provides a methodical description of resource-time analysis for a wide range of requirements imposed for resource consumption processes in scheduling tasks during the construction of high-rise buildings and facilities. The core of the proposed approach and is the resource models being determined. The generalized network models are the elements of those models, the amount of which can be too large to carry out the analysis of each element. Therefore, the problem is to approximate the original resource model by simpler time models, when their amount is not very large.

  18. Assimilation of granite by basaltic magma at Burnt Lava flow, Medicine Lake volcano, northern California: Decoupling of heat and mass transfer

    USGS Publications Warehouse

    Grove, T.L.; Kinzler, R.J.; Baker, M.B.; Donnelly-Nolan, J. M.; Lesher, C.E.

    1988-01-01

    At Medicine Lake volcano, California, andesite of the Holocene Burnt Lava flow has been produced by fractional crystallization of parental high alumina basalt (HAB) accompanied by assimilation of granitic crustal material. Burnt Lava contains inclusions of quenched HAB liquid, a potential parent magma of the andesite, highly melted granitic crustal xenoliths, and xenocryst assemblages which provide a record of the fractional crystallization and crustal assimilation process. Samples of granitic crustal material occur as xenoliths in other Holocene and Pleistocene lavas, and these xenoliths are used to constrain geochemical models of the assimilation process. A large amount of assimilation accompanied fractional crystallization to produce the contaminated Burnt lava andesites. Models which assume that assimilation and fractionation occurred simultaneously estimate the ratio of assimilation to fractional crystallization (R) to be >1 and best fits to all geochemical data are at an R value of 1.35 at F=0.68. Petrologic evidence, however, indicates that the assimilation process did not involve continuous addition of granitic crust as fractionation occurred. Instead, heat and mass transfer were separated in space and time. During the assimilation process, HAB magma underwent large amounts of fractional crystallization which was not accompanied by significant amounts of assimilation. This fractionation process supplied heat to melt granitic crust. The models proposed to explain the contamination process involve fractionation, replenishment by parental HAB, and mixing of evolved and parental magmas with melted granitic crust. ?? 1988 Springer-Verlag.

  19. User Oriented Techniques to Support Interaction and Decision Making with Large Educational Databases

    ERIC Educational Resources Information Center

    Hartley, Roger; Almuhaidib, Saud M. Y.

    2007-01-01

    Information Technology is developing rapidly and providing policy/decision makers with large amounts of information that require processing and analysis. Decision support systems (DSS) aim to provide tools that not only help such analyses, but enable the decision maker to experiment and simulate the effects of different policies and selection…

  20. Implicit and Explicit Cognitive Processes in Incidental Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Ender, Andrea

    2016-01-01

    Studies on vocabulary acquisition in second language learning have revealed that a large amount of vocabulary is learned without an overt intention, in other words, incidentally. This article investigates the relevance of different lexical processing strategies for vocabulary acquisition when reading a text for comprehension among 24 advanced…

  1. A process-based emission model for volatile organic compounds from silage sources on farms

    USDA-ARS?s Scientific Manuscript database

    Silage on dairy farms can emit large amounts of volatile organic compounds (VOCs), a precursor in the formation of tropospheric ozone. Because of the challenges associated with direct measurements, process-based modeling is another approach for estimating emissions of air pollutants from sources suc...

  2. Development of a Two-Stage Microalgae Dewatering Process – A Life Cycle Assessment Approach

    PubMed Central

    Soomro, Rizwan R.; Zeng, Xianhai; Lu, Yinghua; Lin, Lu; Danquah, Michael K.

    2016-01-01

    Even though microalgal biomass is leading the third generation biofuel research, significant effort is required to establish an economically viable commercial-scale microalgal biofuel production system. Whilst a significant amount of work has been reported on large-scale cultivation of microalgae using photo-bioreactors and pond systems, research focus on establishing high performance downstream dewatering operations for large-scale processing under optimal economy is limited. The enormous amount of energy and associated cost required for dewatering large-volume microalgal cultures has been the primary hindrance to the development of the needed biomass quantity for industrial-scale microalgal biofuels production. The extremely dilute nature of large-volume microalgal suspension and the small size of microalgae cells in suspension create a significant processing cost during dewatering and this has raised major concerns towards the economic success of commercial-scale microalgal biofuel production as an alternative to conventional petroleum fuels. This article reports an effective framework to assess the performance of different dewatering technologies as the basis to establish an effective two-stage dewatering system. Bioflocculation coupled with tangential flow filtration (TFF) emerged a promising technique with total energy input of 0.041 kWh, 0.05 kg CO2 emissions and a cost of $ 0.0043 for producing 1 kg of microalgae biomass. A streamlined process for operational analysis of two-stage microalgae dewatering technique, encompassing energy input, carbon dioxide emission, and process cost, is presented. PMID:26904075

  3. Pollution Prevention Guideline for Academic Laboratories.

    ERIC Educational Resources Information Center

    Li, Edwin; Barnett, Stanley M.; Ray, Barbara

    2003-01-01

    Explains how to manage waste after a classroom laboratory experiment which generally has the potential to generate large amounts of waste. Focuses on pollution prevention and the selection processes to eliminate or minimize waste. (YDS)

  4. Supercritical Fluid Extraction of Bioactive Compounds from Plant Materials.

    PubMed

    Wrona, Olga; Rafińska, Katarzyna; Możeński, Cezary; Buszewski, Bogusław

    2017-11-01

    There has been growing interest in the application of supercritical solvents over the last several years, many of the applications industrial in nature. The purpose of plant material extraction is to obtain large amounts of extract rich in the desired active compounds in a time-sensitive and cost-effective manner. The productivity and profitability of a supercritical fluid extraction (SFE) process largely depends on the selection of process parameters, which are elaborated upon in this paper. Carbon dioxide (CO2) is the most desirable solvent for the supercritical extraction of natural products. Its near-ambient critical temperature makes it suitable for the extraction of thermolabile components without degradation. A new approach has been adopted for SFE in which the solubility of nonpolar supercritical CO2 can be enhanced by the addition of small amounts of cosolvent.

  5. Spontaneous, generalized lipidosis in captive greater horseshoe bats (Rhinolophus ferrumequinum).

    PubMed

    Gozalo, Alfonso S; Schwiebert, Rebecca S; Metzner, Walter; Lawson, Gregory W

    2005-11-01

    During a routine 6-month quarantine period, 3 of 34 greater horseshoe bats (Rhinolophus ferrumequinum) captured in mainland China and transported to the United States for use in echolocation studies were found dead with no prior history of illness. All animals were in good body condition at the time of death. At necropsy, a large amount of white fat was found within the subcutis, especially in the sacrolumbar region. The liver, kidneys, and heart were diffusely tan in color. Microscopic examination revealed that hepatocytes throughout the liver were filled with lipid, and in some areas, lipid granulomas were present. renal lesions included moderate amounts of lipid in the cortical tubular epithelium and large amounts of protein and lipid within Bowman's capsules in the glomeruli. In addition, one bat had large lipid vacuoles diffusely distributed throughout the myocardium. The exact pathologic mechanism inducing the hepatic, renal, and cardiac lipidosis is unknown. The horseshoe bats were captured during hibernation and immediately transported to the United States. It is possible that the large amount of fat stored coupled with changes in photoperiod, lack of exercise, and/or the stress of captivity might have contributed to altering the normal metabolic processes, leading to anorexia and consequently lipidosis in these animals.

  6. Influences of large-scale convection and moisture source on monthly precipitation isotope ratios observed in Thailand, Southeast Asia

    NASA Astrophysics Data System (ADS)

    Wei, Zhongwang; Lee, Xuhui; Liu, Zhongfang; Seeboonruang, Uma; Koike, Masahiro; Yoshimura, Kei

    2018-04-01

    Many paleoclimatic records in Southeast Asia rely on rainfall isotope ratios as proxies for past hydroclimatic variability. However, the physical processes controlling modern rainfall isotopic behaviors in the region is poorly constrained. Here, we combined isotopic measurements at six sites across Thailand with an isotope-incorporated atmospheric circulation model (IsoGSM) and the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model to investigate the factors that govern the variability of precipitation isotope ratios in this region. Results show that rainfall isotope ratios are both correlated with local rainfall amount and regional outgoing longwave radiation, suggesting that rainfall isotope ratios in this region are controlled not only by local rain amount (amount effect) but also by large-scale convection. As a transition zone between the Indian monsoon and the western North Pacific monsoon, the spatial difference of observed precipitation isotope among different sites are associated with moisture source. These results highlight the importance of regional processes in determining rainfall isotope ratios in the tropics and provide constraints on the interpretation of paleo-precipitation isotope records in the context of regional climate dynamics.

  7. Competitive adsorption in model charged protein mixtures: Equilibrium isotherms and kinetics behavior

    NASA Astrophysics Data System (ADS)

    Fang, F.; Szleifer, I.

    2003-07-01

    The competitive adsorption of proteins of different sizes and charges is studied using a molecular theory. The theory enables the study of charged systems explicitly including the size, shape, and charge distributions in all the molecular species in the mixture. Thus, this approach goes beyond the commonly used Poisson-Boltzmann approximation. The adsorption isotherms of the protein mixtures are studied for mixtures of two proteins of different size and charge. The amount of proteins adsorbed and the fraction of each protein is calculated as a function of the bulk composition of the solution and the amount of salt in the system. It is found that the total amount of proteins adsorbed is a monotonically decreasing function of the fraction of large proteins on the bulk solution and for fixed protein composition of the salt concentration. However, the composition of the adsorbed layer is a complicated function of the bulk composition and solution ionic strength. The structure of the adsorb layer depends upon the bulk composition and salt concentration. In general, there are multilayers adsorbed due to the long-range character of the electrostatic interactions. When the composition of large proteins in bulk is in very large excess it is found that the structure of the adsorb multilayer is such that the layer in contact with the surface is composed by a mixture of large and small proteins. However, the second and third layers are almost exclusively composed of large proteins. The theory is also generalized to study the time-dependent adsorption. The approach is based on separation of time scales into fast modes for the ions from the salt and the solvent and slow for the proteins. The dynamic equations are written for the slow modes, while the fast ones are obtained from the condition of equilibrium constrained to the distribution of proteins given by the slow modes. Two different processes are presented: the adsorption from a homogeneous solution to a charged surface at low salt concentration, and large excess of the large proteins in bulk. The second process is the kinetics of structural and adsorption change by changing the salt concentration of the bulk solution from low to high. The first process shows a large overshoot of the large proteins on the surface due to their excess in solution, followed by a surface replacement by the smaller molecules. The second process shows a very fast desorption of the large proteins followed by adsorption at latter stages. This process is found to be driven by large electrostatic repulsions induced by the fast ions from the salt approaching the surface. The relevance of the theoretical predictions to experimental system and possible directions for improvements of the theory are discussed.

  8. Computational Performance of Intel MIC, Sandy Bridge, and GPU Architectures: Implementation of a 1D c++/OpenMP Electrostatic Particle-In-Cell Code

    DTIC Science & Technology

    2014-05-01

    fusion, space and astrophysical plasmas, but still the general picture can be presented quite well with the fluid approach [6, 7]. The microscopic...purpose computing CPU for algorithms where processing of large blocks of data is done in parallel. The reason for that is the GPU’s highly effective...parallel structure. Most of the image and video processing computations involve heavy matrix and vector op- erations over large amounts of data and

  9. A Statistical Ontology-Based Approach to Ranking for Multiword Search

    ERIC Educational Resources Information Center

    Kim, Jinwoo

    2013-01-01

    Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships…

  10. Criminal Intent with Property: A Study of Real Estate Fraud Prediction and Detection

    ERIC Educational Resources Information Center

    Blackman, David H.

    2013-01-01

    The large number of real estate transactions across the United States, combined with closing process complexity, creates extremely large data sets that conceal anomalies indicative of fraud. The quantitative amount of damage due to fraud is immeasurable to the lives of individuals who are victims, not to mention the financial impact to…

  11. Big Data and Chemical Education

    ERIC Educational Resources Information Center

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  12. Growth, ethanol production, and inulinase activity on various inulin substrates by mutant kluyveromyces marxianus strains NRRL Y-50798 and NRRL Y-50799

    USDA-ARS?s Scientific Manuscript database

    Economically important plants contain large amounts of inulin. Disposal of waste resulting from their processing presents environmental issues. Finding microorganisms capable of converting inulin waste to biofuel and valuable co-products in a biorefinery at the processing site would have significant...

  13. The Convergence of Information Technology, Data, and Management in a Library Imaging Program

    ERIC Educational Resources Information Center

    France, Fenella G.; Emery, Doug; Toth, Michael B.

    2010-01-01

    Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…

  14. Alternatives for the intermediate recovery of plasmid DNA: performance, economic viability and environmental impact.

    PubMed

    Freitas, Sindelia; Canário, Sónia; Santos, José A L; Prazeres, Duarte M F

    2009-02-01

    Robust cGMP manufacturing is required to produce high-quality plasmid DNA (pDNA). Three established techniques, isopropanol and ammonium sulfate (AS) precipitation (PP), tangential flow filtration (TFF) and aqueous two-phase systems (ATPS) with PEG600/AS, were tested as alternatives to recover pDNA from alkaline lysates. Yield and purity data were used to evaluate the economic and environmental impact of each option. Although pDNA yields > or = 90% were always obtained, ATPS delivered the highest HPLC purity (59%), followed by PP (48%) and TFF (18%). However, the ability of ATPS to concentrate pDNA was very poor when compared with PP or TFF. Processes were also implemented by coupling TFF with ATPS or AS-PP. Process simulations indicate that all options require large amounts of water (100-200 tons/kg pDNA) and that the ATPS process uses large amounts of mass separating agents (65 tons/kg pDNA). Estimates indicate that operating costs of the ATPS process are 2.5-fold larger when compared with the PP and TFF processes. The most significant contributions to the costs in the PP, TFF and ATPS processes came from operators (59%), consumables (75%) and raw materials (84%), respectively. The ATPS process presented the highest environmental impact, whereas the impact of the TFF process was negligible.

  15. The evolution of the quasar continuum

    NASA Technical Reports Server (NTRS)

    Elvis, M.

    1992-01-01

    We now have in hand a large data base of Roentgen Satellite (ROSAT), optical, and IR complementary data. We are in the process of obtaining a large amount of the International Ultraviolet Explorer (IUE) data for the same quasar sample. For our complementary sample at high redshifts, where the UV was redshifted into the optical, we have just had approved large amounts of observing time to cover the quasar continuum in the near-IR using the new Near-Infrared Camera and Multi-Object Spectrometer (NICMOS) array spectrographs. Ten micron, optical, and VLA radio, data also have approved time. An ISO US key program was approved to extend this work into the far-IR, and the launch of ASTRO-D (early in 1993) promises to extend it to higher energy X-rays.

  16. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    PubMed

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  17. Use of cloud computing in biomedicine.

    PubMed

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  18. Development of Telecommunications of Prao ASC Lpi RAS

    NASA Astrophysics Data System (ADS)

    Isaev, E. A.; Dumskiy, D. V.; Likhachev, S. F.; Shatskaya, M. V.; Pugachev, V. D.; Samodurov, V. A.

    The new modern and reliable data storage system was acquired in 2010 in order to develop internal telecommunication resources of the Observatory. The system is designed for store large amounts of observation data obtained from the three radio-astronomy complexes (PT-22, DKR-1000 and BSA). The digital switching system - "Elcom" is installed in the Pushchino Radio Astronomy Observatory to ensure the observatory by phone communications. The phone communication between buildings of the observatory carried out over fiber-optic data links by using the ip-telephony. The direct optical channel from tracking station RT-22 in Pushchino to Moscow processing center has been created and put into operation to transfer large amounts of data at the final stage of the establishment of ground infrastructure for the international space project "Radioastron". A separate backup system for processing and storing data is organized in Pushchino Radio Astronomy Observatory to eliminate data loss during communication sessions with the Space Telescope.

  19. Theoretical implications for the estimation of dinitrogen fixation by large perennial plant species using isotope dilution

    Treesearch

    Dwight D. Baker; Maurice Fried; John A. Parrotta

    1995-01-01

    Estimation of symbiotic N2 fixation associated with large perennial plant species, especially trees, poses special problems because the process must be followed over a potentially long period of time to integrate the total amount of fixation. Estimations using isotope dilution methodology have begun to be used for trees in field studies. Because...

  20. Solution blow spinning: parameters optimization and effects on the properties of nanofibers from poly(lactic) acid/dimethyl carbonate solutions

    USDA-ARS?s Scientific Manuscript database

    Solution blow spinning (SBS) is a process to produce non-woven fiber sheets with high porosity and an extremely large amount of surface area. In this study, a Box-Behnken experimental design (BBD) was used to optimize the processing parameters for the production of nanofibers from polymer solutions ...

  1. An Extensible Processing Framework for Eddy-covariance Data

    NASA Astrophysics Data System (ADS)

    Durden, D.; Fox, A. M.; Metzger, S.; Sturtevant, C.; Durden, N. P.; Luo, H.

    2016-12-01

    The evolution of large data collecting networks has not only led to an increase of available information, but also in the complexity of analyzing the observations. Timely dissemination of readily usable data products necessitates a streaming processing framework that is both automatable and flexible. Tower networks, such as ICOS, Ameriflux, and NEON, exemplify this issue by requiring large amounts of data to be processed from dispersed measurement sites. Eddy-covariance data from across the NEON network are expected to amount to 100 Gigabytes per day. The complexity of the algorithmic processing necessary to produce high-quality data products together with the continued development of new analysis techniques led to the development of a modular R-package, eddy4R. This allows algorithms provided by NEON and the larger community to be deployed in streaming processing, and to be used by community members alike. In order to control the processing environment, provide a proficient parallel processing structure, and certify dependencies are available during processing, we chose Docker as our "Development and Operations" (DevOps) platform. The Docker framework allows our processing algorithms to be developed, maintained and deployed at scale. Additionally, the eddy4R-Docker framework fosters community use and extensibility via pre-built Docker images and the Github distributed version control system. The capability to process large data sets is reliant upon efficient input and output of data, data compressibility to reduce compute resource loads, and the ability to easily package metadata. The Hierarchical Data Format (HDF5) is a file format that can meet these needs. A NEON standard HDF5 file structure and metadata attributes allow users to explore larger data sets in an intuitive "directory-like" structure adopting the NEON data product naming conventions.

  2. Surface-induced dissociation of methanol cations: A non-ergodic process

    DOE PAGES

    Shukla, Anil K.

    2017-09-01

    Here, dissociation of methanol molecular cations, CH 3OH +, to CH 2OH + on collision with a self-assembled monolayer surface of fluorinated alkyl thiol on gold 111 crystal has been studied at 12.5 eV collision energy. Two energetically and spatially distinct processes contribute to the dissociation process: one involving loss of very large amount of energy approaching the initial kinetic energy of the primary ions with scattering of fragment ions over a broad angular range between surface normal and surface parallel while the second process results from small amount of energy loss with fragment ions scattered over a narrow angularmore » range close to the surface parallel. There is a third process with relatively small contribution to total dissociation whose characteristics are very similar to the low energy loss process. Finally, these results demonstrate that surface-induced dissociation of methanol cations via hydrogen loss is non-ergodic.« less

  3. Surface-induced dissociation of methanol cations: A non-ergodic process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shukla, Anil K.

    Here, dissociation of methanol molecular cations, CH 3OH +, to CH 2OH + on collision with a self-assembled monolayer surface of fluorinated alkyl thiol on gold 111 crystal has been studied at 12.5 eV collision energy. Two energetically and spatially distinct processes contribute to the dissociation process: one involving loss of very large amount of energy approaching the initial kinetic energy of the primary ions with scattering of fragment ions over a broad angular range between surface normal and surface parallel while the second process results from small amount of energy loss with fragment ions scattered over a narrow angularmore » range close to the surface parallel. There is a third process with relatively small contribution to total dissociation whose characteristics are very similar to the low energy loss process. Finally, these results demonstrate that surface-induced dissociation of methanol cations via hydrogen loss is non-ergodic.« less

  4. Signal and image processing algorithm performance in a virtual and elastic computing environment

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  5. Production of Low Cost Carbon-Fiber through Energy Optimization of Stabilization Process.

    PubMed

    Golkarnarenji, Gelayol; Naebe, Minoo; Badii, Khashayar; Milani, Abbas S; Jazar, Reza N; Khayyam, Hamid

    2018-03-05

    To produce high quality and low cost carbon fiber-based composites, the optimization of the production process of carbon fiber and its properties is one of the main keys. The stabilization process is the most important step in carbon fiber production that consumes a large amount of energy and its optimization can reduce the cost to a large extent. In this study, two intelligent optimization techniques, namely Support Vector Regression (SVR) and Artificial Neural Network (ANN), were studied and compared, with a limited dataset obtained to predict physical property (density) of oxidative stabilized PAN fiber (OPF) in the second zone of a stabilization oven within a carbon fiber production line. The results were then used to optimize the energy consumption in the process. The case study can be beneficial to chemical industries involving carbon fiber manufacturing, for assessing and optimizing different stabilization process conditions at large.

  6. Production of Low Cost Carbon-Fiber through Energy Optimization of Stabilization Process

    PubMed Central

    Golkarnarenji, Gelayol; Naebe, Minoo; Badii, Khashayar; Milani, Abbas S.; Jazar, Reza N.; Khayyam, Hamid

    2018-01-01

    To produce high quality and low cost carbon fiber-based composites, the optimization of the production process of carbon fiber and its properties is one of the main keys. The stabilization process is the most important step in carbon fiber production that consumes a large amount of energy and its optimization can reduce the cost to a large extent. In this study, two intelligent optimization techniques, namely Support Vector Regression (SVR) and Artificial Neural Network (ANN), were studied and compared, with a limited dataset obtained to predict physical property (density) of oxidative stabilized PAN fiber (OPF) in the second zone of a stabilization oven within a carbon fiber production line. The results were then used to optimize the energy consumption in the process. The case study can be beneficial to chemical industries involving carbon fiber manufacturing, for assessing and optimizing different stabilization process conditions at large. PMID:29510592

  7. Sequential control of step-bunching during graphene growth on SiC (0001)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bao, Jianfeng; Kusunoki, Michiko; Yasui, Osamu

    2016-08-22

    We have investigated the relation between the step-bunching and graphene growth phenomena on an SiC substrate. We found that only a minimum amount of step-bunching occurred during the graphene growth process with a high heating rate. On the other hand, a large amount of step-bunching occurred using a slow heating process. These results indicated that we can control the degree of step-bunching during graphene growth by controlling the heating rate. We also found that graphene coverage suppressed step bunching, which is an effective methodology not only in the graphene technology but also in the SiC-based power electronics.

  8. Support Vector Machines Trained with Evolutionary Algorithms Employing Kernel Adatron for Large Scale Classification of Protein Structures.

    PubMed

    Arana-Daniel, Nancy; Gallegos, Alberto A; López-Franco, Carlos; Alanís, Alma Y; Morales, Jacob; López-Franco, Adriana

    2016-01-01

    With the increasing power of computers, the amount of data that can be processed in small periods of time has grown exponentially, as has the importance of classifying large-scale data efficiently. Support vector machines have shown good results classifying large amounts of high-dimensional data, such as data generated by protein structure prediction, spam recognition, medical diagnosis, optical character recognition and text classification, etc. Most state of the art approaches for large-scale learning use traditional optimization methods, such as quadratic programming or gradient descent, which makes the use of evolutionary algorithms for training support vector machines an area to be explored. The present paper proposes an approach that is simple to implement based on evolutionary algorithms and Kernel-Adatron for solving large-scale classification problems, focusing on protein structure prediction. The functional properties of proteins depend upon their three-dimensional structures. Knowing the structures of proteins is crucial for biology and can lead to improvements in areas such as medicine, agriculture and biofuels.

  9. Understanding the nature of atmospheric acid processing of mineral dusts in supplying bioavailable phosphorus to the oceans.

    PubMed

    Stockdale, Anthony; Krom, Michael D; Mortimer, Robert J G; Benning, Liane G; Carslaw, Kenneth S; Herbert, Ross J; Shi, Zongbo; Myriokefalitakis, Stelios; Kanakidou, Maria; Nenes, Athanasios

    2016-12-20

    Acidification of airborne dust particles can dramatically increase the amount of bioavailable phosphorus (P) deposited on the surface ocean. Experiments were conducted to simulate atmospheric processes and determine the dissolution behavior of P compounds in dust and dust precursor soils. Acid dissolution occurs rapidly (seconds to minutes) and is controlled by the amount of H + ions present. For H + < 10 -4 mol/g of dust, 1-10% of the total P is dissolved, largely as a result of dissolution of surface-bound forms. At H + > 10 -4 mol/g of dust, the amount of P (and calcium) released has a direct proportionality to the amount of H + consumed until all inorganic P minerals are exhausted and the final pH remains acidic. Once dissolved, P will stay in solution due to slow precipitation kinetics. Dissolution of apatite-P (Ap-P), the major mineral phase in dust (79-96%), occurs whether calcium carbonate (calcite) is present or not, although the increase in dissolved P is greater if calcite is absent or if the particles are externally mixed. The system was modeled adequately as a simple mixture of Ap-P and calcite. P dissolves readily by acid processes in the atmosphere in contrast to iron, which dissolves more slowly and is subject to reprecipitation at cloud water pH. We show that acidification can increase bioavailable P deposition over large areas of the globe, and may explain much of the previously observed patterns of variability in leachable P in oceanic areas where primary productivity is limited by this nutrient (e.g., Mediterranean).

  10. Ambient-aware continuous care through semantic context dissemination.

    PubMed

    Ongenae, Femke; Famaey, Jeroen; Verstichel, Stijn; De Zutter, Saar; Latré, Steven; Ackaert, Ann; Verhoeve, Piet; De Turck, Filip

    2014-12-04

    The ultimate ambient-intelligent care room contains numerous sensors and devices to monitor the patient, sense and adjust the environment and support the staff. This sensor-based approach results in a large amount of data, which can be processed by current and future applications, e.g., task management and alerting systems. Today, nurses are responsible for coordinating all these applications and supplied information, which reduces the added value and slows down the adoption rate.The aim of the presented research is the design of a pervasive and scalable framework that is able to optimize continuous care processes by intelligently reasoning on the large amount of heterogeneous care data. The developed Ontology-based Care Platform (OCarePlatform) consists of modular components that perform a specific reasoning task. Consequently, they can easily be replicated and distributed. Complex reasoning is achieved by combining the results of different components. To ensure that the components only receive information, which is of interest to them at that time, they are able to dynamically generate and register filter rules with a Semantic Communication Bus (SCB). This SCB semantically filters all the heterogeneous care data according to the registered rules by using a continuous care ontology. The SCB can be distributed and a cache can be employed to ensure scalability. A prototype implementation is presented consisting of a new-generation nurse call system supported by a localization and a home automation component. The amount of data that is filtered and the performance of the SCB are evaluated by testing the prototype in a living lab. The delay introduced by processing the filter rules is negligible when 10 or fewer rules are registered. The OCarePlatform allows disseminating relevant care data for the different applications and additionally supports composing complex applications from a set of smaller independent components. This way, the platform significantly reduces the amount of information that needs to be processed by the nurses. The delay resulting from processing the filter rules is linear in the amount of rules. Distributed deployment of the SCB and using a cache allows further improvement of these performance results.

  11. AVIRIS ground data-processing system

    NASA Technical Reports Server (NTRS)

    Reimer, John H.; Heyada, Jan R.; Carpenter, Steve C.; Deich, William T. S.; Lee, Meemong

    1987-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has been under development at JPL for the past four years. During this time, a dedicated ground data-processing system has been designed and implemented to store and process the large amounts of data expected. This paper reviews the objectives of this ground data-processing system and describes the hardware. An outline of the data flow through the system is given, and the software and incorporated algorithms developed specifically for the systematic processing of AVIRIS data are described.

  12. Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach

    PubMed Central

    Zimmer, Jennifer S.D.; Monroe, Matthew E.; Qian, Wei-Jun; Smith, Richard D.

    2007-01-01

    Proteomics has recently demonstrated utility in understanding cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled to high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. PMID:16429408

  13. Research on photodiode detector-based spatial transient light detection and processing system

    NASA Astrophysics Data System (ADS)

    Liu, Meiying; Wang, Hu; Liu, Yang; Zhao, Hui; Nan, Meng

    2016-10-01

    In order to realize real-time signal identification and processing of spatial transient light, the features and the energy of the captured target light signal are first described and quantitatively calculated. Considering that the transient light signal has random occurrence, a short duration and an evident beginning and ending, a photodiode detector based spatial transient light detection and processing system is proposed and designed in this paper. This system has a large field of view and is used to realize non-imaging energy detection of random, transient and weak point target under complex background of spatial environment. Weak signal extraction under strong background is difficult. In this paper, considering that the background signal changes slowly and the target signal changes quickly, filter is adopted for signal's background subtraction. A variable speed sampling is realized by the way of sampling data points with a gradually increased interval. The two dilemmas that real-time processing of large amount of data and power consumption required by the large amount of data needed to be stored are solved. The test results with self-made simulative signal demonstrate the effectiveness of the design scheme. The practical system could be operated reliably. The detection and processing of the target signal under the strong sunlight background was realized. The results indicate that the system can realize real-time detection of target signal's characteristic waveform and monitor the system working parameters. The prototype design could be used in a variety of engineering applications.

  14. A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.

    PubMed

    De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc

    2010-09-01

    In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.

  15. Gigwa-Genotype investigator for genome-wide analyses.

    PubMed

    Sempéré, Guilhem; Philippe, Florian; Dereeper, Alexis; Ruiz, Manuel; Sarah, Gautier; Larmande, Pierre

    2016-06-06

    Exploring the structure of genomes and analyzing their evolution is essential to understanding the ecological adaptation of organisms. However, with the large amounts of data being produced by next-generation sequencing, computational challenges arise in terms of storage, search, sharing, analysis and visualization. This is particularly true with regards to studies of genomic variation, which are currently lacking scalable and user-friendly data exploration solutions. Here we present Gigwa, a web-based tool that provides an easy and intuitive way to explore large amounts of genotyping data by filtering it not only on the basis of variant features, including functional annotations, but also on genotype patterns. The data storage relies on MongoDB, which offers good scalability properties. Gigwa can handle multiple databases and may be deployed in either single- or multi-user mode. In addition, it provides a wide range of popular export formats. The Gigwa application is suitable for managing large amounts of genomic variation data. Its user-friendly web interface makes such processing widely accessible. It can either be simply deployed on a workstation or be used to provide a shared data portal for a given community of researchers.

  16. Simplifying the negotiating process with physicians: critical elements in negotiating from private practice to employed physician.

    PubMed

    Gallucci, Armen; Deutsch, Thomas; Youngquist, Jaymie

    2013-01-01

    The authors attempt to simplify the key elements to the process of negotiating successfully with private physicians. From their experience, the business elements that have resulted in the most discussion center on the compensation including the incentive plan. Secondarily, how the issue of malpractice is handled will also consume a fair amount of time. What the authors have also learned is that the intangible issues can often be the reason for an unexpectedly large amount of discussion and therefore add time to the negotiation process. To assist with this process, they have derived a negotiation checklist, which seeks to help hospital leaders and administrators set the proper framework to ensure successful negotiation conversations. More importantly, being organized and recognizing these broad issues upfront and remaining transparent throughout the process will help to ensure a successful negotiation.

  17. ECUT: Energy Conversion and Utilization Technologies program biocatalysis research activity. Potential membrane applications to biocatalyzed processes: Assessment of concentration polarization and membrane fouling

    NASA Technical Reports Server (NTRS)

    Ingham, J. D.

    1983-01-01

    Separation and purification of the products of biocatalyzed fermentation processes, such as ethanol or butanol, consumes most of the process energy required. Since membrane systems require substantially less energy for separation than most alternatives (e.g., distillation) they have been suggested for separation or concentration of fermentation products. This report is a review of the effects of concentration polarization and membrane fouling for the principal membrane processes: microfiltration, ultrafiltration, reverse osmosis, and electrodialysis including a discussion of potential problems relevant to separation of fermentation products. It was concluded that advanced membrane systems may result in significantly decreased energy consumption. However, because of the need to separate large amounts of water from much smaller amounts of product that may be more volatile than wate, it is not clear that membrane separations will necessarily be more efficient than alternative processes.

  18. In Situ High Temperature High Pressure MAS NMR Study on the Crystallization of AlPO 4 -5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Zhenchao; Xu, Suochang; Hu, Mary Y.

    2016-01-28

    A damped oscillating crystallization process of AlPO4-5 at the presence of small amount of water is demonstrated by in situ high temperature high pressure multinuclear MAS NMR. Crystalline AlPO4-5 is formed from an intermediate semicrystalline phase via continuous rearrangement of the local structure of amorphous precursor gel. Activated water catalyzes the rearrangement via repeatedly hydrolysis and condensation reaction. Strong interactions between organic template and inorganic species facilitate the ordered rearrangement. During the crystallization process, excess water, phosphate, and aluminums are expelled from the precursor. The oscillating crystallization reflects mass transportation between the solid and liquid phase during the crystallization process.more » This crystallization process is also applicable to AlPO4-5 crystallized in the presence of a relatively large amount of water.« less

  19. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. General Recommendations on Fatigue Risk Management for the Canadian Forces

    DTIC Science & Technology

    2010-04-01

    missions performed in aviation require an individual(s) to process large amount of information in a short period of time and to do this on a continuous...information processing required during sustained operations can deteriorate an individual’s ability to perform a task. Given the high operational tempo...memory, which, in turn, is utilized to perform human thought processes (Baddeley, 2003). While various versions of this theory exist, they all share

  1. Dairy manure biochar as a phosphorus fertilizer

    USDA-ARS?s Scientific Manuscript database

    Future manure management practices will need to remove large amounts of organic waste as well as harness energy to generate value-added products. Manures can be processed using thermochemical conversion technologies to generate a solid product called biochar. Dairy manure biochars contain sufficient...

  2. Sourcing Life Cycle Inventory Data

    EPA Science Inventory

    The collection and validation of quality lifecycle inventory (LCI) data can be the most difficult and time-consuming aspect of developing a life cycle assessment (LCA). Large amounts of process and production data are needed to complete the LCI. For many studies, the LCA analyst ...

  3. A Decision-Oriented Investigation of Air Force Civil Engineering’s Operations Branch and the Implications for a Decision Support System.

    DTIC Science & Technology

    1984-09-01

    information when making a decision [ Szilagyi and Wallace , 1983:3201." Driver and Mock used cognitive complexity ideas to develop this two dimensional...flexible AMOUNT OF INFORMATION USED High hierarchic integrative Figure 6. Cognitive Complexity Model ( Szilagyi and Wallace , 1983:321) Decisive Style. The...large amount of inform- ation. However, he processes this information with a multiple focus approach ( Szilagyi and Wallace , 1983:320-321). 26 McKenney

  4. Automated Selection Of Pictures In Sequences

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.; Shelton, Robert O.

    1995-01-01

    Method of automated selection of film or video motion-picture frames for storage or examination developed. Beneficial in situations in which quantity of visual information available exceeds amount stored or examined by humans in reasonable amount of time, and/or necessary to reduce large number of motion-picture frames to few conveying significantly different information in manner intermediate between movie and comic book or storyboard. For example, computerized vision system monitoring industrial process programmed to sound alarm when changes in scene exceed normal limits.

  5. Lessons learnt on the analysis of large sequence data in animal genomics.

    PubMed

    Biscarini, F; Cozzi, P; Orozco-Ter Wengel, P

    2018-04-06

    The 'omics revolution has made a large amount of sequence data available to researchers and the industry. This has had a profound impact in the field of bioinformatics, stimulating unprecedented advancements in this discipline. Mostly, this is usually looked at from the perspective of human 'omics, in particular human genomics. Plant and animal genomics, however, have also been deeply influenced by next-generation sequencing technologies, with several genomics applications now popular among researchers and the breeding industry. Genomics tends to generate huge amounts of data, and genomic sequence data account for an increasing proportion of big data in biological sciences, due largely to decreasing sequencing and genotyping costs and to large-scale sequencing and resequencing projects. The analysis of big data poses a challenge to scientists, as data gathering currently takes place at a faster pace than does data processing and analysis, and the associated computational burden is increasingly taxing, making even simple manipulation, visualization and transferring of data a cumbersome operation. The time consumed by the processing and analysing of huge data sets may be at the expense of data quality assessment and critical interpretation. Additionally, when analysing lots of data, something is likely to go awry-the software may crash or stop-and it can be very frustrating to track the error. We herein review the most relevant issues related to tackling these challenges and problems, from the perspective of animal genomics, and provide researchers that lack extensive computing experience with guidelines that will help when processing large genomic data sets. © 2018 Stichting International Foundation for Animal Genetics.

  6. System simulation application for determining the size of daily raw material purchases at PT XY

    NASA Astrophysics Data System (ADS)

    Napitupulu, H. L.

    2018-02-01

    Every manufacturing company needs to implement green production, including PT XY as a marine catchment processing industry in Sumatera Utara Province. The company is engaged in the processing of squid for export purposes. The company’s problem relates to the absence of a decision on the daily purchase amount of the squid. The purchase of daily raw materials in varying quantities has caused companies to face the problem of excess raw materials or otherwise the lack of raw materials. The low purchase of raw materials will result in reduced productivity, while large purchases will lead to increased cooling costs for storage of excess raw materials, as well as possible loss of damage raw material. Therefore it is necessary to determine the optimal amount of raw material purchases every day. This can be determined by applying simulation. Application of system simulations can provide the expected optimal amount of raw material purchases.

  7. The microbial fermentation characteristics depend on both carbohydrate source and heat processing: a model experiment with ileo-cannulated pigs.

    PubMed

    Nielsen, Tina Skau; Jørgensen, Henry; Knudsen, Knud Erik Bach; Lærke, Helle Nygaard

    2017-11-01

    The effects of carbohydrate (CHO) source and processing (extrusion cooking) on large intestinal fermentation products were studied in ileo-cannulated pigs as a model for humans. Pigs were fed diets containing barley, pea or a mixture of potato starch:wheat bran (PSWB) either raw or extrusion cooked. Extrusion cooking reduced the amount of starch fermented in the large intestine by 52-96% depending on the CHO source and the total pool of butyrate in the distal small intestine + large intestine by on average 60% across diets. Overall, extrusion cooking caused a shift in the composition of short-chain fatty acids (SCFA) produced towards more acetate and less propionate and butyrate. The CHO source and processing highly affected the fermentation characteristics and extrusion cooking generally reduced large intestinal fermentation and resulted in a less desirable composition of the fermentation products. The latter outcome is non-conducive to a healthy large intestinal environment and its resulting metabolic health.

  8. Sources for Leads: Natural Products and Libraries.

    PubMed

    van Herwerden, Eric F; Süssmuth, Roderich D

    2016-01-01

    Natural products have traditionally been a major source of leads in the drug discovery process. However, the development of high-throughput screening led to an increased interest in synthetic methods that enabled the rapid construction of large libraries of molecules. This resulted in the termination or downscaling of many natural product research programs, but the chemical libraries did not necessarily produce a larger amount of drug leads. On one hand, this chapter explores the current state of natural product research within the drug discovery process. On the other hand it evaluates the efforts made to increase the amount of leads generated from chemical libraries and considers what role natural products could play here.

  9. Automated information-analytical system for thunderstorm monitoring and early warning alarms using modern physical sensors and information technologies with elements of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Boldyreff, Anton S.; Bespalov, Dmitry A.; Adzhiev, Anatoly Kh.

    2017-05-01

    Methods of artificial intelligence are a good solution for weather phenomena forecasting. They allow to process a large amount of diverse data. Recirculation Neural Networks is implemented in the paper for the system of thunderstorm events prediction. Large amounts of experimental data from lightning sensors and electric field mills networks are received and analyzed. The average recognition accuracy of sensor signals is calculated. It is shown that Recirculation Neural Networks is a promising solution in the forecasting of thunderstorms and weather phenomena, characterized by the high efficiency of the recognition elements of the sensor signals, allows to compress images and highlight their characteristic features for subsequent recognition.

  10. Successful integration of ergonomics into continuous improvement initiatives.

    PubMed

    Monroe, Kimberly; Fick, Faye; Joshi, Madina

    2012-01-01

    Process improvement initiatives are receiving renewed attention by large corporations as they attempt to reduce manufacturing costs and stay competitive in the global marketplace. These initiatives include 5S, Six Sigma, and Lean. These programs often take up a large amount of available time and budget resources. More often than not, existing ergonomics processes are considered separate initiatives by upper management and struggle to gain a seat at the table. To effectively maintain their programs, ergonomics program managers need to overcome those obstacles and demonstrate how ergonomics initiatives are a natural fit with continuous improvement philosophies.

  11. Understanding the nature of atmospheric acid processing of mineral dusts in supplying bioavailable phosphorus to the oceans

    PubMed Central

    Krom, Michael D.; Mortimer, Robert J. G.; Benning, Liane G.; Herbert, Ross J.; Shi, Zongbo; Kanakidou, Maria; Nenes, Athanasios

    2016-01-01

    Acidification of airborne dust particles can dramatically increase the amount of bioavailable phosphorus (P) deposited on the surface ocean. Experiments were conducted to simulate atmospheric processes and determine the dissolution behavior of P compounds in dust and dust precursor soils. Acid dissolution occurs rapidly (seconds to minutes) and is controlled by the amount of H+ ions present. For H+ < 10−4 mol/g of dust, 1–10% of the total P is dissolved, largely as a result of dissolution of surface-bound forms. At H+ > 10−4 mol/g of dust, the amount of P (and calcium) released has a direct proportionality to the amount of H+ consumed until all inorganic P minerals are exhausted and the final pH remains acidic. Once dissolved, P will stay in solution due to slow precipitation kinetics. Dissolution of apatite-P (Ap-P), the major mineral phase in dust (79–96%), occurs whether calcium carbonate (calcite) is present or not, although the increase in dissolved P is greater if calcite is absent or if the particles are externally mixed. The system was modeled adequately as a simple mixture of Ap-P and calcite. P dissolves readily by acid processes in the atmosphere in contrast to iron, which dissolves more slowly and is subject to reprecipitation at cloud water pH. We show that acidification can increase bioavailable P deposition over large areas of the globe, and may explain much of the previously observed patterns of variability in leachable P in oceanic areas where primary productivity is limited by this nutrient (e.g., Mediterranean). PMID:27930294

  12. Sharing Digital Data

    ERIC Educational Resources Information Center

    Benedis-Grab, Gregory

    2011-01-01

    Computers have changed the landscape of scientific research in profound ways. Technology has always played an important role in scientific experimentation--through the development of increasingly sophisticated tools, the measurement of elusive quantities, and the processing of large amounts of data. However, the advent of social networking and the…

  13. Introduction to the JEEG Agricultural Geophysics special issue

    USDA-ARS?s Scientific Manuscript database

    Recent advancements such as the availability of personal computers, technologies to store/process large amounts of data, the GPS, and GIS have now made geophysical methods practical for agricultural use. Consequently, there has been a rapid expansion of agricultural geophysics research just over the...

  14. What is a MISR block?

    Atmospheric Science Data Center

    2016-02-21

    The generic data file for MISR is a swath, i.e., a set of measurements for the entire area observed during the day part of the orbit. This is a very large amount of data. To simplify the storing and processing of these data, swathes are broken...

  15. Composing Data Parallel Code for a SPARQL Graph Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Villa, Oreste

    Big data analytics process large amount of data to extract knowledge from them. Semantic databases are big data applications that adopt the Resource Description Framework (RDF) to structure metadata through a graph-based representation. The graph based representation provides several benefits, such as the possibility to perform in memory processing with large amounts of parallelism. SPARQL is a language used to perform queries on RDF-structured data through graph matching. In this paper we present a tool that automatically translates SPARQL queries to parallel graph crawling and graph matching operations. The tool also supports complex SPARQL constructs, which requires more than basicmore » graph matching for their implementation. The tool generates parallel code annotated with OpenMP pragmas for x86 Shared-memory Multiprocessors (SMPs). With respect to commercial database systems such as Virtuoso, our approach reduces memory occupation due to join operations and provides higher performance. We show the scaling of the automatically generated graph-matching code on a 48-core SMP.« less

  16. URANIUM PURIFICATION PROCESS

    DOEpatents

    Ruhoff, J.R.; Winters, C.E.

    1957-11-12

    A process is described for the purification of uranyl nitrate by an extraction process. A solution is formed consisting of uranyl nitrate, together with the associated impurities arising from the HNO/sub 3/ leaching of the ore, in an organic solvent such as ether. If this were back extracted with water to remove the impurities, large quantities of uranyl nitrate will also be extracted and lost. To prevent this, the impure organic solution is extracted with small amounts of saturated aqueous solutions of uranyl nitrate thereby effectively accomplishing the removal of impurities while not allowing any further extraction of the uranyl nitrate from the organic solvent. After the impurities have been removed, the uranium values are extracted with large quantities of water.

  17. A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.

    PubMed

    Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V

    2016-07-01

    In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  19. webpic: A flexible web application for collecting distance and count measurements from images

    PubMed Central

    2018-01-01

    Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592

  20. Research on key technologies of data processing in internet of things

    NASA Astrophysics Data System (ADS)

    Zhu, Yangqing; Liang, Peiying

    2017-08-01

    The data of Internet of things (IOT) has the characteristics of polymorphism, heterogeneous, large amount and processing real-time. The traditional structured and static batch processing method has not met the requirements of data processing of IOT. This paper studied a middleware that can integrate heterogeneous data of IOT, and integrated different data formats into a unified format. Designed a data processing model of IOT based on the Storm flow calculation architecture, integrated the existing Internet security technology to build the Internet security system of IOT data processing, which provided reference for the efficient transmission and processing of IOT data.

  1. Massively parallel processor computer

    NASA Technical Reports Server (NTRS)

    Fung, L. W. (Inventor)

    1983-01-01

    An apparatus for processing multidimensional data with strong spatial characteristics, such as raw image data, characterized by a large number of parallel data streams in an ordered array is described. It comprises a large number (e.g., 16,384 in a 128 x 128 array) of parallel processing elements operating simultaneously and independently on single bit slices of a corresponding array of incoming data streams under control of a single set of instructions. Each of the processing elements comprises a bidirectional data bus in communication with a register for storing single bit slices together with a random access memory unit and associated circuitry, including a binary counter/shift register device, for performing logical and arithmetical computations on the bit slices, and an I/O unit for interfacing the bidirectional data bus with the data stream source. The massively parallel processor architecture enables very high speed processing of large amounts of ordered parallel data, including spatial translation by shifting or sliding of bits vertically or horizontally to neighboring processing elements.

  2. A Parallel Nonrigid Registration Algorithm Based on B-Spline for Medical Images.

    PubMed

    Du, Xiaogang; Dang, Jianwu; Wang, Yangping; Wang, Song; Lei, Tao

    2016-01-01

    The nonrigid registration algorithm based on B-spline Free-Form Deformation (FFD) plays a key role and is widely applied in medical image processing due to the good flexibility and robustness. However, it requires a tremendous amount of computing time to obtain more accurate registration results especially for a large amount of medical image data. To address the issue, a parallel nonrigid registration algorithm based on B-spline is proposed in this paper. First, the Logarithm Squared Difference (LSD) is considered as the similarity metric in the B-spline registration algorithm to improve registration precision. After that, we create a parallel computing strategy and lookup tables (LUTs) to reduce the complexity of the B-spline registration algorithm. As a result, the computing time of three time-consuming steps including B-splines interpolation, LSD computation, and the analytic gradient computation of LSD, is efficiently reduced, for the B-spline registration algorithm employs the Nonlinear Conjugate Gradient (NCG) optimization method. Experimental results of registration quality and execution efficiency on the large amount of medical images show that our algorithm achieves a better registration accuracy in terms of the differences between the best deformation fields and ground truth and a speedup of 17 times over the single-threaded CPU implementation due to the powerful parallel computing ability of Graphics Processing Unit (GPU).

  3. Numerical analysis of the primary processes controlling oxygen dynamics on the Louisiana shelf

    EPA Science Inventory

    The Louisiana shelf, in the northern Gulf of Mexico, receives large amounts of freshwater and nutrients from the Mississippi–Atchafalaya river system. These river inputs contribute to widespread bottom-water hypoxia every summer. In this study, we use a physical–biogeochemical mo...

  4. Treatment of wastewater containing a large amount of suspended solids by a novel multi-staged UASB reactor.

    PubMed

    Uemura, S; Harada, H; Ohashi, A; Torimura, S

    2005-12-01

    Treatment of artificial wastewater containing a large amount of suspended solids comprised of soybean processing waste and pig fodder was studied using a novel multi-staged upflow anaerobic sludge blanket reactor. The reactor consisted of three compartments, each containing a gas solid separator. The wastewater had chemical oxygen demand of approximately 21600 mg l(-1), suspended solids of 12800 mg l(-1), and an ammonia concentration of 945 mg l(-1). A continuous experiment without effluent circulation showed that the multi-staged reactor was not that effective for the treatment of wastewater containing a large amount of suspended solids. However, operation of the reactor with circulation of effluent enabled the reactor to achieve organic removal of 85% and approximately 70% methane conversion at loading rates of between 4.0 to 5.4 kg-chemical oxygen demand per cubic meter per day, meaning that the reactor was more effective when effluent was circulated. Morphological investigation revealed that the crude fiber in the sludge was partially degraded and that it had many small depressions on its surface. Evolved biogas may have become caught in these depressions of the fibers and caused washout of the sludge.

  5. Additional studies of forest classification accuracy as influenced by multispectral scanner spatial resolution

    NASA Technical Reports Server (NTRS)

    Sadowski, F. E.; Sarno, J. E.

    1976-01-01

    First, an analysis of forest feature signatures was used to help explain the large variation in classification accuracy that can occur among individual forest features for any one case of spatial resolution and the inconsistent changes in classification accuracy that were demonstrated among features as spatial resolution was degraded. Second, the classification rejection threshold was varied in an effort to reduce the large proportion of unclassified resolution elements that previously appeared in the processing of coarse resolution data when a constant rejection threshold was used for all cases of spatial resolution. For the signature analysis, two-channel ellipse plots showing the feature signature distributions for several cases of spatial resolution indicated that the capability of signatures to correctly identify their respective features is dependent on the amount of statistical overlap among signatures. Reductions in signature variance that occur in data of degraded spatial resolution may not necessarily decrease the amount of statistical overlap among signatures having large variance and small mean separations. Features classified by such signatures may thus continue to have similar amounts of misclassified elements in coarser resolution data, and thus, not necessarily improve in classification accuracy.

  6. A mass storage system for supercomputers based on Unix

    NASA Technical Reports Server (NTRS)

    Richards, J.; Kummell, T.; Zarlengo, D. G.

    1988-01-01

    The authors present the design, implementation, and utilization of a large mass storage subsystem (MSS) for the numerical aerodynamics simulation. The MSS supports a large networked, multivendor Unix-based supercomputing facility. The MSS at Ames Research Center provides all processors on the numerical aerodynamics system processing network, from workstations to supercomputers, the ability to store large amounts of data in a highly accessible, long-term repository. The MSS uses Unix System V and is capable of storing hundreds of thousands of files ranging from a few bytes to 2 Gb in size.

  7. Process-in-Network: A Comprehensive Network Processing Approach

    PubMed Central

    Urzaiz, Gabriel; Villa, David; Villanueva, Felix; Lopez, Juan Carlos

    2012-01-01

    A solid and versatile communications platform is very important in modern Ambient Intelligence (AmI) applications, which usually require the transmission of large amounts of multimedia information over a highly heterogeneous network. This article focuses on the concept of Process-in-Network (PIN), which is defined as the possibility that the network processes information as it is being transmitted, and introduces a more comprehensive approach than current network processing technologies. PIN can take advantage of waiting times in queues of routers, idle processing capacity in intermediate nodes, and the information that passes through the network. PMID:22969390

  8. A link between eumelanism and calcium physiology in the barn owl

    NASA Astrophysics Data System (ADS)

    Roulin, Alexandre; Dauwe, Tom; Blust, Ronny; Eens, Marcel; Beaud, Michel

    2006-09-01

    In many animals, melanin-based coloration is strongly heritable and is largely insensitive to the environment and body condition. According to the handicap principle, such a trait may not reveal individual quality because the production of different melanin-based colorations often entails similar costs. However, a recent study showed that the production of eumelanin pigments requires relatively large amounts of calcium, potentially implying that melanin-based coloration is associated with physiological processes requiring calcium. If this is the case, eumelanism may be traded-off against other metabolic processes that require the same elements. We used a correlative approach to examine, for the first time, this proposition in the barn owl, a species in which individuals vary in the amount, size, and blackness of eumelanic spots. For this purpose, we measured calcium concentration in the left humerus of 85 dead owls. Results showed that the humeri of heavily spotted individuals had a higher concentration of calcium. This suggests either that plumage spottiness signals the ability to absorb calcium from the diet for both eumelanin production and storage in bones, or that lightly spotted individuals use more calcium for metabolic processes at the expense of calcium storage in bones. Our study supports the idea that eumelanin-based coloration is associated with a number of physiological processes requiring calcium.

  9. Rare behavior of growth processes via umbrella sampling of trajectories

    NASA Astrophysics Data System (ADS)

    Klymko, Katherine; Geissler, Phillip L.; Garrahan, Juan P.; Whitelam, Stephen

    2018-03-01

    We compute probability distributions of trajectory observables for reversible and irreversible growth processes. These results reveal a correspondence between reversible and irreversible processes, at particular points in parameter space, in terms of their typical and atypical trajectories. Thus key features of growth processes can be insensitive to the precise form of the rate constants used to generate them, recalling the insensitivity to microscopic details of certain equilibrium behavior. We obtained these results using a sampling method, inspired by the "s -ensemble" large-deviation formalism, that amounts to umbrella sampling in trajectory space. The method is a simple variant of existing approaches, and applies to ensembles of trajectories controlled by the total number of events. It can be used to determine large-deviation rate functions for trajectory observables in or out of equilibrium.

  10. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    NASA Astrophysics Data System (ADS)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  11. Tripropellant combustion process

    NASA Technical Reports Server (NTRS)

    Kmiec, T. D.; Carroll, R. G.

    1988-01-01

    The addition of small amounts of hydrogen to the combustion of LOX/hydrocarbon propellants in large rocket booster engines has the potential to enhance the system stability. Programs being conducted to evaluate the effects of hydrogen on the combustion of LOX/hydrocarbon propellants at supercritical pressures are described. Combustion instability has been a problem during the development of large hydrocarbon fueled rocket engines. At the higher combustion chamber pressures expected for the next generation of booster engines, the effect of unstable combustion could be even more destructive. The tripropellant engine cycle takes advantage of the superior cooling characteristics of hydrogen to cool the combustion chamber and a small amount of the hydrogen coolant can be used in the combustion process to enhance the system stability. Three aspects of work that will be accomplished to evaluate tripropellant combustion are described. The first is laboratory demonstration of the benefits through the evaluation of drop size, ignition delay and burning rate. The second is analytical modeling of the combustion process using the empirical relationship determined in the laboratory. The third is a subscale demonstration in which the system stability will be evaluated. The approach for each aspect is described and the analytical models that will be used are presented.

  12. Processes to improve energy efficiency during pumping and aeration of recirculating water in circular tank systems

    USDA-ARS?s Scientific Manuscript database

    Conventional gas transfer technologies for aquaculture systems occupy a large amount of space, require considerable capital investment, and can contribute to high electricity demand. In addition, diffused aeration in a circular tank can interfere with the hydrodynamics of water rotation and the spee...

  13. Reactive N emissions from beef cattle feedlots

    USDA-ARS?s Scientific Manuscript database

    Large amounts of nitrogen (N) are fed to meet the nutritional needs of beef cattle in feedlots. However, only from 10 to 15% of fed N is retained in animals. Most N is excreted. Chemical and biological processes transform manure N into ammonia, nitrous oxide and nitrate. These reactive forms of ...

  14. Early stage litter decomposition across biomes

    Treesearch

    Ika Djukic; Sebastian Kepfer-Rojas; Inger Kappel Schmidt; Klaus Steenberg Larsen; Claus Beier; Björn Berg; Kris Verheyen; Adriano Caliman; Alain Paquette; Alba Gutiérrez-Girón; Alberto Humber; Alejandro Valdecantos; Alessandro Petraglia; Heather Alexander; Algirdas Augustaitis; Amélie Saillard; Ana Carolina Ruiz Fernández; Ana I. Sousa; Ana I. Lillebø; Anderson da Rocha Gripp; André-Jean Francez; Andrea Fischer; Andreas Bohner; Andrey Malyshev; Andrijana Andrić; Andy Smith; Angela Stanisci; Anikó Seres; Anja Schmidt; Anna Avila; Anne Probst; Annie Ouin; Anzar A. Khuroo; Arne Verstraeten; Arely N. Palabral-Aguilera; Artur Stefanski; Aurora Gaxiola; Bart Muys; Bernard Bosman; Bernd Ahrends; Bill Parker; Birgit Sattler; Bo Yang; Bohdan Juráni; Brigitta Erschbamer; Carmen Eugenia Rodriguez Ortiz; Casper T. Christiansen; E. Carol Adair; Céline Meredieu; Cendrine Mony; Charles A. Nock; Chi-Ling Chen; Chiao-Ping Wang; Christel Baum; Christian Rixen; Christine Delire; Christophe Piscart; Christopher Andrews; Corinna Rebmann; Cristina Branquinho; Dana Polyanskaya; David Fuentes Delgado; Dirk Wundram; Diyaa Radeideh; Eduardo Ordóñez-Regil; Edward Crawford; Elena Preda; Elena Tropina; Elli Groner; Eric Lucot; Erzsébet Hornung; Esperança Gacia; Esther Lévesque; Evanilde Benedito; Evgeny A. Davydov; Evy Ampoorter; Fabio Padilha Bolzan; Felipe Varela; Ferdinand Kristöfel; Fernando T. Maestre; Florence Maunoury-Danger; Florian Hofhansl; Florian Kitz; Flurin Sutter; Francisco Cuesta; Francisco de Almeida Lobo; Franco Leandro de Souza; Frank Berninger; Franz Zehetner; Georg Wohlfahrt; George Vourlitis; Geovana Carreño-Rocabado; Gina Arena; Gisele Daiane Pinha; Grizelle González; Guylaine Canut; Hanna Lee; Hans Verbeeck; Harald Auge; Harald Pauli; Hassan Bismarck Nacro; Héctor A. Bahamonde; Heike Feldhaar; Heinke Jäger; Helena C. Serrano; Hélène Verheyden; Helge Bruelheide; Henning Meesenburg; Hermann Jungkunst; Hervé Jactel; Hideaki Shibata; Hiroko Kurokawa; Hugo López Rosas; Hugo L. Rojas Villalobos; Ian Yesilonis; Inara Melece; Inge Van Halder; Inmaculada García Quirós; Isaac Makelele; Issaka Senou; István Fekete; Ivan Mihal; Ivika Ostonen; Jana Borovská; Javier Roales; Jawad Shoqeir; Jean-Christophe Lata; Jean-Paul Theurillat; Jean-Luc Probst; Jess Zimmerman; Jeyanny Vijayanathan; Jianwu Tang; Jill Thompson; Jiří Doležal; Joan-Albert Sanchez-Cabeza; Joël Merlet; Joh Henschel; Johan Neirynck; Johannes Knops; John Loehr; Jonathan von Oppen; Jónína Sigríður Þorláksdóttir; Jörg Löffler; José-Gilberto Cardoso-Mohedano; José-Luis Benito-Alonso; Jose Marcelo Torezan; Joseph C. Morina; Juan J. Jiménez; Juan Dario Quinde; Juha Alatalo; Julia Seeber; Jutta Stadler; Kaie Kriiska; Kalifa Coulibaly; Karibu Fukuzawa; Katalin Szlavecz; Katarína Gerhátová; Kate Lajtha; Kathrin Käppeler; Katie A. Jennings; Katja Tielbörger; Kazuhiko Hoshizaki; Ken Green; Lambiénou Yé; Laryssa Helena Ribeiro Pazianoto; Laura Dienstbach; Laura Williams; Laura Yahdjian; Laurel M. Brigham; Liesbeth van den Brink; Lindsey Rustad; al. et

    2018-01-01

    Through litter decomposition enormous amounts of carbon is emitted to the atmosphere. Numerous large-scale decomposition experiments have been conducted focusing on this fundamental soil process in order to understand the controls on the terrestrial carbon transfer to the atmosphere. However, previous studies were mostly based on site-specific litter and methodologies...

  15. Separation methods and chemical and nutritional characteristics of tomato pomace

    USDA-ARS?s Scientific Manuscript database

    Tomato processing generates a large amount of pomace as a low value by-product primarily used as livestock feed or disposed. The objectives of this research were to investigate the chemical and nutritional characteristics and determine effective separation methods of peel and seed of commercial toma...

  16. MERCURY IN STAMP SAND DISCHARGES: IMPLICATIONS FOR LAKE SUPERIOR MERCURY CYCLING

    EPA Science Inventory

    Approximately a half billion tons of waste rock from the extraction of native copper and silver ores was discharged into the Lake Superior basin. Stamping was the method of choice to recover these metals from the surrounding poor rock. This process created large amounts of extre...

  17. Integrating Computational Science Tools into a Thermodynamics Course

    ERIC Educational Resources Information Center

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…

  18. Evaluation of pressurized water cleaning systems for hardware refurbishment

    NASA Technical Reports Server (NTRS)

    Dillard, Terry W.; Deweese, Charles D.; Hoppe, David T.; Vickers, John H.; Swenson, Gary J.; Hutchens, Dale E.

    1995-01-01

    Historically, refurbishment processes for RSRM motor cases and components have employed environmentally harmful materials. Specifically, vapor degreasing processes consume and emit large amounts of ozone depleting compounds. This program evaluates the use of pressurized water cleaning systems as a replacement for the vapor degreasing process. Tests have been conducted to determine if high pressure water washing, without any form of additive cleaner, is a viable candidate for replacing vapor degreasing processes. This paper discusses the findings thus far of Engineering Test Plan - 1168 (ETP-1168), 'Evaluation of Pressurized Water Cleaning Systems for Hardware Refurbishment.'

  19. Image acquisition in the Pi-of-the-Sky project

    NASA Astrophysics Data System (ADS)

    Jegier, M.; Nawrocki, K.; Poźniak, K.; Sokołowski, M.

    2006-10-01

    Modern astronomical image acquisition systems dedicated for sky surveys provide large amount of data in a single measurement session. During one session that lasts a few hours it is possible to get as much as 100 GB of data. This large amount of data needs to be transferred from camera and processed. This paper presents some aspects of image acquisition in a sky survey image acquisition system. It describes a dedicated USB linux driver for the first version of the "Pi of The Sky" CCD camera (later versions have also Ethernet interface) and the test program for the camera together with a driver-wrapper providing core device functionality. Finally, the paper contains description of an algorithm for matching several images based on image features, i.e. star positions and their brightness.

  20. A Comparative Analysis of Extract, Transformation and Loading (ETL) Process

    NASA Astrophysics Data System (ADS)

    Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.

    2018-02-01

    The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Donald F.; Schulz, Carl; Konijnenburg, Marco

    High-resolution Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry imaging enables the spatial mapping and identification of biomolecules from complex surfaces. The need for long time-domain transients, and thus large raw file sizes, results in a large amount of raw data (“big data”) that must be processed efficiently and rapidly. This can be compounded by largearea imaging and/or high spatial resolution imaging. For FT-ICR, data processing and data reduction must not compromise the high mass resolution afforded by the mass spectrometer. The continuous mode “Mosaic Datacube” approach allows high mass resolution visualization (0.001 Da) of mass spectrometry imaging data, butmore » requires additional processing as compared to featurebased processing. We describe the use of distributed computing for processing of FT-ICR MS imaging datasets with generation of continuous mode Mosaic Datacubes for high mass resolution visualization. An eight-fold improvement in processing time is demonstrated using a Dutch nationally available cloud service.« less

  2. The Gist of Juries: Testing a Model of Damage Award Decision Making

    PubMed Central

    Reyna, Valerie F.; Hans, Valerie P.; Corbin, Jonathan C.; Yeh, Ryan; Lin, Kelvin; Royer, Caisa

    2017-01-01

    Despite the importance of damage awards, juries are often at sea about the amounts that should be awarded, with widely differing awards for cases that seem comparable. We tested a new model of damage award decision making by systematically varying the size, context, and meaningfulness of numerical comparisons or anchors. As a result, we were able to elicit large differences in award amounts that replicated for 2 different cases. Although even arbitrary dollar amounts (unrelated to the cases) influenced the size of award judgments, the most consistent effects of numerical anchors were achieved when the amounts were meaningful in the sense that they conveyed the gist of numbers as small or large. Consistent with the model, the ordinal gist of the severity of plaintiff’s damages and defendant’s liability predicted damage awards, controlling for other factors such as motivation for the award-judgment task and perceived economic damages. Contrary to traditional dual-process approaches, numeracy and cognitive style (e.g., need for cognition and cognitive reflection) were not significant predictors of these numerical judgments, but they were associated with lower levels of variability once the gist of the judgments was taken into account. Implications for theory and policy are discussed. PMID:29075092

  3. Enzymatic lignocellulose hydrolysis: Improved cellulase productivity by insoluble solids recycling

    PubMed Central

    2013-01-01

    Background It is necessary to develop efficient methods to produce renewable fuels from lignocellulosic biomass. One of the main challenges to the industrialization of lignocellulose conversion processes is the large amount of cellulase enzymes used for the hydrolysis of cellulose. One method for decreasing the amount of enzyme used is to recycle the enzymes. In this study, the recycle of enzymes associated with the insoluble solid fraction after the enzymatic hydrolysis of cellulose was investigated for pretreated corn stover under a variety of recycling conditions. Results It was found that a significant amount of cellulase activity could be recovered by recycling the insoluble biomass fraction, and the enzyme dosage could be decreased by 30% to achieve the same glucose yields under the most favorable conditions. Enzyme productivity (g glucose produced/g enzyme applied) increased between 30 and 50% by the recycling, depending on the reaction conditions. While increasing the amount of solids recycled increased process performance, the methods applicability was limited by its positive correlation with increasing total solids concentrations, reaction volumes, and lignin content of the insoluble residue. However, increasing amounts of lignin rich residue during the recycle did not negatively impact glucose yields. Conclusions To take advantage of this effect, the amount of solids recycled should be maximized, based on a given processes ability to deal with higher solids concentrations and volumes. Recycling of enzymes by recycling the insoluble solids fraction was thus shown to be an effective method to decrease enzyme usage, and research should be continued for its industrial application. PMID:23336604

  4. pyPcazip: A PCA-based toolkit for compression and analysis of molecular simulation data

    NASA Astrophysics Data System (ADS)

    Shkurti, Ardita; Goni, Ramon; Andrio, Pau; Breitmoser, Elena; Bethune, Iain; Orozco, Modesto; Laughton, Charles A.

    The biomolecular simulation community is currently in need of novel and optimised software tools that can analyse and process, in reasonable timescales, the large generated amounts of molecular simulation data. In light of this, we have developed and present here pyPcazip: a suite of software tools for compression and analysis of molecular dynamics (MD) simulation data. The software is compatible with trajectory file formats generated by most contemporary MD engines such as AMBER, CHARMM, GROMACS and NAMD, and is MPI parallelised to permit the efficient processing of very large datasets. pyPcazip is a Unix based open-source software (BSD licenced) written in Python.

  5. Male Kirtland's Warblers' patch-level response to landscape structure during periods of varying population size and habitat amounts

    USGS Publications Warehouse

    Donner, D.M.; Ribic, C.A.; Probst, J.R.

    2009-01-01

    Forest planners must evaluate how spatiotemporal changes in habitat amount and configuration across the landscape as a result of timber management will affect species' persistence. However, there are few long-term programs available for evaluation. We investigated the response of male Kirtland's Warbler (Dendroica kirtlandii) to 26 years of changing patch and landscape structure during a large, 26-year forestry-habitat restoration program within the warbler's primary breeding range. We found that the average density of male Kirtland's Warblers was related to a different combination of patch and landscape attributes depending on the species' regional population level and habitat amounts on the landscape (early succession jack pine (Pinus banksiana) forests; 15-42% habitat cover). Specifically, patch age and habitat regeneration type were important at low male population and total habitat amounts, while patch age and distance to an occupied patch were important at relatively high population and habitat amounts. Patch age and size were more important at increasing population levels and an intermediate amount of habitat. The importance of patch age to average male density during all periods reflects the temporal buildup and decline of male numbers as habitat suitability within the patch changed with succession. Habitat selection (i.e., preference for wildfire-regenerated habitat) and availability may explain the importance of habitat type and patch size during lower population and habitat levels. The relationship between male density and distance when there was the most habitat on the landscape and the male population was large and still increasing may be explained by the widening spatial dispersion of the increasing male population at the regional scale. Because creating or preserving habitat is not a random process, management efforts would benefit from more investigations of managed population responses to changes in spatial structure that occur through habitat gain rather than habitat loss to further our empirical understanding of general principles of the fragmentation process and habitat cover threshold effects within dynamic landscapes.

  6. Matrix Determination of Reflectance of Hidden Object via Indirect Photography

    DTIC Science & Technology

    2012-03-01

    the hidden object. This thesis provides an alternative method of processing the camera images by modeling the system as a set of transport and...Distribution Function ( BRDF ). Figure 1. Indirect photography with camera field of view dictated by point of illumination. 3 1.3 Research Focus In an...would need to be modeled using radiometric principles. A large amount of the improvement in this process was due to the use of a blind

  7. Characteristics of the microwave pyrolysis and microwave CO2-assisted gasification of dewatered sewage sludge.

    PubMed

    Chun, Young Nam; Jeong, Byeo Ri

    2017-07-28

    Microwave drying-pyrolysis or drying-gasification characteristics were examined to convert sewage sludge into energy and resources. The gasification was carried out with carbon dioxide as a gasifying agent. The examination results were compared with those of the conventional heating-type electric furnace to compare both product characteristics. Through the pyrolysis or gasification, gas, tar, and char were generated as products. The produced gas was the largest component of each process, followed by the sludge char and the tar. During the pyrolysis process, the main components of the produced gas were hydrogen and carbon monoxide, with a small amount of hydrocarbons such as methane and ethylene. In the gasification process, however, the amount of carbon monoxide was greater than the amount of hydrogen. In microwave gasification, a large amount of heavy tar was produced. The largest amount of benzene in light tar was generated from the pyrolysis or gasification. Ammonia and hydrogen cyanide, which are precursors of NO x , were also generated. In the microwave heating method, the sludge char produced by pyrolysis and gasification had pores in the mesopore range. This could be explained that the gas obtained from the microwave pyrolysis or gasification of the wet sewage sludge can be used as an alternative fuel, but the tar and NO x precursors in the produced gas should be treated. Sludge char can be used as a biomass solid fuel or as a tar removal adsorbent if necessary.

  8. Smelting Magnesium Metal using a Microwave Pidgeon Method

    PubMed Central

    Wada, Yuji; Fujii, Satoshi; Suzuki, Eiichi; Maitani, Masato M.; Tsubaki, Shuntaro; Chonan, Satoshi; Fukui, Miho; Inazu, Naomi

    2017-01-01

    Magnesium (Mg) is a lightweight metal with applications in transportation and sustainable battery technologies, but its current production through ore reduction using the conventional Pidgeon process emits large amounts of CO2 and particulate matter (PM2.5). In this work, a novel Pidgeon process driven by microwaves has been developed to produce Mg metal with less energy consumption and no direct CO2 emission. An antenna structure consisting of dolomite as the Mg source and a ferrosilicon antenna as the reducing material was used to confine microwave energy emitted from a magnetron installed in a microwave oven to produce a practical amount of pure Mg metal. This microwave Pidgeon process with an antenna configuration made it possible to produce Mg with an energy consumption of 58.6 GJ/t, corresponding to a 68.6% reduction when compared to the conventional method. PMID:28401910

  9. Verification Study of Buoyancy-Driven Turbulent Nuclear Combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-01-01

    Buoyancy-driven turbulent nuclear combustion determines the rate of nuclear burning during the deflagration phase (i.e., the ordinary nuclear flame phase) of Type 1a supernovae, and hence the amount of nuclear energy released during this phase. It therefore determines the amount the white dwarf star expands prior to initiation of a detonation wave, and so the amount of radioactive nickel and thus the peak luminosity of the explosion. However, this key physical process is not fully understood. To better understand this process, the Flash Center has conducted an extensive series of large-scale 3D simulations of buoyancy-driven turbulent nuclear combustion for threemore » different physical situations. This movie shows the results for some of these simulations. Credits: Science: Ray Bair, Katherine Riley, Argonne National Laboratory; Anshu Dubey, Don Lamb, Dongwook Lee, University of Chicago; Robert Fisher, University of Massachusetts at Dartmouth and Dean Townsley, University of Alabama Visualization: Jonathan Gallagher, University of Chicago; Randy Hudson, John Norris and Michael E. Papka, Argonne National Laboratory/University of Chicago« less

  10. Visual attention mitigates information loss in small- and large-scale neural codes

    PubMed Central

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  11. Microcomputer Referents in Elementary Mathematics: A Sample Approach.

    ERIC Educational Resources Information Center

    Connell, Michael L.

    The mathematical experiences of elementary students often focus on memorizing facts and rules as opposed to making sense of the subject and developing problem solving skills. Students spend large amounts of time processing, memorizing and sorting collections of data which are tasks well performed by computer technology. To correct this situation,…

  12. Removal of Heavy Metal Contamination from Peanut Skin Extracts by Waste Biomass Adsorbents

    USDA-ARS?s Scientific Manuscript database

    Each year, 3.6 million pounds of peanuts are harvested in the United States. Consequent processing, however, generates large amounts of waste biomass as only the seed portion of the fruit is consumed. The under-utilization of waste biomass is a lost economic opportunity to the industry. In particula...

  13. A laboratory study of channel sidewall expansion in upland concentrated flows

    USDA-ARS?s Scientific Manuscript database

    Gully erosion contributes large amounts of sediment within watersheds and gullies incise landscape into fractured patches on the Loess Plateau of China. As one of the main processes of channel development, gully widening occupies as much as 80% of total soil loss, especially in the presence of a les...

  14. An Analysis of Informal Reasoning Fallacy and Critical Thinking Dispositions among Malaysian Undergraduates

    ERIC Educational Resources Information Center

    Ramasamy, Shamala

    2011-01-01

    In this information age, the amount of complex information available due to technological advancement would require undergraduates to be extremely competent in processing information systematically. Critical thinking ability of undergraduates has been the focal point among educators, employers and the public at large. One of the dimensions of…

  15. How Should I Study for the Exam? Self-Regulated Learning Strategies and Achievement in Introductory Biology

    ERIC Educational Resources Information Center

    Sebesta, Amanda J.; Speth, Elena Bray

    2017-01-01

    In college introductory science courses, students are challenged with mastering large amounts of disciplinary content while developing as autonomous and effective learners. Self-regulated learning (SRL) is the process of setting learning goals, monitoring progress toward them, and applying appropriate study strategies. SRL characterizes…

  16. Redox Flow Batteries, a Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knoxville, U. Tennessee; U. Texas Austin; U, McGill

    2011-07-15

    Redox flow batteries are enjoying a renaissance due to their ability to store large amounts of electrical energy relatively cheaply and efficiently. In this review, we examine the components of redox flow batteries with a focus on understanding the underlying physical processes. The various transport and kinetic phenomena are discussed along with the most common redox couples.

  17. Helping Young Children Understand Graphs: A Demonstration Study.

    ERIC Educational Resources Information Center

    Freeland, Kent; Madden, Wendy

    1990-01-01

    Outlines a demonstration lesson showing third graders how to make and interpret graphs. Includes descriptions of purpose, vocabulary, and learning activities in which students graph numbers of students with dogs at home and analyze the contents of M&M candy packages by color. Argues process helps students understand large amounts of abstract…

  18. Students' Explanations in Complex Learning of Disciplinary Programming

    ERIC Educational Resources Information Center

    Vieira, Camilo

    2016-01-01

    Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or…

  19. Looking for High Quality Accreditation in Higher Education in Colombia

    ERIC Educational Resources Information Center

    Pérez Gama, Jesús Alfonso; Vega Vega, Anselmo

    2017-01-01

    We look for the High Quality Accreditation of tertiary education in two ways: one, involving large amount of information, including issues such as self-assessment, high quality, statistics, indicators, surveys, and field work (process engineering), during several periods of time; and the second, in relation to the information contained there about…

  20. Human-Level Natural Language Understanding: False Progress and Real Challenges

    ERIC Educational Resources Information Center

    Bignoli, Perrin G.

    2013-01-01

    The field of Natural Language Processing (NLP) focuses on the study of how utterances composed of human-level languages can be understood and generated. Typically, there are considered to be three intertwined levels of structure that interact to create meaning in language: syntax, semantics, and pragmatics. Not only is a large amount of…

  1. Chocolate and cardiovascular health: is it too good to be true?

    PubMed

    Ariefdjohan, Merlin W; Savaiano, Dennis A

    2005-12-01

    Recent findings indicate that cocoa and chocolate, when processed appropriately, may contain relatively large amounts of flavonoids, particularly catechin and epicatechin. We review the benefits of these flavonoids, specifically with regard to cardiovascular health, and raise several unresolved issues that suggest the need for additional research and product development in this area.

  2. Preparation of a novel hyperbranched carbosilane-silica hybrid coating for trace amount detection by solid phase microextraction/gas chromatography.

    PubMed

    Chen, Guowen; Li, Wenjie; Zhang, Chen; Zhou, Chuanjian; Feng, Shengyu

    2012-09-21

    Phenyl-ended hyperbranched carbosilane (HBC) is synthesized and immobilized onto the inner wall of a fused silica capillary column using a sol-gel process. The hybrid coating layer formed is used as a stationary phase for gas chromatography (GC) and as an adsorption medium for solid phase microextraction (SPME). Trifluoroacetic acid, as a catalyst in this process, helps produce a homogeneous hybrid coating layer. This result is beneficial for better column chromatographic performances, such as high efficiency and high resolution. Extraction tests using the novel hybrid layer show an extraordinarily large adsorption capacity and specific adsorption behavior for aromatic compounds. A 1 ppm trace level detectability is obtained with the SPME/GC work model when both of the stationary phase and adsorption layer bear a hyperbranched structure. A large amount of phenyl groups and a low viscosity of hyperbranched polymers contribute to these valuable properties, which are important to environment and safety control, wherein detection sensitivity and special adsorption behavior are usually required. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Investigating the Direct Meltwater Effect in Terrestrial Oxygen-Isotope Paleoclimate Records Using an Isotope-Enabled Earth System Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Jiang; Liu, Zhengyu; Brady, Esther C.

    Variations in terrestrial oxygen-isotope reconstructions from ice cores and speleothems have been primarily attributed to climatic changes of surface air temperature, precipitation amount, or atmospheric circulation. In this work, we demonstrate with the fully coupled isotope-enabled Community Earth System Model an additional process contributing to the oxygen-isotope variations during glacial meltwater events. This process, termed “the direct meltwater effect,” involves propagating large amounts of isotopically depleted meltwater throughout the hydrological cycle and is independent of climatic changes. We find that the direct meltwater effect can make up 15–35% of the δ 18O signals in precipitation over Greenland and eastern Brazilmore » for large freshwater forcings (0.25–0.50 sverdrup (10 6 m 3/s)). Model simulations further demonstrate that the direct meltwater effect increases with the magnitude and duration of the freshwater forcing and is sensitive to both the location and shape of the meltwater. These new modeling results have important implications for past climate interpretations of δ 18O.« less

  4. Investigating the Direct Meltwater Effect in Terrestrial Oxygen-Isotope Paleoclimate Records Using an Isotope-Enabled Earth System Model

    DOE PAGES

    Zhu, Jiang; Liu, Zhengyu; Brady, Esther C.; ...

    2017-12-28

    Variations in terrestrial oxygen-isotope reconstructions from ice cores and speleothems have been primarily attributed to climatic changes of surface air temperature, precipitation amount, or atmospheric circulation. In this work, we demonstrate with the fully coupled isotope-enabled Community Earth System Model an additional process contributing to the oxygen-isotope variations during glacial meltwater events. This process, termed “the direct meltwater effect,” involves propagating large amounts of isotopically depleted meltwater throughout the hydrological cycle and is independent of climatic changes. We find that the direct meltwater effect can make up 15–35% of the δ 18O signals in precipitation over Greenland and eastern Brazilmore » for large freshwater forcings (0.25–0.50 sverdrup (10 6 m 3/s)). Model simulations further demonstrate that the direct meltwater effect increases with the magnitude and duration of the freshwater forcing and is sensitive to both the location and shape of the meltwater. These new modeling results have important implications for past climate interpretations of δ 18O.« less

  5. Sol-gel antireflective spin-coating process for large-size shielding windows

    NASA Astrophysics Data System (ADS)

    Belleville, Philippe F.; Prene, Philippe; Mennechez, Francoise; Bouigeon, Christian

    2002-10-01

    The interest of the antireflective coatings applied onto large-area glass components increases everyday for the potential application such as building or shop windows. Today, because of the use of large size components, sol-gel process is a competitive way for antireflective coating mass production. The dip-coating technique commonly used for liquid-deposition, implies a safety hazard due to coating solution handling and storage in the case of large amounts of highly flammable solvent use. On the other hand, spin-coating is a liquid low-consumption technique. Mainly devoted to coat circular small-size substrate, we have developed a spin-coating machine able to coat large-size rectangular windows (up to 1 x 1.7 m2). Both solutions and coating conditions have been optimized to deposit optical layers with accurate and uniform thickness and to highly limit the edge effects. Experimental single layer antireflective coating deposition process onto large-area shielding windows (1000 x 1700 x 20 mm3) is described. Results show that the as-developed process could produce low specular reflection value (down to 1% one side) onto white-glass windows over the visible range (460-750 nm). Low-temperature curing process (120°C) used after sol-gel deposition enables antireflective-coating to withstand abrasion-resistance properties in compliance to US-MIL-C-0675C moderate test.

  6. Informatics in neurocritical care: new ideas for Big Data.

    PubMed

    Flechet, Marine; Grandas, Fabian Güiza; Meyfroidt, Geert

    2016-04-01

    Big data is the new hype in business and healthcare. Data storage and processing has become cheap, fast, and easy. Business analysts and scientists are trying to design methods to mine these data for hidden knowledge. Neurocritical care is a field that typically produces large amounts of patient-related data, and these data are increasingly being digitized and stored. This review will try to look beyond the hype, and focus on possible applications in neurointensive care amenable to Big Data research that can potentially improve patient care. The first challenge in Big Data research will be the development of large, multicenter, and high-quality databases. These databases could be used to further investigate recent findings from mathematical models, developed in smaller datasets. Randomized clinical trials and Big Data research are complementary. Big Data research might be used to identify subgroups of patients that could benefit most from a certain intervention, or can be an alternative in areas where randomized clinical trials are not possible. The processing and the analysis of the large amount of patient-related information stored in clinical databases is beyond normal human cognitive ability. Big Data research applications have the potential to discover new medical knowledge, and improve care in the neurointensive care unit.

  7. Water demand-supply analysis in a large spatial area based on the processes of evapotranspiration and runoff

    PubMed Central

    Maruyama, Toshisuke

    2007-01-01

    To estimate the amount of evapotranspiration in a river basin, the “short period water balance method” was formulated. Then, by introducing the “complementary relationship method,” the amount of evapotranspiration was estimated seasonally, and with reasonable accuracy, for both small and large areas. Moreover, to accurately estimate river discharge in the low water season, the “weighted statistical unit hydrograph method” was proposed and a procedure for the calculation of the unit hydrograph was developed. Also, a new model, based on the “equivalent roughness method,” was successfully developed for the estimation of flood runoff from newly reclaimed farmlands. Based on the results of this research, a “composite reservoir model” was formulated to analyze the repeated use of irrigation water in large spatial areas. The application of this model to a number of watershed areas provided useful information with regard to the realities of water demand-supply systems in watersheds predominately dedicated to paddy fields, in Japan. PMID:24367144

  8. A Parallel Nonrigid Registration Algorithm Based on B-Spline for Medical Images

    PubMed Central

    Wang, Yangping; Wang, Song

    2016-01-01

    The nonrigid registration algorithm based on B-spline Free-Form Deformation (FFD) plays a key role and is widely applied in medical image processing due to the good flexibility and robustness. However, it requires a tremendous amount of computing time to obtain more accurate registration results especially for a large amount of medical image data. To address the issue, a parallel nonrigid registration algorithm based on B-spline is proposed in this paper. First, the Logarithm Squared Difference (LSD) is considered as the similarity metric in the B-spline registration algorithm to improve registration precision. After that, we create a parallel computing strategy and lookup tables (LUTs) to reduce the complexity of the B-spline registration algorithm. As a result, the computing time of three time-consuming steps including B-splines interpolation, LSD computation, and the analytic gradient computation of LSD, is efficiently reduced, for the B-spline registration algorithm employs the Nonlinear Conjugate Gradient (NCG) optimization method. Experimental results of registration quality and execution efficiency on the large amount of medical images show that our algorithm achieves a better registration accuracy in terms of the differences between the best deformation fields and ground truth and a speedup of 17 times over the single-threaded CPU implementation due to the powerful parallel computing ability of Graphics Processing Unit (GPU). PMID:28053653

  9. Use of boron waste as an additive in red bricks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uslu, T.; Arol, A.I

    2004-07-01

    In boron mining and processing operations, large amounts of clay containing tailings have to be discarded. Being rich in boron, the tailings do not only cause economical loss but also pose serious environmental problems. Large areas have to be allocated for waste disposal. In order to alleviate this problem, the possibility of using clayey tailings from a borax concentrator in red brick manufacturing was investigated. Up to 30% by weight tailings addition was found to improve the brick quality.

  10. Evolution of chemically processed air parcels in the lower stratosphere

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S.; Douglass, Anne R.; Schoeberl, Mark R.

    1994-01-01

    Aircraft, ground-based, and satellite measurements indicate large concentrations of ClO in the lower stratosphere in and near the polar vortex. The amount of local ozone depletion caused by these large ClO concentrations will depend on the relative rates of ozone loss and ClO recovery. ClO recovery occurs when NO(x), from HNO3 photolysis, reacts with ClO to form ClONO2. We show that air parcels with large amounts of ClO will experience a subsequent ozone depletion that depends on the solar zenith angle. When the solar zenith angle is large in the middle of winter, the recovery of the ClO concentration in the parcel is slow relative to ozone depletion. In the spring, when the solar zenith angle is smaller, the ClO recovery is much faster. After ClO recovery, the chlorine chemistry has not returned to normal. The ClO has been converted to ClONO2. ClO production from further encounters with PSCs will be limited by the heterogeneous reaction of ClONO2 with water. Large ozone depletions, of the type seen in the Antarctic, occur only if there is significant irreversible denitrification in the air parcel.

  11. Assessing and Projecting Greenhouse Gas Release due to Abrupt Permafrost Degradation

    NASA Astrophysics Data System (ADS)

    Saito, K.; Ohno, H.; Yokohata, T.; Iwahana, G.; Machiya, H.

    2017-12-01

    Permafrost is a large reservoir of frozen soil organic carbon (SOC; about half of all the terrestrial storage). Therefore, its degradation (i.e., thawing) under global warming may lead to a substantial amount of additional greenhouse gas (GHG) release. However, understanding of the processes, geographical distribution of such hazards, and implementation of the relevant processes in the advanced climate models are insufficient yet so that variations in permafrost remains one of the large source of uncertainty in climatic and biogeochemical assessment and projections. Thermokarst, induced by melting of ground ice in ice-rich permafrost, leads to dynamic surface subsidence up to 60 m, which further affects local and regional societies and eco-systems in the Arctic. It can also accelerate a large-scale warming process through a positive feedback between released GHGs (especially methane), atmospheric warming and permafrost degradation. This three-year research project (2-1605, Environment Research and Technology Development Fund of the Ministry of the Environment, Japan) aims to assess and project the impacts of GHG release through dynamic permafrost degradation through in-situ and remote (e.g., satellite and airborn) observations, lab analysis of sampled ice and soil cores, and numerical modeling, by demonstrating the vulnerability distribution and relative impacts between large-scale degradation and such dynamic degradation. Our preliminary laboratory analysis of ice and soil cores sampled in 2016 at the Alaskan and Siberian sites largely underlain by ice-rich permafrost, shows that, although gas volumes trapped in unit mass are more or less homogenous among sites both for ice and soil cores, large variations are found in the methane concentration in the trapped gases, ranging from a few ppm (similar to that of the atmosphere) to hundreds of thousands ppm We will also present our numerical approach to evaluate relative impacts of GHGs released through dynamic permafrost degradations, by implementing conceptual modeling to assess and project distribution and affected amount of ground ice and SOC.

  12. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency.

    PubMed

    Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio

    2015-01-01

    Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB.

  13. Semantic interoperability challenges to process large amount of data perspectives in forensic and legal medicine.

    PubMed

    Jaulent, Marie-Christine; Leprovost, Damien; Charlet, Jean; Choquet, Remy

    2018-07-01

    This article is a position paper dealing with semantic interoperability challenges. It addresses the Variety and Veracity dimensions when integrating, sharing and reusing large amount of heterogeneous data for data analysis and decision making applications in the healthcare domain. Many issues are raised by the necessity to conform Big Data to interoperability standards. We discuss how semantics can contribute to the improvement of information sharing and address the problem of data mediation with domain ontologies. We then introduce the main steps for building domain ontologies as they could be implemented in the context of Forensic and Legal medicine. We conclude with a particular emphasis on the current limitations in standardisation and the importance of knowledge formalization. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  14. Discovering Related Clinical Concepts Using Large Amounts of Clinical Notes

    PubMed Central

    Ganesan, Kavita; Lloyd, Shane; Sarkar, Vikren

    2016-01-01

    The ability to find highly related clinical concepts is essential for many applications such as for hypothesis generation, query expansion for medical literature search, search results filtering, ICD-10 code filtering and many other applications. While manually constructed medical terminologies such as SNOMED CT can surface certain related concepts, these terminologies are inadequate as they depend on expertise of several subject matter experts making the terminology curation process open to geographic and language bias. In addition, these terminologies also provide no quantifiable evidence on how related the concepts are. In this work, we explore an unsupervised graphical approach to mine related concepts by leveraging the volume within large amounts of clinical notes. Our evaluation shows that we are able to use a data driven approach to discovering highly related concepts for various search terms including medications, symptoms and diseases. PMID:27656096

  15. Minimization of energy and surface roughness of the products machined by milling

    NASA Astrophysics Data System (ADS)

    Belloufi, A.; Abdelkrim, M.; Bouakba, M.; Rezgui, I.

    2017-08-01

    Metal cutting represents a large portion in the manufacturing industries, which makes this process the largest consumer of energy. Energy consumption is an indirect source of carbon footprint, we know that CO2 emissions come from the production of energy. Therefore high energy consumption requires a large production, which leads to high cost and a large amount of CO2 emissions. At this day, a lot of researches done on the Metal cutting, but the environmental problems of the processes are rarely discussed. The right selection of cutting parameters is an effective method to reduce energy consumption because of the direct relationship between energy consumption and cutting parameters in machining processes. Therefore, one of the objectives of this research is to propose an optimization strategy suitable for machining processes (milling) to achieve the optimum cutting conditions based on the criterion of the energy consumed during the milling. In this paper the problem of energy consumed in milling is solved by an optimization method chosen. The optimization is done according to the different requirements in the process of roughing and finishing under various technological constraints.

  16. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.

  17. Automated Solvent Seaming of Large Polyimide Membranes

    NASA Technical Reports Server (NTRS)

    Rood, Robert; Moore, James D.; Talley, Chris; Gierow, Paul A.

    2006-01-01

    A solvent-based welding process enables the joining of precise, cast polyimide membranes at their edges to form larger precise membranes. The process creates a homogeneous, optical-quality seam between abutting membranes, with no overlap and with only a very localized area of figure disturbance. The seam retains 90 percent of the strength of the parent material. The process was developed for original use in the fabrication of wide-aperture membrane optics, with areal densities of less than 1 kg/m2, for lightweight telescopes, solar concentrators, antennas, and the like to be deployed in outer space. The process is just as well applicable to the fabrication of large precise polyimide membranes for flat or inflatable solar concentrators and antenna reflectors for terrestrial applications. The process is applicable to cast membranes made of CP1 (or equivalent) polyimide. The process begins with the precise fitting together and fixturing of two membrane segments. The seam is formed by applying a metered amount of a doped solution of the same polyimide along the abutting edges of the membrane segments. After the solution has been applied, the fixtured films are allowed to dry and are then cured by convective heating. The weld material is the same as the parent material, so that what is formed is a homogeneous, strong joint that is almost indistinguishable from the parent material. The success of the process is highly dependent on formulation of the seaming solution from the correct proportion of the polyimide in a suitable solvent. In addition, the formation of reliable seams depends on the deposition of a precise amount of the seaming solution along the seam line. To ensure the required precision, deposition is performed by use of an automated apparatus comprising a modified commercially available, large-format, ink-jet print head on an automated positioning table. The printing head jets the seaming solution into the seam area at a rate controlled in coordination with the movement of the positioning table.

  18. An Estimation of Construction and Demolition Debris in Seoul, Korea: Waste Amount, Type, and Estimating Model.

    PubMed

    Seo, Seongwon; Hwang, Yongwoo

    1999-08-01

    Construction and demolition (C&D) debris is generated at the site of various construction activities. However, the amount of the debris is usually so large that it is necessary to estimate the amount of C&D debris as accurately as possible for effective waste management and control in urban areas. In this paper, an effective estimation method using a statistical model was proposed. The estimation process was composed of five steps: estimation of the life span of buildings; estimation of the floor area of buildings to be constructed and demolished; calculation of individual intensity units of C&D debris; and estimation of the future C&D debris production. This method was also applied in the city of Seoul as an actual case, and the estimated amount of C&D debris in Seoul in 2021 was approximately 24 million tons. Of this total amount, 98% was generated by demolition, and the main components of debris were concrete and brick.

  19. SCP -- A Simple CCD Processing Package

    NASA Astrophysics Data System (ADS)

    Lewis, J. R.

    This note describes a small set of programs, written at RGO, which deal with basic CCD frame processing (e.g. bias subtraction, flat fielding, trimming etc.). The need to process large numbers of CCD frames from devices such as FOS or ISIS in order to extract spectra has prompted the writing of routines which will do the basic hack-work with a minimal amount of interaction from the user. Although they were written with spectral data in mind, there are no ``spectrum-specific'' features in the software which means they can be applied to any CCD data.

  20. Extending Beowulf Clusters

    USGS Publications Warehouse

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Hamer, George

    2003-01-01

    Beowulf clusters can provide a cost-effective way to compute numerical models and process large amounts of remote sensing image data. Usually a Beowulf cluster is designed to accomplish a specific set of processing goals, and processing is very efficient when the problem remains inside the constraints of the original design. There are cases, however, when one might wish to compute a problem that is beyond the capacity of the local Beowulf system. In these cases, spreading the problem to multiple clusters or to other machines on the network may provide a cost-effective solution.

  1. Subsurface damage in precision ground ULE(R) and Zerodur(R) surfaces.

    PubMed

    Tonnellier, X; Morantz, P; Shore, P; Baldwin, A; Evans, R; Walker, D D

    2007-09-17

    The total process cycle time for large ULE((R)) and Zerodur((R))optics can be improved using a precise and rapid grinding process, with low levels of surface waviness and subsurface damage. In this paper, the amounts of defects beneath ULE((R)) and Zerodur((R) )surfaces ground using a selected grinding mode were compared. The grinding response was characterised by measuring: surface roughness, surface profile and subsurface damage. The observed subsurface damage can be separated into two distinct depth zones, which are: 'process' and 'machine dynamics' related.

  2. New regulations for radiation protection for work involving radioactive fallout emitted by the TEPCO Fukushima Daiichi APP accident--disposal of contaminated soil and wastes.

    PubMed

    Yasui, Shojiro

    2014-01-01

    The accident at the Fukushima Daiichi Atomic Power Plant that accompanied the Great East Japan Earthquake on March 11, 2011, released a large amount of radioactive material. To rehabilitate the contaminated areas, the government of Japan decided to carry out decontamination work and manage the waste resulting from decontamination. In the summer of 2013, the Ministry of the Environment planned to begin a full-scale process for waste disposal of contaminated soil and wastes removed as part of the decontamination work. The existing regulations were not developed to address such a large amount of contaminated wastes. The Ministry of Health, Labour and Welfare (MHLW), therefore, had to amend the existing regulations for waste disposal workers. The amendment of the general regulation targeted the areas where the existing exposure situation overlaps the planned exposure situation. The MHLW established the demarcation lines between the two regulations to be applied in each situation. The amendment was also intended to establish provisions for the operation of waste disposal facilities that handle large amounts of contaminated materials. Deliberation concerning the regulation was conducted when the facilities were under design; hence, necessary adjustments should be made as needed during the operation of the facilities.

  3. Using Mosix for Wide-Area Compuational Resources

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.

  4. Narrowing the Gap in Quantification of Aerosol-Cloud Radiative Effects

    NASA Astrophysics Data System (ADS)

    Feingold, G.; McComiskey, A. C.; Yamaguchi, T.; Kazil, J.; Johnson, J. S.; Carslaw, K. S.

    2016-12-01

    Despite large advances in our understanding of aerosol and cloud processes over the past years, uncertainty in the aerosol-cloud radiative effect/forcing is still of major concern. In this talk we will advocate a methodology for quantifying the aerosol-cloud radiative effect that considers the primacy of fundamental cloud properties such as cloud amount and albedo alongside the need for process level understanding of aerosol-cloud interactions. We will present a framework for quantifying the aerosol-cloud radiative effect, regime-by-regime, through process-based modelling and observations at the large eddy scale. We will argue that understanding the co-variability between meteorological and aerosol drivers of the radiative properties of the cloud system may be as important an endeavour as attempting to untangle these drivers.

  5. Direct nuclear-powered lasers

    NASA Technical Reports Server (NTRS)

    Jalufka, N. W.

    1983-01-01

    The development of direct nuclear pumped lasers is reviewed. Theoretical and experimental investigations of various methods of converting the energy of nuclear fission fragments to laser power are summarized. The development of direct nuclear pumped lasers was achieved. The basic processes involved in the production of a plasma by nuclear radiation were studied. Significant progress was accomplished in this area and a large amount of basic data on plasma formation and atomic and molecular processes leading to population inversions is available.

  6. The purchase decision process and involvement of the elderly regarding nonprescription products.

    PubMed

    Reisenwitz, T H; Wimbish, G J

    1997-01-01

    The elderly or senior citizen is a large and growing market segment that purchases a disproportionate amount of health care products, particularly nonprescription products. This study attempts to examine the elderly's level of involvement (high versus low) and their purchase decision process regarding nonprescription or over-the-counter (OTC) products. Frequencies and percentages are calculated to indicate level of involvement as well as purchase decision behavior. Previous research is critiqued and managerial implications are discussed.

  7. Roles and applications of biomedical ontologies in experimental animal science.

    PubMed

    Masuya, Hiroshi

    2012-01-01

    A huge amount of experimental data from past studies has played a vital role in the development of new knowledge and technologies in biomedical science. The importance of computational technologies for the reuse of data, data integration, and knowledge discoveries has also increased, providing means of processing large amounts of data. In recent years, information technologies related to "ontologies" have played more significant roles in the standardization, integration, and knowledge representation of biomedical information. This review paper outlines the history of data integration in biomedical science and its recent trends in relation to the field of experimental animal science.

  8. Optical smart packaging to reduce transmitted information.

    PubMed

    Cabezas, Luisa; Tebaldi, Myrian; Barrera, John Fredy; Bolognini, Néstor; Torroba, Roberto

    2012-01-02

    We demonstrate a smart image-packaging optical technique that uses what we believe is a new concept to save byte space when transmitting data. The technique supports a large set of images mapped into modulated speckle patterns. Then, they are multiplexed into a single package. This operation results in a substantial decreasing of the final amount of bytes of the package with respect to the amount resulting from the addition of the images without using the method. Besides, there are no requirements on the type of images to be processed. We present results that proof the potentiality of the technique.

  9. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges

    PubMed Central

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  10. Big Data Analysis of Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  11. Local Cloudiness Development Forecast Based on Simulation of Solid Phase Formation Processes in the Atmosphere

    NASA Astrophysics Data System (ADS)

    Barodka, Siarhei; Kliutko, Yauhenia; Krasouski, Alexander; Papko, Iryna; Svetashev, Alexander; Turishev, Leonid

    2013-04-01

    Nowadays numerical simulation of thundercloud formation processes is of great interest as an actual problem from the practical point of view. Thunderclouds significantly affect airplane flights, and mesoscale weather forecast has much to contribute to facilitate the aviation forecast procedures. An accurate forecast can certainly help to avoid aviation accidents due to weather conditions. The present study focuses on modelling of the convective clouds development and thunder clouds detection on the basis of mesoscale atmospheric processes simulation, aiming at significantly improving the aeronautical forecast. In the analysis, the primary weather radar information has been used to be further adapted for mesoscale forecast systems. Two types of domains have been selected for modelling: an internal one (with radius of 8 km), and an external one (with radius of 300 km). The internal domain has been directly applied to study the local clouds development, and the external domain data has been treated as initial and final conditions for cloud cover formation. The domain height has been chosen according to the civil aviation forecast data (i.e. not exceeding 14 km). Simulations of weather conditions and local clouds development have been made within selected domains with the WRF modelling system. In several cases, thunderclouds are detected within the convective clouds. To specify the given category of clouds, we employ a simulation technique of solid phase formation processes in the atmosphere. Based on modelling results, we construct vertical profiles indicating the amount of solid phase in the atmosphere. Furthermore, we obtain profiles demonstrating the amount of ice particles and large particles (hailstones). While simulating the processes of solid phase formation, we investigate vertical and horizontal air flows. Consequently, we attempt to separate the total amount of solid phase into categories of small ice particles, large ice particles and hailstones. Also, we strive to reveal and differentiate the basic atmospheric parameters of sublimation and coagulation processes, aiming to predict ice particles precipitation. To analyze modelling results we apply the VAPOR three-dimensional visualization package. For the chosen domains, a diurnal synoptic situation has been simulated, including rain, sleet, ice pellets, and hail. As a result, we have obtained a large scope of data describing various atmospheric parameters: cloud cover, major wind components, basic levels of isobaric surfaces, and precipitation rate. Based on this data, we show both distinction in precipitation formation due to various heights and its differentiation of the ice particles. The relation between particle rise in the atmosphere and its size is analyzed: at 8-10 km altitude large ice particles, resulted from coagulation, dominate, while at 6-7 km altitude one can find snow and small ice particles formed by condensation growth. Also, mechanical trajectories of solid precipitation particles for various ice formation processes have been calculated.

  12. Spaceport Command and Control System Software Development

    NASA Technical Reports Server (NTRS)

    Glasser, Abraham

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This large system requires a large amount of intensive testing that will properly measure the capabilities of the system. Automating the test procedures would save the project money from human labor costs, as well as making the testing process more efficient. Therefore, the Exploration Systems Division (formerly the Electrical Engineering Division) at Kennedy Space Center (KSC) has recruited interns for the past two years to work alongside full-time engineers to develop these automated tests, as well as innovate upon the current automation process.

  13. Interannual kinetics (2010-2013) of large wood in a river corridor exposed to a 50-year flood event and fluvial ice dynamics

    NASA Astrophysics Data System (ADS)

    Boivin, Maxime; Buffin-Bélanger, Thomas; Piégay, Hervé

    2017-02-01

    Semi-alluvial rivers of the Gaspé Peninsula, Québec, are prone to produce and transport vast quantities of large wood (LW). The high rate of lateral erosion owing to high energy flows and noncohesive banks is the main process leading to the recruitment of large wood, which in turn initiates complex patterns of wood accumulation and reentrainment within the active channel. The delta of the Saint-Jean River (SJR) has accumulated large annual wood fluxes since 1960 that culminated in a wood raft of > 3-km in length in 2014. To document the kinetics of large wood on the main channel of SJR, four annual surveys were carried out from 2010 to 2013 to locate and describe > 1000 large wood jams (LWJ) and 2000 large wood individuals (LWI) along a 60-km river section. Airborne and ground photo/video images were used to estimate the wood volume introduced by lateral erosion and to identify local geomorphic conditions that control wood mobility and deposits. Video camera analysis allowed the examination of transport rates from three hydrometeorological events for specific river sections. Results indicate that the volume of LW recruited between 2010 and 2013 represents 57% of the total LW production over the 2004-2013 period. Volumes of wood deposited along the 60-km section were four times higher in 2013 than in 2010. Increases in wood amount occurred mainly in upper alluvial sections of the river, whereas decreases were observed in the semi-alluvial middle sections. Observations suggest that the 50-year flood event of 2010 produced large amounts of LW that were only partly exported out of the basin so that a significant amount was still available for subsequent floods. Large wood storage continued after this flood until a similar flood or an ice-breakup event could remobilise these LW accumulations into the river corridor. Ice-jam floods transport large amounts of wood during events with fairly low flow but do not contribute significantly to recruitment rates (ca. 10 to 30% early). It is fairly probable that the wood export peak observed in 2012 at the river mouth, where no flood occurred and which is similar to the 1-in 10-year flood of 2010, is mainly linked to such ice-break events that occurred in March 2012.

  14. The review of recent carbonate minerals processing technology

    NASA Astrophysics Data System (ADS)

    Solihin

    2018-02-01

    Carbonate is one of the groups of minerals that can be found in relatively large amount in the earth crust. The common carbonate minerals are calcium carbonate (calcite, aragonite, depending on its crystal structure), magnesium carbonate (magnesite), calcium-magnesium carbonate (dolomite), and barium carbonate (barite). A large amount of calcite can be found in many places in Indonesia such as Padalarang, Sukabumi, and Tasikmalaya (West Java Provence). Dolomite can be found in a large amount in Gresik, Lamongan, and Tuban (East Java Provence). Magnesite is quite rare in Indonesia, and up to the recent years it can only be found in Padamarang Island (South East Sulawesi Provence). The carbonate has been being exploited through open pit mining activity. Traditionally, calcite can be ground to produce material for brick production, be carved to produce craft product, or be roasted to produce lime for many applications such as raw materials for cement, flux for metal smelting, etc. Meanwhile, dolomite has traditionally been used as a raw material to make brick for local buildings and to make fertilizer for coconut oil plant. Carbonate minerals actually consist of important elements needed by modern application. Calcium is one of the elements needed in artificial bone formation, slow release fertilizer synthesis, dielectric material production, etc. Magnesium is an important material in automotive industry to produce the alloy for vehicle main parts. It is also used as alloying element in the production of special steel for special purpose. Magnesium oxide can be used to produce slow release fertilizer, catalyst and any other modern applications. The aim of this review article is to present in brief the recent technology in processing carbonate minerals. This review covers both the technology that has been industrially proven and the technology that is still in research and development stage. One of the industrially proven technologies to process carbonate mineral is the production of magnesium metals from dolomite. The discussion is emphasized to the requirements of certain aspects prior to the application of this technology in Indonesia. Other technologies that are still in research and development stage are also presented and discussed. The discussion is aimed to find further possible research and development in carbonate processing.

  15. Toward high-throughput genotyping: dynamic and automatic software for manipulating large-scale genotype data using fluorescently labeled dinucleotide markers.

    PubMed

    Li, J L; Deng, H; Lai, D B; Xu, F; Chen, J; Gao, G; Recker, R R; Deng, H W

    2001-07-01

    To efficiently manipulate large amounts of genotype data generated with fluorescently labeled dinucleotide markers, we developed a Microsoft database management system, named. offers several advantages. First, it accommodates the dynamic nature of the accumulations of genotype data during the genotyping process; some data need to be confirmed or replaced by repeat lab procedures. By using, the raw genotype data can be imported easily and continuously and incorporated into the database during the genotyping process that may continue over an extended period of time in large projects. Second, almost all of the procedures are automatic, including autocomparison of the raw data read by different technicians from the same gel, autoadjustment among the allele fragment-size data from cross-runs or cross-platforms, autobinning of alleles, and autocompilation of genotype data for suitable programs to perform inheritance check in pedigrees. Third, provides functions to track electrophoresis gel files to locate gel or sample sources for any resultant genotype data, which is extremely helpful for double-checking consistency of raw and final data and for directing repeat experiments. In addition, the user-friendly graphic interface of renders processing of large amounts of data much less labor-intensive. Furthermore, has built-in mechanisms to detect some genotyping errors and to assess the quality of genotype data that then are summarized in the statistic reports automatically generated by. The can easily handle >500,000 genotype data entries, a number more than sufficient for typical whole-genome linkage studies. The modules and programs we developed for the can be extended to other database platforms, such as Microsoft SQL server, if the capability to handle still greater quantities of genotype data simultaneously is desired.

  16. Offset Initial Sodium Loss To Improve Coulombic Efficiency and Stability of Sodium Dual-Ion Batteries.

    PubMed

    Ma, Ruifang; Fan, Ling; Chen, Suhua; Wei, Zengxi; Yang, Yuhua; Yang, Hongguan; Qin, Yong; Lu, Bingan

    2018-05-09

    Sodium dual-ion batteries (NDIBs) are attracting extensive attention recently because of their low cost and abundant sodium resources. However, the low capacity of the carbonaceous anode would reduce the energy density, and the formation of the solid-electrolyte interphase (SEI) in the anode during the initial cycles will lead to large amount consumption of Na + in the electrolyte, which results in low Coulombic efficiency and inferior stability of the NDIBs. To address these issues, a phosphorus-doped soft carbon (P-SC) anode combined with a presodiation process is developed to enhance the performance of the NDIBs. The phosphorus atom doping could enhance the electric conductivity and further improve the sodium storage property. On the other hand, an SEI could preform in the anode during the presodiation process; thus the anode has no need to consume large amounts of Na + to form the SEI during the cycling of the NDIBs. Consequently, the NDIBs with P-SC anode after the presodiation process exhibit high Coulombic efficiency (over 90%) and long cycle stability (81 mA h g -1 at 1000 mA g -1 after 900 cycles with capacity retention of 81.8%), far more superior to the unsodiated NDIBs. This work may provide guidance for developing high performance NDIBs in the future.

  17. Pulsed Electron Beam Water Radiolysis for Sub-Microsecond Hydroxyl Radical Protein Footprinting

    PubMed Central

    Watson, Caroline; Janik, Ireneusz; Zhuang, Tiandi; Charvátová, Olga; Woods, Robert J.; Sharp, Joshua S.

    2009-01-01

    Hydroxyl radical footprinting is a valuable technique for studying protein structure, but care must be taken to ensure that the protein does not unfold during the labeling process due to oxidative damage. Footprinting methods based on sub-microsecond laser photolysis of peroxide that complete the labeling process faster than the protein can unfold have been recently described; however, the mere presence of large amounts of hydrogen peroxide can also cause uncontrolled oxidation and minor conformational changes. We have developed a novel method for sub-microsecond hydroxyl radical protein footprinting using a pulsed electron beam from a 2 MeV Van de Graaff electron accelerator to generate a high concentration of hydroxyl radicals by radiolysis of water. The amount of oxidation can be controlled by buffer composition, pulsewidth, dose, and dissolved nitrous oxide gas in the sample. Our results with ubiquitin and β-lactoglobulin A demonstrate that one sub-microsecond electron beam pulse produces extensive protein surface modifications. Highly reactive residues that are buried within the protein structure are not oxidized, indicating that the protein retains its folded structure during the labeling process. Time-resolved spectroscopy indicates that the major part of protein oxidation is complete in a timescale shorter than that of large scale protein motions. PMID:19265387

  18. Immunological and biochemical characterization of processing products from the neurotensin/neuromedin N precursor in the rat medullary thyroid carcinoma 6-23 cell line.

    PubMed Central

    Bidard, J N; de Nadai, F; Rovere, C; Moinier, D; Laur, J; Martinez, J; Cuber, J C; Kitabgi, P

    1993-01-01

    Neurotensin (NT) and neuromedin N (NN) are two related biologically active peptides that are encoded in the same precursor molecule. In the rat, the precursor consists of a 169-residue polypeptide starting with an N-terminal signal peptide and containing in its C-terminal region one copy each of NT and NN. NN precedes NT and is separated from it by a Lys-Arg sequence. Two other Lys-Arg sequences flank the N-terminus of NN and the C-terminus of NT. A fourth Lys-Arg sequence occurs near the middle of the precursor and is followed by an NN-like sequence. Finally, an Arg-Arg pair is present within the NT moiety. The four Lys-Arg doublets represent putative processing sites in the precursor molecule. The present study was designed to investigate the post-translational processing of the NT/NN precursor in the rat medullary thyroid carcinoma (rMTC) 6-23 cell line, which synthesizes large amounts of NT upon dexamethasone treatment. Five region-specific antisera recognizing the free N- or C-termini of sequences adjacent to the basic doublets were produced, characterized and used for immunoblotting and radioimmunoassay studies in combination with gel filtration, reverse-phase h.p.l.c. and trypsin digestion of rMTC 6-23 cell extracts. Because two of the antigenic sequences, i.e. NN and the NN-like sequence, start with a lysine residue that is essential for recognition by their respective antisera, a micromethod by which trypsin specifically cleaves at arginine residues was developed. The results show that dexamethasone-treated rMTC 6-23 cells produced comparable amounts of NT, NN and a peptide corresponding to a large N-terminal precursor fragment lacking the NN and NT moieties. This large fragment was purified. N-Terminal sequencing revealed that it started at residue Ser23 of the prepro-NT/NN sequence, and thus established the Cys22-Ser23 bond as the cleavage site of the signal peptide. Two other large N-terminal fragments bearing respectively the NN and NT sequences at their C-termini were present in lower amounts. The NN-like sequence was internal to all the large fragments. There was no evidence for the presence of peptides with the NN-like sequence at their N-termini. This shows that, in rMTC 6-23 cells, the precursor is readily processed at the three Lys-Arg doublets that flank and separate the NT and NN sequences. In contrast, the Lys-Arg doublet that precedes the NN-like sequence is not processed in this system.(ABSTRACT TRUNCATED AT 400 WORDS) Images Figure 3 PMID:8471039

  19. Review of enhanced processes for anaerobic digestion treatment of sewage sludge

    NASA Astrophysics Data System (ADS)

    Liu, Xinyuan; Han, Zeyu; Yang, Jie; Ye, Tianyi; Yang, Fang; Wu, Nan; Bao, Zhenbo

    2018-02-01

    Great amount of sewage sludge had been produced each year, which led to serious environmental pollution. Many new technologies had been developed recently, but they were hard to be applied in large scales. As one of the traditional technologies, anaerobic fermentation process was capable of obtaining bioenergy by biogas production under the functions of microbes. However, the anaerobic process is facing new challenges due to the low fermentation efficiency caused by the characteristics of sewage sludge itself. In order to improve the energy yield, the enhancement technologies including sewage sludge pretreatment process, co-digestion process, high-solid digestion process and two-stage fermentation process were widely studied in the literatures, which were introduced in this article.

  20. The role of crystallization-driven exsolution on the sulfur mass balance in volcanic arc magmas

    USGS Publications Warehouse

    Su, Yanqing; Huber, Christian; Bachmann, Olivier; Zajacz, Zoltán; Wright, Heather M.; Vazquez, Jorge A.

    2016-01-01

    The release of large amounts of sulfur to the stratosphere during explosive eruptions affects the radiative balance in the atmosphere and consequentially impacts climate for up to several years after the event. Quantitative estimations of the processes that control the mass balance of sulfur between melt, crystals, and vapor bubbles is needed to better understand the potential sulfur yield of individual eruption events and the conditions that favor large sulfur outputs to the atmosphere. The processes that control sulfur partitioning in magmas are (1) exsolution of volatiles (dominantly H2O) during decompression (first boiling) and during isobaric crystallization (second boiling), (2) the crystallization and breakdown of sulfide or sulfate phases in the magma, and (3) the transport of sulfur-rich vapor (gas influx) from deeper unerupted regions of the magma reservoir. Vapor exsolution and the formation/breakdown of sulfur-rich phases can all be considered as closed-system processes where mass balance arguments are generally easier to constrain, whereas the contribution of sulfur by vapor transport (open system process) is more difficult to quantify. The ubiquitous “excess sulfur” problem, which refers to the much higher sulfur mass released during eruptions than what can be accounted for by amount of sulfur originally dissolved in erupted melt, as estimated from melt inclusion sulfur concentrations (the “petrologic estimate”), reflects the challenges in closing the sulfur mass balance between crystals, melt, and vapor before and during a volcanic eruption. In this work, we try to quantify the relative importance of closed- and open-system processes for silicic arc volcanoes using kinetic models of sulfur partitioning during exsolution. Our calculations show that crystallization-induced exsolution (second boiling) can generate a significant fraction of the excess sulfur observed in crystal-rich arc magmas. This result does not negate the important role of vapor migration in sulfur mass balance but rather points out that second boiling (in situ exsolution) can provide the necessary yield to drive the excess sulfur to the levels observed for crystal-rich systems. In contrast, in crystal-poor systems, magma recharge that releases sulfur-rich bubbles is necessary and most likely the primary contributor to sulfur mass balance. Finally, we apply our model to account for the effect of sulfur partitioning during second boiling and its impact on sulfur released during the Cerro Galan supereruption in Argentina (2.08 Ma) and show the potential importance of second boiling in releasing a large amount of sulfur to the atmosphere during the eruption of large crystal-rich ignimbrites.

  1. Duration of extinction trials as a determinant of instrumental extinction in terrestrial toads (Rhinella arenarum).

    PubMed

    Puddington, Martín M; Papini, Mauricio R; Muzio, Rubén N

    2018-01-01

    Instrumental learning guides behavior toward resources. When such resources are no longer available, approach to previously reinforced locations is reduced, a process called extinction. The present experiments are concerned with factors affecting the extinction of acquired behaviors in toads. In previous experiments, total reward magnitude in acquisition and duration of extinction trials were confounded. The present experiments were designed to test the effects of these factors in factorial designs. Experiment 1 varied reward magnitude (900, 300, or 100 s of water access per trial) and amount of acquisition training (5 or 15 daily trials). With total amount of water access equated in acquisition, extinction with large rewards was faster (longer latencies in 900/5 than 300/15), but with total amount of training equated, extinction with small rewards was faster (longer latencies in 100/15 than 300/15). Experiment 2 varied reward magnitude (1200 or 120 s of water access per trial) while holding constant the number of acquisition trials (5 daily trials) and the duration of extinction trials (300 s). Extinction performance was lower with small, rather than large reward magnitude (longer latencies in 120/300 than in 1200/300). Thus, instrumental extinction depends upon the amount of time toads are exposed to the empty goal compartment during extinction trials.

  2. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  3. A workload model and measures for computer performance evaluation

    NASA Technical Reports Server (NTRS)

    Kerner, H.; Kuemmerle, K.

    1972-01-01

    A generalized workload definition is presented which constructs measurable workloads of unit size from workload elements, called elementary processes. An elementary process makes almost exclusive use of one of the processors, CPU, I/O processor, etc., and is measured by the cost of its execution. Various kinds of user programs can be simulated by quantitative composition of elementary processes into a type. The character of the type is defined by the weights of its elementary processes and its structure by the amount and sequence of transitions between its elementary processes. A set of types is batched to a mix. Mixes of identical cost are considered as equivalent amounts of workload. These formalized descriptions of workloads allow investigators to compare the results of different studies quantitatively. Since workloads of different composition are assigned a unit of cost, these descriptions enable determination of cost effectiveness of different workloads on a machine. Subsequently performance parameters such as throughput rate, gain factor, internal and external delay factors are defined and used to demonstrate the effects of various workload attributes on the performance of a selected large scale computer system.

  4. Voyager at Uranus: 1986

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The voyager 2 spacecraft begins its initial observations of Uranus November 4, 1985, and makes its final observation February 25, 1996. The data from the infrared interfermometer spectrometer, photopolarimeters, plasma wave, plasma detecter, and ultraviolet spectrometer will be processed to add a large block of infermation to the small amount already known. The trajectory of Voyager 2 is also discussed.

  5. Wildfire and post-fire erosion impacts on forest ecosystem carbon and nitrogen: An analysis

    Treesearch

    D. G. Neary; S. T. Overby

    2006-01-01

    Many ecosystem processes occurring in soils depend upon the presence of organic matter. Soil organic matter is particularly important for nutrient supply, cation exchange capacity, and water retention, hence its importance in long-term site productivity. However, wildfires consume large amounts of aboveground organic material, and soil heating can consume soil organic...

  6. Improving the Compensation Process in Higher Education: Fostering a High Performing Organization

    ERIC Educational Resources Information Center

    Cockerham, Thomas

    2016-01-01

    While there is a large amount of research regarding employee satisfaction and turnover, less attention has been paid to the role of compensation, especially in the context of for-profit institutions of higher education. This capstone project conducted a review of budgetary records and human resources files at a for-profit institution of higher…

  7. Deep feature representation with stacked sparse auto-encoder and convolutional neural network for hyperspectral imaging-based detection of cucumber defects

    USDA-ARS?s Scientific Manuscript database

    It is challenging to achieve rapid and accurate processing of large amounts of hyperspectral image data. This research was aimed to develop a novel classification method by employing deep feature representation with the stacked sparse auto-encoder (SSAE) and the SSAE combined with convolutional neur...

  8. Ignorance- versus Evidence-Based Decision Making: A Decision Time Analysis of the Recognition Heuristic

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.; Pohl, Rudiger F.

    2009-01-01

    According to part of the adaptive toolbox notion of decision making known as the recognition heuristic (RH), the decision process in comparative judgments--and its duration--is determined by whether recognition discriminates between objects. By contrast, some recently proposed alternative models predict that choices largely depend on the amount of…

  9. Autoclave Meltout of Cast Explosives

    DTIC Science & Technology

    1996-08-22

    various tanks , kettles , and pelletizing equipment a usable product was recovered. This process creates large amounts of pink water requiring...vacuum treatment melt kettles , flaker belts, and improved material handling equipment in an integrated system. During the 1976/1977 period, AED...McAlester Army Ammo Plant , Oklahoma, to discuss proposed workload and inspect available facilities and equipment . Pilot model production and testing

  10. Impacts of pinyon-juniper treatments on water yields: A historical perspective

    Treesearch

    Peter F. Ffolliott; Cody Stropki

    2008-01-01

    Pinyon-juniper woodlands are not normally considered a high water-yielding type largely because of the low precipitation amounts and high evapotranspiration rates encountered. Nevertheless, a recommendation was made in the 1950s to evaluate the effectiveness of increasing water yields by converting pinyon-juniper overstories to herbaceous covers. A series of process...

  11. Landsat continuity: issues and opportunities for land cover monitoring

    Treesearch

    Michael A. Wulder; Joanne C. White; Samuel N. Goward; Jeffrey G. Masek; James R. Irons; Martin Herold; Warren B. Cohen; Thomas R. Loveland; Curtis E. Woodcock

    2008-01-01

    Initiated in 1972, the Landsat program has provided a continuous record of Earth observation for 35 years. The assemblage of Landsat spatial, spectral, and temporal resolutions, over a reasonably sized image extent, results in imagery that can be processed to represent land cover over large areas with an amount of spatial detail that is absolutely unique and...

  12. Using a remote sensing-based, percent tree cover map to enhance forest inventory estimation

    Treesearch

    Ronald E. McRoberts; Greg C. Liknes; Grant M. Domke

    2014-01-01

    For most national forest inventories, the variables of primary interest to users are forest area and growing stock volume. The precision of estimates of parameters related to these variables can be increased using remotely sensed auxiliary variables, often in combination with stratified estimators. However, acquisition and processing of large amounts of remotely sensed...

  13. Actual Drawing of Histological Images Improves Knowledge Retention

    ERIC Educational Resources Information Center

    Balemans, Monique C. M.; Kooloos, Jan G. M.; Donders, A. Rogier T.; Van der Zee, Catharina E. E. M.

    2016-01-01

    Medical students have to process a large amount of information during the first years of their study, which has to be retained over long periods of nonuse. Therefore, it would be beneficial when knowledge is gained in a way that promotes long-term retention. Paper-and-pencil drawings for the uptake of form-function relationships of basic tissues…

  14. Review of the harvesting and extraction of advanced biofuels and bioproducts

    Treesearch

    Babette L. Marrone;  Ronald E.  Lacey;  Daniel B. Anderson;  James Bonner;  Jim Coons;  Taraka Dale;  Cara Meghan Downes;  Sandun Fernando;  Christopher  Fuller;  Brian Goodall;  Johnathan E. Holladay;  Kiran Kadam;  Daniel  Kalb;  Wei  Liu;  John B. Mott;  Zivko Nikolov;  Kimberly L. Ogden;  Richard T. Sayre;  Brian G. Trewyn;  José A. Olivares

    2017-01-01

    Energy-efficient and scalable harvesting and lipid extraction processes must be developed in order for the algal biofuels and bioproducts industry to thrive. The major challenge for harvesting is the handling of large volumes of cultivation water to concentrate low amounts of biomass. For lipid extraction, the major energy and cost drivers are associated with...

  15. Data archiving and network system of Bisei Spaceguard center

    NASA Astrophysics Data System (ADS)

    Terazono, J.-Y.; Asami, A.; Asher, D.; Hashimoto, N.; Nakano, S.; Nishiyama, K.; Oshima, Y.; Umehara, H.; Urata, T.; Yoshikawa, M.; Isobe, S.

    2002-09-01

    Bisei Spaceguard Center, Japan's first facility for observations of space debris and Near-Earth Objects (NEOs), will produce large amounts of data. In this paper, we describe details of the data transfer and processing system we are now developing. Also we present a software system devoted to the discovery of asteroids mainly by high school students.

  16. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    PubMed

    Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  17. Visual attention mitigates information loss in small- and large-scale neural codes.

    PubMed

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-04-01

    The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. How to improve patient satisfaction when patients are already satisfied: a continuous process-improvement approach.

    PubMed

    Friesner, Dan; Neufelder, Donna; Raisor, Janet; Bozman, Carl S

    2009-01-01

    The authors present a methodology that measures improvement in customer satisfaction scores when those scores are already high and the production process is slow and thus does not generate a large amount of useful data in any given time period. The authors used these techniques with data from a midsized rehabilitation institute affiliated with a regional, nonprofit medical center. Thus, this article functions as a case study, the findings of which may be applicable to a large number of other healthcare providers that share both the mission and challenges faced by this facility. The methodology focused on 2 factors: use of the unique characteristics of panel data to overcome the paucity of observations and a dynamic benchmarking approach to track process variability over time. By focusing on these factors, the authors identify some additional areas for process improvement despite the institute's past operational success.

  19. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  20. A new multiple air beam approach for in-process form error optical measurement

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Li, R.

    2018-07-01

    In-process measurement can provide feedback for the control of workpiece precision in terms of size, roughness and, in particular, mid-spatial frequency form error. Optical measurement methods are of the non-contact type and possess high precision, as required for in-process form error measurement. In precision machining, coolant is commonly used to reduce heat generation and thermal deformation on the workpiece surface. However, the use of coolant will induce an opaque coolant barrier if optical measurement methods are used. In this paper, a new multiple air beam approach is proposed. The new approach permits the displacement of coolant from any direction and with a large thickness, i.e. with a large amount of coolant. The model, the working principle, and the key features of the new approach are presented. Based on the proposed new approach, a new in-process form error optical measurement system is developed. The coolant removal capability and the performance of this new multiple air beam approach are assessed. The experimental results show that the workpiece surface y(x, z) can be measured successfully with standard deviation up to 0.3011 µm even under a large amount of coolant, such that the coolant thickness is 15 mm. This means a relative uncertainty of 2σ up to 4.35% and the workpiece surface is deeply immersed in the opaque coolant. The results also show that, in terms of coolant removal capability, air supply and air velocity, the proposed new approach improves by, respectively, 3.3, 1.3 and 5.3 times on the previous single air beam approach. The results demonstrate the significant improvements brought by the new multiple air beam method together with the developed measurement system.

  1. [A medical consumable material management information system].

    PubMed

    Tang, Guoping; Hu, Liang

    2014-05-01

    Medical consumables material is essential supplies to carry out medical work, which has a wide range of varieties and a large amount of usage. How to manage it feasibly and efficiently that has been a topic of concern to everyone. This article discussed about how to design a medical consumable material management information system that has a set of standardized processes, bring together medical supplies administrator, suppliers and clinical departments. Advanced management mode, enterprise resource planning (ERP) applied to the whole system design process.

  2. Big Data Analytics for a Smart Green Infrastructure Strategy

    NASA Astrophysics Data System (ADS)

    Barrile, Vincenzo; Bonfa, Stefano; Bilotta, Giuliana

    2017-08-01

    As well known, Big Data is a term for data sets so large or complex that traditional data processing applications aren’t sufficient to process them. The term “Big Data” is referred to using predictive analytics. It is often related to user behavior analytics, or other advanced data analytics methods which from data extract value, and rarely to a particular size of data set. This is especially true for the huge amount of Earth Observation data that satellites constantly orbiting the earth daily transmit.

  3. The influence of polarization on millimeter wave propagation through rain. [radio signals

    NASA Technical Reports Server (NTRS)

    Bostian, C. W.; Stutzman, W. L.; Wiley, P. H.; Marshall, R. E.

    1973-01-01

    The measurement and analysis of the depolarization and attenuation that occur when millimeter wave radio signals propagate through rain are described. Progress was made in three major areas: the processing of recorded 1972 data, acquisition and processing of a large amount of 1973 data, and the development of a new theoretical model to predict rain cross polarization and attenuation. Each of these topics is described in detail along with radio frequency system design for cross polarization measurements.

  4. Sensible use of antisense: how to use oligonucleotides as research tools.

    PubMed

    Myers, K J; Dean, N M

    2000-01-01

    In the past decade, there has been a vast increase in the amount of gene sequence information that has the potential to revolutionize the way diseases are both categorized and treated. Old diagnoses, largely anatomical or descriptive in nature, are likely to be superceded by the molecular characterization of the disease. The recognition that certain genes drive key disease processes will also enable the rational design of gene-specific therapeutics. Antisense oligonucleotides represent a technology that should play multiple roles in this process.

  5. New Engineering Solutions in Creation of Mini-BOF for Metallic Waste Recycling

    NASA Astrophysics Data System (ADS)

    Eronko, S. P.; Gorbatyuk, S. M.; Oshovskaya, E. V.; Starodubtsev, B. I.

    2017-12-01

    New engineering solutions used in design of the mini melting unit capable of recycling industrial and domestic metallic waste with high content of harmful impurities are provided. High efficiency of the process technology implemented with its use is achieved due to the possibility of the heat and mass transfer intensification in the molten metal bath, controlled charge into it of large amounts of reagents in lumps and in fines, and cut-off of remaining process slag during metal tapping into the teeming ladle.

  6. Theoretical bases for conducting certain technological processes in space

    NASA Technical Reports Server (NTRS)

    Okhotin, A. S.

    1979-01-01

    Dimensionless conservation equations are presented and the theoretical bases of fluid behavior aboard orbiting satellites with application to the processes of manufacturing crystals in weightlessness. The small amount of gravitational acceleration is shown to increase the separation of bands of varying concentration. Natural convection is shown to have no practical effect on crystallization from a liquid melt. Barodiffusion is also negligibly small in realistic conditions of weightlessness. The effects of surface tension become increasingly large, and suggestions are made for further research.

  7. Software for rapid prototyping in the pharmaceutical and biotechnology industries.

    PubMed

    Kappler, Michael A

    2008-05-01

    The automation of drug discovery methods continues to develop, especially techniques that process information, represent workflow and facilitate decision-making. The magnitude of data and the plethora of questions in pharmaceutical and biotechnology research give rise to the need for rapid prototyping software. This review describes the advantages and disadvantages of three solutions: Competitive Workflow, Taverna and Pipeline Pilot. Each of these systems processes large amounts of data, integrates diverse systems and assists novice programmers and human experts in critical decision-making steps.

  8. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  9. Gene regulation knowledge commons: community action takes care of DNA binding transcription factors

    PubMed Central

    Tripathi, Sushil; Vercruysse, Steven; Chawla, Konika; Christie, Karen R.; Blake, Judith A.; Huntley, Rachael P.; Orchard, Sandra; Hermjakob, Henning; Thommesen, Liv; Lægreid, Astrid; Kuiper, Martin

    2016-01-01

    A large gap remains between the amount of knowledge in scientific literature and the fraction that gets curated into standardized databases, despite many curation initiatives. Yet the availability of comprehensive knowledge in databases is crucial for exploiting existing background knowledge, both for designing follow-up experiments and for interpreting new experimental data. Structured resources also underpin the computational integration and modeling of regulatory pathways, which further aids our understanding of regulatory dynamics. We argue how cooperation between the scientific community and professional curators can increase the capacity of capturing precise knowledge from literature. We demonstrate this with a project in which we mobilize biological domain experts who curate large amounts of DNA binding transcription factors, and show that they, although new to the field of curation, can make valuable contributions by harvesting reported knowledge from scientific papers. Such community curation can enhance the scientific epistemic process. Database URL: http://www.tfcheckpoint.org PMID:27270715

  10. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency

    PubMed Central

    Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio

    2015-01-01

    Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB. PMID:26558254

  11. Fuzzy Document Clustering Approach using WordNet Lexical Categories

    NASA Astrophysics Data System (ADS)

    Gharib, Tarek F.; Fouad, Mohammed M.; Aref, Mostafa M.

    Text mining refers generally to the process of extracting interesting information and knowledge from unstructured text. This area is growing rapidly mainly because of the strong need for analysing the huge and large amount of textual data that reside on internal file systems and the Web. Text document clustering provides an effective navigation mechanism to organize this large amount of data by grouping their documents into a small number of meaningful classes. In this paper we proposed a fuzzy text document clustering approach using WordNet lexical categories and Fuzzy c-Means algorithm. Some experiments are performed to compare efficiency of the proposed approach with the recently reported approaches. Experimental results show that Fuzzy clustering leads to great performance results. Fuzzy c-means algorithm overcomes other classical clustering algorithms like k-means and bisecting k-means in both clustering quality and running time efficiency.

  12. The application of waste fly ash and construction-waste in cement filling material in goaf

    NASA Astrophysics Data System (ADS)

    Chen, W. X.; Xiao, F. K.; Guan, X. H.; Cheng, Y.; Shi, X. P.; Liu, S. M.; Wang, W. W.

    2018-01-01

    As the process of urbanization accelerated, resulting in a large number of abandoned fly ash and construction waste, which have occupied the farmland and polluted the environment. In this paper, a large number of construction waste and abandoned fly ash are mixed into the filling material in goaf, the best formula of the filling material which containing a large amount of abandoned fly ash and construction waste is obtained, and the performance of the filling material is analyzed. The experimental results show that the cost of filling material is very low while the performance is very good, which have a good prospect in goaf.

  13. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  14. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  15. The Origin of the Terra Meridiani Sediments: Volatile Transport and the Formation of Sulfate Bearing Layered Deposits on Mars

    NASA Technical Reports Server (NTRS)

    Niles, P.B.

    2008-01-01

    The chemistry, sedimentology, and geology of the Meridiani sedimentary deposits are best explained by eolian reworking of the sublimation residue of a large scale ice/dust deposit. This large ice deposit was located in close proximity to Terra Meridiani and incorporated large amounts of dust, sand, and SO2 aerosols generated by impacts and volcanism during early martian history. Sulfate formation and chemical weathering of the initial igneous material is hypothesized to have occurred inside of the ice when the darker mineral grains were heated by solar radiant energy. This created conditions in which small films of liquid water were created in and around the mineral grains. This water dissolved the SO2 and reacted with the mineral grains forming an acidic environment under low water/rock conditions. Subsequent sublimation of this ice deposit left behind large amounts of weathered sublimation residue which became the source material for the eolian process that deposited the Terra Meridiani deposit. The following features of the Meridiani sediments are best explained by this model: The large scale of the deposit, its mineralogic similarity across large distances, the cation-conservative nature of the weathering processes, the presence of acidic groundwaters on a basaltic planet, the accumulation of a thick sedimentary sequence outside of a topographic basin, and the low water/rock ratio needed to explain the presence of very soluble minerals and elements in the deposit. Remote sensing studies have linked the Meridiani deposits to a number of other martian surface features through mineralogic similarities, geomorphic similarities, and regional associations. These include layered deposits in Arabia Terra, interior layered deposits in the Valles Marineris system, southern Elysium/Aeolis, Amazonis Planitia, and the Hellas basin, Aram Chaos, Aureum Chaos, and Ioni Chaos. The common properties shared by these deposits suggest that all of these deposits share a common formation process which must have acted over a large area of Mars. The results of this study suggest a mechanism for volatile transport on Mars without invoking an early greenhouse. They also imply a common formation mechanism for most of the sulfate minerals and layered deposits on Mars, which explains their common occurrence.

  16. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  17. Recycling of metal bearing electronic scrap in a plasma furnace

    NASA Astrophysics Data System (ADS)

    Jarosz, Piotr; Małecki, Stanisław; Gargul, Krzysztof

    2011-12-01

    The recycling of electronic waste and the recovery of valuable components are large problems in the modern world economy. This paper presents the effects of melting sorted electronic scrap in a plasma furnace. Printed circuit boards, cables, and windings were processed separately. The characteristics of the obtained products (i.e., alloy metal, slag, dust, and gases) are presented. A method of their further processing in order to obtain commercial products is proposed. Because of the chemical composition and physical properties, the waste slag is environmentally inert and can be used for the production of abrasives. Process dusts containing large amounts of carbon and its compounds have a high calorific value. That makes it possible to use them for energy generation. The gas has a high calorific value, and its afterburning combined with energy recovery is necessary.

  18. New pathway of stratocumulus to cumulus transition via aerosol-cloud-precipitation interaction

    NASA Astrophysics Data System (ADS)

    Yamaguchi, T.; Feingold, G.; Kazil, J.

    2017-12-01

    The stratocumulus to cumulus transition (SCT) is typically considered to be a slow, multi-day process, caused primarily by dry air entrainment associated with overshooting cumulus rising under stratocumulus, with minor influence of precipitation. In this presentation, we show rapid SCT induced by a strong precipitation-induced modulation with Lagrangian SCT large eddy simulations. A large eddy model is coupled with a two-moment bulk microphysics scheme that predicts aerosol and droplet number concentrations. Moderate aerosol concentrations (100-250 cm-3) produce little to no drizzle from the stratocumulus deck. Large amounts of rain eventually form and wash out stratocumulus and much of the aerosol, and a cumulus state appears for approximately 10 hours. Initiation of strong rain formation is identified in penetrative cumulus clouds which are much deeper than stratocumulus, and they are able to condense large amounts of water. We show that prediction of cloud droplet number is necessary for this fast SCT since it is a result of a positive feedback of collision-coalescence induced aerosol depletion enhancing drizzle formation. Simulations with fixed droplet concentrations that bracket the time varying aerosol/drop concentrations are therefore not representative of the role of drizzle in the SCT.

  19. Chemical Vapor Synthesis of Titanium Aluminides by Reaction of Aluminum Subchloride and Titanium Tetrachloride

    NASA Astrophysics Data System (ADS)

    Zakirov, Roman A.; Parfenov, Oleg G.; Solovyov, Leonid A.

    2018-02-01

    A new process for developing titanium aluminides (TiAls) using chemical vapor synthesis was investigated in a laboratory experiment. Aluminum subchloride (AlCl) was used as the reducing agent in the reaction with TiCl4 and the source of aluminum for Ti-Al alloy. Two types of products, with large crystals and fine particles, were fabricated. The large crystals were determined to be TiAl, with small amounts of Ti and Ti3Al phases. The composition of fine particles, on the other hand, varied in wide range.

  20. Automated Absorber Attachment for X-ray Microcalorimeter Arrays

    NASA Technical Reports Server (NTRS)

    Moseley, S.; Allen, Christine; Kilbourne, Caroline; Miller, Timothy M.; Costen, Nick; Schulte, Eric; Moseley, Samuel J.

    2007-01-01

    Our goal is to develop a method for the automated attachment of large numbers of absorber tiles to large format detector arrays. This development includes the fabrication of high quality, closely spaced HgTe absorber tiles that are properly positioned for pick-and-place by our FC150 flip chip bonder. The FC150 also transfers the appropriate minute amount of epoxy to the detectors for permanent attachment of the absorbers. The success of this development will replace an arduous, risky and highly manual task with a reliable, high-precision automated process.

  1. Synthesis of rose-like boron nitride particles with a high specific surface area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Hongming; Huang, Xiaoxiao; Wen, Guangwu, E-mail: wgw@hitwh.edu.cn

    2010-08-15

    Novel rose-like BN nanostructures were synthesized on a large scale via a two-step procedure. The products were characterized by X-ray diffraction, scanning electron microscopy, transmission electron microscopy, Fourier transform infrared spectroscopy, X-ray photoelectron spectrometer and nitrogen porosimetry. The results show that the obtained rose-like nanostructures are composed of a large amount of h-BN crystalline flakes and have a surface area of 90.31 m{sup 2}/g. A mechanism was proposed to explain the formation process of the rose-like BN nanostructures.

  2. High-performance computing in image registration

    NASA Astrophysics Data System (ADS)

    Zanin, Michele; Remondino, Fabio; Dalla Mura, Mauro

    2012-10-01

    Thanks to the recent technological advances, a large variety of image data is at our disposal with variable geometric, radiometric and temporal resolution. In many applications the processing of such images needs high performance computing techniques in order to deliver timely responses e.g. for rapid decisions or real-time actions. Thus, parallel or distributed computing methods, Digital Signal Processor (DSP) architectures, Graphical Processing Unit (GPU) programming and Field-Programmable Gate Array (FPGA) devices have become essential tools for the challenging issue of processing large amount of geo-data. The article focuses on the processing and registration of large datasets of terrestrial and aerial images for 3D reconstruction, diagnostic purposes and monitoring of the environment. For the image alignment procedure, sets of corresponding feature points need to be automatically extracted in order to successively compute the geometric transformation that aligns the data. The feature extraction and matching are ones of the most computationally demanding operations in the processing chain thus, a great degree of automation and speed is mandatory. The details of the implemented operations (named LARES) exploiting parallel architectures and GPU are thus presented. The innovative aspects of the implementation are (i) the effectiveness on a large variety of unorganized and complex datasets, (ii) capability to work with high-resolution images and (iii) the speed of the computations. Examples and comparisons with standard CPU processing are also reported and commented.

  3. Trend and current practices of palm oil mill effluent polishing: Application of advanced oxidation processes and their future perspectives.

    PubMed

    Bello, Mustapha Mohammed; Abdul Raman, Abdul Aziz

    2017-08-01

    Palm oil processing is a multi-stage operation which generates large amount of effluent. On average, palm oil mill effluent (POME) may contain up to 51, 000 mg/L COD, 25,000 mg/L BOD, 40,000 TS and 6000 mg/L oil and grease. Due to its potential to cause environmental pollution, palm oil mills are required to treat the effluent prior to discharge. Biological treatments using open ponding system are widely used for POME treatment. Although these processes are capable of reducing the pollutant concentrations, they require long hydraulic retention time and large space, with the effluent frequently failing to satisfy the discharge regulation. Due to more stringent environmental regulations, research interest has recently shifted to the development of polishing technologies for the biologically-treated POME. Various technologies such as advanced oxidation processes, membrane technology, adsorption and coagulation have been investigated. Among these, advanced oxidation processes have shown potentials as polishing technologies for POME. This paper offers an overview on the POME polishing technologies, with particularly emphasis on advanced oxidation processes and their prospects for large scale applications. Although there are some challenges in large scale applications of these technologies, this review offers some perspectives that could help in overcoming these challenges. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Examining the Impact of Culture and Human Elements on OLAP Tools Usefulness

    ERIC Educational Resources Information Center

    Sharoupim, Magdy S.

    2010-01-01

    The purpose of the present study was to examine the impact of culture and human-related elements on the On-line Analytical Processing (OLAP) usability in generating decision-making information. The use of OLAP technology has evolved rapidly and gained momentum, mainly due to the ability of OLAP tools to examine and query large amounts of data sets…

  5. Introducing Students to Feedstock Recycling of End-of-Life Silicones via a Low-Temperature, Iron-Catalyzed Depolymerization Process

    ERIC Educational Resources Information Center

    Do¨hlert, Peter; Weidauer, Maik; Peifer, Raphael; Kohl, Stephan; Enthaler, Stephan

    2015-01-01

    The straightforward large-scale synthesis and the ability to adjust the properties of polymers make polymers very attractive materials. Polymers have been used in numerous applications and an increased demand is foreseeable. However, a serious issue is the accumulation of enormous amounts of end-of-life polymers, which are currently recycled by…

  6. Dynamic Database. Efficiently Convert Massive Quantities of Sensor Data into Actionable Information for Tactical Commanders

    DTIC Science & Technology

    2000-06-01

    As the number of sensors, platforms, exploitation sites, and command and control nodes continues to grow in response to Joint Vision 2010 information ... dominance requirements, Commanders and analysts will have an ever increasing need to collect and process vast amounts of data over wide areas using a large number of disparate sensors and information gathering sources.

  7. A New Student Performance Analysing System Using Knowledge Discovery in Higher Educational Databases

    ERIC Educational Resources Information Center

    Guruler, Huseyin; Istanbullu, Ayhan; Karahasan, Mehmet

    2010-01-01

    Knowledge discovery is a wide ranged process including data mining, which is used to find out meaningful and useful patterns in large amounts of data. In order to explore the factors having impact on the success of university students, knowledge discovery software, called MUSKUP, has been developed and tested on student data. In this system a…

  8. Stopping of protons - Improved accuracy of the UCA model

    NASA Astrophysics Data System (ADS)

    Schiwietz, G.; Grande, P. L.

    2012-02-01

    Recent theoretical developments in the unitary convolution approximation (UCA) for electronic energy losses of bare and screened ions are presented. Examples are given for proton beams and rare-gas targets. For gas targets there exists a sufficient amount of experimental data on charge exchange, for pinpointing the largely unknown stopping-power contribution of electron-capture processes at low and intermediate energies.

  9. STREAMLINED LIFE CYCLE ASSESSMENT: A FINAL REPORT FROM THE SETAC-NORTH AMERICA STREAMLINED LCA WORKGROUP

    EPA Science Inventory

    The original goal of the Streamlined LCA workgroup was to define and document a process for a shortened form of LCA. At the time, because of the large amount of data needed to do a cradle-to-grave evaluation, it was believed that in addition to such a "full" LCA approach there w...

  10. Quantum cutting in nanoparticles producing two green photons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorbeer, C; Mudring, Anja -V

    2014-01-01

    A synthetic route to nanoscale NaGdF4:Ln is presented which allows for quantum cutting based on the Gd-Er-Tb system. This shows, that cross-relaxation and other energy transfer processes necessary for multiphoton emission can be achieved in nanoparticles even if the large surface and the potentially huge amount of killer traps would suggest a lack of subsequent emission.

  11. Emotional Intelligence: A Quantitative Study of the Relationship among Academic Success Factors and Emotional Intelligence

    ERIC Educational Resources Information Center

    Iannucci, Brian A.

    2013-01-01

    Researchers have found a correlation between emotional intelligence (EI) and success in the workplace. As a result, many companies have invested a large amount of resources into EI testing during their hiring process. In the United States, corporations are spending over $33 billion on hiring, training, and development. In addition to the increase…

  12. Ecological foundations for fire management in North American forest and shrubland ecosystems

    Treesearch

    J.E. Keeley; G.H. Aplet; N.L. Christensen; S.G. Conard; E.A. Johnson; P.N. Omi; D.L. Peterson; T.W. Swetnam

    2009-01-01

    This synthesis provides an ecological foundation for management of the diverse ecosystems and fire regimes of North America based on scientific principles of fire interactions with vegetation, fuels, and biophysical processes. Although a large amount of scientific data on fire exists, most of those data have been collected at small spatial and temporal scales. Thus, it...

  13. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  14. Parallel hyperspectral compressive sensing method on GPU

    NASA Astrophysics Data System (ADS)

    Bernabé, Sergio; Martín, Gabriel; Nascimento, José M. P.

    2015-10-01

    Remote hyperspectral sensors collect large amounts of data per flight usually with low spatial resolution. It is known that the bandwidth connection between the satellite/airborne platform and the ground station is reduced, thus a compression onboard method is desirable to reduce the amount of data to be transmitted. This paper presents a parallel implementation of an compressive sensing method, called parallel hyperspectral coded aperture (P-HYCA), for graphics processing units (GPU) using the compute unified device architecture (CUDA). This method takes into account two main properties of hyperspectral dataset, namely the high correlation existing among the spectral bands and the generally low number of endmembers needed to explain the data, which largely reduces the number of measurements necessary to correctly reconstruct the original data. Experimental results conducted using synthetic and real hyperspectral datasets on two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN, reveal that the use of GPUs can provide real-time compressive sensing performance. The achieved speedup is up to 20 times when compared with the processing time of HYCA running on one core of the Intel i7-2600 CPU (3.4GHz), with 16 Gbyte memory.

  15. Assignment of channels and polarisations in a broadcasting satellite service environment

    NASA Astrophysics Data System (ADS)

    Fortes, J. M. P.

    1986-07-01

    In the process of synthesizing a satellite communications plan, a large number of possible configurations has to be analyzed in a short amount of time. An important part of the process concerns the allocation of channels and polarizations to the various systems. It is, of course, desirable to make these allocations based on the aggregate carrier/interference ratios, but this needs a considerable amount of time, and for this reason the single-entry carrier/interference criterion is usually employed. The paper presents an integer programming model based on an approximate evaluation of the aggregate carrier/interference ratios, which is fast enough to justify its application in the synthesis process. It was developed to help the elaboration of a downlink plan for the broadcasting satellite service (BSS) of North, Central, and South America. The official software package of the 1983 Administrative Radio Conference (RARC 83), responsible for the planning of the BSS in region 2, contains a routine based on this model.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blennow, Mattias; Clementz, Stefan, E-mail: emb@kth.se, E-mail: scl@kth.se

    Current problems with the solar model may be alleviated if a significant amount of dark matter from the galactic halo is captured in the Sun. We discuss the capture process in the case where the dark matter is a Dirac fermion and the background halo consists of equal amounts of dark matter and anti-dark matter. By considering the case where dark matter and anti-dark matter have different cross sections on solar nuclei as well as the case where the capture process is considered to be a Poisson process, we find that a significant asymmetry between the captured dark particles andmore » anti-particles is possible even for an annihilation cross section in the range expected for thermal relic dark matter. Since the captured number of particles are competitive with asymmetric dark matter models in a large range of parameter space, one may expect solar physics to be altered by the capture of Dirac dark matter. It is thus possible that solutions to the solar composition problem may be searched for in these type of models.« less

  17. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    PubMed

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Effect of hydrothermal liquefaction aqueous phase recycling on bio-crude yields and composition.

    PubMed

    Biller, Patrick; Madsen, René B; Klemmer, Maika; Becker, Jacob; Iversen, Bo B; Glasius, Marianne

    2016-11-01

    Hydrothermal liquefaction (HTL) is a promising thermo-chemical processing technology for the production of biofuels but produces large amounts of process water. Therefore recirculation of process water from HTL of dried distillers grains with solubles (DDGS) is investigated. Two sets of recirculation on a continuous reactor system using K2CO3 as catalyst were carried out. Following this, the process water was recirculated in batch experiments for a total of 10 rounds. To assess the effect of alkali catalyst, non-catalytic HTL process water recycling was performed with 9 recycle rounds. Both sets of experiments showed a large increase in bio-crude yields from approximately 35 to 55wt%. The water phase and bio-crude samples from all experiments were analysed via quantitative gas chromatography-mass spectrometry (GC-MS) to investigate their composition and build-up of organic compounds. Overall the results show an increase in HTL conversion efficiency and a lower volume, more concentrated aqueous by-product following recycling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Advanced oxidation process using hydrogen peroxide/microwave system for solubilization of phosphate.

    PubMed

    Liao, Ping Huang; Wong, Wayne T; Lo, Kwang Victor

    2005-01-01

    An advanced oxidation process (AOP) combining hydrogen peroxide and microwave heating was used for the solubilization of phosphate from secondary municipal sludge from an enhanced biological phosphorus removal process. The microwave irradiation is used as a generator agent of oxidizing radicals as well as a heating source in the process. This AOP process could facilitate the release of a large amount of the sludge-bound phosphorus from the sewage sludge. More than 84% of the total phosphorous could be released at a microwave heating time of 5 min at 170 degrees C. This innovative process has the potential of being applied to simple sludge treatment processes in domestic wastewater treatment and to the recovery of phosphorus from the wastewater.

  20. Room-temperature solution-processed and metal oxide-free nano-composite for the flexible transparent bottom electrode of perovskite solar cells

    NASA Astrophysics Data System (ADS)

    Lu, Haifei; Sun, Jingsong; Zhang, Hong; Lu, Shunmian; Choy, Wallace C. H.

    2016-03-01

    The exploration of low-temperature and solution-processed charge transporting and collecting layers can promote the development of low-cost and large-scale perovskite solar cells (PVSCs) through an all solution process. Here, we propose a room-temperature solution-processed and metal oxide-free nano-composite composed of a silver nano-network and graphene oxide (GO) flawless film for the transparent bottom electrode of a PVSC. Our experimental results show that the amount of GO flakes play a critical role in forming the flawless anti-corrosive barrier in the silver nano-network through a self-assembly approach under ambient atmosphere, which can effectively prevent the penetration of liquid or gaseous halides and their corrosion against the silver nano-network underneath. Importantly, we simultaneously achieve good work function alignment and surface wetting properties for a practical bottom electrode by controlling the degree of reduction of GO flakes. Finally, flexible PVSC adopting the room-temperature and solution-processed nano-composite as the flexible transparent bottom electrode has been demonstrated on a polyethylene terephthalate (PET) substrate. As a consequence, the demonstration of our room-temperature solution-processed and metal oxide-free flexible transparent bottom electrode will contribute to the emerging large-area flexible PVSC technologies.The exploration of low-temperature and solution-processed charge transporting and collecting layers can promote the development of low-cost and large-scale perovskite solar cells (PVSCs) through an all solution process. Here, we propose a room-temperature solution-processed and metal oxide-free nano-composite composed of a silver nano-network and graphene oxide (GO) flawless film for the transparent bottom electrode of a PVSC. Our experimental results show that the amount of GO flakes play a critical role in forming the flawless anti-corrosive barrier in the silver nano-network through a self-assembly approach under ambient atmosphere, which can effectively prevent the penetration of liquid or gaseous halides and their corrosion against the silver nano-network underneath. Importantly, we simultaneously achieve good work function alignment and surface wetting properties for a practical bottom electrode by controlling the degree of reduction of GO flakes. Finally, flexible PVSC adopting the room-temperature and solution-processed nano-composite as the flexible transparent bottom electrode has been demonstrated on a polyethylene terephthalate (PET) substrate. As a consequence, the demonstration of our room-temperature solution-processed and metal oxide-free flexible transparent bottom electrode will contribute to the emerging large-area flexible PVSC technologies. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00011h

  1. Automated manufacturing process for DEAP stack-actuators

    NASA Astrophysics Data System (ADS)

    Tepel, Dominik; Hoffstadt, Thorben; Maas, Jürgen

    2014-03-01

    Dielectric elastomers (DE) are thin polymer films belonging to the class of electroactive polymers (EAP), which are coated with compliant and conductive electrodes on each side. Due to the influence of an electrical field, dielectric elastomers perform a large amount of deformation. In this contribution a manufacturing process of automated fabricated stack-actuators based on dielectric electroactive polymers (DEAP) are presented. First of all the specific design of the considered stack-actuator is explained and afterwards the development, construction and realization of an automated manufacturing process is presented in detail. By applying this automated process, stack-actuators with reproducible and homogeneous properties can be manufactured. Finally, first DEAP actuator modules fabricated by the mentioned process are validated experimentally.

  2. Hot working behavior of selective laser melted and laser metal deposited Inconel 718

    NASA Astrophysics Data System (ADS)

    Bambach, Markus; Sizova, Irina

    2018-05-01

    The production of Nickel-based high-temperature components is of great importance for the transport and energy sector. Forging of high-temperature alloys often requires expensive dies, multiple forming steps and leads to forged parts with tolerances that require machining to create the final shape and a large amount of scrap. Additive manufacturing offers the possibility to print the desired shapes directly as net-shape components, requiring only little additional effort in machining. Especially for high-temperature alloys carrying a large amount of energy per unit mass, additive manufacturing could be more energy-efficient than forging if the energy contained in the machining scrap exceeds the energy needed for powder production and laser processing. However, the microstructure and performance of 3d-printed parts will not reach the level of forged material unless further expensive processes such as hot-isostatic pressing are used. Using the design freedom and possibilities to locally engineer material, additive manufacturing could be combined with forging operations to novel process chains, offering the possibility to reduce the number of forging steps and to create near-net shape forgings with desired local properties. Some innovative process chains combining additive manufacturing and forging have been patented recently, but almost no scientific knowledge on the workability of 3D printed preforms exists. The present study investigates the flow stress and microstructure evolution during hot working of pre-forms produced by laser powder deposition and selective laser melting (Figure 1) and puts forward a model for the flow stress.

  3. The Impact of Attention on Judgments of Frequency and Duration

    PubMed Central

    Winkler, Isabell; Glauer, Madlen; Betsch, Tilmann; Sedlmeier, Peter

    2015-01-01

    Previous studies that examined human judgments of frequency and duration found an asymmetrical relationship: While frequency judgments were quite accurate and independent of stimulus duration, duration judgments were highly dependent upon stimulus frequency. A potential explanation for these findings is that the asymmetry is moderated by the amount of attention directed to the stimuli. In the current experiment, participants' attention was manipulated in two ways: (a) intrinsically, by varying the type and arousal potential of the stimuli (names, low-arousal and high-arousal pictures), and (b) extrinsically, by varying the physical effort participants expended during the stimulus presentation (by lifting a dumbbell vs. relaxing the arm). Participants processed stimuli with varying presentation frequencies and durations and were subsequently asked to estimate the frequency and duration of each stimulus. Sensitivity to duration increased for pictures in general, especially when processed under physical effort. A large effect of stimulus frequency on duration judgments was obtained for all experimental conditions, but a similar large effect of presentation duration on frequency judgments emerged only in the conditions that could be expected to draw high amounts of attention to the stimuli: when pictures were judged under high physical effort. Almost no difference in the mutual impact of frequency and duration was obtained for low-arousal or high-arousal pictures. The mechanisms underlying the simultaneous processing of frequency and duration are discussed with respect to existing models derived from animal research. Options for the extension of such models to human processing of frequency and duration are suggested. PMID:26000712

  4. The impact of attention on judgments of frequency and duration.

    PubMed

    Winkler, Isabell; Glauer, Madlen; Betsch, Tilmann; Sedlmeier, Peter

    2015-01-01

    Previous studies that examined human judgments of frequency and duration found an asymmetrical relationship: While frequency judgments were quite accurate and independent of stimulus duration, duration judgments were highly dependent upon stimulus frequency. A potential explanation for these findings is that the asymmetry is moderated by the amount of attention directed to the stimuli. In the current experiment, participants' attention was manipulated in two ways: (a) intrinsically, by varying the type and arousal potential of the stimuli (names, low-arousal and high-arousal pictures), and (b) extrinsically, by varying the physical effort participants expended during the stimulus presentation (by lifting a dumbbell vs. relaxing the arm). Participants processed stimuli with varying presentation frequencies and durations and were subsequently asked to estimate the frequency and duration of each stimulus. Sensitivity to duration increased for pictures in general, especially when processed under physical effort. A large effect of stimulus frequency on duration judgments was obtained for all experimental conditions, but a similar large effect of presentation duration on frequency judgments emerged only in the conditions that could be expected to draw high amounts of attention to the stimuli: when pictures were judged under high physical effort. Almost no difference in the mutual impact of frequency and duration was obtained for low-arousal or high-arousal pictures. The mechanisms underlying the simultaneous processing of frequency and duration are discussed with respect to existing models derived from animal research. Options for the extension of such models to human processing of frequency and duration are suggested.

  5. Elastin in large artery stiffness and hypertension

    PubMed Central

    Wagenseil, Jessica E.; Mecham, Robert P.

    2012-01-01

    Large artery stiffness, as measured by pulse wave velocity (PWV), is correlated with high blood pressure and may be a causative factor in essential hypertension. The extracellular matrix components, specifically the mix of elastin and collagen in the vessel wall, determine the passive mechanical properties of the large arteries. Elastin is organized into elastic fibers in the wall during arterial development in a complex process that requires spatial and temporal coordination of numerous proteins. The elastic fibers last the lifetime of the organism, but are subject to proteolytic degradation and chemical alterations that change their mechanical properties. This review discusses how alterations in the amount, assembly, organization or chemical properties of the elastic fibers affect arterial stiffness and blood pressure. Strategies for encouraging or reversing alterations to the elastic fibers are addressed. Methods for determining the efficacy of these strategies, by measuring elastin amounts and arterial stiffness, are summarized. Therapies that have a direct effect on arterial stiffness through alterations to the elastic fibers in the wall may be an effective treatment for essential hypertension. PMID:22290157

  6. Quality and loudness judgments for music subjected to compression limiting.

    PubMed

    Croghan, Naomi B H; Arehart, Kathryn H; Kates, James M

    2012-08-01

    Dynamic-range compression (DRC) is used in the music industry to maximize loudness. The amount of compression applied to commercial recordings has increased over time due to a motivating perspective that louder music is always preferred. In contrast to this viewpoint, artists and consumers have argued that using large amounts of DRC negatively affects the quality of music. However, little research evidence has supported the claims of either position. The present study investigated how DRC affects the perceived loudness and sound quality of recorded music. Rock and classical music samples were peak-normalized and then processed using different amounts of DRC. Normal-hearing listeners rated the processed and unprocessed samples on overall loudness, dynamic range, pleasantness, and preference, using a scaled paired-comparison procedure in two conditions: un-equalized, in which the loudness of the music samples varied, and loudness-equalized, in which loudness differences were minimized. Results indicated that a small amount of compression was preferred in the un-equalized condition, but the highest levels of compression were generally detrimental to quality, whether loudness was equalized or varied. These findings are contrary to the "louder is better" mentality in the music industry and suggest that more conservative use of DRC may be preferred for commercial music.

  7. Nature and Properties of Lateritic Soils Derived from Different Parent Materials in Taiwan

    PubMed Central

    2014-01-01

    The objective of this study was to investigate the physical, chemical, and mineralogical composition of lateritic soils in order to use these soils as potential commercial products for industrial application in the future. Five lateritic soils derived from various parent materials in Taiwan, including andesite, diluvium, shale stone, basalt, and Pleistocene deposit, were collected from the Bt1 level of soil samples. Based on the analyses, the Tungwei soil is an alfisol, whereas other lateritic soils are ultisol. Higher pH value of Tungwei is attributed to the large amounts of Ca2+ and Mg2+. Loupi and Pingchen soils would be the older lateritic soils because of the lower active iron ratio. For the iron minerals, the magnetic iron oxides such as major amounts of magnetite and maghemite were found for Tamshui and Tungwei lateritic soils, respectively. Lepidocrocite was only found in Soka soil and intermediate amounts of goethite were detected for Loupi and Pingchen soils. After Mg-saturated and K-saturated processes, major amounts of mixed layer were observed in Loupi and Soka soils, whereas the montmorillonite was only detected in Tungwei soil. The investigation results revealed that the parent materials would play an important role during soil weathering process and physical, chemical, and mineralogy compositions strongly affect the formation of lateritic soils. PMID:24883366

  8. Nature and properties of lateritic soils derived from different parent materials in Taiwan.

    PubMed

    Ko, Tzu-Hsing

    2014-01-01

    The objective of this study was to investigate the physical, chemical, and mineralogical composition of lateritic soils in order to use these soils as potential commercial products for industrial application in the future. Five lateritic soils derived from various parent materials in Taiwan, including andesite, diluvium, shale stone, basalt, and Pleistocene deposit, were collected from the Bt1 level of soil samples. Based on the analyses, the Tungwei soil is an alfisol, whereas other lateritic soils are ultisol. Higher pH value of Tungwei is attributed to the large amounts of Ca(2+) and Mg(2+). Loupi and Pingchen soils would be the older lateritic soils because of the lower active iron ratio. For the iron minerals, the magnetic iron oxides such as major amounts of magnetite and maghemite were found for Tamshui and Tungwei lateritic soils, respectively. Lepidocrocite was only found in Soka soil and intermediate amounts of goethite were detected for Loupi and Pingchen soils. After Mg-saturated and K-saturated processes, major amounts of mixed layer were observed in Loupi and Soka soils, whereas the montmorillonite was only detected in Tungwei soil. The investigation results revealed that the parent materials would play an important role during soil weathering process and physical, chemical, and mineralogy compositions strongly affect the formation of lateritic soils.

  9. Development of an interactive data base management system for capturing large volumes of data.

    PubMed

    Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L

    1995-10-01

    Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.

  10. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs

    PubMed Central

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410

  11. Destruction of Navy Hazardous Wastes by Supercritical Water Oxidation

    DTIC Science & Technology

    1994-08-01

    cleaning and derusting (nitrite and citric acid solutions), electroplating ( acids and metal bearing solutions), electronics and refrigeration... acid forming chemical species or that contain a large amount of dissolved solids present a challenge to current SCWO •-chnology. Approved for public...Waste streams that contain a large amount of mineral- acid forming chemical species or that contain a large amount of dissolved solids present a challenge

  12. Empirical relationships between tree fall and landscape-level amounts of logging and fire

    PubMed Central

    Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C.

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape. PMID:29474487

  13. Empirical relationships between tree fall and landscape-level amounts of logging and fire.

    PubMed

    Lindenmayer, David B; Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape.

  14. Image processing for improved eye-tracking accuracy

    NASA Technical Reports Server (NTRS)

    Mulligan, J. B.; Watson, A. B. (Principal Investigator)

    1997-01-01

    Video cameras provide a simple, noninvasive method for monitoring a subject's eye movements. An important concept is that of the resolution of the system, which is the smallest eye movement that can be reliably detected. While hardware systems are available that estimate direction of gaze in real-time from a video image of the pupil, such systems must limit image processing to attain real-time performance and are limited to a resolution of about 10 arc minutes. Two ways to improve resolution are discussed. The first is to improve the image processing algorithms that are used to derive an estimate. Off-line analysis of the data can improve resolution by at least one order of magnitude for images of the pupil. A second avenue by which to improve resolution is to increase the optical gain of the imaging setup (i.e., the amount of image motion produced by a given eye rotation). Ophthalmoscopic imaging of retinal blood vessels provides increased optical gain and improved immunity to small head movements but requires a highly sensitive camera. The large number of images involved in a typical experiment imposes great demands on the storage, handling, and processing of data. A major bottleneck had been the real-time digitization and storage of large amounts of video imagery, but recent developments in video compression hardware have made this problem tractable at a reasonable cost. Images of both the retina and the pupil can be analyzed successfully using a basic toolbox of image-processing routines (filtering, correlation, thresholding, etc.), which are, for the most part, well suited to implementation on vectorizing supercomputers.

  15. Enabling Graph Appliance for Genome Assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Rina; Graves, Jeffrey A; Lee, Sangkeun

    2015-01-01

    In recent years, there has been a huge growth in the amount of genomic data available as reads generated from various genome sequencers. The number of reads generated can be huge, ranging from hundreds to billions of nucleotide, each varying in size. Assembling such large amounts of data is one of the challenging computational problems for both biomedical and data scientists. Most of the genome assemblers developed have used de Bruijn graph techniques. A de Bruijn graph represents a collection of read sequences by billions of vertices and edges, which require large amounts of memory and computational power to storemore » and process. This is the major drawback to de Bruijn graph assembly. Massively parallel, multi-threaded, shared memory systems can be leveraged to overcome some of these issues. The objective of our research is to investigate the feasibility and scalability issues of de Bruijn graph assembly on Cray s Urika-GD system; Urika-GD is a high performance graph appliance with a large shared memory and massively multithreaded custom processor designed for executing SPARQL queries over large-scale RDF data sets. However, to the best of our knowledge, there is no research on representing a de Bruijn graph as an RDF graph or finding Eulerian paths in RDF graphs using SPARQL for potential genome discovery. In this paper, we address the issues involved in representing a de Bruin graphs as RDF graphs and propose an iterative querying approach for finding Eulerian paths in large RDF graphs. We evaluate the performance of our implementation on real world ebola genome datasets and illustrate how genome assembly can be accomplished with Urika-GD using iterative SPARQL queries.« less

  16. Pattern Recognition for a Flight Dynamics Monte Carlo Simulation

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; Hurtado, John E.

    2011-01-01

    The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.

  17. Evaluation of Iodine Bioavailability in Seaweed Using in Vitro Methods.

    PubMed

    Domínguez-González, M Raquel; Chiocchetti, Gabriela M; Herbello-Hermelo, Paloma; Vélez, Dinoraz; Devesa, Vicenta; Bermejo-Barrera, Pilar

    2017-09-27

    Due to the high levels of iodine present in seaweed, the ingestion of a large amount of this type of food can produce excessive intake of iodine. However, the food after ingestion undergoes different chemistry and physical processes that can modify the amount of iodine that reaches the systemic circulation (bioavailability). Studies on the bioavailability of iodine from food are scarce and indicate that the bioavailable amount is generally lower than ingested. Iodine in vitro bioavailability estimation from different commercialized seaweed has been studied using different in vitro approaches (solubility, dialyzability, and transport and uptake by intestinal cells). Results indicate that iodine is available after gastrointestinal digestion for absorption (bioaccessibility: 49-82%), kombu being the seaweed with the highest bioaccessibility. The incorporation of dialysis cell cultures to elucidate bioavailability modifies the estimation of the amount of iodine that may reach the systemic circulation (dialysis, 5-28%; cell culture, ≤3%). The paper discusses advantages and drawbacks of these methodologies for iodine bioavailability in seaweed.

  18. Study of residue type defect formation mechanism and the effect of advanced defect reduction (ADR) rinse process

    NASA Astrophysics Data System (ADS)

    Arima, Hiroshi; Yoshida, Yuichi; Yoshihara, Kosuke; Shibata, Tsuyoshi; Kushida, Yuki; Nakagawa, Hiroki; Nishimura, Yukio; Yamaguchi, Yoshikazu

    2009-03-01

    Residue type defect is one of yield detractors in lithography process. It is known that occurrence of the residue type defect is dependent on resist development process and the defect is reduced by optimized rinsing condition. However, the defect formation is affected by resist materials and substrate conditions. Therefore, it is necessary to optimize the development process condition by each mask level. Those optimization steps require a large amount of time and effort. The formation mechanism is investigated from viewpoint of both material and process. The defect formation is affected by resist material types, substrate condition and development process condition (D.I.W. rinse step). Optimized resist formulation and new rinse technology significantly reduce the residue type defect.

  19. A simple biosynthetic pathway for large product generation from small substrate amounts

    NASA Astrophysics Data System (ADS)

    Djordjevic, Marko; Djordjevic, Magdalena

    2012-10-01

    A recently emerging discipline of synthetic biology has the aim of constructing new biosynthetic pathways with useful biological functions. A major application of these pathways is generating a large amount of the desired product. However, toxicity due to the possible presence of toxic precursors is one of the main problems for such production. We consider here the problem of generating a large amount of product from a potentially toxic substrate. To address this, we propose a simple biosynthetic pathway, which can be induced in order to produce a large number of the product molecules, by keeping the substrate amount at low levels. Surprisingly, we show that the large product generation crucially depends on fast non-specific degradation of the substrate molecules. We derive an optimal induction strategy, which allows as much as three orders of magnitude increase in the product amount through biologically realistic parameter values. We point to a recently discovered bacterial immune system (CRISPR/Cas in E. coli) as a putative example of the pathway analysed here. We also argue that the scheme proposed here can be used not only as a stand-alone pathway, but also as a strategy to produce a large amount of the desired molecules with small perturbations of endogenous biosynthetic pathways.

  20. Molecular Properties of Red Wine Compounds and Cardiometabolic Benefits

    PubMed Central

    Markoski, Melissa M.; Garavaglia, Juliano; Oliveira, Aline; Olivaes, Jessica; Marcadenti, Aline

    2016-01-01

    Wine has been used since the dawn of human civilization. Despite many health benefits, there is still a lot of discussion about the real properties of its components and its actions on cells and molecular interactions. A large part of these issues permeate the fine line between the amount of alcohol that causes problems to organic systems and the amount that could be beneficial for the health. However, even after the process of fermentation, wine conserves different organic compounds from grapes, such as polysaccharides, acids, and phenolic compounds, such as flavonoids and nonflavonoids. These substances have known anti-inflammatory and antioxidant capacities, and are considered as regulatory agents in cardiometabolic process. In this study, the main chemical components present in the wine, its interaction with molecules and biological mechanisms, and their interference with intra- and extracellular signaling are reviewed. Finally, the properties of wine that may benefit cardiovascular system are also revised. PMID:27512338

  1. Overview of Sea-Ice Properties, Distribution and Temporal Variations, for Application to Ice-Atmosphere Chemical Processes.

    NASA Astrophysics Data System (ADS)

    Moritz, R. E.

    2005-12-01

    The properties, distribution and temporal variation of sea-ice are reviewed for application to problems of ice-atmosphere chemical processes. Typical vertical structure of sea-ice is presented for different ice types, including young ice, first-year ice and multi-year ice, emphasizing factors relevant to surface chemistry and gas exchange. Time average annual cycles of large scale variables are presented, including ice concentration, ice extent, ice thickness and ice age. Spatial and temporal variability of these large scale quantities is considered on time scales of 1-50 years, emphasizing recent and projected changes in the Arctic pack ice. The amount and time evolution of open water and thin ice are important factors that influence ocean-ice-atmosphere chemical processes. Observations and modeling of the sea-ice thickness distribution function are presented to characterize the range of variability in open water and thin ice.

  2. Cultured 3T3L1 adipocytes dispose of excess medium glucose as lactate under abundant oxygen availability

    NASA Astrophysics Data System (ADS)

    Sabater, David; Arriarán, Sofía; Romero, María Del Mar; Agnelli, Silvia; Remesar, Xavier; Fernández-López, José Antonio; Alemany, Marià

    2014-01-01

    White adipose tissue (WAT) produces lactate in significant amount from circulating glucose, especially in obesity;Under normoxia, 3T3L1 cells secrete large quantities of lactate to the medium, again at the expense of glucose and proportionally to its levels. Most of the glucose was converted to lactate with only part of it being used to synthesize fat. Cultured adipocytes were largely anaerobic, but this was not a Warburg-like process. It is speculated that the massive production of lactate, is a process of defense of the adipocyte, used to dispose of excess glucose. This way, the adipocyte exports glucose carbon (and reduces the problem of excess substrate availability) to the liver, but the process may be also a mechanism of short-term control of hyperglycemia. The in vivo data obtained from adipose tissue of male rats agree with this interpretation.

  3. Alternatives to Antibiotics in Semen Extenders: A Review

    PubMed Central

    Morrell, Jane M.; Wallgren, Margareta

    2014-01-01

    Antibiotics are added to semen extenders to be used for artificial insemination (AI) in livestock breeding to control bacterial contamination in semen arising during collection and processing. The antibiotics to be added and their concentrations for semen for international trade are specified by government directives. Since the animal production industry uses large quantities of semen for artificial insemination, large amounts of antibiotics are currently used in semen extenders. Possible alternatives to antibiotics are discussed, including physical removal of the bacteria during semen processing, as well as the development of novel antimicrobials. Colloid centrifugation, particularly Single Layer Centrifugation, when carried out with a strict aseptic technique, offers a feasible method for reducing bacterial contamination in semen and is a practical method for semen processing laboratories to adopt. However, none of these alternatives to antibiotics should replace strict attention to hygiene during semen collection and handling. PMID:25517429

  4. Modeling of fugitive dust emission for construction sand and gravel processing plant.

    PubMed

    Lee, C H; Tang, L W; Chang, C T

    2001-05-15

    Due to rapid economic development in Taiwan, a large quantity of construction sand and gravel is needed to support domestic civil construction projects. However, a construction sand and gravel processing plant is often a major source of air pollution, due to its associated fugitive dust emission. To predict the amount of fugitive dust emitted from this kind of processing plant, a semiempirical model was developed in this study. This model was developed on the basis of the actual dust emission data (i.e., total suspended particulate, TSP) and four on-site operating parameters (i.e., wind speed (u), soil moisture (M), soil silt content (s), and number (N) of trucks) measured at a construction sand and gravel processing plant. On the basis of the on-site measured data and an SAS nonlinear regression program, the expression of this model is E = 0.011.u2.653.M-1.875.s0.060.N0.896, where E is the amount (kg/ton) of dust emitted during the production of each ton of gravel and sand. This model can serve as a facile tool for predicting the fugitive dust emission from a construction sand and gravel processing plant.

  5. Conversion of rice straw to bio-based chemicals: an integrated process using Lactobacillus brevis.

    PubMed

    Kim, Jae-Han; Block, David E; Shoemaker, Sharon P; Mills, David A

    2010-05-01

    Commercialization of lignocellulosic biomass as a feedstock for bio-based chemical production is problematic due to the high processing costs of pretreatment and saccharifying enzymes combined with low product yields. Such low product yield can be attributed, in large part, to the incomplete utilization of the various carbohydrate sugars found in the lignocellulosic biomass. In this study, we demonstrate that Lactobacillus brevis is able to simultaneously metabolize all fermentable carbohydrates in acid pre-processed rice straw hydrolysate, thereby allowing complete utilization of all released sugars. Inhibitors present in rice straw hydrolysate did not affect lactic acid production. Moreover, the activity of exogenously added cellulases was not reduced in the presence of growing cultures of L. brevis. These factors enabled the use of L. brevis in a process termed simultaneous saccharification and mixed sugar fermentation (SSMSF). In SSMSF with L. brevis, sugars present in rice straw hydrolysate were completely utilized while the cellulase maintained its maximum activity due to the lack of feedback inhibition from glucose and/or cellobiose. By comparison to a sequential hydrolysis and fermentation process, SSMSF reduced operation time and the amount of cellulase enzyme necessary to produce the same amount of lactic acid.

  6. A multilinear regression methodology to analyze the effect of atmospheric and surface forcing on Arctic clouds

    NASA Astrophysics Data System (ADS)

    Boeke, R.; Taylor, P. C.; Li, Y.

    2017-12-01

    Arctic cloud amount as simulated in CMIP5 models displays large intermodel spread- models disagree on the processes important for cloud formation as well as the radiative impact of clouds. The radiative response to cloud forcing can be better assessed when the drivers of Arctic cloud formation are known. Arctic cloud amount (CA) is a function of both atmospheric and surface conditions, and it is crucial to separate the influences of unique processes to understand why the models are different. This study uses a multilinear regression methodology to determine cloud changes using 3 variables as predictors: lower tropospheric stability (LTS), 500-hPa vertical velocity (ω500), and sea ice concentration (SIC). These three explanatory variables were chosen because their effects on clouds can be attributed to unique climate processes: LTS is a thermodynamic indicator of the relationship between clouds and atmospheric stability, SIC determines the interaction between clouds and the surface, and ω500 is a metric for dynamical change. Vertical, seasonal profiles of necessary variables are obtained from the Coupled Model Intercomparison Project 5 (CMIP5) historical simulation, an ocean-atmosphere couple model forced with the best-estimate natural and anthropogenic radiative forcing from 1850-2005, and statistical significance tests are used to confirm the regression equation. A unique heuristic model will be constructed for each climate model and for observations, and models will be tested by their ability to capture the observed cloud amount and behavior. Lastly, the intermodel spread in Arctic cloud amount will be attributed to individual processes, ranking the relative contributions of each factor to shed light on emergent constraints in the Arctic cloud radiative effect.

  7. The big data processing platform for intelligent agriculture

    NASA Astrophysics Data System (ADS)

    Huang, Jintao; Zhang, Lichen

    2017-08-01

    Big data technology is another popular technology after the Internet of Things and cloud computing. Big data is widely used in many fields such as social platform, e-commerce, and financial analysis and so on. Intelligent agriculture in the course of the operation will produce large amounts of data of complex structure, fully mining the value of these data for the development of agriculture will be very meaningful. This paper proposes an intelligent data processing platform based on Storm and Cassandra to realize the storage and management of big data of intelligent agriculture.

  8. Cellulase producing microorganism ATCC 55702

    DOEpatents

    Dees, H. Craig

    1997-01-01

    Bacteria which produce large amounts of cellulase--containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualifies for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.

  9. Cellulase-containing cell-free fermentate produced from microorganism ATCC 55702

    DOEpatents

    Dees, H. Craig

    1997-12-16

    Bacteria which produce large amounts of cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.

  10. Regenerative adsorbent heat pump

    NASA Technical Reports Server (NTRS)

    Jones, Jack A. (Inventor)

    1991-01-01

    A regenerative adsorbent heat pump process and system is provided which can regenerate a high percentage of the sensible heat of the system and at least a portion of the heat of adsorption. A series of at least four compressors containing an adsorbent is provided. A large amount of heat is transferred from compressor to compressor so that heat is regenerated. The process and system are useful for air conditioning rooms, providing room heat in the winter or for hot water heating throughout the year, and, in general, for pumping heat from a lower temperature to a higher temperature.

  11. Cellulase producing microorganism ATCC 55702

    DOEpatents

    Dees, H.C.

    1997-12-30

    Bacteria which produce large amounts of cellulase--containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualifies for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.

  12. Application of symbolic/numeric matrix solution techniques to the NASTRAN program

    NASA Technical Reports Server (NTRS)

    Buturla, E. M.; Burroughs, S. H.

    1977-01-01

    The matrix solving algorithm of any finite element algorithm is extremely important since solution of the matrix equations requires a large amount of elapse time due to null calculations and excessive input/output operations. An alternate method of solving the matrix equations is presented. A symbolic processing step followed by numeric solution yields the solution very rapidly and is especially useful for nonlinear problems.

  13. Yield comparisons from floating blade and fixed arbor gang ripsaws when processing boards before and after crook removal

    Treesearch

    Charles J. Gatchell; Charles J. Gatchell

    1991-01-01

    Gang-ripping technology that uses a movable (floating) outer blade to eliminate unusable edgings is described, including new tenn1nology for identifying preferred and minimally acceptable strip widths. Because of the large amount of salvage required to achieve total yields, floating blade gang ripping is not recommended for boards with crook. With crook removed by...

  14. A Bayesian Analysis of Scale-Invariant Processes

    DTIC Science & Technology

    2012-01-01

    Earth Grid (EASE- Grid). The NED raster elevation data of one arc-second resolution (30 m) over the continental US are derived from multiple satellites ...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...empirical and ME distributions, yet ensuring computational efficiency. Instead of com- puting empirical histograms from large amount of data , only some

  15. Both topography and climate affected forest and woodland burn severity in two regions of the western US, 1984 to 2006

    Treesearch

    Gregory K. Dillon; Zachery A. Holden; Penelope Morgan; Michael A. Crimmins; Emily K. Heyerdahl; Charles H. Luce

    2011-01-01

    Fire is a keystone process in many ecosystems of western North America. Severe fires kill and consume large amounts of above- and belowground biomass and affect soils, resulting in long-lasting consequences for vegetation, aquatic ecosystem productivity and diversity, and other ecosystem properties. We analyzed the occurrence of, and trends in, satellite-derived burn...

  16. Abu Ghraib Dairy, Abu Ghraib, Iraq

    DTIC Science & Technology

    2010-01-14

    products, especially milk. Traditionally, a young population consumes a large amount of dairy products, such as milk, yogurt , and processed cheese...security situation and electrical capacity in Iraq continue to improve, there will be a further increase in the demand for milk, yogurt , and cheese. Dairy...based products, such as bottled milk, yogurt , cheese, cream, and butter. The State Company for Dairy Products is a holding company with three

  17. Abu Ghraib Dairy, Abu Ghraib, Iraq

    DTIC Science & Technology

    2009-01-14

    especially milk. Traditionally, a young population consumes a large amount of dairy products, such as milk, yogurt , and processed cheese. However...and electrical capacity in Iraq continue to improve, there will be a further increase in the demand for milk, yogurt , and cheese. Dairy products...such as bottled milk, yogurt , cheese, cream, and butter. The State Company for Dairy Products is a holding company with three factories/plants

  18. Integrative Genomics and Computational Systems Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Huang, Yufei; Zhang, Bing

    The exponential growth in generation of large amounts of genomic data from biological samples has driven the emerging field of systems medicine. This field is promising because it improves our understanding of disease processes at the systems level. However, the field is still in its young stage. There exists a great need for novel computational methods and approaches to effectively utilize and integrate various omics data.

  19. Eliciting Dyslexic Symptoms in Proficient Readers by Simulating Deficits in Grapheme-to-Phoneme Conversion and Visuo-Magnocellular Processing

    ERIC Educational Resources Information Center

    Tholen, Nicole; Weidner, Ralph; Grande, Marion; Amunts, Katrin; Heim, Stefan

    2011-01-01

    Among the cognitive causes of dyslexia, phonological and magnocellular deficits have attracted a substantial amount of research. Their role and their exact impact on reading ability are still a matter of debate, partly also because large samples of dyslexics are hard to recruit. Here, we report a new technique to simulate dyslexic symptoms in…

  20. Processing of cellulosic material by a cellulase-containing cell-free fermentate produced from cellulase-producing bacteria, ATCC 55702

    DOEpatents

    Dees, H.C.

    1998-08-04

    Bacteria which produce large amounts of a cellulase-containing cell-free fermentate, have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase degrading bacterium ATCC 55702, which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic materials. 5 figs.

  1. Processing of cellulosic material by a cellulase-containing cell-free fermentate produced from cellulase-producing bacteria, ATCC 55702

    DOEpatents

    Dees, H. Craig

    1998-01-01

    Bacteria which produce large amounts of a cellulase-containing cell-free fermentate, have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase degrading bacterium ATCC 55702, which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic materials.

  2. Inactivation of low pathogenicity notifiable avian influenza virus and lentogenic Newcastle disease virus following pasteurization in liquid egg products

    USDA-ARS?s Scientific Manuscript database

    Sixty seven million cases of shell eggs produced per year in the U.S. are processed as liquid egg product. The U.S. also exports a large amount of egg products. Although the U.S. is normally free of avian influenza, concern about contamination of egg product with these viruses has in the past result...

  3. A smart multisensor approach to assist blind people in specific urban navigation tasks.

    PubMed

    Ando, B

    2008-12-01

    Visually impaired people are often discouraged in using electronic aids due to complexity of operation, large amount of training, nonoptimized degree of information provided to the user, and high cost. In this paper, a new multisensor architecture is discussed, which would help blind people to perform urban mobility tasks. The device is based on a multisensor strategy and adopts smart signal processing.

  4. Woody residues and solid waste wood available for recovery in the United States, 2002

    Treesearch

    David B. McKeever; Robert H. Falk

    2004-01-01

    Large amounts of woody residues and solid wood waste are generated annually in the United States from the extraction of timber from forests, from forestry cultural operations, in the conversion of forest land to nonforest uses, in the initial processing of roundwood timber into usable products, in the construction and demolition of buildings and structures, and in the...

  5. Inventories of woody residues and solid wood waste in the United States, 2002

    Treesearch

    David B. McKeever

    2004-01-01

    Large amounts of woody residues and wood waste are generated annually in the United States. In 2002, an estimated 240 million metric tons was generated during the extraction of timber from the Nation’s forests, from forestry cultural operations, in the conversion of forest land to nonforest uses, in the initial processing of roundwood timber into usable products, in...

  6. A Cross-Cultural Perspective about the Implementation and Adaptation Process of the Schoolwide Enrichment Model: The Importance of Talent Development in a Global World

    ERIC Educational Resources Information Center

    Hernández-Torrano, Daniel; Saranli, Adile Gulsah

    2015-01-01

    Gifted education and talent development are considered today as key elements for developing human capital and increasing competitiveness within education and the economy. Within this framework, a growing number of countries have begun to invest large amounts of resources to discover and nurture their most able students. As boundaries and…

  7. Parallel computing method for simulating hydrological processesof large rivers under climate change

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, Y.

    2016-12-01

    Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.

  8. Study on Mosaic and Uniform Color Method of Satellite Image Fusion in Large Srea

    NASA Astrophysics Data System (ADS)

    Liu, S.; Li, H.; Wang, X.; Guo, L.; Wang, R.

    2018-04-01

    Due to the improvement of satellite radiometric resolution and the color difference for multi-temporal satellite remote sensing images and the large amount of satellite image data, how to complete the mosaic and uniform color process of satellite images is always an important problem in image processing. First of all using the bundle uniform color method and least squares mosaic method of GXL and the dodging function, the uniform transition of color and brightness can be realized in large area and multi-temporal satellite images. Secondly, using Color Mapping software to color mosaic images of 16bit to mosaic images of 8bit based on uniform color method with low resolution reference images. At last, qualitative and quantitative analytical methods are used respectively to analyse and evaluate satellite image after mosaic and uniformity coloring. The test reflects the correlation of mosaic images before and after coloring is higher than 95 % and image information entropy increases, texture features are enhanced which have been proved by calculation of quantitative indexes such as correlation coefficient and information entropy. Satellite image mosaic and color processing in large area has been well implemented.

  9. D/H on Mars: Effects of floods, volcanism, impacts, and polar processes

    USGS Publications Warehouse

    Carr, M.H.

    1990-01-01

    Water in the Martian atmosphere is 5.1 times more enriched in deuterium than terrestial water. The enrichment has been previously attributed to either a massive loss of water early in the planet's history or the presence of only a very small reservoir of water that has exchanged with the atmosphere over geologic time. Both these interpretations appear inconsistent with geologic evidence of large floods and sustained volcanism. Large floods are believed to have episodically introduced large amounts of water onto the surface. During a large flood roughly 1017 g of water would almost immediately sublime into the atmospher and be frozen out on polar terrain, to form a new layer several centimeters thick. The long-term effect of a flood would depend on where the water pooled after the flood. If the water pooled at low latitudes, all the water would slowly sublime into the atmosphers and ultimately be frozen out at the poles, thereby adding several meters to the polar deposits for each flood. If the water pooled at high latitude, it would form a permanent ice deposit, largely isolated from further interchange with the atmosphere. Volcanism has also episodically introduced water into the atmosphere. Most of this water has become incorporated into the polar deposits. That released over the last 3.5 Ga could have added a few kilometers to the polar deposits, depending on the amount of dust incorporated along with the ice. Large cometary impacts would have introduced additional large amounts of water into the atmosphere. The long-term evolution of D/H in the atmosphere depends on the rate of exchange of water between the atmosphere and the polar deposits. If exchange is active, then loss rates of hydrogen from the upper atmosphere are substantially higher than those estimated by Y. L. Yung, J. Wen, J. P. Pinto, M. Allen, K. K. Pierce, and S. Paulsen [Icarus 76, 146-159 (1988)]. More plausibly, exchange of water between the atmosphere and the polar deposits is limited, so that after eruptions, floods, and cometary impacts, the atmosphere soon becomes enriched in deuterium. According to this scenario, the atmospheric D/H is different from the bulk of the planet's water and so reveals little about the amount of water outgassed. The scenario implies, however, that the polar deposits are older and more stable than formerly thought. ?? 1990.

  10. Scrap tyre recycling process with molten zinc as direct heat transfer and solids separation fluid: A new reactor concept.

    PubMed

    Riedewald, Frank; Goode, Kieran; Sexton, Aidan; Sousa-Gallagher, Maria J

    2016-01-01

    Every year about 1.5 billion tyres are discarded worldwide representing a large amount of solid waste, but also a largely untapped source of raw materials. The objective of the method was to prove the concept of a novel scrap tyre recycling process which uses molten zinc as the direct heat transfer fluid and, simultaneously, uses this media to separate the solids products (i.e. steel and rCB) in a sink-float separation at an operating temperature of 450-470 °C. This methodology involved: •construction of the laboratory scale batch reactor,•separation of floating rCB from the zinc,•recovery of the steel from the bottom of the reactor following pyrolysis.

  11. Design and characterization of microporous zeolitic hydroceramic waste forms for the solidification and stabilization of sodium bearing wastes

    NASA Astrophysics Data System (ADS)

    Bao, Yun

    During the production of nuclear weapon by the DOE, large amounts of liquid waste were generated and stored in millions of gallons of tanks at Savannah River, Hanford and INEEL sites. Typically, the waste contains large amounts of soluble NaOH, NaNO2 and NaNO3 and small amounts of soluble fission products, cladding materials and cleaning solution. Due to its high sodium content it has been called sodium bearing waste (SBW). We have formulated, tested and evaluated a new type of hydroceramic waste form specifically designed to solidify SBW. Hydroceramics can be made from an alumosilicate source such as metakaolin and NaOH solutions or the SBW itself. Under mild hydrothermal conditions, the mixture is transformed into a solid consisting of zeolites. This process leads to the incorporation of radionuclides into lattice sites and the cage structures of the zeolites. Hydroceramics have high strength and inherent stability in realistic geologic settings. The process of making hydroceramics from a series of SBWs was optimized. The results are reported in this thesis. Some SBWs containing relatively small amounts of NaNO3 and NaNO2 (SigmaNOx/Sigma Na<25 mol%) can be directly solidified with metakaolin. The remaining SBW having high concentrations of nitrate and nitrite (SigmaNOx/Sigma Na>25 mol%) require pretreatment since a zeolitic matrix such as cancrinite is unable to host more than 25 mol% nitrate/nitrite. Two procedures to denitrate/denitrite followed by solidification were developed. One is based on calcination in which a reducing agent such as sucrose and metakaolin have been chosen as a way of reducing nitrate and nitrite to an acceptable level. The resulting calcine can be solidified using additional metakaolin and NaOH to form a hydroceramic. As an alternate, a chemical denitration/denitrition process using Si and Al powders as the reducing agents, followed by adding metakaolin to the solution prepare a hydroceramic was also investigated. Si and Al not only are the reducing agents, but they also provide Si and Al species to make zeolites during the reducing process. Performance of the hydroceramics was documented using SEM microstructure and X-ray diffraction phase analysis, mechanical property and leaching tests (Product Consistency Test and ANSI/ANS-16.1 leaching test).

  12. Parallelization and visual analysis of multidimensional fields: Application to ozone production, destruction, and transport in three dimensions

    NASA Technical Reports Server (NTRS)

    Schwan, Karsten

    1994-01-01

    Atmospheric modeling is a grand challenge problem for several reasons, including its inordinate computational requirements and its generation of large amounts of data concurrent with its use of very large data sets derived from measurement instruments like satellites. In addition, atmospheric models are typically run several times, on new data sets or to reprocess existing data sets, to investigate or reinvestigate specific chemical or physical processes occurring in the earth's atmosphere, to understand model fidelity with respect to observational data, or simply to experiment with specific model parameters or components.

  13. Mantle plumes and continental tectonics.

    PubMed

    Hill, R I; Campbell, I H; Davies, G F; Griffiths, R W

    1992-04-10

    Mantle plumes and plate tectonics, the result of two distinct modes of convection within the Earth, operate largely independently. Although plumes are secondary in terms of heat transport, they have probably played an important role in continental geology. A new plume starts with a large spherical head that can cause uplift and flood basalt volcanism, and may be responsible for regional-scale metamorphism or crustal melting and varying amounts of crustal extension. Plume heads are followed by narrow tails that give rise to the familiar hot-spot tracks. The cumulative effect of processes associated with tail volcanism may also significantly affect continental crust.

  14. Super enrichments of Fe-group nuclei in solar flares and their association with large He-3 enrichments

    NASA Technical Reports Server (NTRS)

    Anglin, J. D.; Dietrich, W. F.; Simpson, J. A.

    1978-01-01

    Data on solar flares and perodic particle intensity enhancements in the energy range from 1 to 20 MeV/n are examined. It is found that: (1) Fe/He-4 ratios range from about 1 to 1000 times the solar ratio of 0.0004; (2) these high ratios mitigate against extended storage and large amounts of nuclear processing; (3) the CNO/He-4 ratio has a much smaller range of variability and a mean value of 0.02; (4) large He-3 and Fe enrichments are strongly associated, but not on a one-to-one basis; (5) large Fe enhancements sometimes occur without correspondingly large He-3 enrichments; and (6) none of the models so far advanced adequately explains the observed He-3 and heavy-nucleus enrichments.

  15. Using Visualization in Cockpit Decision Support Systems

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.

    2005-01-01

    In order to safely operate their aircraft, pilots must make rapid decisions based on integrating and processing large amounts of heterogeneous information. Visual displays are often the most efficient method of presenting safety-critical data to pilots in real time. However, care must be taken to ensure the pilot is provided with the appropriate amount of information to make effective decisions and not become cognitively overloaded. The results of two usability studies of a prototype airflow hazard visualization cockpit decision support system are summarized. The studies demonstrate that such a system significantly improves the performance of helicopter pilots landing under turbulent conditions. Based on these results, design principles and implications for cockpit decision support systems using visualization are presented.

  16. Optimization of additive compositions for anode in Ni-MH secondary battery using the response surface method

    NASA Astrophysics Data System (ADS)

    Yang, Dong-Cheol; Jang, In-Su; Jang, Min-Ho; Park, Choong-Nyeon; Park, Chan-Jin; Choi, Jeon

    2009-06-01

    We optimized the composition of additives for the anode in a Ni-MH battery using the response surface method (RSM) to improve the electrode discharge capacities. When the amount of additives was small, the discharge characteristics of the electrode were degraded by charge-discharge cycling due to the low binding strength among the alloy powders and the resultant separation of the powder from the electrode surface. In contrast, the addition of a large amount of the additives increased the electrical impedance of the electrode. Through a response optimization process, we found an optimum composition range of additives to exhibit the greatest discharge capacity of the electrode.

  17. Annotating novel genes by integrating synthetic lethals and genomic information

    PubMed Central

    Schöner, Daniel; Kalisch, Markus; Leisner, Christian; Meier, Lukas; Sohrmann, Marc; Faty, Mahamadou; Barral, Yves; Peter, Matthias; Gruissem, Wilhelm; Bühlmann, Peter

    2008-01-01

    Background Large scale screening for synthetic lethality serves as a common tool in yeast genetics to systematically search for genes that play a role in specific biological processes. Often the amounts of data resulting from a single large scale screen far exceed the capacities of experimental characterization of every identified target. Thus, there is need for computational tools that select promising candidate genes in order to reduce the number of follow-up experiments to a manageable size. Results We analyze synthetic lethality data for arp1 and jnm1, two spindle migration genes, in order to identify novel members in this process. To this end, we use an unsupervised statistical method that integrates additional information from biological data sources, such as gene expression, phenotypic profiling, RNA degradation and sequence similarity. Different from existing methods that require large amounts of synthetic lethal data, our method merely relies on synthetic lethality information from two single screens. Using a Multivariate Gaussian Mixture Model, we determine the best subset of features that assign the target genes to two groups. The approach identifies a small group of genes as candidates involved in spindle migration. Experimental testing confirms the majority of our candidates and we present she1 (YBL031W) as a novel gene involved in spindle migration. We applied the statistical methodology also to TOR2 signaling as another example. Conclusion We demonstrate the general use of Multivariate Gaussian Mixture Modeling for selecting candidate genes for experimental characterization from synthetic lethality data sets. For the given example, integration of different data sources contributes to the identification of genetic interaction partners of arp1 and jnm1 that play a role in the same biological process. PMID:18194531

  18. The electrical properties of zero-gravity processed immiscibles

    NASA Technical Reports Server (NTRS)

    Lacy, L. L.; Otto, G. H.

    1974-01-01

    When dispersed or mixed immiscibles are solidified on earth, a large amount of separation of the constituents takes place due to differences in densities. However, when the immiscibles are dispersed and solidified in zero-gravity, density separation does not occur, and unique composite solids can be formed with many new and promising electrical properties. By measuring the electrical resistivity and superconducting critical temperature, Tc, of zero-g processed Ga-Bi samples, it has been found that the electrical properties of such materials are entirely different from the basic constituents and the ground control samples. Our results indicate that space processed immiscible materials may form an entirely new class of electronic materials.

  19. Exocytosis of macrophage lysosomes leads to digestion of apoptotic adipocytes and foam cell formation[S

    PubMed Central

    Haka, Abigail S.; Barbosa-Lorenzi, Valéria C.; Lee, Hyuek Jong; Falcone, Domenick J.; Hudis, Clifford A.; Dannenberg, Andrew J.

    2016-01-01

    Many types of apoptotic cells are phagocytosed and digested by macrophages. Adipocytes can be hundreds of times larger than macrophages, so they are too large to be digested by conventional phagocytic processes. The nature of the interaction between macrophages and apoptotic adipocytes has not been studied in detail. We describe a cellular process, termed exophagy, that is important for macrophage clearance of dead adipocytes and adipose tissue homeostasis. Using mouse models of obesity, human tissue, and a cell culture model, we show that macrophages form hydrolytic extracellular compartments at points of contact with dead adipocytes using local actin polymerization. These compartments are acidic and contain lysosomal enzymes delivered by exocytosis. Uptake and complete degradation of adipocyte fragments, which are released by extracellular hydrolysis, leads to macrophage foam cell formation. Exophagy-mediated foam cell formation is a highly efficient means by which macrophages internalize large amounts of lipid, which may ultimately overwhelm the metabolic capacity of the macrophage. This process provides a mechanism for degradation of objects, such as dead adipocytes, that are too large to be phagocytosed by macrophages. PMID:27044658

  20. Human Growth Hormone Adsorption Kinetics and Conformation on Self-Assembled Monolayers

    PubMed Central

    Buijs, Jos; Britt, David W.; Hlady, Vladimir

    2012-01-01

    The adsorption process of the recombinant human growth hormone on organic films, created by self-assembly of octadecyltrichlorosilane, arachidic acid, and dipalmitoylphosphatidylcholine, is investigated and compared to adsorption on silica and methylated silica substrates. Information on the adsorption process of human growth hormone (hGH) is obtained by using total internal reflection fluorescence (TIRF). The intensity, spectra, and quenching of the intrinsic fluorescence emitted by the growth hormone’s single tryptophan are monitored and related to adsorption kinetics and protein conformation. For the various alkylated hydrophobic surfaces with differences in surface density and conformational freedom it is observed that the adsorbed amount of growth hormone is relatively large if the alkyl chains are in an ordered structure while the amounts adsorbed are considerably lower for adsorption onto less ordered alkyl chains of fatty acid and phospholipid layers. Adsorption on methylated surfaces results in a relatively large conformational change in the growth hormone’s structure, as displayed by a 7 nm blue shift in emission wavelength and a large increase in the effectiveness of fluorescence quenching. Conformational changes are less evident for hGH adsorption onto the fatty acid and phospholipid alkyl chains. Adsorption kinetics on the hydrophilic head groups of the self-assembled monolayers are similar to those on solid hydrophilic surfaces. The relatively small conformational changes in the hGH structure observed for adsorption on silica are even further reduced for adsorption on fatty acid head groups. PMID:25125795

  1. Reversible Parallel Discrete-Event Execution of Large-scale Epidemic Outbreak Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Seal, Sudip K

    2010-01-01

    The spatial scale, runtime speed and behavioral detail of epidemic outbreak simulations together require the use of large-scale parallel processing. In this paper, an optimistic parallel discrete event execution of a reaction-diffusion simulation model of epidemic outbreaks is presented, with an implementation over themore » $$\\mu$$sik simulator. Rollback support is achieved with the development of a novel reversible model that combines reverse computation with a small amount of incremental state saving. Parallel speedup and other runtime performance metrics of the simulation are tested on a small (8,192-core) Blue Gene / P system, while scalability is demonstrated on 65,536 cores of a large Cray XT5 system. Scenarios representing large population sizes (up to several hundred million individuals in the largest case) are exercised.« less

  2. Unsupervised Detection of Planetary Craters by a Marked Point Process

    NASA Technical Reports Server (NTRS)

    Troglio, G.; Benediktsson, J. A.; Le Moigne, J.; Moser, G.; Serpico, S. B.

    2011-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images is being acquired. Preferably, automatic and robust processing techniques need to be used for data analysis because of the huge amount of the acquired data. Here, the aim is to achieve a robust and general methodology for crater detection. A novel technique based on a marked point process is proposed. First, the contours in the image are extracted. The object boundaries are modeled as a configuration of an unknown number of random ellipses, i.e., the contour image is considered as a realization of a marked point process. Then, an energy function is defined, containing both an a priori energy and a likelihood term. The global minimum of this function is estimated by using reversible jump Monte-Carlo Markov chain dynamics and a simulated annealing scheme. The main idea behind marked point processes is to model objects within a stochastic framework: Marked point processes represent a very promising current approach in the stochastic image modeling and provide a powerful and methodologically rigorous framework to efficiently map and detect objects and structures in an image with an excellent robustness to noise. The proposed method for crater detection has several feasible applications. One such application area is image registration by matching the extracted features.

  3. Optimization of Refining Craft for Vegetable Insulating Oil

    NASA Astrophysics Data System (ADS)

    Zhou, Zhu-Jun; Hu, Ting; Cheng, Lin; Tian, Kai; Wang, Xuan; Yang, Jun; Kong, Hai-Yang; Fang, Fu-Xin; Qian, Hang; Fu, Guang-Pan

    2016-05-01

    Vegetable insulating oil because of its environmental friendliness are considered as ideal material instead of mineral oil used for the insulation and the cooling of the transformer. The main steps of traditional refining process included alkali refining, bleaching and distillation. This kind of refining process used in small doses of insulating oil refining can get satisfactory effect, but can't be applied to the large capacity reaction kettle. This paper using rapeseed oil as crude oil, and the refining process has been optimized for large capacity reaction kettle. The optimized refining process increases the acid degumming process. The alkali compound adds the sodium silicate composition in the alkali refining process, and the ratio of each component is optimized. Add the amount of activated clay and activated carbon according to 10:1 proportion in the de-colorization process, which can effectively reduce the oil acid value and dielectric loss. Using vacuum pumping gas instead of distillation process can further reduce the acid value. Compared some part of the performance parameters of refined oil products with mineral insulating oil, the dielectric loss of vegetable insulating oil is still high and some measures are needed to take to further optimize in the future.

  4. Heavy metal solubility in podzolic soils exposed to the alkalizing effect of air pollutants.

    PubMed

    Haapala, H; Goltsova, N; Lodenius, M

    2001-01-01

    The heavy metal content of pine forest soil was studied near the boundary between Russia and Estonia, an area characterized by large amounts of acidic and basic air pollutants, mainly sulfur dioxide and calcium. Alkalization dominates the processes in soil, since sulfur is adsorbed only in small quantities, and calcium is much better adsorbed. In addition to Ca, great amounts of Al, Fe, K, and Mg are accumulated in the humus layer due to air pollution. The heavy metal content has increased. The exchangeable content of heavy metals was in many cases much higher in polluted alkaline soils than in non-polluted acidic soils, even the ratio of exchangeable to total metal content being higher in alkaline plots. To avoid a dangerous increase in soluble heavy metal content, it is important to decrease not only the large sulfur emissions of local pollutant sources, but also the alkaline pollutants. A similar concern must be taken into account when liming of acidic forest soils is planned.

  5. Lenticular card: a new method for denture identification.

    PubMed

    Colvenkar, Shreya S

    2010-01-01

    The need for denture marking is important for forensic and social reasons in case patients need to be identified individually. Majority of the surface marking and inclusion techniques are expensive, time consuming, and do not permit the incorporation of large amounts of information. In this article, the method to include a lenticular identification card stood out from the currently available denture marking methods in various ways. The lenticular card stores the patient's information has two or more images that can be viewed by changing the angle of view. The maxillary denture was processed according to the manufacturer's instructions. The lenticular identification card was incorporated in the external posterior buccal surface of the maxillary denture using salt and pepper technique. For testing of durability, denture with the identifier was placed in water for up to 4 months. The proposed method is simple, cheap, and can store a large amount of information, thus allowing quick identification of the denture wearer. The labels showed no sign of fading or deterioration.

  6. Analyzing large-scale spiking neural data with HRLAnalysis™

    PubMed Central

    Thibeault, Corey M.; O'Brien, Michael J.; Srinivasa, Narayan

    2014-01-01

    The additional capabilities provided by high-performance neural simulation environments and modern computing hardware has allowed for the modeling of increasingly larger spiking neural networks. This is important for exploring more anatomically detailed networks but the corresponding accumulation in data can make analyzing the results of these simulations difficult. This is further compounded by the fact that many existing analysis packages were not developed with large spiking data sets in mind. Presented here is a software suite developed to not only process the increased amount of spike-train data in a reasonable amount of time, but also provide a user friendly Python interface. We describe the design considerations, implementation and features of the HRLAnalysis™ suite. In addition, performance benchmarks demonstrating the speedup of this design compared to a published Python implementation are also presented. The result is a high-performance analysis toolkit that is not only usable and readily extensible, but also straightforward to interface with existing Python modules. PMID:24634655

  7. A biomedical information system for retrieval and manipulation of NHANES data.

    PubMed

    Mukherjee, Sukrit; Martins, David; Norris, Keith C; Jenders, Robert A

    2013-01-01

    The retrieval and manipulation of data from large public databases like the U.S. National Health and Nutrition Examination Survey (NHANES) may require sophisticated statistical software and significant expertise that may be unavailable in the university setting. In response, we have developed the Data Retrieval And Manipulation System (DReAMS), an automated information system to handle all processes of data extraction and cleaning and then joining different subsets to produce analysis-ready output. The system is a browser-based data warehouse application in which the input data from flat files or operational systems are aggregated in a structured way so that the desired data can be read, recoded, queried and extracted efficiently. The current pilot implementation of the system provides access to a limited amount of NHANES database. We plan to increase the amount of data available through the system in the near future and to extend the techniques to other large databases from CDU archive with a current holding of about 53 databases.

  8. An Efficient Method for the Isolation of Highly Purified RNA from Seeds for Use in Quantitative Transcriptome Analysis.

    PubMed

    Kanai, Masatake; Mano, Shoji; Nishimura, Mikio

    2017-01-11

    Plant seeds accumulate large amounts of storage reserves comprising biodegradable organic matter. Humans rely on seed storage reserves for food and as industrial materials. Gene expression profiles are powerful tools for investigating metabolic regulation in plant cells. Therefore, detailed, accurate gene expression profiles during seed development are required for crop breeding. Acquiring highly purified RNA is essential for producing these profiles. Efficient methods are needed to isolate highly purified RNA from seeds. Here, we describe a method for isolating RNA from seeds containing large amounts of oils, proteins, and polyphenols, which have inhibitory effects on high-purity RNA isolation. Our method enables highly purified RNA to be obtained from seeds without the use of phenol, chloroform, or additional processes for RNA purification. This method is applicable to Arabidopsis, rapeseed, and soybean seeds. Our method will be useful for monitoring the expression patterns of low level transcripts in developing and mature seeds.

  9. Large-scale deposition of weathered oil in the Gulf of Mexico following a deep-water oil spill.

    PubMed

    Romero, Isabel C; Toro-Farmer, Gerardo; Diercks, Arne-R; Schwing, Patrick; Muller-Karger, Frank; Murawski, Steven; Hollander, David J

    2017-09-01

    The blowout of the Deepwater Horizon (DWH) drilling rig in 2010 released an unprecedented amount of oil at depth (1,500 m) into the Gulf of Mexico (GoM). Sedimentary geochemical data from an extensive area (∼194,000 km 2 ) was used to characterize the amount, chemical signature, distribution, and extent of the DWH oil deposited on the seafloor in 2010-2011 from coastal to deep-sea areas in the GoM. The analysis of numerous hydrocarbon compounds (N = 158) and sediment cores (N = 2,613) suggests that, 1.9 ± 0.9 × 10 4 metric tons of hydrocarbons (>C9 saturated and aromatic fractions) were deposited in 56% of the studied area, containing 21± 10% (up to 47%) of the total amount of oil discharged and not recovered from the DWH spill. Examination of the spatial trends and chemical diagnostic ratios indicate large deposition of weathered DWH oil in coastal and deep-sea areas and negligible deposition on the continental shelf (behaving as a transition zone in the northern GoM). The large-scale analysis of deposited hydrocarbons following the DWH spill helps understanding the possible long-term fate of the released oil in 2010, including sedimentary transformation processes, redistribution of deposited hydrocarbons, and persistence in the environment as recycled petrocarbon. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Real-time face and gesture analysis for human-robot interaction

    NASA Astrophysics Data System (ADS)

    Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd

    2010-05-01

    Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakel, Allen J.; Conner, Cliff; Quigley, Kevin

    One of the missions of the Reduced Enrichment for Research and Test Reactors (RERTR) program (and now the National Nuclear Security Administrations Material Management and Minimization program) is to facilitate the use of low enriched uranium (LEU) targets for 99Mo production. The conversion from highly enriched uranium (HEU) to LEU targets will require five to six times more uranium to produce an equivalent amount of 99Mo. The work discussed here addresses the technical challenges encountered in the treatment of uranyl nitrate hexahydrate (UNH)/nitric acid solutions remaining after the dissolution of LEU targets. Specifically, the focus of this work is themore » calcination of the uranium waste from 99Mo production using LEU foil targets and the Modified Cintichem Process. Work with our calciner system showed that high furnace temperature, a large vent tube, and a mechanical shield are beneficial for calciner operation. One- and two-step direct calcination processes were evaluated. The high-temperature one-step process led to contamination of the calciner system. The two-step direct calcination process operated stably and resulted in a relatively large amount of material in the calciner cup. Chemically assisted calcination using peroxide was rejected for further work due to the difficulty in handling the products. Chemically assisted calcination using formic acid was rejected due to unstable operation. Chemically assisted calcination using oxalic acid was recommended, although a better understanding of its chemistry is needed. Overall, this work showed that the two-step direct calcination and the in-cup oxalic acid processes are the best approaches for the treatment of the UNH/nitric acid waste solutions remaining from dissolution of LEU targets for 99Mo production.« less

  12. Querying Large Biological Network Datasets

    ERIC Educational Resources Information Center

    Gulsoy, Gunhan

    2013-01-01

    New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…

  13. Why the United States Underestimated the Soviet BW Threat

    DTIC Science & Technology

    2006-09-01

    air sampling.2 For example, the nuclear power plant at Yongbyon in the Democratic People’s Republic of Korea was detected operating shortly after it...Cirincione, Wolfsthal, and Rajkumar, Deadly Arsenals, 435-437. 3 short amount of time. Chemical weapons plants also have large footprints that can...chemical processing plant for industry or agriculture is possible. For example, phosgene was a chemical weapon used extensively in World War I. This

  14. Man-Machine Interaction: Operator.

    DTIC Science & Technology

    1984-06-01

    EASTER OF SCIENCI I COMPUTER SCIENCE Justification from the Distribution/ Availability Codes NAVal POSTGBADUATE SCHOOL Avail and/or June 1984 Dlst...Few pecple, if any, remember everything they see or hear but an anazingly large amount of material can be recalled years after it has been acquired...and skill, learning takes tine. The time required for the learning process will generally vary with the coaplexity of the material cr task he is

  15. Artificial intelligence applications concepts for the remote sensing and earth science community

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Roelofs, L. H.

    1984-01-01

    The following potential applications of AI to the study of earth science are described: (1) intelligent data management systems; (2) intelligent processing and understanding of spatial data; and (3) automated systems which perform tasks that currently require large amounts of time by scientists and engineers to complete. An example is provided of how an intelligent information system might operate to support an earth science project.

  16. Ruby lidar observations and trajectory analysis of stratospheric aerosols injected by the volcanic eruptions of El Chichon

    NASA Technical Reports Server (NTRS)

    Uchino, O.; Tabata, T.; Akita, I.; Okada, Y.; Naito, K.

    1985-01-01

    Large amounts of aerosol particles and gases were injected into the lower stratosphere by the violet volcanic eruptions of El Chichon on March 28, and April 3 and 4, 1982. Observational results obtained by a ruby lidar at Tsukuba (36.1 deg N, 140.1 deg E) are shown, and some points of latitude dispersion processes of aerosols are discussed.

  17. Information Fusion of Conflicting Input Data.

    PubMed

    Mönks, Uwe; Dörksen, Helene; Lohweg, Volker; Hübner, Michael

    2016-10-29

    Sensors, and also actuators or external sources such as databases, serve as data sources in order to realise condition monitoring of industrial applications or the acquisition of characteristic parameters like production speed or reject rate. Modern facilities create such a large amount of complex data that a machine operator is unable to comprehend and process the information contained in the data. Thus, information fusion mechanisms gain increasing importance. Besides the management of large amounts of data, further challenges towards the fusion algorithms arise from epistemic uncertainties (incomplete knowledge) in the input signals as well as conflicts between them. These aspects must be considered during information processing to obtain reliable results, which are in accordance with the real world. The analysis of the scientific state of the art shows that current solutions fulfil said requirements at most only partly. This article proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation) employing the μ BalTLCS (fuzzified balanced two-layer conflict solving) fusion algorithm to reduce the impact of conflicts on the fusion result. The performance of the contribution is shown by its evaluation in the scope of a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The utilised data is published and freely accessible.

  18. FISH Oracle 2: a web server for integrative visualization of genomic data in cancer research.

    PubMed

    Mader, Malte; Simon, Ronald; Kurtz, Stefan

    2014-03-31

    A comprehensive view on all relevant genomic data is instrumental for understanding the complex patterns of molecular alterations typically found in cancer cells. One of the most effective ways to rapidly obtain an overview of genomic alterations in large amounts of genomic data is the integrative visualization of genomic events. We developed FISH Oracle 2, a web server for the interactive visualization of different kinds of downstream processed genomics data typically available in cancer research. A powerful search interface and a fast visualization engine provide a highly interactive visualization for such data. High quality image export enables the life scientist to easily communicate their results. A comprehensive data administration allows to keep track of the available data sets. We applied FISH Oracle 2 to published data and found evidence that, in colorectal cancer cells, the gene TTC28 may be inactivated in two different ways, a fact that has not been published before. The interactive nature of FISH Oracle 2 and the possibility to store, select and visualize large amounts of downstream processed data support life scientists in generating hypotheses. The export of high quality images supports explanatory data visualization, simplifying the communication of new biological findings. A FISH Oracle 2 demo server and the software is available at http://www.zbh.uni-hamburg.de/fishoracle.

  19. Information Fusion of Conflicting Input Data

    PubMed Central

    Mönks, Uwe; Dörksen, Helene; Lohweg, Volker; Hübner, Michael

    2016-01-01

    Sensors, and also actuators or external sources such as databases, serve as data sources in order to realise condition monitoring of industrial applications or the acquisition of characteristic parameters like production speed or reject rate. Modern facilities create such a large amount of complex data that a machine operator is unable to comprehend and process the information contained in the data. Thus, information fusion mechanisms gain increasing importance. Besides the management of large amounts of data, further challenges towards the fusion algorithms arise from epistemic uncertainties (incomplete knowledge) in the input signals as well as conflicts between them. These aspects must be considered during information processing to obtain reliable results, which are in accordance with the real world. The analysis of the scientific state of the art shows that current solutions fulfil said requirements at most only partly. This article proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation) employing the μBalTLCS (fuzzified balanced two-layer conflict solving) fusion algorithm to reduce the impact of conflicts on the fusion result. The performance of the contribution is shown by its evaluation in the scope of a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The utilised data is published and freely accessible. PMID:27801874

  20. Test Data Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosas, Joseph

    The National Security Campus (NSC) collects a large amount of test data used The National Security Campus (NSC) collects a large amount of test data used to accept high value and high rigor product. The data has been used historically to support root cause analysis when anomalies are detected in down-stream processes. The opportunity to use the data for predictive failure analysis however, had never been exploited. The primary goal of the Test Data Monitor (TDM) software is to provide automated capabilities to analyze data in near-real-time and report trends that foreshadow actual product failures. To date, the aerospace industrymore » as a whole is challenged at utilizing collected data to the degree that modern technology allows. As a result of the innovation behind TDM, Honeywell is able to monitor millions of data points through a multitude of SPC algorithms continuously and autonomously so that our personnel resources can more efficiently and accurately direct their attention to suspect processes or features. TDM’s capabilities have been recognized by our U.S. Department of Energy National Nuclear Security Administration (NNSA) sponsor for potential use at other sites within the NNSA. This activity supports multiple initiatives including expectations of the NNSA and broader corporate goals that center around data-based quality controls on production.« less

  1. A tool for optimization of the production and user analysis on the Grid, C. Grigoras for the ALICE Collaboration

    NASA Astrophysics Data System (ADS)

    Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores

    2011-12-01

    With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.

  2. Kinetics of methane hydrate replacement with carbon dioxide and nitrogen gas mixture using in situ NMR spectroscopy.

    PubMed

    Cha, Minjun; Shin, Kyuchul; Lee, Huen; Moudrakovski, Igor L; Ripmeester, John A; Seo, Yutaek

    2015-02-03

    In this study, the kinetics of methane replacement with carbon dioxide and nitrogen gas in methane gas hydrate prepared in porous silica gel matrices has been studied by in situ (1)H and (13)C NMR spectroscopy. The replacement process was monitored by in situ (1)H NMR spectra, where about 42 mol % of the methane in the hydrate cages was replaced in 65 h. Large amounts of free water were not observed during the replacement process, indicating a spontaneous replacement reaction upon exposing methane hydrate to carbon dioxide and nitrogen gas mixture. From in situ (13)C NMR spectra, we confirmed that the replacement ratio was slightly higher in small cages, but due to the composition of structure I hydrate, the amount of methane evolved from the large cages was larger than that of the small cages. Compositional analysis of vapor and hydrate phases was also carried out after the replacement reaction ceased. Notably, the composition changes in hydrate phases after the replacement reaction would be affected by the difference in the chemical potential between the vapor phase and hydrate surface rather than a pore size effect. These results suggest that the replacement technique provides methane recovery as well as stabilization of the resulting carbon dioxide hydrate phase without melting.

  3. Present status of recycling waste mobile phones in China: a review.

    PubMed

    Li, Jingying; Ge, Zhongying; Liang, Changjin; An, Ni

    2017-07-01

    A large number of waste mobile phones have already been generated and are being generated. Various countries around the world have all been positively exploring the way of recycling and reuse when facing such a large amount of waste mobile phones. In some countries, processing waste mobile phones has been forming a complete industrial chain, which can not only recycle waste mobile phones to reduce their negative influence on the environment but also turn waste into treasure to acquire economic benefits dramatically. However, the situation of recycling waste mobile phones in China is not going well. Waste mobile phones are not formally covered by existing regulations and policies for the waste electric and electronic equipment in China. In order to explore an appropriate system to recover waste mobile phones, the mobile phone production and the amount of waste mobile phones are introduced in this paper, and status of waste mobile phones recycling is described; then, the disposal technology of electronic waste that would be most likely to be used for processing of electronic waste in industrial applications in the near future is reviewed. Finally, rationalization proposals are put forward based on the current recovery status of waste mobile phones for the purpose of promoting the development of recycling waste mobile phones in developing countries with a special emphasis on China.

  4. The Effect of Traditional Treatments on Heavy Metal Toxicity of Armenian Bole

    PubMed Central

    Hosamo, Ammar; Zarshenas, Mohammad Mehdi; Mehdizadeh, Alireza; Zomorodian, Kamiar; Khani, Ayda Hossein

    2016-01-01

    Background: Clay has been used for its nutrition, cosmetic, and antibacterial properties for thousands of years. Its small particle size, large surface area, and high concentration of ions have made it an interesting subject for pharmaceutical research. There have been studies on scavenging foreign substances and antibacterial properties of clay minerals. The main problem with the medical use of these agents, today, is their heavy metal toxicity. This includes arsenic, cadmium, lead, nickel, zinc, and iron. Iranian traditional medicine (ITM) introduces different clays as medicaments. In this system, there are specific processes for these agents, which might reduce the chance of heavy metal toxicity. Armenian bole is a type of clay that has been used to treat a wound. Before in vivo studies of this clay, its safety should be confirmed. Methods: In this work, we investigated the effect of washing process as mentioned in ITM books regarding the presence of Pb, As, and Cd in 5 samples using atomic absorption spectrometry. We washed each sample (50 g) with 500 cc of distilled water. The samples were filtered and dried at room temperature for 24 hours. Results: In all studied samples, the amount of Pb and Cd was reduced after the ITM washing process. The amount of As was reduced in 3 samples and increased in 2 other samples. Conclusion: In ITM books, there are general considerations for the use of medicinal clay. These agents should not be used before special treatments such as the washing process. In this study, we observed the effect of washing process on reducing the amount of heavy metals in Armenian bole samples. In two samples, washing caused an increase in the amount of As. As these heavy metals sediment according to their density in different layers, the sample layer on which the spectrometry is performed could have an effect on the results. PMID:27840531

  5. The Effect of Traditional Treatments on Heavy Metal Toxicity of Armenian Bole

    PubMed Central

    Hosamo, Ammar; Zarshenas, Mohammad Mehdi; Mehdizadeh, Alireza; Zomorodian, Kamiar; Khani, Ayda Hossein

    2016-01-01

    Background: Clay has been used for its nutrition, cosmetic, and antibacterial properties for thousands of years. Its small particle size, large surface area, and high concentration of ions have made it an interesting subject for pharmaceutical research. There have been studies on scavenging foreign substances and antibacterial properties of clay minerals. The main problem with the medical use of these agents, today, is their heavy metal toxicity. This includes arsenic, cadmium, lead, nickel, zinc, and iron. Iranian traditional medicine (ITM) introduces different clays as medicaments. In this system, there are specific processes for these agents, which might reduce the chance of heavy metal toxicity. Armenian bole is a type of clay that has been used to treat a wound. Before in vivo studies of this clay, its safety should be confirmed. Methods: In this work, we investigated the effect of washing process as mentioned in ITM books regarding the presence of Pb, As, and Cd in 5 samples using atomic absorption spectrometry. We washed each sample (50 g) with 500 cc of distilled water. The samples were filtered and dried at room temperature for 24 hours. Results: In all studied samples, the amount of Pb and Cd was reduced after the ITM washing process. The amount of As was reduced in 3 samples and increased in 2 other samples. Conclusion: In ITM books, there are general considerations for the use of medicinal clay. These agents should not be used before special treatments such as the washing process. In this study, we observed the effect of washing process on reducing the amount of heavy metals in Armenian bole samples. In two samples, washing caused an increase in the amount of As. As these heavy metals sediment according to their density in different layers, the sample layer on which the spectrometry is performed could have an effect on the results. PMID:27516695

  6. The Effect of Traditional Treatments on Heavy Metal Toxicity of Armenian Bole.

    PubMed

    Hosamo, Ammar; Zarshenas, Mohammad Mehdi; Mehdizadeh, Alireza; Zomorodian, Kamiar; Khani, Ayda Hossein

    2016-05-01

    Clay has been used for its nutrition, cosmetic, and antibacterial properties for thousands of years. Its small particle size, large surface area, and high concentration of ions have made it an interesting subject for pharmaceutical research. There have been studies on scavenging foreign substances and antibacterial properties of clay minerals. The main problem with the medical use of these agents, today, is their heavy metal toxicity. This includes arsenic, cadmium, lead, nickel, zinc, and iron. Iranian traditional medicine (ITM) introduces different clays as medicaments. In this system, there are specific processes for these agents, which might reduce the chance of heavy metal toxicity. Armenian bole is a type of clay that has been used to treat a wound. Before in vivo studies of this clay, its safety should be confirmed. In this work, we investigated the effect of washing process as mentioned in ITM books regarding the presence of Pb, As, and Cd in 5 samples using atomic absorption spectrometry. We washed each sample (50 g) with 500 cc of distilled water. The samples were filtered and dried at room temperature for 24 hours. In all studied samples, the amount of Pb and Cd was reduced after the ITM washing process. The amount of As was reduced in 3 samples and increased in 2 other samples. In ITM books, there are general considerations for the use of medicinal clay. These agents should not be used before special treatments such as the washing process. In this study, we observed the effect of washing process on reducing the amount of heavy metals in Armenian bole samples. In two samples, washing caused an increase in the amount of As. As these heavy metals sediment according to their density in different layers, the sample layer on which the spectrometry is performed could have an effect on the results.

  7. Large-scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU).

    PubMed

    Shi, Yulin; Veidenbaum, Alexander V; Nicolau, Alex; Xu, Xiangmin

    2015-01-15

    Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post hoc processing and analysis. Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22× speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Large scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU)

    PubMed Central

    Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin

    2014-01-01

    Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633

  9. Selective sequential precipitation of dissolved metals in mine drainage from coal mine

    NASA Astrophysics Data System (ADS)

    Yim, Giljae; Bok, Songmin; Ji, Sangwoo; Oh, Chamteut; Cheong, Youngwook; Han, Youngsoo; Ahn, Joosung

    2017-04-01

    In abandoned mines in Korea, a large amount of mine drainage continues to flow out and spread pollution. In purification of the mine drainage a massive amount of sludge is generated as waste. Since this metal sludge contains high Fe, Al and Mn oxides, developing the treatment method to recover homogeneous individual metal with high purity may beneficial to recycle waste metals as useful resources and reduce the amount of sludge production. In this regard, we established a dissolved metals selective precipitation process to treat Waryong Industry's mine drainage. The process that selectively precipitates metals dissolved in mine drainage is a continuous Fe-buffer-Al process, and each process consists of the neutralization tank, the coagulation tank, and the settling tank. Based on this process, this study verified the operational applicability of the Fe and Al selective precipitation. Our previous study revealed that high-purity Fe and Al precipitates could be recovered at a flow rate of 1.5 ton/day, while the lower purity was achieved when the rate was increased to about 3 ton/day due to the difficulty in reagent dosage control. In the current study was conducted to increase the capacity of the system to recover Fe and Al as high-purity precipitates at a flow rate of 10 ton/day with the ensured continuous operations by introducing an automatic reagent injection system. The previous study had a difficulty in controlling the pH and operating system continuously due to the manually controlled reagent injection system. To upgrade this and ensure the optimal pH in a stable way, a continuous reagent injection system was installed. The result of operation of the 10 ton/day system confirmed that the scaled-up process could maintain the stable recovery rates and purities of precipitates on site.

  10. Episodic, generalized, and semantic memory tests: switching and strength effects.

    PubMed

    Humphreys, Michael S; Murray, Krista L

    2011-09-01

    We continue the process of investigating the probabilistic paired associate paradigm in an effort to understand the memory access control processes involved and to determine whether the memory structure produced is in transition between episodic and semantic memory. In this paradigm two targets are probabilistically paired with a cue across a large number of short lists. Participants can recall the target paired with the cue in the most recent list (list specific test), produce the first of the two targets that have been paired with that cue to come to mind (generalised test), and produce a free association response (semantic test). Switching between a generalised test and a list specific test did not produce a switching cost indicating a general similarity in the control processes involved. In addition, there was evidence for a dissociation between two different strength manipulations (amount of study time and number of cue-target pairings) such that number of pairings influenced the list specific, generalised and the semantic test but amount of study time only influenced the list specific and generalised test. © 2011 Canadian Psychological Association

  11. The leaching kinetics of cadmium from hazardous Cu-Cd zinc plant residues.

    PubMed

    Li, Meng; Zheng, Shili; Liu, Biao; Du, Hao; Dreisinger, David Bruce; Tafaghodi, Leili; Zhang, Yi

    2017-07-01

    A large amount of Cu-Cd zinc plant residues (CZPR) are produced from the hydrometallurgical zinc plant operations. Since these residues contain substantial amount of heavy metals including Cd, Zn and Cu, therefore, they are considered as hazardous wastes. In order to realize decontamination treatment and efficient extraction of the valuable metals from the CZPR, a comprehensive recovery process using sulfuric acid as the leaching reagent and air as the oxidizing reagent has been proposed. The effect of temperature, sulfuric acid concentration, particle size, solid/liquid ratio and stirring speed on the cadmium extraction efficiency was investigated. The leaching kinetics of cadmium was also studied. It was concluded that the cadmium leaching process was controlled by the solid film diffusion process. Moreover, the order of the reaction rate constant versus H 2 SO 4 concentration, particle size, solid/liquid ratio and stirring speed was calculated. The XRD and SEM-EDS analysis results showed that the main phases of the secondary sulfuric acid leaching residues were lead sulfate and calcium sulfate. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Asymmetric capture of Dirac dark matter by the Sun

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blennow, Mattias; Clementz, Stefan

    2015-08-18

    Current problems with the solar model may be alleviated if a significant amount of dark matter from the galactic halo is captured in the Sun. We discuss the capture process in the case where the dark matter is a Dirac fermion and the background halo consists of equal amounts of dark matter and anti-dark matter. By considering the case where dark matter and anti-dark matter have different cross sections on solar nuclei as well as the case where the capture process is considered to be a Poisson process, we find that a significant asymmetry between the captured dark particles andmore » anti-particles is possible even for an annihilation cross section in the range expected for thermal relic dark matter. Since the captured number of particles are competitive with asymmetric dark matter models in a large range of parameter space, one may expect solar physics to be altered by the capture of Dirac dark matter. It is thus possible that solutions to the solar composition problem may be searched for in these type of models.« less

  13. Microstructure and corrosion behavior of laser processed NiTi alloy.

    PubMed

    Marattukalam, Jithin J; Singh, Amit Kumar; Datta, Susmit; Das, Mitun; Balla, Vamsi Krishna; Bontha, Srikanth; Kalpathy, Sreeram K

    2015-12-01

    Laser Engineered Net Shaping (LENS™), a commercially available additive manufacturing technology, has been used to fabricate dense equiatomic NiTi alloy components. The primary aim of this work is to study the effect of laser power and scan speed on microstructure, phase constituents, hardness and corrosion behavior of laser processed NiTi alloy. The results showed retention of large amount of high-temperature austenite phase at room temperature due to high cooling rates associated with laser processing. The high amount of austenite in these samples increased the hardness. The grain size and corrosion resistance were found to increase with laser power. The surface energy of NiTi alloy, calculated using contact angles, decreased from 61 mN/m to 56 mN/m with increase in laser energy density from 20 J/mm(2) to 80 J/mm(2). The decrease in surface energy shifted the corrosion potentials to nobler direction and decreased the corrosion current. Under present experimental conditions the laser power found to have strong influence on microstructure, phase constituents and corrosion resistance of NiTi alloy. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Acoustic Sample Deposition MALDI-MS (ASD-MALDI-MS): A Novel Process Flow for Quality Control Screening of Compound Libraries.

    PubMed

    Chin, Jefferson; Wood, Elizabeth; Peters, Grace S; Drexler, Dieter M

    2016-02-01

    In the early stages of drug discovery, high-throughput screening (HTS) of compound libraries against pharmaceutical targets is a common method to identify potential lead molecules. For these HTS campaigns to be efficient and successful, continuous quality control of the compound collection is necessary and crucial. However, the large number of compound samples and the limited sample amount pose unique challenges. Presented here is a proof-of-concept study for a novel process flow for the quality control screening of small-molecule compound libraries that consumes only minimal amounts of samples and affords compound-specific molecular data. This process employs an acoustic sample deposition (ASD) technique for the offline sample preparation by depositing nanoliter volumes in an array format onto microscope glass slides followed by matrix-assisted laser desorption/ionization mass spectrometric (MALDI-MS) analysis. An initial study of a 384-compound array employing the ASD-MALDI-MS workflow resulted in a 75% first-pass positive identification rate with an analysis time of <1 s per sample. © 2015 Society for Laboratory Automation and Screening.

  15. Robust watermarking scheme for binary images using a slice-based large-cluster algorithm with a Hamming Code

    NASA Astrophysics Data System (ADS)

    Chen, Wen-Yuan; Liu, Chen-Chung

    2006-01-01

    The problems with binary watermarking schemes are that they have only a small amount of embeddable space and are not robust enough. We develop a slice-based large-cluster algorithm (SBLCA) to construct a robust watermarking scheme for binary images. In SBLCA, a small-amount cluster selection (SACS) strategy is used to search for a feasible slice in a large-cluster flappable-pixel decision (LCFPD) method, which is used to search for the best location for concealing a secret bit from a selected slice. This method has four major advantages over the others: (a) SBLCA has a simple and effective decision function to select appropriate concealment locations, (b) SBLCA utilizes a blind watermarking scheme without the original image in the watermark extracting process, (c) SBLCA uses slice-based shuffling capability to transfer the regular image into a hash state without remembering the state before shuffling, and finally, (d) SBLCA has enough embeddable space that every 64 pixels could accommodate a secret bit of the binary image. Furthermore, empirical results on test images reveal that our approach is a robust watermarking scheme for binary images.

  16. Effects of mass loading on dayside solar wind-magnetosphere interactions

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Brambles, O.; Wiltberger, M. J.; Lyon, J.; Lotko, W.

    2016-12-01

    Satellite observations have shown that terrestrial-sourced plasmas mass load the dayside magnetopause and cause reductions in local reconnection rates. Whether the integrated dayside reconnection rate is affected by these local mass-loading processes is still an open question. Several mechanisms have been proposed to describe the control of dayside reconnection, including the local-control and global-control hypotheses. We have conducted a series of controlled numerical simulations to investigate the response of dayside solar wind-magnetopshere (SW-M) coupling to mass loading processes. Our simulation results show that the coupled SW-M system may exhibit both local and global control behaviors depending on the amount of mass loading. With a small amount of mass loading, the changes in the local reconnection rate does not affect magnetosheath properties and the geoeffective length in the upstream solar wind, resulting in the same integrated dayside reconnection rate. With a large amount of mass loading, the magnetosheath properties and the geoeffective length are significantly modified by slowing down the local reconnection rate, resulting in a significant reduction in the integrated dayside reconnection rate. The response of magnetosheath properties to mass loading is expected from the Cassak-Shay asymmetric reconnection theory through conservation of energy. The physical origin of the transition regime between local and global control is qualitatively explained. The parameters that determine the transition regime depend on the location, spatial extension and density of the mass loading process.

  17. Simultaneous analysis of large INTEGRAL/SPI1 datasets: Optimizing the computation of the solution and its variance using sparse matrix algorithms

    NASA Astrophysics Data System (ADS)

    Bouchet, L.; Amestoy, P.; Buttari, A.; Rouet, F.-H.; Chauvin, M.

    2013-02-01

    Nowadays, analyzing and reducing the ever larger astronomical datasets is becoming a crucial challenge, especially for long cumulated observation times. The INTEGRAL/SPI X/γ-ray spectrometer is an instrument for which it is essential to process many exposures at the same time in order to increase the low signal-to-noise ratio of the weakest sources. In this context, the conventional methods for data reduction are inefficient and sometimes not feasible at all. Processing several years of data simultaneously requires computing not only the solution of a large system of equations, but also the associated uncertainties. We aim at reducing the computation time and the memory usage. Since the SPI transfer function is sparse, we have used some popular methods for the solution of large sparse linear systems; we briefly review these methods. We use the Multifrontal Massively Parallel Solver (MUMPS) to compute the solution of the system of equations. We also need to compute the variance of the solution, which amounts to computing selected entries of the inverse of the sparse matrix corresponding to our linear system. This can be achieved through one of the latest features of the MUMPS software that has been partly motivated by this work. In this paper we provide a brief presentation of this feature and evaluate its effectiveness on astrophysical problems requiring the processing of large datasets simultaneously, such as the study of the entire emission of the Galaxy. We used these algorithms to solve the large sparse systems arising from SPI data processing and to obtain both their solutions and the associated variances. In conclusion, thanks to these newly developed tools, processing large datasets arising from SPI is now feasible with both a reasonable execution time and a low memory usage.

  18. Degradation of municipal solid waste in simulated landfill bioreactors under aerobic conditions.

    PubMed

    Slezak, Radoslaw; Krzystek, Liliana; Ledakowicz, Stanislaw

    2015-09-01

    In this study the municipal solid waste degradation processes in simulated landfill bioreactors under aerobic and anaerobic conditions is investigated. The effect of waste aeration on the dynamics of the aerobic degradation processes in lysimeters as well as during anaerobic processes after completion of aeration is presented. The results are compared with the anaerobic degradation process to determine the stabilization stage of waste in both experimental modes. The experiments in aerobic lysimeters were carried out at small aeration rate (4.41⋅10(-3)lmin(-1)kg(-1)) and for two recirculation rates (24.9 and 1.58lm(-3)d(-1)). The change of leachate and formed gases composition showed that the application of even a small aeration rate favored the degradation of organic matter. The amount of CO2 and CH4 released from anaerobic lysimeter was about 5 times lower than that from the aerobic lysimeters. Better stabilization of the waste was obtained in the aerobic lysimeter with small recirculation, from which the amount of CO2 produced was larger by about 19% in comparison with that from the aerobic lysimeter with large leachate recirculation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. International online support to process optimisation and operation decisions.

    PubMed

    Onnerth, T B; Eriksson, J

    2002-01-01

    The information level at all technical facilities has developed from almost nothing 30-40 years ago to advanced IT--Information Technology--systems based on both chemical and mechanical on-line sensors for process and equipment. Still the basic part of information is to get the right data at the right time for the decision to be made. Today a large amount of operational data is available at almost any European wastewater treatment plant, from laboratory and SCADA. The difficult part is to determine which data to keep, which to use in calculations and how and where to make data available. With the STARcontrol system it is possible to separate only process relevant data to use for on-line control and reporting at engineering level, to optimise operation. Furthermore, the use of IT makes it possible to communicate internationally, with full access to the whole amount of data on the single plant. In this way, expert supervision can be both very local in local language e.g. Polish and at the same time very professional with Danish experts advising on Danish processes in Poland or Sweden where some of the 12 STARcontrol systems are running.

  20. Liquid by-products from fish canning industry as sustainable sources of ω3 lipids.

    PubMed

    Monteiro, Ana; Paquincha, Diogo; Martins, Florinda; Queirós, Rui P; Saraiva, Jorge A; Švarc-Gajić, Jaroslava; Nastić, Nataša; Delerue-Matos, Cristina; Carvalho, Ana P

    2018-08-01

    Fish canning industry generates large amounts of liquid wastes, which are discarded, after proper treatment to remove the organic load. However, alternative treatment processes may also be designed in order to target the recovery of valuable compounds; with this procedure, these wastewaters are converted into liquid by-products, becoming an additional source of revenue for the company. This study evaluated green and economically sustainable methodologies for the extraction of ω3 lipids from fish canning liquid by-products. Lipids were extracted by processes combining physical and chemical parameters (conventional and pressurized extraction processes), as well as chemical and biological parameters. Furthermore, LCA was applied to evaluate the environmental performance and costs indicators for each process. Results indicated that extraction with high hydrostatic pressure provides the highest amounts of ω3 polyunsaturated fatty acids (3331,5 mg L -1 effluent), apart from presenting the lowest environmental impact and costs. The studied procedures allow to obtain alternative, sustainable and traceable sources of ω3 lipids for further applications in food, pharmaceutical and cosmetic industries. Additionally, such approach contributes towards the organic depuration of canning liquid effluents, therefore reducing the overall waste treatment costs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. AsterAnts: A Concept for Large-Scale Meteoroid Return and Processing using the International Space Station

    NASA Technical Reports Server (NTRS)

    Globus, Al; Biegel, Bryan A.; Traugott, Steve

    2004-01-01

    AsterAnts is a concept calling for a fleet of solar sail powered spacecraft to retrieve large numbers of small (1/2-1 meter diameter) Near Earth Objects (NEOs) for orbital processing. AsterAnts could use the International Space Station (ISS) for NEO processing, solar sail construction, and to test NEO capture hardware. Solar sails constructed on orbit are expected to have substantially better performance than their ground built counterparts [Wright 1992]. Furthermore, solar sails may be used to hold geosynchronous communication satellites out-of-plane [Forward 1981] increasing the total number of slots by at least a factor of three. potentially generating $2 billion worth of orbital real estate over North America alone. NEOs are believed to contain large quantities of water, carbon, other life-support materials and metals. Thus. with proper processing, NEO materials could in principle be used to resupply the ISS, produce rocket propellant, manufacture tools, and build additional ISS working space. Unlike proposals requiring massive facilities, such as lunar bases, before returning any extraterrestrial larger than a typical inter-planetary mission. Furthermore, AsterAnts could be scaled up to deliver large amounts of material by building many copies of the same spacecraft, thereby achieving manufacturing economies of scale. Because AsterAnts would capture NEOs whole, NEO composition details, which are generally poorly characterized, are relatively unimportant and no complex extraction equipment is necessary. In combination with a materials processing facility at the ISS, AsterAnts might inaugurate an era of large-scale orbital construction using extraterrestrial materials.

  2. Volume Computation of a Stockpile - a Study Case Comparing GPS and Uav Measurements in AN Open Pit Quarry

    NASA Astrophysics Data System (ADS)

    Raeva, P. L.; Filipova, S. L.; Filipov, D. G.

    2016-06-01

    The following paper aims to test and evaluate the accuracy of UAV data for volumetric measurements to the conventional GNSS techniques. For this purpose, an appropriate open pit quarry has been chosen. Two sets of measurements were performed. Firstly, a stockpile was measured by GNSS technologies and later other terrestrial GNSS measurements for modelling the berms of the quarry were taken. Secondly, the area of the whole quarry including the stockpile site was mapped by a UAV flight. Having considered how dynamic our world is, new techniques and methods should be presented in numerous fields. For instance, the management of an open pit quarry requires gaining, processing and storing a large amount of information which is constantly changing with time. Fast and precise acquisition of measurements regarding the process taking place in a quarry is the key to an effective and stable maintenance. In other words, this means getting an objective evaluations of the processes, using up-to-date technologies and reliable accuracy of the results. Often legislations concerning mine engineering state that the volumetric calculations are to present ±3% accuracy of the whole amount. On one hand, extremely precise measurements could be performed by GNSS technologies, however, it could be really time consuming. On the other hand, UAV photogrammetry presents a fast, accurate method for mapping large areas and calculating stockpiles volumes. The study case was performed as a part of a master thesis.

  3. Coastal erosion as a source of mercury into the marine environment along the Polish Baltic shore.

    PubMed

    Bełdowska, Magdalena; Jędruch, Agnieszka; Łęczyński, Leszek; Saniewska, Dominika; Kwasigroch, Urszula

    2016-08-01

    The climate changes in recent years in the southern Baltic have been resulting in an increased frequency of natural extreme phenomena (i.e. storms, floods) and intensification of abrasion processes, which leads to introduction of large amounts of sedimentary deposits into the marine environment. The aim of this study was to determine the mercury load introduced to the Baltic Sea with deposits crumbling off the cliffs-parts of the coast that are the most exposed to abrasion. The studies were carried out close to five cliffs located on the Polish coast in the years 2011-2014. The results show that coastal erosion could be an important Hg source into the marine environment. This process is the third most important route, after riverine and precipitation input, by which Hg may enter the Gulf of Gdańsk. In the Hg budget in the gulf, the load caused by erosion (14.3 kg a(-1)) accounted for 80 % of the wet deposition and was 50 % higher than the amount of mercury introduced with dry deposition. Although the Hg concentration in the cliff deposits was similar to the natural background, due to their large mass, this problem could be significant. In addition, the preliminary studies on the impact of coastal erosion on the Hg level in the marine ecosystem have shown that this process may be one of the Hg sources into the trophic chain.

  4. An automated workflow for parallel processing of large multiview SPIM recordings

    PubMed Central

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-01-01

    Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585

  5. An automated workflow for parallel processing of large multiview SPIM recordings.

    PubMed

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-04-01

    Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction : schmied@mpi-cbg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  6. Minor-element composition and organic carbon content of marine and nonmarine shales of Late Cretaceous age in the western interior of the United States

    USGS Publications Warehouse

    Tourtelot, H.A.

    1964-01-01

    The composition of nonmarine shales of Cretaceous age that contain less than 1 per cent organic carbon is assumed to represent the inherited minor-element composition of clayey sediments delivered to the Cretaceous sea that occupied the western interior region of North America. Differences in minor-element content between these samples and samples of 1. (a) nonmarine carbonaceous shales (1 to 17 per cent organic carbon), 2. (b) nearshore marine shales (less than 1 per cent organic carbon), and 3. (c) offshore marine shales (as much as 8 per cent organic carbon), all of the same age, reveal certain aspects of the role played by clay minerals and organic materials in affecting the minor-element composition of the rocks. The organic carbon in the nonmarine rocks occurs in disseminated coaly plant remains. The organic carbon in the marine rocks occurs predominantly in humic material derived from terrestrial plants. The close similarity in composition between the organic isolates from the marine samples and low-rank coal suggests that the amount of marine organic material in these rocks is small. The minor-element content of the two kinds of nonmarine shales is the same despite the relatively large amount of organic carbon in the carbonaceous shales. The nearshore marine shales, however, contain larger median amounts of arsenic, boron, chromium, vanadium and zinc than do the nonmarine rocks; and the offshore marine shales contain even larger amounts of these elements. Cobalt, molybdenum, lead and zirconium show insignificant differences in median content between the nonmarine and marine rocks, although as much as 25 ppm molybdenum is present in some offshore marine samples. The gallium content is lower in the marine than in the nonmarine samples. Copper and selenium contents of the two kinds of nonmarine rocks and the nearshore marine samples are the same, but those of the offshore samples are larger. In general, arsenic, chromium, copper, molybdenum, selenium, vanadium and zinc are concentrated in those offshore marine samples having the largest amounts of organic carbon, but samples with equal amounts of vanadium, for instance, may differ by a factor of 3 in their amount of organic carbon. Arsenic and molybdenum occur in some samples chiefly in syngenetic pyrite but also are present in relatively large amounts in samples that contain little pyrite. The data on nonmarine carbonaceous shales indicate that organic matter of terrestrial origin in marine shales contributes little to the minor-element content of such rocks. It is possible that marine organic matter, even though seemingly small in amount in marine shales, contributes to the minor-element composition of the shales. In addition to any such contribution, however, the great effectiveness in sorption processes of humic materials in conjunction with clay minerals suggests that such processes must have played an important role as these materials moved from the relatively dilute solutions of the nonmarine environment to the relatively concentrated solution of sea water. The volumes of sea water sufficient to supply for sorption the amounts of most minor elements in the offshore marine samples are insignificant compared to the volumes of water with which the clay and organic matter were in contact during their transportation and sedimentation. Consequently, the chemical characteristics of the environment in which the clay minerals and organic matter accumulated and underwent diagenesis probably were the most important factors in controlling the degree to which sorption processes and the formation of syngenetic minerals affected the final composition of the rocks. ?? 1969.

  7. Shielding materials for highly penetrating space radiations

    NASA Technical Reports Server (NTRS)

    Kiefer, Richard L.; Orwoll, Robert A.

    1995-01-01

    Interplanetary travel involves the transfer from an Earth orbit to a solar orbit. Once outside the Earth's magnetosphere, the major sources of particulate radiation are solar cosmic rays (SCR's) and galactic cosmic rays (GCR's). Intense fluxes of SCR's come from solar flares and consist primarily of protons with energies up to 1 GeV. The GCR consists of a low flux of nuclei with energies up to 10(exp 10) GeV. About 70 percent of the GCR are protons, but a small amount (0.6 percent) are nuclei with atomic numbers greater than 10. High energy charged particles (HZE) interact with matter by transferring energy to atomic electrons in a Coulomb process and by reacting with an atomic nucleus. Energy transferred in the first process increases with the square of the atomic number, so particles with high atomic numbers would be expected to lose large amounts of energy by this process. Nuclear reactions produced by (HZE) particles produce high-energy secondary particles which in turn lose energy to the material. The HZE nuclei are a major concern for radiation protection of humans during interplanetary missions because of the very high specific ionization of both primary and secondary particles. Computer codes have been developed to calculate the deposition of energy by very energetic charged particles in various materials. Calculations show that there is a significant buildup of secondary particles from nuclear fragmentation and Coulomb dissociation processes. A large portion of these particles are neutrons. Since neutrons carry no charge, they only lose energy by collision or reaction with a nucleus. Neutrons with high energies transfer large amounts of energy by inelastic collisions with nuclei. However, as the neutron energy decreases, elastic collisions become much more effective for energy loss. The lighter the nucleus, the greater the fraction of the neutron's kinetic energy that can be lost in an elastic collision. Thus, hydrogen-containing materials such as polymers are most effective in reducing the energy of neutrons. Once neutrons are reduced to very low energies, the probability for undergoing a reaction with a nucleus (the cross section) becomes very high. The product of such a reaction is often radioactive and can involve the release of a significant amount of energy. Thus, it is important to provide protection from low energy neutrons during a long duration space flight. Among the light elements, lithium and boron each have an isotope with a large thermal neutron capture cross section, Li-6 and B-10. However, B-10 is more abundant in the naturally-occurring element than Li-6, has a thermal neutron capture cross section four times that of Li-6, and produces the stable products, He-4 and Li-7 in the interaction while Li-6 produces radioactive tritium (H-3). Thus, boron is the best light-weight material for thermal neutron absorption in spacecraft. The work on this project was focused in two areas: computer design where existing computer codes were used, and in some cases modified, to calculate the propagation and interactions of high energy charged particles through various media, and materials development where boron was incorporated into high performance materials.

  8. Semantics-based distributed I/O with the ParaMEDIC framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balaji, P.; Feng, W.; Lin, H.

    2008-01-01

    Many large-scale applications simultaneously rely on multiple resources for efficient execution. For example, such applications may require both large compute and storage resources; however, very few supercomputing centers can provide large quantities of both. Thus, data generated at the compute site oftentimes has to be moved to a remote storage site for either storage or visualization and analysis. Clearly, this is not an efficient model, especially when the two sites are distributed over a wide-area network. Thus, we present a framework called 'ParaMEDIC: Parallel Metadata Environment for Distributed I/O and Computing' which uses application-specific semantic information to convert the generatedmore » data to orders-of-magnitude smaller metadata at the compute site, transfer the metadata to the storage site, and re-process the metadata at the storage site to regenerate the output. Specifically, ParaMEDIC trades a small amount of additional computation (in the form of data post-processing) for a potentially significant reduction in data that needs to be transferred in distributed environments.« less

  9. Complete physico-chemical treatment for coke plant effluents.

    PubMed

    Ghose, M K

    2002-03-01

    Naturally found coal is converted to coke which is suitable for metallurgical industries. Large quantities of liquid effluents produced contain a large amount of suspended solids, high COD, BOD, phenols, ammonia and other toxic substances which are causing serious pollution problem in the receiving water to which they are discharged. There are a large number of coke plants in the vicinity of Jharia Coal Field (JCF). Characteristics of the effluents have been evaluated. The present effluent treatment systems were found to be inadequate. Physico-chemical treatment has been considered as a suitable option for the treatment of coke plant effluents. Ammonia removal by synthetic zeolite, activated carbon for the removal of bacteria, viruses, refractory organics, etc. were utilized and the results are discussed. A scheme has been proposed for the complete physico-chemical treatment, which can be suitably adopted for the recycling, reuse and safe disposal of the treated effluent. Various unit process and unit operations involved in the treatment system have been discussed. The process may be useful on industrial scale at various sites.

  10. Formation of Large-scale Coronal Loops Interconnecting Two Active Regions through Gradual Magnetic Reconnection and an Associated Heating Process

    NASA Astrophysics Data System (ADS)

    Du, Guohui; Chen, Yao; Zhu, Chunming; Liu, Chang; Ge, Lili; Wang, Bing; Li, Chuanyang; Wang, Haimin

    2018-06-01

    Coronal loops interconnecting two active regions (ARs), called interconnecting loops (ILs), are prominent large-scale structures in the solar atmosphere. They carry a significant amount of magnetic flux and therefore are considered to be an important element of the solar dynamo process. Earlier observations showed that eruptions of ILs are an important source of CMEs. It is generally believed that ILs are formed through magnetic reconnection in the high corona (>150″–200″), and several scenarios have been proposed to explain their brightening in soft X-rays (SXRs). However, the detailed IL formation process has not been fully explored, and the associated energy release in the corona still remains unresolved. Here, we report the complete formation process of a set of ILs connecting two nearby ARs, with successive observations by STEREO-A on the far side of the Sun and by SDO and Hinode on the Earth side. We conclude that ILs are formed by gradual reconnection high in the corona, in line with earlier postulations. In addition, we show evidence that ILs brighten in SXRs and EUVs through heating at or close to the reconnection site in the corona (i.e., through the direct heating process of reconnection), a process that has been largely overlooked in earlier studies of ILs.

  11. A report on the ST ScI optical disk workstation

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The STScI optical disk project was designed to explore the options, opportunities and problems presented by the optical disk technology, and to see if optical disks are a viable, and inexpensive, means of storing the large amount of data which are found in astronomical digital imagery. A separate workstation was purchased on which the development can be done and serves as an astronomical image processing computer, incorporating the optical disks into the solution of standard image processing tasks. It is indicated that small workstations can be powerful tools for image processing, and that astronomical image processing may be more conveniently and cost-effectively performed on microcomputers than on the mainframe and super-minicomputers. The optical disks provide unique capabilities in data storage.

  12. Framework for Service Composition in G-Lite

    NASA Astrophysics Data System (ADS)

    Goranova, R.

    2011-11-01

    G-Lite is a Grid middleware, currently the main middleware installed on all clusters in Bulgaria. The middleware is used by scientists for solving problems, which require a large amount of storage and computational resources. On the other hand, the scientists work with complex processes, where job execution in Grid is just a step of the process. That is why, it is strategically important g-Lite to provide a mechanism for service compositions and business process management. Such mechanism is not specified yet. In this article we propose a framework for service composition in g-Lite. We discuss business process modeling, deployment and execution in this Grid environment. The examples used to demonstrate the concept are based on some IBM products.

  13. DIMENSION STABILIZED FIXED PHOTOGRAPHIC TYPE EMULSION AND A METHOD FOR PRODUCING SAME

    DOEpatents

    Gilbert, F.C.

    1962-03-13

    A process is given for stabilizing the dimensions of fixed gelatin-base photographic type emulsions containing silver halide, and particularly to such emulsions containing large amounts of silver chloride for use as nuclear track emulsions, so that the dimensions of the final product are the same as or in a predetermined fixed ratio to the dimensions of the emulsions prior to exposure. The process comprises contacting an exposed, fixed emulsion with a solution of wood rosin dissolved in ethyl alcohol for times corresponding to the dimensions desired, and thereafter permitting the alcohol to evaporate. (AEC)

  14. Method of producing a cellulase-containing cell-free fermentate produced from microorganism ATCC 55702

    DOEpatents

    Dees, H. Craig

    1998-01-01

    Bacteria which produce large amounts of cellulose-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.

  15. Detergent composition comprising a cellulase containing cell-free fermentate produced from microorganism ATCC 55702 or mutant thereof

    DOEpatents

    Dees, H. Craig

    1998-01-01

    Bacteria which produce large amounts of a cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.

  16. Plasma fractionation issues.

    PubMed

    Farrugia, Albert; Evers, Theo; Falcou, Pierre-Francois; Burnouf, Thierry; Amorim, Luiz; Thomas, Sylvia

    2009-04-01

    Procurement and processing of human plasma for fractionation of therapeutic proteins or biological medicines used in clinical practice is a multi-billion dollar international trade. Together the private sector and public sector (non-profit) provide large amounts of safe and effective therapeutic plasma proteins needed worldwide. The principal therapeutic proteins produced by the dichotomous industry include gamma globulins or immunoglobulins (including pathogen-specific hyperimmune globulins, such as hepatitis B immune globulins) albumin, factor VIII and Factor IX concentrates. Viral inactivation, principally by solvent detergent and other processes, has proven highly effective in preventing transmission of enveloped viruses, viz. HBV, HIV, and HCV.

  17. Production of a raw material for energy production in agriculture

    NASA Astrophysics Data System (ADS)

    Hellstroem, G.

    1980-04-01

    The total amount of energy in products produced by Swedish agriculture was estimated to 80 TWH: 30 TWh for cereals, 15 TWh for grass and leguminosae, and 35 TWh for straw and other agricultural wastes. Of this production a large part will be used as food even in the future. New plants that would produce more energy than the ones traditionally grown in Sweden are discussed. Also other types of energy from agriculture are discussed such as methane from manure, methanol from gasification processes, and ethanol from fermentative processes. Costs were estimated from different alternatives.

  18. Cellulase-containing cell-free fermentate produced from microorganism ATCC 55702

    DOEpatents

    Dees, H.C.

    1997-12-16

    Bacteria which produce large amounts of cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.

  19. Method of producing a cellulase-containing cell-free fermentate produced from microorganism ATCC 55702

    DOEpatents

    Dees, H.C.

    1998-05-26

    Bacteria which produce large amounts of cellulose-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.

  20. Automatic building identification under bomb damage conditions

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Noll, Warren; Barker, Joseph; Wunsch, Donald C., II

    2009-05-01

    Given the vast amount of image intelligence utilized in support of planning and executing military operations, a passive automated image processing capability for target identification is urgently required. Furthermore, transmitting large image streams from remote locations would quickly use available band width (BW) precipitating the need for processing to occur at the sensor location. This paper addresses the problem of automatic target recognition for battle damage assessment (BDA). We utilize an Adaptive Resonance Theory approach to cluster templates of target buildings. The results show that the network successfully classifies targets from non-targets in a virtual test bed environment.

  1. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  2. Biochar for composting improvement and contaminants reduction. A review.

    PubMed

    Godlewska, Paulina; Schmidt, Hans Peter; Ok, Yong Sik; Oleszczuk, Patryk

    2017-12-01

    Biochar is characterised by a large specific surface area, porosity, and a large amount of functional groups. All of those features cause that biochar can be a potentially good material in the optimisation of the process of composting and final compost quality. The objective of this study was to compile the current knowledge on the possibility of biochar application in the process of composting and on the effect of biochar on compost properties and on the content of contaminants in compost. The paper presents the effect of biochar on compost maturity indices, composting temperature and moisture, and also on the content and bioavailability of nutrients and of organic and inorganic contaminants. In the paper note is also taken of the effect of biochar added to composted material on plants, microorganisms and soil invertebrates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Runaway electrons as a source of impurity and reduced fusion yield in the dense plasma focus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lerner, Eric J.; Yousefi, Hamid R.

    2014-10-15

    Impurities produced by the vaporization of metals in the electrodes may be a major cause of reduced fusion yields in high-current dense plasma focus devices. We propose here that a major, but hitherto-overlooked, cause of such impurities is vaporization by runaway electrons during the breakdown process at the beginning of the current pulse. This process is sufficient to account for the large amount of erosion observed in many dense plasma focus devices on the anode very near to the insulator. The erosion is expected to become worse with lower pressures, typical of machines with large electrode radii, and would explainmore » the plateauing of fusion yield observed in such machines at higher peak currents. Such runaway electron vaporization can be eliminated by the proper choice of electrode material, by reducing electrode radii and thus increasing fill gas pressure, or by using pre-ionization to eliminate the large fields that create runaway electrons. If these steps are combined with monolithic electrodes to eliminate arcing erosion, large reductions in impurities and large increases in fusion yield may be obtained, as the I{sup 4} scaling is extended to higher currents.« less

  4. Improving the scalability of hyperspectral imaging applications on heterogeneous platforms using adaptive run-time data compression

    NASA Astrophysics Data System (ADS)

    Plaza, Antonio; Plaza, Javier; Paz, Abel

    2010-10-01

    Latest generation remote sensing instruments (called hyperspectral imagers) are now able to generate hundreds of images, corresponding to different wavelength channels, for the same area on the surface of the Earth. In previous work, we have reported that the scalability of parallel processing algorithms dealing with these high-dimensional data volumes is affected by the amount of data to be exchanged through the communication network of the system. However, large messages are common in hyperspectral imaging applications since processing algorithms are pixel-based, and each pixel vector to be exchanged through the communication network is made up of hundreds of spectral values. Thus, decreasing the amount of data to be exchanged could improve the scalability and parallel performance. In this paper, we propose a new framework based on intelligent utilization of wavelet-based data compression techniques for improving the scalability of a standard hyperspectral image processing chain on heterogeneous networks of workstations. This type of parallel platform is quickly becoming a standard in hyperspectral image processing due to the distributed nature of collected hyperspectral data as well as its flexibility and low cost. Our experimental results indicate that adaptive lossy compression can lead to improvements in the scalability of the hyperspectral processing chain without sacrificing analysis accuracy, even at sub-pixel precision levels.

  5. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  6. Brine flow in heated geologic salt.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhlman, Kristopher L.; Malama, Bwalya

    This report is a summary of the physical processes, primary governing equations, solution approaches, and historic testing related to brine migration in geologic salt. Although most information presented in this report is not new, we synthesize a large amount of material scattered across dozens of laboratory reports, journal papers, conference proceedings, and textbooks. We present a mathematical description of the governing brine flow mechanisms in geologic salt. We outline the general coupled thermal, multi-phase hydrologic, and mechanical processes. We derive these processes governing equations, which can be used to predict brine flow. These equations are valid under a wide varietymore » of conditions applicable to radioactive waste disposal in rooms and boreholes excavated into geologic salt.« less

  7. Information retrieval from holographic interferograms: Fundamentals and problems

    NASA Technical Reports Server (NTRS)

    Vest, Charles M.

    1987-01-01

    Holographic interferograms can contain large amounts of information about flow and temperature fields. Their information content can be very high because they can be viewed from many different directions. This multidirectionality, and fringe localization add to the information contained in the fringe pattern if diffuse illumination is used. Additional information, and increased accuracy can be obtained through the use of dual reference wave holography to add reference fringes or to effect discrete phase shift or hetrodyne interferometry. Automated analysis of fringes is possible if interferograms are of simple structure and good quality. However, in practice a large number of practical problems can arise, so that a difficult image processing task results.

  8. Cryogenic and radiation hard ASIC design for large format NIR/SWIR detector

    NASA Astrophysics Data System (ADS)

    Gao, Peng; Dupont, Benoit; Dierickx, Bart; Müller, Eric; Verbruggen, Geert; Gielis, Stijn; Valvekens, Ramses

    2014-10-01

    An ASIC is developed to control and data quantization for large format NIR/SWIR detector arrays. Both cryogenic and space radiation environment issue are considered during the design. Therefore it can be integrated in the cryogenic chamber, which reduces significantly the vast amount of long wires going in and out the cryogenic chamber, i.e. benefits EMI and noise concerns, as well as the power consumption of cooling system and interfacing circuits. In this paper, we will describe the development of this prototype ASIC for image sensor driving and signal processing as well as the testing in both room and cryogenic temperature.

  9. A new method of edge detection for object recognition

    USGS Publications Warehouse

    Maddox, Brian G.; Rhew, Benjamin

    2004-01-01

    Traditional edge detection systems function by returning every edge in an input image. This can result in a large amount of clutter and make certain vectorization algorithms less accurate. Accuracy problems can then have a large impact on automated object recognition systems that depend on edge information. A new method of directed edge detection can be used to limit the number of edges returned based on a particular feature. This results in a cleaner image that is easier for vectorization. Vectorized edges from this process could then feed an object recognition system where the edge data would also contain information as to what type of feature it bordered.

  10. Optimal Full Information Synthesis for Flexible Structures Implemented on Cray Supercomputers

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Balas, Gary J.

    1995-01-01

    This paper considers an algorithm for synthesis of optimal controllers for full information feedback. The synthesis procedure reduces to a single linear matrix inequality which may be solved via established convex optimization algorithms. The computational cost of the optimization is investigated. It is demonstrated the problem dimension and corresponding matrices can become large for practical engineering problems. This algorithm represents a process that is impractical for standard workstations for large order systems. A flexible structure is presented as a design example. Control synthesis requires several days on a workstation but may be solved in a reasonable amount of time using a Cray supercomputer.

  11. Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    Wilson, Gregory S.; Huntress, Wesley T.

    1990-01-01

    The rationale behind Mission to Planet Earth is presented, and the program plan is described in detail. NASA and its interagency and international partners will place satellites carrying advanced sensors in strategic earth orbits to collect muultidisciplinary data. A sophisticated data system will process and archive an unprecedented large amount of information about the earth and how it functions as a system. Attention is given to the space observatories, the data and information systems, and the interdisciplinary research.

  12. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  13. The ROK Army’s Role When North Korea Collapses Without a War with the ROK

    DTIC Science & Technology

    2001-02-01

    produced large amounts of biological and chemical weapons. In addition, North Korea continues to develop nuclear weapons and missile technology and export...process. 6. Security and safe disposal of WMD. This includes research, production and storage facilities for nuclear, biological and chemical weapons...Publishers, 1989. Naisbitt, John . Megatrends Asia: Eight Asian Megatrends That Are Reshaping Our World, New York: Simon and Schuster. 1996. The New

  14. Rational calculation accuracy in acousto-optical matrix-vector processor

    NASA Astrophysics Data System (ADS)

    Oparin, V. V.; Tigin, Dmitry V.

    1994-01-01

    The high speed of parallel computations for a comparatively small-size processor and acceptable power consumption makes the usage of acousto-optic matrix-vector multiplier (AOMVM) attractive for processing of large amounts of information in real time. The limited accuracy of computations is an essential disadvantage of such a processor. The reduced accuracy requirements allow for considerable simplification of the AOMVM architecture and the reduction of the demands on its components.

  15. Balancing Information Analysis and Decision Value: A Model to Exploit the Decision Process

    DTIC Science & Technology

    2011-12-01

    technical intelli- gence e.g. signals and sensors (SIGINT and MASINT), imagery (!MINT), as well and human and open source intelligence (HUMINT and OSINT ...Clark 2006). The ability to capture large amounts of da- ta and the plenitude of modem intelligence information sources provides a rich cache of...many tech- niques for managing information collected and derived from these sources , the exploitation of intelligence assets for decision-making

  16. Weymouth Fore River, Weymouth, Braintree, Massachusetts, Small Navigation Project. Detailed Project Report and Environmental Assessment.

    DTIC Science & Technology

    1981-02-01

    and all considered sites were beyond normal hydraulic pumping system capability. Marsh creation requires a large amount of land area which is...boating. Mechanical dredging is planned (as opposed to hydraulic ) because open water disposal of dredged sediments is the preferred alternative...river the above process would be complicated in several ways. First, because the material would have to be hydraulically disposed of at the temporary

  17. Metal-assisted exfoliation (MAE): green, roll-to-roll compatible method for transferring graphene to flexible substrates

    NASA Astrophysics Data System (ADS)

    Zaretski, Aliaksandr V.; Moetazedi, Herad; Kong, Casey; Sawyer, Eric J.; Savagatrup, Suchol; Valle, Eduardo; O'Connor, Timothy F.; Printz, Adam D.; Lipomi, Darren J.

    2015-01-01

    Graphene is expected to play a significant role in future technologies that span a range from consumer electronics, to devices for the conversion and storage of energy, to conformable biomedical devices for healthcare. To realize these applications, however, a low-cost method of synthesizing large areas of high-quality graphene is required. Currently, the only method to generate large-area single-layer graphene that is compatible with roll-to-roll manufacturing destroys approximately 300 kg of copper foil (thickness = 25 μm) for every 1 g of graphene produced. This paper describes a new environmentally benign and scalable process of transferring graphene to flexible substrates. The process is based on the preferential adhesion of certain thin metallic films to graphene; separation of the graphene from the catalytic copper foil is followed by lamination to a flexible target substrate in a process that is compatible with roll-to-roll manufacturing. The copper substrate is indefinitely reusable and the method is substantially greener than the current process that uses relatively large amounts of corrosive etchants to remove the copper. The sheet resistance of the graphene produced by this new process is unoptimized but should be comparable in principle to that produced by the standard method, given the defects observable by Raman spectroscopy and the presence of process-induced cracks. With further improvements, this green, inexpensive synthesis of single-layer graphene could enable applications in flexible, stretchable, and disposable electronics, low-profile and lightweight barrier materials, and in large-area displays and photovoltaic modules.

  18. Coupling a basin erosion and river sediment transport model into a large scale hydrological model: an application in the Amazon basin

    NASA Astrophysics Data System (ADS)

    Buarque, D. C.; Collischonn, W.; Paiva, R. C. D.

    2012-04-01

    This study presents the first application and preliminary results of the large scale hydrodynamic/hydrological model MGB-IPH with a new module to predict the spatial distribution of the basin erosion and river sediment transport in a daily time step. The MGB-IPH is a large-scale, distributed and process based hydrological model that uses a catchment based discretization and the Hydrological Response Units (HRU) approach. It uses physical based equations to simulate the hydrological processes, such as the Penman Monteith model for evapotranspiration, and uses the Muskingum Cunge approach and a full 1D hydrodynamic model for river routing; including backwater effects and seasonal flooding. The sediment module of the MGB-IPH model is divided into two components: 1) prediction of erosion over the basin and sediment yield to river network; 2) sediment transport along the river channels. Both MGB-IPH and the sediment module use GIS tools to display relevant maps and to extract parameters from SRTM DEM (a 15" resolution was adopted). Using the catchment discretization the sediment module applies the Modified Universal Soil Loss Equation to predict soil loss from each HRU considering three sediment classes defined according to the soil texture: sand, silt and clay. The effects of topography on soil erosion are estimated by a two-dimensional slope length (LS) factor which using the contributing area approach and a local slope steepness (S), both estimated for each DEM pixel using GIS algorithms. The amount of sediment releasing to the catchment river reach in each day is calculated using a linear reservoir. Once the sediment reaches the river they are transported into the river channel using an advection equation for silt and clay and a sediment continuity equation for sand. A sediment balance based on the Yang sediment transport capacity, allowing to compute the amount of erosion and deposition along the rivers, is performed for sand particles as bed load, whilst no erosion or deposition is allowed for silt and clay. The model was first applied on the Madeira River basin, one of the major tributaries of the Amazon River (~1.4*106 km2) accounting for 35% of the suspended sediment amount annually transported for the Amazon river to the ocean. Model results agree with observed data, mainly for monthly and annual time scales. The spatial distribution of soil erosion within the basin showed a large amount of sediment being delivered from the Andean regions of Bolivia and Peru. Spatial distribution of mean annual sediment along the river showed that Madre de Dios, Mamoré and Beni rivers transport the major amount of sediment. Simulated daily suspended solid discharge agree with observed data. The model is able to provide temporaly and spatialy distributed estimates of soil loss source over the basin, locations with tendency for erosion or deposition along the rivers, and to reproduce long term sediment yield at several locations. Despite model results are encouraging, further effort is needed to validate the model considering the scarcity of data at large scale.

  19. Bayesian cloud detection for MERIS, AATSR, and their combination

    NASA Astrophysics Data System (ADS)

    Hollstein, A.; Fischer, J.; Carbajal Henken, C.; Preusker, R.

    2014-11-01

    A broad range of different of Bayesian cloud detection schemes is applied to measurements from the Medium Resolution Imaging Spectrometer (MERIS), the Advanced Along-Track Scanning Radiometer (AATSR), and their combination. The cloud masks were designed to be numerically efficient and suited for the processing of large amounts of data. Results from the classical and naive approach to Bayesian cloud masking are discussed for MERIS and AATSR as well as for their combination. A sensitivity study on the resolution of multidimensional histograms, which were post-processed by Gaussian smoothing, shows how theoretically insufficient amounts of truth data can be used to set up accurate classical Bayesian cloud masks. Sets of exploited features from single and derived channels are numerically optimized and results for naive and classical Bayesian cloud masks are presented. The application of the Bayesian approach is discussed in terms of reproducing existing algorithms, enhancing existing algorithms, increasing the robustness of existing algorithms, and on setting up new classification schemes based on manually classified scenes.

  20. Effects of endogenous small molecular compounds on the rheological properties, texture and microstructure of soymilk coagulum: Removal of phytate using ultrafiltration.

    PubMed

    Wang, Ruican; Guo, Shuntang

    2016-11-15

    This study aims to clarify the roles played by endogenous small molecular components in soymilk coagulation process and the properties of gels. Soymilk samples with decreasing levels of small molecules were prepared by ultrafiltration, to reduce the amount of phytate and salts. CaSO4-induced coagulation process was analyzed using rheological methods. Results showed that removal of free small molecules decreased the activation energy of protein coagulation, resulting in accelerated reaction and increased gel strength. However, too fast a reaction led to the drop in storage modulus (G'). Microscopic observation suggested that accelerated coagulation generated a coarse and non-uniform gel network with large pores. This network could not hold much water, leading to serious syneresis. Endogenous small molecules in soymilk were vital in the fine gel structure. Coagulation rate could be controlled by adjusting the amount of small molecules to obtain tofu products with the optimal texture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Automatic Feature Extraction from Planetary Images

    NASA Technical Reports Server (NTRS)

    Troglio, Giulia; Le Moigne, Jacqueline; Benediktsson, Jon A.; Moser, Gabriele; Serpico, Sebastiano B.

    2010-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images has already been acquired and much more will be available for analysis in the coming years. The image data need to be analyzed, preferably by automatic processing techniques because of the huge amount of data. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to planetary data that often present low contrast and uneven illumination characteristics. Different methods have already been presented for crater extraction from planetary images, but the detection of other types of planetary features has not been addressed yet. Here, we propose a new unsupervised method for the extraction of different features from the surface of the analyzed planet, based on the combination of several image processing techniques, including a watershed segmentation and the generalized Hough Transform. The method has many applications, among which image registration and can be applied to arbitrary planetary images.

  2. Swarm motility inhibitory and antioxidant activities of pomegranate peel processed under three drying conditions.

    PubMed

    John, K M Maria; Bhagwat, Arvind A; Luthria, Devanand L

    2017-11-15

    During processing of ready-to-eat fresh fruits, large amounts of peel and seeds are discarded as waste. Pomegranate (Punicagranatum) peels contain high amounts of bioactive compounds which inhibit migration of Salmonella on wet surfaces. The metabolic distribution of bioactives in pomegranate peel, inner membrane, and edible aril portion was investigated under three different drying conditions along with the anti-swarming activity against Citrobacter rodentium. Based on the multivariate analysis, 29 metabolites discriminated the pomegranate peel, inner membrane, and edible aril portion, as well as the three different drying methods. Punicalagins (∼38.6-50.3mg/g) were detected in higher quantities in all fractions as compared to ellagic acid (∼0.1-3.2mg/g) and punicalins (∼0-2.4mg/g). The bioactivity (antioxidant, anti-swarming) and phenolics content was significantly higher in peels than the edible aril portion. Natural anti-swarming agents from food waste may have promising potential for controlling food borne pathogens. Published by Elsevier Ltd.

  3. A Framework for Real-Time Collection, Analysis, and Classification of Ubiquitous Infrasound Data

    NASA Astrophysics Data System (ADS)

    Christe, A.; Garces, M. A.; Magana-Zook, S. A.; Schnurr, J. M.

    2015-12-01

    Traditional infrasound arrays are generally expensive to install and maintain. There are ~10^3 infrasound channels on Earth today. The amount of data currently provided by legacy architectures can be processed on a modest server. However, the growing availability of low-cost, ubiquitous, and dense infrasonic sensor networks presents a substantial increase in the volume, velocity, and variety of data flow. Initial data from a prototype ubiquitous global infrasound network is already pushing the boundaries of traditional research server and communication systems, in particular when serving data products over heterogeneous, international network topologies. We present a scalable, cloud-based approach for capturing and analyzing large amounts of dense infrasonic data (>10^6 channels). We utilize Akka actors with WebSockets to maintain data connections with infrasound sensors. Apache Spark provides streaming, batch, machine learning, and graph processing libraries which will permit signature classification, cross-correlation, and other analytics in near real time. This new framework and approach provide significant advantages in scalability and cost.

  4. Meat consumption and cancer risk: a critical review of published meta-analyses.

    PubMed

    Lippi, Giuseppe; Mattiuzzi, Camilla; Cervellin, Gianfranco

    2016-01-01

    Dietary habits play a substantial role for increasing or reducing cancer risk. We performed a critical review of scientific literature, to describe the findings of meta-analyses that explored the association between meat consumption and cancer risk. Overall, 42 eligible meta-analyses were included in this review, in which meat consumption was assumed from sheer statistics. Convincing association was found between larger intake of red meat and cancer, especially with colorectal, lung, esophageal and gastric malignancies. Increased consumption of processed meat was also found to be associated with colorectal, esophageal, gastric and bladder cancers. Enhanced intake of white meat or poultry was found to be negatively associated with some types of cancers. Larger beef consumption was significantly associated with cancer, whereas the risk was not increased consuming high amounts of pork. Our analysis suggest increased risk of cancer in subjects consuming large amounts of red and processed meat, but not in those with high intake of white meat or poultry. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Hadoop-based implementation of processing medical diagnostic records for visual patient system

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Shi, Liehang; Xie, Zhe; Zhang, Jianguo

    2018-03-01

    We have innovatively introduced Visual Patient (VP) concept and method visually to represent and index patient imaging diagnostic records (IDR) in last year SPIE Medical Imaging (SPIE MI 2017), which can enable a doctor to review a large amount of IDR of a patient in a limited appointed time slot. In this presentation, we presented a new approach to design data processing architecture of VP system (VPS) to acquire, process and store various kinds of IDR to build VP instance for each patient in hospital environment based on Hadoop distributed processing structure. We designed this system architecture called Medical Information Processing System (MIPS) with a combination of Hadoop batch processing architecture and Storm stream processing architecture. The MIPS implemented parallel processing of various kinds of clinical data with high efficiency, which come from disparate hospital information system such as PACS, RIS LIS and HIS.

  6. Formation of ammonia from dinitrogen under primordial conditions

    NASA Astrophysics Data System (ADS)

    Weigand, W.; Dörr, M.; Robl, C.; Kreisel, G.; Grunert, R.; Käßbohrer, J.; Brand, W.; Werner, R.; Popp, J.; Tarcea, N.

    2002-11-01

    Ammonia is one of the most largely industrially produced basic compounds, leading to a variety of important secondary products. In the chemical industry, ammonia is produced in large amounts via the HABER-BOSCH-process. In contrast to the industrial process, the nitrogenase enzyme operates in organisms under very mild conditions at atmospheric pressure and ambient temperature. In this article, we describe a method for the synthesis of ammonia from molecular nitrogen using H2S and freshly precipitated iron sulfide as a mediator thus serving as a primordial inorganic substitute for the enzyme nitrogenase. The reductand, as well as the reaction conditions (atmospheric nitrogen pressure and temperatures on the order of 70 - 80°C) are rather mild and therefore comparable to the biological processes. The driving force of the overall reaction is believed to be the oxidation of iron sulfide to iron disulfide, and the formation of hydrogen from H2S. The reactions reported in this article may support the theory of an archaic nitrogen-fixing Fe-S cluster.

  7. A New Framework and Prototype Solution for Clinical Decision Support and Research in Genomics and Other Data-intensive Fields of Medicine.

    PubMed

    Evans, James P; Wilhelmsen, Kirk C; Berg, Jonathan; Schmitt, Charles P; Krishnamurthy, Ashok; Fecho, Karamarie; Ahalt, Stanley C

    2016-01-01

    In genomics and other fields, it is now possible to capture and store large amounts of data in electronic medical records (EMRs). However, it is not clear if the routine accumulation of massive amounts of (largely uninterpretable) data will yield any health benefits to patients. Nevertheless, the use of large-scale medical data is likely to grow. To meet emerging challenges and facilitate optimal use of genomic data, our institution initiated a comprehensive planning process that addresses the needs of all stakeholders (e.g., patients, families, healthcare providers, researchers, technical staff, administrators). Our experience with this process and a key genomics research project contributed to the proposed framework. We propose a two-pronged Genomic Clinical Decision Support System (CDSS) that encompasses the concept of the "Clinical Mendeliome" as a patient-centric list of genomic variants that are clinically actionable and introduces the concept of the "Archival Value Criterion" as a decision-making formalism that approximates the cost-effectiveness of capturing, storing, and curating genome-scale sequencing data. We describe a prototype Genomic CDSS that we developed as a first step toward implementation of the framework. The proposed framework and prototype solution are designed to address the perspectives of stakeholders, stimulate effective clinical use of genomic data, drive genomic research, and meet current and future needs. The framework also can be broadly applied to additional fields, including other '-omics' fields. We advocate for the creation of a Task Force on the Clinical Mendeliome, charged with defining Clinical Mendeliomes and drafting clinical guidelines for their use.

  8. Acoustic emission signal processing for rolling bearing running state assessment using compressive sensing

    NASA Astrophysics Data System (ADS)

    Liu, Chang; Wu, Xing; Mao, Jianlin; Liu, Xiaoqin

    2017-07-01

    In the signal processing domain, there has been growing interest in using acoustic emission (AE) signals for the fault diagnosis and condition assessment instead of vibration signals, which has been advocated as an effective technique for identifying fracture, crack or damage. The AE signal has high frequencies up to several MHz which can avoid some signals interference, such as the parts of bearing (i.e. rolling elements, ring and so on) and other rotating parts of machine. However, acoustic emission signal necessitates advanced signal sampling capabilities and requests ability to deal with large amounts of sampling data. In this paper, compressive sensing (CS) is introduced as a processing framework, and then a compressive features extraction method is proposed. We use it for extracting the compressive features from compressively-sensed data directly, and also prove the energy preservation properties. First, we study the AE signals under the CS framework. The sparsity of AE signal of the rolling bearing is checked. The observation and reconstruction of signal is also studied. Second, we present a method of extraction AE compressive feature (AECF) from compressively-sensed data directly. We demonstrate the energy preservation properties and the processing of the extracted AECF feature. We assess the running state of the bearing using the AECF trend. The AECF trend of the running state of rolling bearings is consistent with the trend of traditional features. Thus, the method is an effective way to evaluate the running trend of rolling bearings. The results of the experiments have verified that the signal processing and the condition assessment based on AECF is simpler, the amount of data required is smaller, and the amount of computation is greatly reduced.

  9. Development and Technology of Large Thickness TMCP Steel Plate with 390MPA Grade Used for Engineering Machinery

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoshu; Zhang, Zhijun; Zhang, Peng

    Recently, with the rapid upgrading of the equipment in the steel Corp, the rolling technology of TMCP has been rapidly developed and widely applied. A large amount of steel plate has been produced by using the TMCP technology. The TMCP processes have been used more and more widely and replaced the heat treatment technology of normalizing, quenching and tempering heat process. In this paper, low financial input is considered in steel plate production and the composition of the steel has been designed with low C component, a limited alloy element of the Nb, and certain amounts of Mn element. During the continuous casting process, the size of the continuous casting slab section is 300 mm × 2400 mm. The rolling technology of TMCP is controlled at a lower rolling and red temperature to control the transformation of the microstructure. Four different rolling treatments are chosen to test its effects on the 390MPa grade low carbon steel of bainitic microstructure and properties. This test manages to produce a proper steel plate fulfilling the standard mechanical properties. Specifically, low carbon bainite is observed in the microstructure of the steel plate and the maximum thickness of steel plate under this TMCP technology is up to 80mm. The mechanical property of the steel plate is excellent and the KV2 at -40 °C performs more than 200 J. Moreover, the production costs are greatly reduced when the steel plate is produced by this TMCP technology when replacing the current production process of quenching and tempering. The low cost steel plate could well meet the requirements of producing engineering machinery in the steel market.

  10. Industrial applications of hot dry rock geothermal energy

    NASA Astrophysics Data System (ADS)

    Duchane, D. V.

    1992-07-01

    Geothermal resources in the form of naturally occurring hot water or steam have been utilized for many years. While these hydrothermal resources are found in many places, the general case is that the rock at depth is hot, but does not contain significant amounts of mobile fluid. An extremely large amount of geothermal energy is found around the world in this hot dry rock (HDR). Technology has been under development for more than twenty years at the Los Alamos National Laboratory in the United States and elsewhere to develop the technology to extract the geothermal energy from HDR in a form useful for electricity generation, space heating, or industrial processing. HDR technology is especially attractive for industrial applications because of the ubiquitous distribution of the HDR resource and the unique aspects of the process developed to recover it. In the HDR process, as developed at Los Alamos, water is pumped down a well under high pressure to open up natural joints in hot rock and create an artificial geothermal reservoir. Energy is extracted by circulating water through the reservoir. Pressurized hot water is returned to the surface through the production well, and its thermal energy is extracted for practical use. The same water is then recirculated through the system to mine more geothermal heat. Construction of a pilot HDR facility at Fenton Hill, NM, USA, has recently been completed by the Los Alamos National Laboratory. It consists of a large underground reservoir, a surface plant, and the connecting wellbores. This paper describes HDR technology and the current status of the development program. Novel industrial applications of geothermal energy based on the unique characteristics of the HDR energy extraction process are discussed.

  11. Algorithm for Aligning an Array of Receiving Radio Antennas

    NASA Technical Reports Server (NTRS)

    Rogstad, David

    2006-01-01

    A digital-signal-processing algorithm (somewhat arbitrarily) called SUMPLE has been devised as a means of aligning the outputs of multiple receiving radio antennas in a large array for the purpose of receiving a weak signal transmitted by a single distant source. As used here, aligning signifies adjusting the delays and phases of the outputs from the various antennas so that their relatively weak replicas of the desired signal can be added coherently to increase the signal-to-noise ratio (SNR) for improved reception, as though one had a single larger antenna. The method was devised to enhance spacecraft-tracking and telemetry operations in NASA's Deep Space Network (DSN); the method could also be useful in such other applications as both satellite and terrestrial radio communications and radio astronomy. Heretofore, most commonly, alignment has been effected by a process that involves correlation of signals in pairs. This approach necessitates the use of a large amount of hardware most notably, the N(N - 1)/2 correlators needed to process signals from all possible pairs of N antennas. Moreover, because the incoming signals typically have low SNRs, the delay and phase adjustments are poorly determined from the pairwise correlations. SUMPLE also involves correlations, but the correlations are not performed in pairs. Instead, in a partly iterative process, each signal is appropriately weighted and then correlated with a composite signal equal to the sum of the other signals (see Figure 1). One benefit of this approach is that only N correlators are needed; in an array of N much greater than 1 antennas, this results in a significant reduction of the amount of hardware. Another benefit is that once the array achieves coherence, the correlation SNR is N - 1 times that of a pair of antennas.

  12. Recovery of ammonia in digestates of calf manure through a struvite precipitation process using unconventional reagents.

    PubMed

    Siciliano, A; De Rosa, S

    2014-01-01

    Land spreading of digestates causes the discharge of large quantities of nutrients into the environment, which contributes to eutrophication and depletion of dissolved oxygen in water bodies. For the removal of ammonia nitrogen, there is increasing interest in the chemical precipitation of struvite, which is a mineral that can be reused as a slow-release fertilizer. However, this process is an expensive treatment of digestate because large amounts of magnesium and phosphorus reagents are required. In this paper, a struvite precipitation-based process is proposed for an efficient recovery of digestate nutrients using low-cost reagents. In particular, seawater bittern, a by-product of marine salt manufacturing and bone meal, a by-product of the thermal treatment of meat waste, have been used as low-cost sources of magnesium and phosphorus, respectively. Once the operating conditions are defined, the process enables the removal of more than 90% ammonia load, the almost complete recovery of magnesium and phosphorus and the production of a potentially valuable precipitate containing struvite crystals.

  13. Drell-Yan process as an avenue to test a noncommutative standard model at the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    J, Selvaganapathy; Das, Prasanta Kumar; Konar, Partha

    2016-06-01

    We study the Drell-Yan process at the Large Hadron Collider in the presence of the noncommutative extension of the standard model. Using the Seiberg-Witten map, we calculate the production cross section to first order in the noncommutative parameter Θμ ν . Although this idea has been evolving for a long time, only a limited amount of phenomenological analysis has been completed, and this was mostly in the context of the linear collider. An outstanding feature from this nonminimal noncommutative standard model not only modifies the couplings over the SM production channel but also allows additional nonstandard vertices which can play a significant role. Hence, in the Drell-Yan process, as studied in the present analysis, one also needs to account for the gluon fusion process at the tree level. Some of the characteristic signatures, such as oscillatory azimuthal distributions, are an outcome of the momentum-dependent effective couplings. We explore the noncommutative scale ΛNC≥0.4 TeV , considering different machine energy ranging from 7 to 13 TeV.

  14. TransAtlasDB: an integrated database connecting expression data, metadata and variants

    PubMed Central

    Adetunji, Modupeore O; Lamont, Susan J; Schmidt, Carl J

    2018-01-01

    Abstract High-throughput transcriptome sequencing (RNAseq) is the universally applied method for target-free transcript identification and gene expression quantification, generating huge amounts of data. The constraint of accessing such data and interpreting results can be a major impediment in postulating suitable hypothesis, thus an innovative storage solution that addresses these limitations, such as hard disk storage requirements, efficiency and reproducibility are paramount. By offering a uniform data storage and retrieval mechanism, various data can be compared and easily investigated. We present a sophisticated system, TransAtlasDB, which incorporates a hybrid architecture of both relational and NoSQL databases for fast and efficient data storage, processing and querying of large datasets from transcript expression analysis with corresponding metadata, as well as gene-associated variants (such as SNPs) and their predicted gene effects. TransAtlasDB provides the data model of accurate storage of the large amount of data derived from RNAseq analysis and also methods of interacting with the database, either via the command-line data management workflows, written in Perl, with useful functionalities that simplifies the complexity of data storage and possibly manipulation of the massive amounts of data generated from RNAseq analysis or through the web interface. The database application is currently modeled to handle analyses data from agricultural species, and will be expanded to include more species groups. Overall TransAtlasDB aims to serve as an accessible repository for the large complex results data files derived from RNAseq gene expression profiling and variant analysis. Database URL: https://modupeore.github.io/TransAtlasDB/ PMID:29688361

  15. An effective online data monitoring and saving strategy for large-scale climate simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin

    Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less

  16. An effective online data monitoring and saving strategy for large-scale climate simulations

    DOE PAGES

    Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...

    2018-01-22

    Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less

  17. PATHA: Performance Analysis Tool for HPC Applications

    DOE PAGES

    Yoo, Wucherl; Koo, Michelle; Cao, Yi; ...

    2016-02-18

    Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data.more » Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.« less

  18. Magnetic thin-film insulator with ultra-low spin wave damping for coherent nanomagnonics

    NASA Astrophysics Data System (ADS)

    Yu, Haiming; Kelly, O. D'allivy; Cros, V.; Bernard, R.; Bortolotti, P.; Anane, A.; Brandl, F.; Huber, R.; Stasinopoulos, I.; Grundler, D.

    2014-10-01

    Wave control in the solid state has opened new avenues in modern information technology. Surface-acoustic-wave-based devices are found as mass market products in 100 millions of cellular phones. Spin waves (magnons) would offer a boost in today's data handling and security implementations, i.e., image processing and speech recognition. However, nanomagnonic devices realized so far suffer from the relatively short damping length in the metallic ferromagnets amounting to a few 10 micrometers typically. Here we demonstrate that nm-thick YIG films overcome the damping chasm. Using a conventional coplanar waveguide we excite a large series of short-wavelength spin waves (SWs). From the data we estimate a macroscopic of damping length of about 600 micrometers. The intrinsic damping parameter suggests even a record value about 1 mm allowing for magnonics-based nanotechnology with ultra-low damping. In addition, SWs at large wave vector are found to exhibit the non-reciprocal properties relevant for new concepts in nanoscale SW-based logics. We expect our results to provide the basis for coherent data processing with SWs at GHz rates and in large arrays of cellular magnetic arrays, thereby boosting the envisioned image processing and speech recognition.

  19. Magnetic thin-film insulator with ultra-low spin wave damping for coherent nanomagnonics

    PubMed Central

    Yu, Haiming; Kelly, O. d'Allivy; Cros, V.; Bernard, R.; Bortolotti, P.; Anane, A.; Brandl, F.; Huber, R.; Stasinopoulos, I.; Grundler, D.

    2014-01-01

    Wave control in the solid state has opened new avenues in modern information technology. Surface-acoustic-wave-based devices are found as mass market products in 100 millions of cellular phones. Spin waves (magnons) would offer a boost in today's data handling and security implementations, i.e., image processing and speech recognition. However, nanomagnonic devices realized so far suffer from the relatively short damping length in the metallic ferromagnets amounting to a few 10 micrometers typically. Here we demonstrate that nm-thick YIG films overcome the damping chasm. Using a conventional coplanar waveguide we excite a large series of short-wavelength spin waves (SWs). From the data we estimate a macroscopic of damping length of about 600 micrometers. The intrinsic damping parameter suggests even a record value about 1 mm allowing for magnonics-based nanotechnology with ultra-low damping. In addition, SWs at large wave vector are found to exhibit the non-reciprocal properties relevant for new concepts in nanoscale SW-based logics. We expect our results to provide the basis for coherent data processing with SWs at GHz rates and in large arrays of cellular magnetic arrays, thereby boosting the envisioned image processing and speech recognition. PMID:25355200

  20. Apply creative thinking of decision support in electrical nursing record.

    PubMed

    Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung

    2006-01-01

    The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.

  1. Development of a 2-stage shear-cutting-process to reduce cut-edge-sensitivity of steels

    NASA Astrophysics Data System (ADS)

    Gläsner, T.; Sunderkötter, C.; Hoffmann, H.; Volk, W.; Golle, R.

    2017-09-01

    The edge cracking sensitivity of AHSS and UHSS is a challenging factor in the cold forming process. Expanding cut holes during flanging operations is rather common in automotive components. During these flanging operations the pierced hole is stretched so that its diameter is increased. These flanging operations stretch material that has already been subjected to large amounts of plastic deformation, therefore forming problems may occur. An innovative cutting process decreases micro cracks in the cutting surface and facilitates the subsequent cold forming process. That cutting process consists of two stages, which produces close dimensional tolerance and smooth edges. As a result the hole expanding ratio was increased by nearly 100 % when using thick high strength steels for suspension components. The paper describes the mechanisms of the trimming process at the cut edge, and the positive effect of the 2-stage shear-cutting process on the hole extension capability of multiphase steels.

  2. Coal Producer's Rubber Waste Processing Development

    NASA Astrophysics Data System (ADS)

    Makarevich, Evgeniya; Papin, Andrey; Nevedrov, Alexander; Cherkasova, Tatyana; Ignatova, Alla

    2017-11-01

    A large amount of rubber-containing waste, the bulk of which are worn automobile tires and conveyor belts, is produced at coal mining and coal processing enterprises using automobile tires, conveyor belts, etc. The volume of waste generated increases every year and reaches enormous proportions. The methods for processing rubber waste can be divided into three categories: grinding, pyrolysis (high and low temperature), and decomposition by means of chemical solvents. One of the known techniques of processing the worn-out tires is their regeneration, aimed at producing the new rubber substitute used in the production of rubber goods. However, the number of worn tires used for the production of regenerate does not exceed 20% of their total quantity. The new method for processing rubber waste through the pyrolysis process is considered in this article. Experimental data on the upgrading of the carbon residue of pyrolysis by the methods of heavy media separation, magnetic and vibroseparation, and thermal processing are presented.

  3. Bandgap-customizable germanium using lithographically determined biaxial tensile strain for silicon-compatible optoelectronics.

    PubMed

    Sukhdeo, David S; Nam, Donguk; Kang, Ju-Hyung; Brongersma, Mark L; Saraswat, Krishna C

    2015-06-29

    Strain engineering has proven to be vital for germanium-based photonics, in particular light emission. However, applying a large permanent biaxial tensile strain to germanium has been a challenge. We present a simple, CMOS-compatible technique to conveniently induce a large, spatially homogenous strain in circular structures patterned within germanium nanomembranes. Our technique works by concentrating and amplifying a pre-existing small strain into a circular region. Biaxial tensile strains as large as 1.11% are observed by Raman spectroscopy and are further confirmed by photoluminescence measurements, which show enhanced and redshifted light emission from the strained germanium. Our technique allows the amount of biaxial strain to be customized lithographically, allowing the bandgaps of different germanium structures to be independently customized in a single mask process.

  4. Oxygen isotopes in tree rings record variation in precipitation δ18O and amount effects in the south of Mexico

    NASA Astrophysics Data System (ADS)

    Brienen, Roel J. W.; Hietz, Peter; Wanek, Wolfgang; Gloor, Manuel

    2013-12-01

    Natural archives of oxygen isotopes in precipitation may be used to study changes in the hydrological cycle in the tropics, but their interpretation is not straightforward. We studied to which degree tree rings of Mimosa acantholoba from southern Mexico record variation in isotopic composition of precipitation and which climatic processes influence oxygen isotopes in tree rings (δ18Otr). Interannual variation in δ18Otr was highly synchronized between trees and closely related to isotopic composition of rain measured at San Salvador, 710 km to the southwest. Correlations with δ13C, growth, or local climate variables (temperature, cloud cover, vapor pressure deficit (VPD)) were relatively low, indicating weak plant physiological influences. Interannual variation in δ18Otr correlated negatively with local rainfall amount and intensity. Correlations with the amount of precipitation extended along a 1000 km long stretch of the Pacific Central American coast, probably as a result of organized storm systems uniformly affecting rainfall in the region and its isotope signal; episodic heavy precipitation events, of which some are related to cyclones, deposit strongly 18O-depleted rain in the region and seem to have affected the δ18Otr signal. Large-scale controls on the isotope signature include variation in sea surface temperatures of tropical north Atlantic and Pacific Ocean. In conclusion, we show that δ18Otr of M. acantholoba can be used as a proxy for source water δ18O and that interannual variation in δ18Oprec is caused by a regional amount effect. This contrasts with δ18O signatures at continental sites where cumulative rainout processes dominate and thus provide a proxy for precipitation integrated over a much larger scale. Our results confirm that processes influencing climate-isotope relations differ between sites located, e.g., in the western Amazon versus coastal Mexico, and that tree ring isotope records can help in disentangling the processes influencing precipitation δ18O.

  5. Oxygen isotopes in tree rings record variation in precipitation δ18O and amount effects in the south of Mexico

    PubMed Central

    Brienen, Roel J W; Hietz, Peter; Wanek, Wolfgang; Gloor, Manuel

    2013-01-01

    [1] Natural archives of oxygen isotopes in precipitation may be used to study changes in the hydrological cycle in the tropics, but their interpretation is not straightforward. We studied to which degree tree rings of Mimosa acantholoba from southern Mexico record variation in isotopic composition of precipitation and which climatic processes influence oxygen isotopes in tree rings (δ18Otr). Interannual variation in δ18Otr was highly synchronized between trees and closely related to isotopic composition of rain measured at San Salvador, 710 km to the southwest. Correlations with δ13C, growth, or local climate variables (temperature, cloud cover, vapor pressure deficit (VPD)) were relatively low, indicating weak plant physiological influences. Interannual variation in δ18Otr correlated negatively with local rainfall amount and intensity. Correlations with the amount of precipitation extended along a 1000 km long stretch of the Pacific Central American coast, probably as a result of organized storm systems uniformly affecting rainfall in the region and its isotope signal; episodic heavy precipitation events, of which some are related to cyclones, deposit strongly 18O-depleted rain in the region and seem to have affected the δ18Otr signal. Large-scale controls on the isotope signature include variation in sea surface temperatures of tropical north Atlantic and Pacific Ocean. In conclusion, we show that δ18Otr of M. acantholoba can be used as a proxy for source water δ18O and that interannual variation in δ18Oprec is caused by a regional amount effect. This contrasts with δ18O signatures at continental sites where cumulative rainout processes dominate and thus provide a proxy for precipitation integrated over a much larger scale. Our results confirm that processes influencing climate-isotope relations differ between sites located, e.g., in the western Amazon versus coastal Mexico, and that tree ring isotope records can help in disentangling the processes influencing precipitation δ18O. PMID:26213660

  6. Pre-treatment step with Leuconostoc mesenteroides or L. pseudomesenteroides strains removes furfural from Zymomonas mobilis ethanolic fermentation broth.

    PubMed

    Hunter, William J; Manter, Daniel K

    2014-10-01

    Furfural is an inhibitor of growth and ethanol production by Zymomonas mobilis. This study used a naturally occurring (not GMO) biological pre-treatment to reduce that amount of furfural in a model fermentation broth. Pre-treatment involved inoculating and incubating the fermentation broth with strains of Leuconostoc mesenteroides or Leuconostoc pseudomesenteroides. The Leuconostoc strains converted furfural to furfuryl alcohol without consuming large amounts of dextrose in the process. Coupling this pre-treatment to ethanolic fermentation reduced furfural in the broth and improved growth, dextrose uptake and ethanol formation. Pre-treatment permitted ethanol formation in the presence of 5.2 g L(-1) furfural, which was otherwise inhibitive. The pre-treatment and presence of the Leuconostoc strains in the fermentation broth did not interfere with Z. mobilis ethanolic fermentation or the amounts of ethanol produced. The method suggests a possible technique for reducing the effect that furfural has on the production of ethanol for use as a biofuel. Published by Elsevier Ltd.

  7. Determination of solid mass fraction in partially frozen hydrocarbon fuels

    NASA Technical Reports Server (NTRS)

    Cotterell, E. M.; Mossadegh, R.; Bruce, A. J.; Moynihan, C. T.

    1986-01-01

    Filtration procedures alone are insufficient to determine the amounts of crystalline solid in a partially frozen hydrocarbon distillate fraction. This is due to the nature of the solidification process by which a large amount of liquid becomes entrapped within an interconnected crystalline structure. A technique has been developed to supplement filtration methods with an independent determination of the amount of liquid in the precipitate thereby revealing the actual value of mass percent crystalline solid, %S. A non-crystallizing dye is injected into the fuel and used as a tracer during the filtration. The relative concentrations of the dye in the filtrate and precipitate fractions is subsequently detected by a spectrophotometric comparison. The filtration apparatus was assembled so that the temperature of the sample is recorded immediately above the filter. Also, a second method of calculation has been established which allows significant reduction in test time while retaining acceptable accuracy of results. Data have been obtained for eight different kerosene range hydrocarbon fuels.

  8. Software design for analysis of multichannel intracardial and body surface electrocardiograms.

    PubMed

    Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A

    2002-11-01

    Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.

  9. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  10. Application of fluorescence spectroscopy for on-line bioprocess monitoring and control

    NASA Astrophysics Data System (ADS)

    Boehl, Daniela; Solle, D.; Toussaint, Hans J.; Menge, M.; Renemann, G.; Lindemann, Carsten; Hitzmann, Bernd; Scheper, Thomas-Helmut

    2001-02-01

    12 Modern bioprocess control requires fast data acquisition and in-time evaluation of bioprocess variables. On-line fluorescence spectroscopy for data acquisition and the use of chemometric methods accomplish these requirements. The presented investigations were performed with fluorescence spectrometers with wide ranges of excitation and emission wavelength. By detection of several biogenic fluorophors (amino acids, coenzymes and vitamins) a large amount of information about the state of the bioprocess are obtained. For the evaluation of the process variables partial least squares regression is used. This technique was applied to several bioprocesses: the production of ergotamine by Claviceps purpurea, the production of t-PA (tissue plasminogen activator) by animal cells and brewing processes. The main point of monitoring the brewing processes was to determine the process variables cell count and extract concentration.

  11. Strategies for responding to RAC requests electronically.

    PubMed

    Schramm, Michael

    2012-04-01

    Providers that would like to respond to complex RAC reviews electronically should consider three strategies: Invest in an EHR software package or a high-powered scanner that can quickly scan large amounts of paper. Implement an audit software platform that will allow providers to manage the entire audit process in one place. Use a CONNECT-compatible gateway capable of accessing the Nationwide Health Information Network (the network on which the electronic submission of medical documentation program runs).

  12. NPS CubeSat Launcher Design, Process and Requirements

    DTIC Science & Technology

    2009-06-01

    Soviet era ICBM. The first Dnepr launch in July 2006 consisted of fourteen CubeSats in five P-PODs, while the second in April 2007 consisted of...Regulations (ITAR). ITAR restricts the export of defense-related products and technology on the United States Munitions List. Although one might not...think that CubeSat technology would fall under ITAR, in fact a large amount of Aerospace technology , including some that could be used on CubeSats is

  13. The Significance of Lichens and Their Metabolites

    NASA Astrophysics Data System (ADS)

    Huneck, S.

    Lichens, symbiontic organisms of fungi and algae, synthesize numerous metabolites, the "lichen substances," which comprise aliphatic, cycloaliphatic, aromatic, and terpenic compounds. Lichens and their metabolites have a manifold biological activity: antiviral, antibiotic, antitumor, allergenic, plant growth inhibitory, antiherbivore, and enzyme inhibitory. Usnic acid, a very active lichen substance is used in pharmaceutical preparations. Large amounts of Pseudevernia furfuracea and Evernia prunastri are processed in the perfume industry, and some lichens are sensitive reagents for the evaluation of air pollution.

  14. The Computer: An Effective Research Assistant

    PubMed Central

    Gancher, Wendy

    1984-01-01

    The development of software packages such as data management systems and statistical packages has made it possible to process large amounts of research data. Data management systems make the organization and manipulation of such data easier. Floppy disks ease the problem of storing and retrieving records. Patient information can be kept confidential by limiting access to computer passwords linked with research files, or by using floppy disks. These attributes make the microcomputer essential to modern primary care research. PMID:21279042

  15. Parallel processing implementations of a contextual classifier for multispectral remote sensing data

    NASA Technical Reports Server (NTRS)

    Siegel, H. J.; Swain, P. H.; Smith, B. W.

    1980-01-01

    Contextual classifiers are being developed as a method to exploit the spatial/spectral context of a pixel to achieve accurate classification. Classification algorithms such as the contextual classifier typically require large amounts of computation time. One way to reduce the execution time of these tasks is through the use of parallelism. The applicability of the CDC flexible processor system and of a proposed multimicroprocessor system (PASM) for implementing contextual classifiers is examined.

  16. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III

    DTIC Science & Technology

    2015-04-30

    It is a supervised learning method but best for Big Data with low dimensions. It is an approximate inference good for Big Data and Hadoop ...Each process produces large amounts of information ( Big Data ). There is a critical need for automation, validation, and discovery to help acquisition...can inform managers where areas might have higher program risk and how resource and big data management might affect the desired return on investment

  17. Planetary Analogs in Antarctica: Icy Satellites

    NASA Technical Reports Server (NTRS)

    Malin, M. C.

    1985-01-01

    As part of a study to provide semi-quantitative techniques to date past Antarctic glaciations, sponsored by the Antarctic Research Program, field observations pertinent to other planets were also acquired. The extremely diverse surface conditions, marked by extreme cold and large amounts of ice, provide potential terrain and process analogs to the icy satellites of Jupiter and Saturn. Thin ice tectonic features and explosion craters (on sea ice) and deformation features on thicker ice (glaciers) are specifically addressed.

  18. One-step process of hydrothermal and alkaline treatment of wheat straw for improving the enzymatic saccharification.

    PubMed

    Sun, Shaolong; Zhang, Lidan; Liu, Fang; Fan, Xiaolin; Sun, Run-Cang

    2018-01-01

    To increase the production of bioethanol, a two-step process based on hydrothermal and dilute alkaline treatment was applied to reduce the natural resistance of biomass. However, the process required a large amount of water and a long operation time due to the solid/liquid separation before the alkaline treatment, which led to decrease the pure economic profit for production of bioethanol. Therefore, four one-step processes based on order of hydrothermal and alkaline treatment have been developed to enhance concentration of glucose of wheat straw by enzymatic saccharification. The aim of the present study was to systematically evaluated effect for different one-step processes by analyzing the physicochemical properties (composition, structural change, crystallinity, surface morphology, and BET surface area) and enzymatic saccharification of the treated substrates. In this study, hemicelluloses and lignins were removed from wheat straw and the morphologic structures were destroyed to various extents during the four one-step processes, which were favorable for cellulase absorption on cellulose. A positive correlation was also observed between the crystallinity and enzymatic saccharification rate of the substrate under the conditions given. The surface area of the substrate was positively related to the concentration of glucose in this study. As compared to the control (3.0 g/L) and treated substrates (11.2-14.6 g/L) obtained by the other three one-step processes, the substrate treated by one-step process based on successively hydrothermal and alkaline treatment had a maximum glucose concentration of 18.6 g/L, which was due to the high cellulose concentration and surface area for the substrate, accompanying with removal of large amounts of lignins and hemicelluloses. The present study demonstrated that the order of hydrothermal and alkaline treatment had significant effects on the physicochemical properties and enzymatic saccharification of wheat straw. The one-step process based on successively hydrothermal and alkaline treatment is a simple operating and economical feasible method for the production of glucose, which will be further converted into bioethanol.

  19. [Blue-light induced expression of S-adenosy-L-homocysteine hydrolase-like gene in Mucor amphibiorum RCS1].

    PubMed

    Gao, Ya; Wang, Shu; Fu, Mingjia; Zhong, Guolin

    2013-09-04

    To determine blue-light induced expression of S-adenosyl-L-homocysteine hydrolase-like (sahhl) gene in fungus Mucor amphibiorum RCS1. In the random process of PCR, a sequence of 555 bp was obtained from M. amphibiorum RCS1. The 555 bp sequence was labeled with digoxin to prepare the probe for northern hybridization. By northern hybridization, the transcription of sahhl gene was analyzed in M. amphibiorum RCS1 mycelia culture process from darkness to blue light to darkness. Simultaneously real-time PCR method was used to the sahhl gene expression analysis. Compared with the sequence of sahh gene from Homo sapiens, Mus musculus and some fungi species, a high homology of the 555 bp sequence was confirmed. Therefore, the preliminary confirmation has supported that the 555 bp sequence should be sahhl gene from M. amphibiorum RCS1. Under the dark pre-culture in 24 h, a large amounts of transcript of sahhl gene in the mycelia can be detected by northern hybridization and real-time PCR in the condition of 24 h blue light. But a large amounts of transcript of sahhl gene were not found in other detection for the dark pre-culture of 48 h, even though M. amphibiorum RCS1 mycelia were induced by blue light. Blue light can induce the expression of sahhl gene in the vigorous growth of M. amphibiorum RCS1 mycelia.

  20. Characterizing variable biogeochemical changes during the treatment of produced oilfield waste.

    PubMed

    Hildenbrand, Zacariah L; Santos, Inês C; Liden, Tiffany; Carlton, Doug D; Varona-Torres, Emmanuel; Martin, Misty S; Reyes, Michelle L; Mulla, Safwan R; Schug, Kevin A

    2018-09-01

    At the forefront of the discussions about climate change and energy independence has been the process of hydraulic fracturing, which utilizes large amounts of water, proppants, and chemical additives to stimulate sequestered hydrocarbons from impermeable subsurface strata. This process also produces large amounts of heterogeneous flowback and formation waters, the subsurface disposal of which has most recently been linked to the induction of anthropogenic earthquakes. As such, the management of these waste streams has provided a newfound impetus to explore recycling alternatives to reduce the reliance on subsurface disposal and fresh water resources. However, the biogeochemical characteristics of produced oilfield waste render its recycling and reutilization for production well stimulation a substantial challenge. Here we present a comprehensive analysis of produced waste from the Eagle Ford shale region before, during, and after treatment through adjustable separation, flocculation, and disinfection technologies. The collection of bulk measurements revealed significant reductions in suspended and dissolved constituents that could otherwise preclude untreated produced water from being utilized for production well stimulation. Additionally, a significant step-wise reduction in pertinent scaling and well-fouling elements was observed, in conjunction with notable fluctuations in the microbiomes of highly variable produced waters. Collectively, these data provide insight into the efficacies of available water treatment modalities within the shale energy sector, which is currently challenged with improving the environmental stewardship of produced water management. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. FISH Oracle 2: a web server for integrative visualization of genomic data in cancer research

    PubMed Central

    2014-01-01

    Background A comprehensive view on all relevant genomic data is instrumental for understanding the complex patterns of molecular alterations typically found in cancer cells. One of the most effective ways to rapidly obtain an overview of genomic alterations in large amounts of genomic data is the integrative visualization of genomic events. Results We developed FISH Oracle 2, a web server for the interactive visualization of different kinds of downstream processed genomics data typically available in cancer research. A powerful search interface and a fast visualization engine provide a highly interactive visualization for such data. High quality image export enables the life scientist to easily communicate their results. A comprehensive data administration allows to keep track of the available data sets. We applied FISH Oracle 2 to published data and found evidence that, in colorectal cancer cells, the gene TTC28 may be inactivated in two different ways, a fact that has not been published before. Conclusions The interactive nature of FISH Oracle 2 and the possibility to store, select and visualize large amounts of downstream processed data support life scientists in generating hypotheses. The export of high quality images supports explanatory data visualization, simplifying the communication of new biological findings. A FISH Oracle 2 demo server and the software is available at http://www.zbh.uni-hamburg.de/fishoracle. PMID:24684958

  2. Hapl-o-Mat: open-source software for HLA haplotype frequency estimation from ambiguous and heterogeneous data.

    PubMed

    Schäfer, Christian; Schmidt, Alexander H; Sauter, Jürgen

    2017-05-30

    Knowledge of HLA haplotypes is helpful in many settings as disease association studies, population genetics, or hematopoietic stem cell transplantation. Regarding the recruitment of unrelated hematopoietic stem cell donors, HLA haplotype frequencies of specific populations are used to optimize both donor searches for individual patients and strategic donor registry planning. However, the estimation of haplotype frequencies from HLA genotyping data is challenged by the large amount of genotype data, the complex HLA nomenclature, and the heterogeneous and ambiguous nature of typing records. To meet these challenges, we have developed the open-source software Hapl-o-Mat. It estimates haplotype frequencies from population data including an arbitrary number of loci using an expectation-maximization algorithm. Its key features are the processing of different HLA typing resolutions within a given population sample and the handling of ambiguities recorded via multiple allele codes or genotype list strings. Implemented in C++, Hapl-o-Mat facilitates efficient haplotype frequency estimation from large amounts of genotype data. We demonstrate its accuracy and performance on the basis of artificial and real genotype data. Hapl-o-Mat is a versatile and efficient software for HLA haplotype frequency estimation. Its capability of processing various forms of HLA genotype data allows for a straightforward haplotype frequency estimation from typing records usually found in stem cell donor registries.

  3. CMS Connect

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Gardner, R., Jr.; Hurtado Anampa, K.; Jayatilaka, B.; Aftab Khan, F.; Lannon, K.; Larson, K.; Letts, J.; Marra Da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.

    2017-10-01

    The CMS experiment collects and analyzes large amounts of data coming from high energy particle collisions produced by the Large Hadron Collider (LHC) at CERN. This involves a huge amount of real and simulated data processing that needs to be handled in batch-oriented platforms. The CMS Global Pool of computing resources provide +100K dedicated CPU cores and another 50K to 100K CPU cores from opportunistic resources for these kind of tasks and even though production and event processing analysis workflows are already managed by existing tools, there is still a lack of support to submit final stage condor-like analysis jobs familiar to Tier-3 or local Computing Facilities users into these distributed resources in an integrated (with other CMS services) and friendly way. CMS Connect is a set of computing tools and services designed to augment existing services in the CMS Physics community focusing on these kind of condor analysis jobs. It is based on the CI-Connect platform developed by the Open Science Grid and uses the CMS GlideInWMS infrastructure to transparently plug CMS global grid resources into a virtual pool accessed via a single submission machine. This paper describes the specific developments and deployment of CMS Connect beyond the CI-Connect platform in order to integrate the service with CMS specific needs, including specific Site submission, accounting of jobs and automated reporting to standard CMS monitoring resources in an effortless way to their users.

  4. Affordable multisensor digital video architecture for 360° situational awareness displays

    NASA Astrophysics Data System (ADS)

    Scheiner, Steven P.; Khan, Dina A.; Marecki, Alexander L.; Berman, David A.; Carberry, Dana

    2011-06-01

    One of the major challenges facing today's military ground combat vehicle operations is the ability to achieve and maintain full-spectrum situational awareness while under armor (i.e. closed hatch). Thus, the ability to perform basic tasks such as driving, maintaining local situational awareness, surveillance, and targeting will require a high-density array of real time information be processed, distributed, and presented to the vehicle operators and crew in near real time (i.e. low latency). Advances in display and sensor technologies are providing never before seen opportunities to supply large amounts of high fidelity imagery and video to the vehicle operators and crew in real time. To fully realize the advantages of these emerging display and sensor technologies, an underlying digital architecture must be developed that is capable of processing these large amounts of video and data from separate sensor systems and distributing it simultaneously within the vehicle to multiple vehicle operators and crew. This paper will examine the systems and software engineering efforts required to overcome these challenges and will address development of an affordable, integrated digital video architecture. The approaches evaluated will enable both current and future ground combat vehicle systems the flexibility to readily adopt emerging display and sensor technologies, while optimizing the Warfighter Machine Interface (WMI), minimizing lifecycle costs, and improve the survivability of the vehicle crew working in closed-hatch systems during complex ground combat operations.

  5. A paradigm shift towards low-nitrifying production systems: the role of biological nitrification inhibition (BNI).

    PubMed

    Subbarao, G V; Sahrawat, K L; Nakahara, K; Rao, I M; Ishitani, M; Hash, C T; Kishii, M; Bonnett, D G; Berry, W L; Lata, J C

    2013-07-01

    Agriculture is the single largest geo-engineering initiative that humans have initiated on planet Earth, largely through the introduction of unprecedented amounts of reactive nitrogen (N) into ecosystems. A major portion of this reactive N applied as fertilizer leaks into the environment in massive amounts, with cascading negative effects on ecosystem health and function. Natural ecosystems utilize many of the multiple pathways in the N cycle to regulate N flow. In contrast, the massive amounts of N currently applied to agricultural systems cycle primarily through the nitrification pathway, a single inefficient route that channels much of this reactive N into the environment. This is largely due to the rapid nitrifying soil environment of present-day agricultural systems. In this Viewpoint paper, the importance of regulating nitrification as a strategy to minimize N leakage and to improve N-use efficiency (NUE) in agricultural systems is highlighted. The ability to suppress soil nitrification by the release of nitrification inhibitors from plant roots is termed 'biological nitrification inhibition' (BNI), an active plant-mediated natural function that can limit the amount of N cycling via the nitrification pathway. The development of a bioassay using luminescent Nitrosomonas to quantify nitrification inhibitory activity from roots has facilitated the characterization of BNI function. Release of BNIs from roots is a tightly regulated physiological process, with extensive genetic variability found in selected crops and pasture grasses. Here, the current status of understanding of the BNI function is reviewed using Brachiaria forage grasses, wheat and sorghum to illustrate how BNI function can be utilized for achieving low-nitrifying agricultural systems. A fundamental shift towards ammonium (NH4(+))-dominated agricultural systems could be achieved by using crops and pastures with high BNI capacities. When viewed from an agricultural and environmental perspective, the BNI function in plants could potentially have a large influence on biogeochemical cycling and closure of the N loop in crop-livestock systems.

  6. A paradigm shift towards low-nitrifying production systems: the role of biological nitrification inhibition (BNI)

    PubMed Central

    Subbarao, G. V.; Sahrawat, K. L.; Nakahara, K.; Rao, I. M.; Ishitani, M.; Hash, C. T.; Kishii, M.; Bonnett, D. G.; Berry, W. L.; Lata, J. C.

    2013-01-01

    Background Agriculture is the single largest geo-engineering initiative that humans have initiated on planet Earth, largely through the introduction of unprecedented amounts of reactive nitrogen (N) into ecosystems. A major portion of this reactive N applied as fertilizer leaks into the environment in massive amounts, with cascading negative effects on ecosystem health and function. Natural ecosystems utilize many of the multiple pathways in the N cycle to regulate N flow. In contrast, the massive amounts of N currently applied to agricultural systems cycle primarily through the nitrification pathway, a single inefficient route that channels much of this reactive N into the environment. This is largely due to the rapid nitrifying soil environment of present-day agricultural systems. Scope In this Viewpoint paper, the importance of regulating nitrification as a strategy to minimize N leakage and to improve N-use efficiency (NUE) in agricultural systems is highlighted. The ability to suppress soil nitrification by the release of nitrification inhibitors from plant roots is termed ‘biological nitrification inhibition’ (BNI), an active plant-mediated natural function that can limit the amount of N cycling via the nitrification pathway. The development of a bioassay using luminescent Nitrosomonas to quantify nitrification inhibitory activity from roots has facilitated the characterization of BNI function. Release of BNIs from roots is a tightly regulated physiological process, with extensive genetic variability found in selected crops and pasture grasses. Here, the current status of understanding of the BNI function is reviewed using Brachiaria forage grasses, wheat and sorghum to illustrate how BNI function can be utilized for achieving low-nitrifying agricultural systems. A fundamental shift towards ammonium (NH4+)-dominated agricultural systems could be achieved by using crops and pastures with high BNI capacities. When viewed from an agricultural and environmental perspective, the BNI function in plants could potentially have a large influence on biogeochemical cycling and closure of the N loop in crop–livestock systems. PMID:23118123

  7. The Role of Fresh Water in Fish Processing in Antiquity

    NASA Astrophysics Data System (ADS)

    Sánchez López, Elena H.

    2018-04-01

    Water has been traditionally highlighted (together with fish and salt) as one of the essential elements in fish processing. Indeed, the need for large quantities of fresh water for the production of salted fish and fish sauces in Roman times is commonly asserted. This paper analyses water-related structures within Roman halieutic installations, arguing that their common presence in the best known fish processing installations in the Western Roman world should be taken as evidence of the use of fresh water during the production processes, even if its role in the activities carried out in those installations is not clear. In addition, the text proposes some first estimates on the amount of water that could be needed by those fish processing complexes for their functioning, concluding that water needs to be taken into account when reconstructing fish-salting recipes.

  8. Impact of data base structure in a successful in vitro-in vivo correlation for pharmaceutical products.

    PubMed

    Roudier, B; Davit, B; Schütz, H; Cardot, J-M

    2015-01-01

    The in vitro-in vivo correlation (IVIVC) (Food and Drug Administration 1997) aims to predict performances in vivo of a pharmaceutical formulation based on its in vitro characteristics. It is a complex process that (i) incorporates in a gradual and incremental way a large amount of information and (ii) requires information from different properties (formulation, analytical, clinical) and associated dedicated treatments (statistics, modeling, simulation). These results in many studies that are initiated and integrated into the specifications (quality target product profile, QTPP). This latter defines the appropriate experimental designs (quality by design, QbD) (Food and Drug Administration 2011, 2012) whose main objectives are determination (i) of key factors of development and manufacturing (critical process parameters, CPPs) and (ii) of critical points of physicochemical nature relating to active ingredients (API) and critical quality attribute (CQA) which may have implications in terms of efficiency, safety, and inoffensiveness for the patient, due to their non-inclusion. These processes generate a very large amount of data that is necessary to structure. In this context, the storage of information in a database (DB) and the management of this database (database management system, DBMS) become an important issue for the management of projects and IVIVC and more generally for development of new pharmaceutical forms. This article describes the implementation of a prototype object-oriented database (OODB) considered as a tool, which is helpful for decision taking, responding in a structured and consistent way to the issues of project management of IVIVC (including bioequivalence and bioavailability) (Food and Drug Administration 2003) necessary for the implementation of QTPP.

  9. CO2 and CH4 exchanges between land ecosystems and the atmosphere in northern high latitudes over the 21st century

    USGS Publications Warehouse

    Zhuang, Q.; Melillo, J.M.; Sarofim, M.C.; Kicklighter, D.W.; McGuire, A.D.; Felzer, B.S.; Sokolov, A.; Prinn, R.G.; Steudler, P.A.; Hu, S.

    2006-01-01

    Terrestrial ecosystems of the northern high latitudes (above 50??N) exchange large amounts of CO2 and CH4 with the atmosphere each year. Here we use a process-based model to estimate the budget of CO 2 and CH4 of the region for current climate conditions and for future scenarios by considering effects of permafrost dynamics, CO 2 fertilization of photosynthesis and fire. We find that currently the region is a net source of carbon to the atmosphere at 276 Tg C yr -1. We project that throughout the 21st century, the region will most likely continue as a net source of carbon and the source will increase by up to 473 Tg C yr-1 by the end of the century compared to the current emissions. However our coupled carbon and climate model simulations show that these emissions will exert relatively small radiative forcing on global climate system compared to large amounts of anthropogenic emissions. Copyright 2006 by the American Geophysical Union.

  10. STEPP--Search Tool for Exploration of Petri net Paths: a new tool for Petri net-based path analysis in biochemical networks.

    PubMed

    Koch, Ina; Schueler, Markus; Heiner, Monika

    2005-01-01

    To understand biochemical processes caused by, e. g., mutations or deletions in the genome, the knowledge of possible alternative paths between two arbitrary chemical compounds is of increasing interest for biotechnology, pharmacology, medicine, and drug design. With the steadily increasing amount of data from high-throughput experiments new biochemical networks can be constructed and existing ones can be extended, which results in many large metabolic, signal transduction, and gene regulatory networks. The search for alternative paths within these complex and large networks can provide a huge amount of solutions, which can not be handled manually. Moreover, not all of the alternative paths are generally of interest. Therefore, we have developed and implemented a method, which allows us to define constraints to reduce the set of all structurally possible paths to the truly interesting path set. The paper describes the search algorithm and the constraints definition language. We give examples for path searches using this dedicated special language for a Petri net model of the sucrose-to-starch breakdown in the potato tuber.

  11. STEPP - Search Tool for Exploration of Petri net Paths: A New Tool for Petri Net-Based Path Analysis in Biochemical Networks.

    PubMed

    Koch, Ina; Schüler, Markus; Heiner, Monika

    2011-01-01

    To understand biochemical processes caused by, e.g., mutations or deletions in the genome, the knowledge of possible alternative paths between two arbitrary chemical compounds is of increasing interest for biotechnology, pharmacology, medicine, and drug design. With the steadily increasing amount of data from high-throughput experiments new biochemical networks can be constructed and existing ones can be extended, which results in many large metabolic, signal transduction, and gene regulatory networks. The search for alternative paths within these complex and large networks can provide a huge amount of solutions, which can not be handled manually. Moreover, not all of the alternative paths are generally of interest. Therefore, we have developed and implemented a method, which allows us to define constraints to reduce the set of all structurally possible paths to the truly interesting path set. The paper describes the search algorithm and the constraints definition language. We give examples for path searches using this dedicated special language for a Petri net model of the sucrose-to-starch breakdown in the potato tuber. http://sanaga.tfh-berlin.de/~stepp/

  12. Advantages of Parallel Processing and the Effects of Communications Time

    NASA Technical Reports Server (NTRS)

    Eddy, Wesley M.; Allman, Mark

    2000-01-01

    Many computing tasks involve heavy mathematical calculations, or analyzing large amounts of data. These operations can take a long time to complete using only one computer. Networks such as the Internet provide many computers with the ability to communicate with each other. Parallel or distributed computing takes advantage of these networked computers by arranging them to work together on a problem, thereby reducing the time needed to obtain the solution. The drawback to using a network of computers to solve a problem is the time wasted in communicating between the various hosts. The application of distributed computing techniques to a space environment or to use over a satellite network would therefore be limited by the amount of time needed to send data across the network, which would typically take much longer than on a terrestrial network. This experiment shows how much faster a large job can be performed by adding more computers to the task, what role communications time plays in the total execution time, and the impact a long-delay network has on a distributed computing system.

  13. Emulation and Sobol' sensitivity analysis of an atmospheric dispersion model applied to the Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Girard, Sylvain; Mallet, Vivien; Korsakissok, Irène; Mathieu, Anne

    2016-04-01

    Simulations of the atmospheric dispersion of radionuclides involve large uncertainties originating from the limited knowledge of meteorological input data, composition, amount and timing of emissions, and some model parameters. The estimation of these uncertainties is an essential complement to modeling for decision making in case of an accidental release. We have studied the relative influence of a set of uncertain inputs on several outputs from the Eulerian model Polyphemus/Polair3D on the Fukushima case. We chose to use the variance-based sensitivity analysis method of Sobol'. This method requires a large number of model evaluations which was not achievable directly due to the high computational cost of Polyphemus/Polair3D. To circumvent this issue, we built a mathematical approximation of the model using Gaussian process emulation. We observed that aggregated outputs are mainly driven by the amount of emitted radionuclides, while local outputs are mostly sensitive to wind perturbations. The release height is notably influential, but only in the vicinity of the source. Finally, averaging either spatially or temporally tends to cancel out interactions between uncertain inputs.

  14. Automation of processing and photometric data analysis for transiting exoplanets observed with ESO NIR instrument HAWK-I

    NASA Astrophysics Data System (ADS)

    Blažek, M.; Kabáth, P.; Klocová, T.; Skarka, M.

    2018-04-01

    Nowadays, when amount of data still increases, it is necessary to automatise their processing. State-of-the-art instruments are capable to produce even tens of thousands of images during a single night. One of them is HAWK-I that is a part of Very Large Telescope of European Southern Observatory. This instrument works in near-infrared band. In my Master thesis, I dealt with developing a pipeline to process data obtained by the instrument. It is written in Python programming language using commands of IRAF astronomical software and it is developed directly for "Fast Photometry Mode" of HAWK-I. In this mode, a large number of data has been obtained during secondary eclipses of exoplanets by their host star. The pipeline was tested by a data set from sorting of the images to making a light curve. The data of WASP-18 system contained almost 40 000 images observed by using a filter centered at 2.09 μm wavelength and there is a plan to process other data sets. A goal of processing of WASP-18 and the other data sets is consecutive analysis of exoplanetary atmospheres of the observed systems.

  15. Development of a Real-Time Pulse Processing Algorithm for TES-Based X-Ray Microcalorimeters

    NASA Technical Reports Server (NTRS)

    Tan, Hui; Hennig, Wolfgang; Warburton, William K.; Doriese, W. Bertrand; Kilbourne, Caroline A.

    2011-01-01

    We report here a real-time pulse processing algorithm for superconducting transition-edge sensor (TES) based x-ray microcalorimeters. TES-based. microca1orimeters offer ultra-high energy resolutions, but the small volume of each pixel requires that large arrays of identical microcalorimeter pixe1s be built to achieve sufficient detection efficiency. That in turn requires as much pulse processing as possible must be performed at the front end of readout electronics to avoid transferring large amounts of data to a host computer for post-processing. Therefore, a real-time pulse processing algorithm that not only can be implemented in the readout electronics but also achieve satisfactory energy resolutions is desired. We have developed an algorithm that can be easily implemented. in hardware. We then tested the algorithm offline using several data sets acquired with an 8 x 8 Goddard TES x-ray calorimeter array and 2x16 NIST time-division SQUID multiplexer. We obtained an average energy resolution of close to 3.0 eV at 6 keV for the multiplexed pixels while preserving over 99% of the events in the data sets.

  16. Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.

    PubMed

    Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher

    2011-01-01

    Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume. Copyright © 2011 American Institute of Chemical Engineers (AIChE).

  17. Removal characteristics of pharmaceuticals and personal care products: Comparison between membrane bioreactor and various biological treatment processes.

    PubMed

    Park, Junwon; Yamashita, Naoyuki; Park, Chulhwi; Shimono, Tatsumi; Takeuchi, Daniel M; Tanaka, Hiroaki

    2017-07-01

    We investigated the concentrations of 57 target compounds in the different treatment units of various biological treatment processes in South Korea, including modified biological nutrient removal (BNR), anaerobic-anoxic-aerobic (A2O), and membrane bioreactor (MBR) systems, to elucidate the occurrence and removal fates of PPCPs in WWTPs. Biological treatment processes appeared to be most effective in eliminating most PPCPs, whereas some PPCPs were additionally removed by post-treatment. With the exception of the MBR process, the A2O system was effective for PPCPs removal. As a result, removal mechanisms were evaluated by calculating the mass balances in A2O and a lab-scale MBR process. The comparative study demonstrated that biodegradation was largely responsible for the improved removal performance found in lab-scale MBR (e.g., in removing bezafibrate, ketoprofen, and atenolol). Triclocarban, ciprofloxacin, levofloxacin and tetracycline were adsorbed in large amounts to MBR sludge. Increased biodegradability was also observed in lab-scale MBR, despite the highly adsorbable characteristics. The enhanced biodegradation potential seen in the MBR process thus likely plays a key role in eliminating highly adsorbable compounds as well as non-degradable or persistent PPCPs in other biological treatment processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Real Time Search Algorithm for Observation Outliers During Monitoring Engineering Constructions

    NASA Astrophysics Data System (ADS)

    Latos, Dorota; Kolanowski, Bogdan; Pachelski, Wojciech; Sołoducha, Ryszard

    2017-12-01

    Real time monitoring of engineering structures in case of an emergency of disaster requires collection of a large amount of data to be processed by specific analytical techniques. A quick and accurate assessment of the state of the object is crucial for a probable rescue action. One of the more significant evaluation methods of large sets of data, either collected during a specified interval of time or permanently, is the time series analysis. In this paper presented is a search algorithm for those time series elements which deviate from their values expected during monitoring. Quick and proper detection of observations indicating anomalous behavior of the structure allows to take a variety of preventive actions. In the algorithm, the mathematical formulae used provide maximal sensitivity to detect even minimal changes in the object's behavior. The sensitivity analyses were conducted for the algorithm of moving average as well as for the Douglas-Peucker algorithm used in generalization of linear objects in GIS. In addition to determining the size of deviations from the average it was used the so-called Hausdorff distance. The carried out simulation and verification of laboratory survey data showed that the approach provides sufficient sensitivity for automatic real time analysis of large amount of data obtained from different and various sensors (total stations, leveling, camera, radar).

  19. Calderas produced by hydromagmatic eruptions through permafrost in northwest Alaska

    NASA Technical Reports Server (NTRS)

    Beget, J. E.

    1993-01-01

    Most hydromagmatic eruptions on Earth are generated by interactions of lava and ground or surface water. This eruptive process typically produces craters 0.1-1 km in diameter, although a few as large as 1-2 km were described. In contrast, a series of Pleistocene hydromagmatic eruptions through 80-100-m-thick permafrost on the Seward Peninsula of Alaska produced four craters 3-8 km in diameter. These craters, called the Espenberg maars, are the four largest maars known on Earth. The thermodynamic properties of ground ice influence the rate and amount of water melted during the course of the eruption. Large quantities of water are present, but only small amounts can be melted at any time to interact with magma. This would tend to produce sustained and highly explosive low water/magma (fuel-coolant) ratios during the eruptions. An area of 400 km(sub 2) around the Alaskan maars shows strong reductions in the density of thaw lakes, ground ice, and other surface manifestations of permafrost because of deep burial by coeval tephra falls. The unusually large Espenberg maars are the first examples of calderas produced by hydromagmatic eruptions. These distinctive landforms can apparently be used as an indicator of the presence of permafrost at the time of eruption.

  20. Consolidated View on Space Software Engineering Problems - An Empirical Study

    NASA Astrophysics Data System (ADS)

    Silva, N.; Vieira, M.; Ricci, D.; Cotroneo, D.

    2015-09-01

    Independent software verification and validation (ISVV) has been a key process for engineering quality assessment for decades, and is considered in several international standards. The “European Space Agency (ESA) ISVV Guide” is used for the European Space market to drive the ISVV tasks and plans, and to select applicable tasks and techniques. Software artefacts have room for improvement due to the amount if issues found during ISVV tasks. This article presents the analysis of the results of a large set of ISVV issues originated from three different ESA missions-amounting to more than 1000 issues. The study presents the main types, triggers and impacts related to the ISVV issues found and sets the path for a global software engineering improvement based on the most common deficiencies identified for space projects.

  1. Stochastic theory of log-periodic patterns

    NASA Astrophysics Data System (ADS)

    Canessa, Enrique

    2000-12-01

    We introduce an analytical model based on birth-death clustering processes to help in understanding the empirical log-periodic corrections to power law scaling and the finite-time singularity as reported in several domains including rupture, earthquakes, world population and financial systems. In our stochastic theory log-periodicities are a consequence of transient clusters induced by an entropy-like term that may reflect the amount of co-operative information carried by the state of a large system of different species. The clustering completion rates for the system are assumed to be given by a simple linear death process. The singularity at t0 is derived in terms of birth-death clustering coefficients.

  2. Approaches to advancescientific understanding of macrosystems ecology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levy, Ofir; Ball, Becky; Bond-Lamberty, Benjamin

    Macrosystem ecological studies inherently investigate processes that interact across multiple spatial and temporal scales, requiring intensive sampling and massive amounts of data from diverse sources to incorporate complex cross-scale and hierarchical interactions. Inherent challenges associated with these characteristics include high computational demands, data standardization and assimilation, identification of important processes and scales without prior knowledge, and the need for large, cross-disciplinary research teams that conduct long-term studies. Therefore, macrosystem ecology studies must utilize a unique set of approaches that are capable of encompassing these methodological characteristics and associated challenges. Several case studies demonstrate innovative methods used in current macrosystem ecologymore » studies.« less

  3. Program Helps Decompose Complex Design Systems

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Hall, Laura E.

    1995-01-01

    DeMAID (Design Manager's Aid for Intelligent Decomposition) computer program is knowledge-based software system for ordering sequence of modules and identifying possible multilevel structure for design problems such as large platforms in outer space. Groups modular subsystems on basis of interactions among them. Saves considerable amount of money and time in total design process, particularly in new design problem in which order of modules has not been defined. Originally written for design problems, also applicable to problems containing modules (processes) that take inputs and generate outputs. Available in three machine versions: Macintosh written in Symantec's Think C 3.01, Sun, and SGI IRIS in C language.

  4. Detergent composition comprising a cellulase containing cell-free fermentate produced from microorganism ATCC 55702 or mutant thereof

    DOEpatents

    Dees, H.C.

    1998-07-14

    Bacteria which produce large amounts of a cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.

  5. Continuous process for singlet oxygenation of hydrophobic substrates in microemulsion using a pervaporation membrane.

    PubMed

    Caron, Laurent; Nardello, Véronique; Mugge, José; Hoving, Erik; Alsters, Paul L; Aubry, Jean-Marie

    2005-02-15

    Chemically generated singlet oxygen (1O2, 1Deltag) is able to oxidize a great deal of hydrophobic substrates from molybdate-catalyzed hydrogen peroxide decomposition, provided a suitable reaction medium such as a microemulsion system is used. However, high substrate concentrations or poorly reactive organics require large amounts of H2O2 that generate high amounts of water and thus destabilize the system. We report results obtained on combining dark singlet oxygenation of hydrophobic substrates in microemulsions with a pervaporation membrane process. To avoid composition alterations after addition of H2O2 during the peroxidation, the reaction mixture circulates through a ceramic membrane module that enables a partial and selective dewatering of the microemulsion. Optimization phase diagrams of sodium molybdate/water/alcohol/anionic surfactant/organic solvent have been elaborated to maximize the catalyst concentration and therefore the reaction rate. The membrane selectivity towards the mixture constituents has been investigated showing that a high retention is observed for the catalyst, for organic solvents and hydrophobic substrates, but not for n-propanol (cosurfactant) and water. The efficiency of such a process is illustrated with the peroxidation of a poorly reactive substrate, viz., beta-pinene.

  6. An exploratory investigation of polar organic compounds in waters from a lead–zinc mine and mill complex

    USGS Publications Warehouse

    Rostad, Colleen E.; Schmitt, Christopher J.; Schumacher, John G.; Leiker, Thomas J.

    2011-01-01

    Surface water samples were collected in 2006 from a lead mine-mill complex in Missouri to investigate possible organic compounds coming from the milling process. Water samples contained relatively high concentrations of dissolved organic carbon (DOC; greater than 20 mg/l) for surface waters but were colorless, implying a lack of naturally occurring aquatic humic or fulvic acids. Samples were extracted by three different types of solid-phase extraction and analyzed by electrospray ionization/mass spectrometry. Because large amounts of xanthate complexation reagents are used in the milling process, techniques were developed to extract and analyze for sodium isopropyl xanthate and sodium ethyl xanthate. Although these xanthate reagents were not found, trace amounts of the degradates, isopropyl xanthyl thiosulfonate and isopropyl xanthyl sulfonate, were found in most locations sampled, including the tailings pond downstream. Dioctyl sulfosuccinate, a surfactant and process filtering aid, was found at concentrations estimated at 350 μg/l at one mill outlet, but not downstream. Release of these organic compounds downstream from lead-zinc mine and milling areas has not previously been reported. A majority of the DOC remains unidentified.

  7. Mining manufacturing data for discovery of high productivity process characteristics.

    PubMed

    Charaniya, Salim; Le, Huong; Rangwala, Huzefa; Mills, Keri; Johnson, Kevin; Karypis, George; Hu, Wei-Shou

    2010-06-01

    Modern manufacturing facilities for bioproducts are highly automated with advanced process monitoring and data archiving systems. The time dynamics of hundreds of process parameters and outcome variables over a large number of production runs are archived in the data warehouse. This vast amount of data is a vital resource to comprehend the complex characteristics of bioprocesses and enhance production robustness. Cell culture process data from 108 'trains' comprising production as well as inoculum bioreactors from Genentech's manufacturing facility were investigated. Each run constitutes over one-hundred on-line and off-line temporal parameters. A kernel-based approach combined with a maximum margin-based support vector regression algorithm was used to integrate all the process parameters and develop predictive models for a key cell culture performance parameter. The model was also used to identify and rank process parameters according to their relevance in predicting process outcome. Evaluation of cell culture stage-specific models indicates that production performance can be reliably predicted days prior to harvest. Strong associations between several temporal parameters at various manufacturing stages and final process outcome were uncovered. This model-based data mining represents an important step forward in establishing a process data-driven knowledge discovery in bioprocesses. Implementation of this methodology on the manufacturing floor can facilitate a real-time decision making process and thereby improve the robustness of large scale bioprocesses. 2010 Elsevier B.V. All rights reserved.

  8. Processing of waste material of radix physochlainae for preparation of fine chemicals after extraction

    NASA Astrophysics Data System (ADS)

    He, A.; Yohannes, A.; Feng, X. T.; Yao, S.

    2017-02-01

    Waste residues of Chinese traditional medicine radix physochlainae (Huashanshen) contain a large amount of hemicelluloses after extraction. After the removal of the cellulose and lignin, main components of the solution are different degree of hydrolysis products of hemicelluloses. In the degradation process, hemicelluloses firstly become pentose, and then pentose loses 3 molecules of water and turns into furfural. This study explored a series of conditions of the method; finally the yield of furfural can reach 8.5% (calculated with the weight of raw residues) under the condition of pH of 0.2-0.3, temperature of 104-106°C, hydrolysis duration for 10 minutes. Furfural can be further processed to be resin materials.

  9. Simulation and analysis of main steam control system based on heat transfer calculation

    NASA Astrophysics Data System (ADS)

    Huang, Zhenqun; Li, Ruyan; Feng, Zhongbao; Wang, Songhan; Li, Wenbo; Cheng, Jiwei; Jin, Yingai

    2018-05-01

    In this paper, after thermal power plant 300MW boiler was studied, mat lab was used to write calculation program about heat transfer process between the main steam and boiler flue gas and amount of water was calculated to ensure the main steam temperature keeping in target temperature. Then heat transfer calculation program was introduced into Simulink simulation platform based on control system multiple models switching and heat transfer calculation. The results show that multiple models switching control system based on heat transfer calculation not only overcome the large inertia of main stream temperature, a large hysteresis characteristic of main stream temperature, but also adapted to the boiler load changing.

  10. GraphReduce: Processing Large-Scale Graphs on Accelerator-Based Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Dipanjan; Song, Shuaiwen; Agarwal, Kapil

    2015-11-15

    Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the host andmore » device.« less

  11. A human factors approach to range scheduling for satellite control

    NASA Technical Reports Server (NTRS)

    Wright, Cameron H. G.; Aitken, Donald J.

    1991-01-01

    Range scheduling for satellite control presents a classical problem: supervisory control of a large-scale dynamic system, with unwieldy amounts of interrelated data used as inputs to the decision process. Increased automation of the task, with the appropriate human-computer interface, is highly desirable. The development and user evaluation of a semi-automated network range scheduling system is described. The system incorporates a synergistic human-computer interface consisting of a large screen color display, voice input/output, a 'sonic pen' pointing device, a touchscreen color CRT, and a standard keyboard. From a human factors standpoint, this development represents the first major improvement in almost 30 years to the satellite control network scheduling task.

  12. Comparing forest fragmentation and its drivers in China and the USA with Globcover v2.2

    USGS Publications Warehouse

    Chen, Mingshi; Mao, Lijun; Zhou, Chunguo; Vogelmann, James E.; Zhu, Zhiliang

    2010-01-01

    Forest loss and fragmentation are of major concern to the international community, in large part because they impact so many important environmental processes. The main objective of this study was to assess the differences in forest fragmentation patterns and drivers between China and the conterminous United States (USA). Using the latest 300-m resolution global land cover product, Globcover v2.2, a comparative analysis of forest fragmentation patterns and drivers was made. The fragmentation patterns were characterized by using a forest fragmentation model built on the sliding window analysis technique in association with landscape indices. Results showed that China’s forests were substantially more fragmented than those of the USA. This was evidenced by a large difference in the amount of interior forest area share, with China having 48% interior forest versus the 66% for the USA. China’s forest fragmentation was primarily attributed to anthropogenic disturbances, driven particularly by agricultural expansion from an increasing and large population, as well as poor forest management practices. In contrast, USA forests were principally fragmented by natural land cover types. However, USA urban sprawl contributed more to forest fragmentation than in China. This is closely tied to the USA’s economy, lifestyle and institutional processes. Fragmentation maps were generated from this study, which provide valuable insights and implications regarding habitat planning for rare and endangered species. Such maps enable development of strategic plans for sustainable forest management by identifying areas with high amounts of human-induced fragmentation, which improve risk assessments and enable better targeting for protection and remediation efforts. Because forest fragmentation is a long-term, complex process that is highly related to political, institutional, economic and philosophical arenas, both nations need to take effective and comprehensive measures to mitigate the negative effects of forest loss and fragmentation on the existing forest ecosystems.

  13. A novel method to reduce time investment when processing videos from camera trap studies.

    PubMed

    Swinnen, Kristijn R R; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

    2014-01-01

    Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs.

  14. Comparing forest fragmentation and its drivers in China and the USA with Globcover v2.2.

    PubMed

    Li, Mingshi; Mao, Lijun; Zhou, Chunguo; Vogelmann, James E; Zhu, Zhiliang

    2010-12-01

    Forest loss and fragmentation are of major concern to the international community, in large part because they impact so many important environmental processes. The main objective of this study was to assess the differences in forest fragmentation patterns and drivers between China and the conterminous United States (USA). Using the latest 300-m resolution global land cover product, Globcover v2.2, a comparative analysis of forest fragmentation patterns and drivers was made. The fragmentation patterns were characterized by using a forest fragmentation model built on the sliding window analysis technique in association with landscape indices. Results showed that China's forests were substantially more fragmented than those of the USA. This was evidenced by a large difference in the amount of interior forest area share, with China having 48% interior forest versus the 66% for the USA. China's forest fragmentation was primarily attributed to anthropogenic disturbances, driven particularly by agricultural expansion from an increasing and large population, as well as poor forest management practices. In contrast, USA forests were principally fragmented by natural land cover types. However, USA urban sprawl contributed more to forest fragmentation than in China. This is closely tied to the USA's economy, lifestyle and institutional processes. Fragmentation maps were generated from this study, which provide valuable insights and implications regarding habitat planning for rare and endangered species. Such maps enable development of strategic plans for sustainable forest management by identifying areas with high amounts of human-induced fragmentation, which improve risk assessments and enable better targeting for protection and remediation efforts. Because forest fragmentation is a long-term, complex process that is highly related to political, institutional, economic and philosophical arenas, both nations need to take effective and comprehensive measures to mitigate the negative effects of forest loss and fragmentation on the existing forest ecosystems. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. Angular Momentum Transport in Thin Magnetically Arrested Disks

    NASA Astrophysics Data System (ADS)

    Marshall, Megan D.; Avara, Mark J.; McKinney, Jonathan C.

    2018-05-01

    In accretion disks with large-scale ordered magnetic fields, the magnetorotational instability (MRI) is marginally suppressed, so other processes may drive angular momentum transport leading to accretion. Accretion could then be driven by large-scale magnetic fields via magnetic braking, and large-scale magnetic flux can build-up onto the black hole and within the disk leading to a magnetically-arrested disk (MAD). Such a MAD state is unstable to the magnetic Rayleigh-Taylor (RT) instability, which itself leads to vigorous turbulence and the emergence of low-density highly-magnetized bubbles. This instability was studied in a thin (ratio of half-height H to radius R, H/R ≈ 0.1) MAD simulation, where it has a more dramatic effect on the dynamics of the disk than for thicker disks. Large amounts of flux are pushed off the black hole into the disk, leading to temporary decreases in stress, then this flux is reprocessed as the stress increases again. Throughout this process, we find that the dominant component of the stress is due to turbulent magnetic fields, despite the suppression of the axisymmetric MRI and the dominant presence of large-scale magnetic fields. This suggests that the magnetic RT instability plays a significant role in driving angular momentum transport in MADs.

  16. Large optical glass blanks for the ELT generation

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Petzold, Uwe; Dietrich, Volker; Wittmer, Volker; Rexius, Olga

    2016-07-01

    The upcoming extremely large telescope projects like the E-ELT, TMT or GMT telescopes require not only large amount of mirror blank substrates but have also sophisticated instrument setups. Common instrument components are atmospheric dispersion correctors that compensate for the varying atmospheric path length depending on the telescope inclination angle. These elements consist usually of optical glass blanks that have to be large due to the increased size of the focal beam of the extremely large telescopes. SCHOTT has a long experience in producing and delivering large optical glass blanks for astronomical applications up to 1 m and in homogeneity grades up to H3 quality in the past. The most common optical glass available in large formats is SCHOTT N-BK7. But other glass types like F2 or LLF1 can also be produced in formats up to 1 m. The extremely large telescope projects partly demand atmospheric dispersion components even in sizes beyond 1m up to a range of 1.5 m diameter. The production of such large homogeneous optical glass banks requires tight control of all process steps. To cover this demand in the future SCHOTT initiated a research project to improve the large optical blank production process steps from melting to annealing and measurement. Large optical glass blanks are measured in several sub-apertures that cover the total clear aperture of the application. With SCHOTT's new stitching software it is now possible to combine individual sub-aperture measurements to a total homogeneity map of the blank. In this presentation first results will be demonstrated.

  17. Cold Trap Dismantling and Sodium Removal at a Fast Breeder Reactor - 12327

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, A.; Petrick, H.; Stutz, U.

    2012-07-01

    The first German prototype Fast Breeder Nuclear Reactor (KNK) is currently being dismantled after being the only operating Fast Breeder-type reactor in Germany. As this reactor type used sodium as a coolant in its primary and secondary circuit, seven cold traps containing various amounts of partially activated sodium needed to be disposed of as part of the dismantling. The resulting combined difficulties of radioactive contamination and high chemical reactivity were handled by treating the cold traps differently depending on their size and the amount of sodium contained inside. Six small cold traps were processed onsite by cutting them up intomore » small parts using a band saw under a protective atmosphere. The sodium was then converted to sodium hydroxide by using water. The remaining large cold trap could not be handled in the same way due to its dimensions (2.9 m x 1.1 m) and the declared amount of sodium inside (1,700 kg). It was therefore manually dismantled inside a large box filled with a protective atmosphere, while the resulting pieces were packaged for later burning in a special facility. The experiences gained by KNK during this process may be advantageous for future dismantling projects in similar sodium-cooled reactors worldwide. The dismantling of a prototype fast breeder reactor provides the challenge not only to dismantle radioactive materials but also to handle sodium-contaminated or sodium-containing components. The treatment of sodium requires additional equipment and installations to ensure a safe handling. Since it is not permitted to bring sodium into a repository, all sodium has to be neutralized either through a controlled reaction with water or by incinerating. The resulting components can be disposed of as normal radioactive waste with no further conditions. The handling of sodium needs skilled and experienced workers to minimize the inherent risks. And the example of the disposal of the large KNK cold trap shows the interaction with others and also foreign decommissioning projects can provide solutions with were unknown before. (authors)« less

  18. The distribution of soil phosphorus for global biogeochemical modeling

    DOE PAGES

    Yang, Xiaojuan; Post, Wilfred M.; Thornton, Peter E.; ...

    2013-04-16

    We discuss that phosphorus (P) is a major element required for biological activity in terrestrial ecosystems. Although the total P content in most soils can be large, only a small fraction is available or in an organic form for biological utilization because it is bound either in incompletely weathered mineral particles, adsorbed on mineral surfaces, or, over the time of soil formation, made unavailable by secondary mineral formation (occluded). In order to adequately represent phosphorus availability in global biogeochemistry–climate models, a representation of the amount and form of P in soils globally is required. We develop an approach that buildsmore » on existing knowledge of soil P processes and databases of parent material and soil P measurements to provide spatially explicit estimates of different forms of naturally occurring soil P on the global scale. We assembled data on the various forms of phosphorus in soils globally, chronosequence information, and several global spatial databases to develop a map of total soil P and the distribution among mineral bound, labile, organic, occluded, and secondary P forms in soils globally. The amount of P, to 50cm soil depth, in soil labile, organic, occluded, and secondary pools is 3.6 ± 3, 8.6 ± 6, 12.2 ± 8, and 3.2 ± 2 Pg P (Petagrams of P, 1 Pg = 1 × 10 15g) respectively. The amount in soil mineral particles to the same depth is estimated at 13.0 ± 8 Pg P for a global soil total of 40.6 ± 18 Pg P. The large uncertainty in our estimates reflects our limited understanding of the processes controlling soil P transformations during pedogenesis and a deficiency in the number of soil P measurements. In spite of the large uncertainty, the estimated global spatial variation and distribution of different soil P forms presented in this study will be useful for global biogeochemistry models that include P as a limiting element in biological production by providing initial estimates of the available soil P for plant uptake and microbial utilization.« less

  19. Use of tropical maize for bioethanol production

    USDA-ARS?s Scientific Manuscript database

    Tropical maize is an alternative energy crop being considered as a feedstock for bioethanol production in the North Central and Midwest United States. Tropical maize is advantageous because it produces large amounts of soluble sugars in its stalks, creates a large amount of biomass, and requires lo...

  20. Variability of mass-wasting processes in the tectonically-controlled Calabro Tyrrhenian continental margin (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Casalbore, D.; Bosman, A.; Casas, D.; Chiocci, F. L.; Martorelli, E.; Ridente, D.

    2017-12-01

    The recent collection of multibeam bathymetry and single-channel seismic profiles on the Calabro-Tyrrhenian continental margin in Southern Italy allowed us to depict a large suite of mass-wasting processes, ranging from gullies up to shelf-indenting canyon system along with over 400 landslide scars, affecting the 52% of the entire area. In detail, slide scars occur from the coast down to -1700 m, with mobilized volumes ranging from some hundreds of m3 up to tens of millions of m3. On the whole, they affect an area of >85 km2, being able to mobilize approximately 1.4 km3. These slides also show a large variability of features both in the headwall, translational and toe domain, thus providing useful insights for a better understanding of their failure and post-failure behavior. The aim of this study is to show the magnitude-frequency relationship of this large amount of slides in order to quantify a range of probabilities for the occurrence of new landslide events as well as to illustrate the main mechanisms that control their development and emplacement.

  1. QUAL-NET, a high temporal-resolution eutrophication model for large hydrographic networks

    NASA Astrophysics Data System (ADS)

    Minaudo, Camille; Curie, Florence; Jullian, Yann; Gassama, Nathalie; Moatar, Florentina

    2018-04-01

    To allow climate change impact assessment of water quality in river systems, the scientific community lacks efficient deterministic models able to simulate hydrological and biogeochemical processes in drainage networks at the regional scale, with high temporal resolution and water temperature explicitly determined. The model QUALity-NETwork (QUAL-NET) was developed and tested on the Middle Loire River Corridor, a sub-catchment of the Loire River in France, prone to eutrophication. Hourly variations computed efficiently by the model helped disentangle the complex interactions existing between hydrological and biological processes across different timescales. Phosphorus (P) availability was the most constraining factor for phytoplankton development in the Loire River, but simulating bacterial dynamics in QUAL-NET surprisingly evidenced large amounts of organic matter recycled within the water column through the microbial loop, which delivered significant fluxes of available P and enhanced phytoplankton growth. This explained why severe blooms still occur in the Loire River despite large P input reductions since 1990. QUAL-NET could be used to study past evolutions or predict future trajectories under climate change and land use scenarios.

  2. Making Activated Carbon by Wet Pressurized Pyrolysis

    NASA Technical Reports Server (NTRS)

    Fisher, John W.; Pisharody, Suresh; Wignarajah, K.; Moran, Mark

    2006-01-01

    A wet pressurized pyrolysis (wet carbonization) process has been invented as a means of producing activated carbon from a wide variety of inedible biomass consisting principally of plant wastes. The principal intended use of this activated carbon is room-temperature adsorption of pollutant gases from cooled incinerator exhaust streams. Activated carbon is highly porous and has a large surface area. The surface area depends strongly on the raw material and the production process. Coconut shells and bituminous coal are the primary raw materials that, until now, were converted into activated carbon of commercially acceptable quality by use of traditional production processes that involve activation by use of steam or carbon dioxide. In the wet pressurized pyrolysis process, the plant material is subjected to high pressure and temperature in an aqueous medium in the absence of oxygen for a specified amount of time to break carbon-oxygen bonds in the organic material and modify the structure of the material to obtain large surface area. Plant materials that have been used in demonstrations of the process include inedible parts of wheat, rice, potato, soybean, and tomato plants. The raw plant material is ground and mixed with a specified proportion of water. The mixture is placed in a stirred autoclave, wherein it is pyrolized at a temperature between 450 and 590 F (approximately between 230 and 310 C) and a pressure between 1 and 1.4 kpsi (approximately between 7 and 10 MPa) for a time between 5 minutes and 1 hour. The solid fraction remaining after wet carbonization is dried, then activated at a temperature of 500 F (260 C) in nitrogen gas. The activated carbon thus produced is comparable to commercial activated carbon. It can be used to adsorb oxides of sulfur, oxides of nitrogen, and trace amounts of hydrocarbons, any or all of which can be present in flue gas. Alternatively, the dried solid fraction can be used, even without the activation treatment, to absorb oxides of nitrogen.

  3. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    PubMed Central

    Vukićević, Milan

    2014-01-01

    Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101

  4. Uric Acid and Antioxidant Effects of Wine

    PubMed Central

    Boban, Mladen; Modun, Darko

    2010-01-01

    The aim of this article is to review the role of uric acid in the context of antioxidant effects of wine and its potential implication to human health. We described and discussed the mechanisms of increase in plasma antioxidant capacity after consumption of moderate amounts of wine. Because this effect is largely contributed by acute elevation in plasma uric acid, we paid special attention to wine constituents and metabolic processes that are likely to be involved in uric acid elevation. PMID:20162741

  5. Retraining Attentional Bias to Unhealthy Food Cues

    DTIC Science & Technology

    2013-06-26

    Obesity ’Overweight’ and ’Obese’ describe levels of body fat (i.e., adiposity) that exceed ranges of weight that are considered healthy for a given...to an accumulation of excess body fat . Obesity is further divided into Class I (30.0 - 34.9 kg/m2) and Class II obesity (35.0 - 39.9 kg/m2) and Class...weight gain through excessive consumption of palatable foods that contain large amounts of fat and sugar. Processed foods such as those from fast food

  6. Mutual information based feature selection for medical image retrieval

    NASA Astrophysics Data System (ADS)

    Zhi, Lijia; Zhang, Shaomin; Li, Yan

    2018-04-01

    In this paper, authors propose a mutual information based method for lung CT image retrieval. This method is designed to adapt to different datasets and different retrieval task. For practical applying consideration, this method avoids using a large amount of training data. Instead, with a well-designed training process and robust fundamental features and measurements, the method in this paper can get promising performance and maintain economic training computation. Experimental results show that the method has potential practical values for clinical routine application.

  7. The evolution of educational information systems and nurse faculty roles.

    PubMed

    Nelson, Ramona; Meyers, Linda; Rizzolo, Mary Anne; Rutar, Pamela; Proto, Marcia B; Newbold, Susan

    2006-01-01

    Institutions of higher education are purchasing and/or designing sophisticated administrative information systems to manage such functions as the application, admissions, and registration process, grants management, student records, and classroom scheduling. Although faculty also manage large amounts of data, few automated systems have been created to help faculty improve teaching and learning through the management of information related to individual students, the curriculum, educational programs, and program evaluation. This article highlights the potential benefits that comprehensive educational information systems offer nurse faculty.

  8. Energy carries information

    NASA Astrophysics Data System (ADS)

    Ilgin, Irfan; Yang, I.-Sheng

    2014-08-01

    We show that for every qubit of quantum information, there is a well-defined notion of "the amount of energy that carries it," because it is a conserved quantity. This generalizes to larger systems and any conserved quantities: the eigenvalue spectrum of conserved charges has to be preserved while transferring quantum information. It is possible to "apparently" violate these conservations by losing a small fraction of information, but that must invoke a specific process which requires a large scale coherence. We discuss its implication regarding the black hole information paradox.

  9. Quantitative mapping of rainfall rates over the oceans utilizing Nimbus-5 ESMR data

    NASA Technical Reports Server (NTRS)

    Rao, M. S. V.; Abbott, W. V.

    1976-01-01

    The electrically scanning microwave radiometer (ESMR) data from the Nimbus 5 satellite was used to deduce estimates of precipitation amount over the oceans. An atlas of the global oceanic rainfall was prepared and the global rainfall maps analyzed and related to available ground truth information as well as to large scale processes in the atmosphere. It was concluded that the ESMR system provides the most reliable and direct approach yet known for the estimation of rainfall over sparsely documented, wide oceanic regions.

  10. 1H-NMR and HPLC studies of the changes involved in volume regulation in the muscle fibres of the crab, Hemigrapsus edwardsi.

    PubMed

    Bedford, J J; Smith, R A; Thomas, M; Leader, J P

    1991-01-01

    1. The process of cell volume readjustment, during adaptation to salinity changes, in muscle fibres of the euryhaline New Zealand shore crab, Hemigrapsus edwardsi, involve large changes in the amounts of free amino acid. 2. These are taurine, proline, alanine, arginine, glutamic acid, glycine and serine. 3. These changes may be quantified by High Performance Liquid Chromatography, and qualitatively demonstrated by proton nuclear magnetic resonance spectroscopy.

  11. Universal SaaS platform of internet of things for real-time monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tongke; Wu, Gang

    2018-04-01

    Real-time monitoring service, as a member of the IoT (Internet of Things) service, has a wide range application scenario. To support rapid construction and deployment of applications and avoid repetitive development works in these processes, this paper designs and develops a universal SaaS platform of IoT for real-time monitoring. Evaluation shows that this platform can provide SaaS service to multiple tenants and achieve high real-time performance under the situation of large amount of device access.

  12. Experience in highly parallel processing using DAP

    NASA Technical Reports Server (NTRS)

    Parkinson, D.

    1987-01-01

    Distributed Array Processors (DAP) have been in day to day use for ten years and a large amount of user experience has been gained. The profile of user applications is similar to that of the Massively Parallel Processor (MPP) working group. Experience has shown that contrary to expectations, highly parallel systems provide excellent performance on so-called dirty problems such as the physics part of meteorological codes. The reasons for this observation are discussed. The arguments against replacing bit processors with floating point processors are also discussed.

  13. The dynamic ejecta of compact object mergers and eccentric collisions.

    PubMed

    Rosswog, Stephan

    2013-06-13

    Compact object mergers eject neutron-rich matter in a number of ways: by the dynamical ejection mediated by gravitational torques, as neutrino-driven winds, and probably also a good fraction of the resulting accretion disc finally becomes unbound by a combination of viscous and nuclear processes. If compact binary mergers indeed produce gamma-ray bursts, there should also be an interaction region where an ultra-relativistic outflow interacts with the neutrino-driven wind and produces moderately relativistic ejecta. Each type of ejecta has different physical properties, and therefore plays a different role for nucleosynthesis and for the electromagnetic (EM) transients that go along with compact object encounters. Here, we focus on the dynamic ejecta and present results for over 30 hydrodynamical simulations of both gravitational wave-driven mergers and parabolic encounters as they may occur in globular clusters. We find that mergers eject approximately 1 per cent of a Solar mass of extremely neutron-rich material. The exact amount, as well as the ejection velocity, depends on the involved masses with asymmetric systems ejecting more material at higher velocities. This material undergoes a robust r-process and both ejecta amount and abundance pattern are consistent with neutron star mergers being a major source of the 'heavy' (A>130) r-process isotopes. Parabolic collisions, especially those between neutron stars and black holes, eject substantially larger amounts of mass, and therefore cannot occur frequently without overproducing gala- ctic r-process matter. We also discuss the EM transients that are powered by radioactive decays within the ejecta ('macronovae'), and the radio flares that emerge when the ejecta dissipate their large kinetic energies in the ambient medium.

  14. A New Framework and Prototype Solution for Clinical Decision Support and Research in Genomics and Other Data-intensive Fields of Medicine

    PubMed Central

    Evans, James P.; Wilhelmsen, Kirk C.; Berg, Jonathan; Schmitt, Charles P.; Krishnamurthy, Ashok; Fecho, Karamarie; Ahalt, Stanley C.

    2016-01-01

    Introduction: In genomics and other fields, it is now possible to capture and store large amounts of data in electronic medical records (EMRs). However, it is not clear if the routine accumulation of massive amounts of (largely uninterpretable) data will yield any health benefits to patients. Nevertheless, the use of large-scale medical data is likely to grow. To meet emerging challenges and facilitate optimal use of genomic data, our institution initiated a comprehensive planning process that addresses the needs of all stakeholders (e.g., patients, families, healthcare providers, researchers, technical staff, administrators). Our experience with this process and a key genomics research project contributed to the proposed framework. Framework: We propose a two-pronged Genomic Clinical Decision Support System (CDSS) that encompasses the concept of the “Clinical Mendeliome” as a patient-centric list of genomic variants that are clinically actionable and introduces the concept of the “Archival Value Criterion” as a decision-making formalism that approximates the cost-effectiveness of capturing, storing, and curating genome-scale sequencing data. We describe a prototype Genomic CDSS that we developed as a first step toward implementation of the framework. Conclusion: The proposed framework and prototype solution are designed to address the perspectives of stakeholders, stimulate effective clinical use of genomic data, drive genomic research, and meet current and future needs. The framework also can be broadly applied to additional fields, including other ‘-omics’ fields. We advocate for the creation of a Task Force on the Clinical Mendeliome, charged with defining Clinical Mendeliomes and drafting clinical guidelines for their use. PMID:27195307

  15. Interactive Exploration on Large Genomic Datasets.

    PubMed

    Tu, Eric

    2016-01-01

    The prevalence of large genomics datasets has made the the need to explore this data more important. Large sequencing projects like the 1000 Genomes Project [1], which reconstructed the genomes of 2,504 individuals sampled from 26 populations, have produced over 200TB of publically available data. Meanwhile, existing genomic visualization tools have been unable to scale with the growing amount of larger, more complex data. This difficulty is acute when viewing large regions (over 1 megabase, or 1,000,000 bases of DNA), or when concurrently viewing multiple samples of data. While genomic processing pipelines have shifted towards using distributed computing techniques, such as with ADAM [4], genomic visualization tools have not. In this work we present Mango, a scalable genome browser built on top of ADAM that can run both locally and on a cluster. Mango presents a combination of different optimizations that can be combined in a single application to drive novel genomic visualization techniques over terabytes of genomic data. By building visualization on top of a distributed processing pipeline, we can perform visualization queries over large regions that are not possible with current tools, and decrease the time for viewing large data sets. Mango is part of the Big Data Genomics project at University of California-Berkeley [25] and is published under the Apache 2 license. Mango is available at https://github.com/bigdatagenomics/mango.

  16. Gene Expression Browser: Large-Scale and Cross-Experiment Microarray Data Management, Search & Visualization

    USDA-ARS?s Scientific Manuscript database

    The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...

  17. New Insights into Handling Missing Values in Environmental Epidemiological Studies

    PubMed Central

    Roda, Célina; Nicolis, Ioannis; Momas, Isabelle; Guihenneuc, Chantal

    2014-01-01

    Missing data are unavoidable in environmental epidemiologic surveys. The aim of this study was to compare methods for handling large amounts of missing values: omission of missing values, single and multiple imputations (through linear regression or partial least squares regression), and a fully Bayesian approach. These methods were applied to the PARIS birth cohort, where indoor domestic pollutant measurements were performed in a random sample of babies' dwellings. A simulation study was conducted to assess performances of different approaches with a high proportion of missing values (from 50% to 95%). Different simulation scenarios were carried out, controlling the true value of the association (odds ratio of 1.0, 1.2, and 1.4), and varying the health outcome prevalence. When a large amount of data is missing, omitting these missing data reduced statistical power and inflated standard errors, which affected the significance of the association. Single imputation underestimated the variability, and considerably increased risk of type I error. All approaches were conservative, except the Bayesian joint model. In the case of a common health outcome, the fully Bayesian approach is the most efficient approach (low root mean square error, reasonable type I error, and high statistical power). Nevertheless for a less prevalent event, the type I error is increased and the statistical power is reduced. The estimated posterior distribution of the OR is useful to refine the conclusion. Among the methods handling missing values, no approach is absolutely the best but when usual approaches (e.g. single imputation) are not sufficient, joint modelling approach of missing process and health association is more efficient when large amounts of data are missing. PMID:25226278

  18. Distribution and biokinetic analysis of 210Pb and 210Po in poultry due to ingestion of dicalcium phosphate.

    PubMed

    Casacuberta, N; Traversa, F L; Masqué, P; Garcia-Orellana, J; Anguita, M; Gasa, J; Garcia-Tenorio, R

    2010-09-15

    Dicalcium phosphate (DCP) is used as a calcium supplement for food producing animals (i.e., cattle, poultry and pig). When DCP is produced via wet acid digestion of the phosphate rock and depending on the acid used in the industrial process, the final product can result in enhanced (210)Pb and (210)Po specific activities (approximately 2000 Bq.kg(-1)). Both (210)Pb and (210)Po are of great interest because their contribution to the dose received by ingestion is potentially large. The aims of this work are to examine the accumulation of (210)Pb and (210)Po in chicken tissues during the first 42 days of life and to build a suitable single-compartment biokinetic model to understand the behavior of both radionuclides within the entire animal using the experimental results. Three commercial corn-soybean-based diets containing different amounts and sources of DCP were fed to broilers during a period of 42 days. The results show that diets containing enhanced concentrations of (210)Pb and (210)Po lead to larger specific accumulation in broiler tissues compared to the blank diet. Radionuclides do not accumulate homogeneously within the animal body: (210)Pb follows the calcium pathways to some extent and accumulates largely in bones, while (210)Po accumulates to a large extent in liver and kidneys. However, the total amount of radionuclide accumulation in tissues is small compared to the amounts excreted in feces. The single-compartment non-linear biokinetic model proposed here for (210)Pb and (210)Po in the whole animal takes into account the size evolution and is self-consistent in that no fitting parameterization of intake and excretions rates is required. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Data acquisition system issues for large experiments

    NASA Astrophysics Data System (ADS)

    Siskind, E. J.

    2007-09-01

    This talk consists of personal observations on two classes of data acquisition ("DAQ") systems for Silicon trackers in large experiments with which the author has been concerned over the last three or more years. The first half is a classic "lessons learned" recital based on experience with the high-level debug and configuration of the DAQ system for the GLAST LAT detector. The second half is concerned with a discussion of the promises and pitfalls of using modern (and future) generations of "system-on-a-chip" ("SOC") or "platform" field-programmable gate arrays ("FPGAs") in future large DAQ systems. The DAQ system pipeline for the 864k channels of Si tracker in the GLAST LAT consists of five tiers of hardware buffers which ultimately feed into the main memory of the (two-active-node) level-3 trigger processor farm. The data formats and buffer volumes of these tiers are briefly described, as well as the flow control employed between successive tiers. Lessons learned regarding data formats, buffer volumes, and flow control/data discard policy are discussed. The continued development of platform FPGAs containing large amounts of configurable logic fabric, embedded PowerPC hard processor cores, digital signal processing components, large volumes of on-chip buffer memory, and multi-gigabit serial I/O capability permits DAQ system designers to vastly increase the amount of data preprocessing that can be performed in parallel within the DAQ pipeline for detector systems in large experiments. The capabilities of some currently available FPGA families are reviewed, along with the prospects for next-generation families of announced, but not yet available, platform FPGAs. Some experience with an actual implementation is presented, and reconciliation between advertised and achievable specifications is attempted. The prospects for applying these components to space-borne Si tracker detectors are briefly discussed.

  20. Large-scale circulation patterns, instability factors and global precipitation modeling as influenced by external forcing

    NASA Astrophysics Data System (ADS)

    Bundel, A.; Kulikova, I.; Kruglova, E.; Muravev, A.

    2003-04-01

    The scope of the study is to estimate the relationship between large-scale circulation regimes, various instability indices and global precipitation with different boundary conditions, considered as external forcing. The experiments were carried out in the ensemble-prediction framework of the dynamic-statistical monthly forecast scheme run in the Hydrometeorological Research Center of Russia every ten days. The extension to seasonal intervals makes it necessary to investigate the role of slowly changing boundary conditions among which the sea surface temperature (SST) may be defined as the most effective factor. Continuous integrations of the global spectral T41L15 model for the whole year 2000 (starting from January 1) were performed with the climatic SST and the Reynolds Archive SSTs. Monthly values of the SST were projected on the year days using spline interpolation technique. First, the global precipitation values in experiments were compared to the GPCP (Global Precipitation Climate Program) daily observation data. Although the global mean precipitation is underestimated by the model, some large-scale regional amounts correspond to the real ones (e.g. for Europe) fairly well. On the whole, however, anomaly phases failed to be reproduced. The precipitation averaged over the whole land revealed a greater sensitivity to the SSTs than that over the oceans. The wavelet analysis was applied to separate the low- and high-frequency signal of the SST influence on the large-scale circulation and precipitation. A derivative of the Wallace-Gutzler teleconnection index for the East-Atlantic oscillation was taken as the circulation characteristic. The daily oscillation index values and precipitation amounts averaged over Europe were decomposed using wavelet approach with different “mother wavelets” up to approximation level 3. It was demonstrated that an increase in the precipitation amount over Europe was associated with the zonal flow intensification over the Northern Atlantic when the real SSTs were used. Blocking structures in the circulation caused decreasing precipitation amounts. The wavelet approach gave a more distinctive discrimination in the modeled circulation and precipitation patterns versus different external forcing than a number of other statistical techniques. Several atmospheric instability indices (e.g. the Phillips like parameters, Richardson number etc) were additionally used in post-processing for a more detailed validation of the modeled large-scale and total precipitation amounts. It was shown that a reasonable variety of instability indices must be used for such validations and for precipitation output corrections. Their statistical stability may be substantiated only on the ensemble modeling basis. This work was performed with the financial support of the Russian Foundation for Basic Research (02-05-64655).

  1. Consumption with Large Sip Sizes Increases Food Intake and Leads to Underestimation of the Amount Consumed

    PubMed Central

    Bolhuis, Dieuwerke P.; Lakemond, Catriona M. M.; de Wijk, Rene A.; Luning, Pieternel A.; de Graaf, Cees

    2013-01-01

    Background A number of studies have shown that bite and sip sizes influence the amount of food intake. Consuming with small sips instead of large sips means relatively more sips for the same amount of food to be consumed; people may believe that intake is higher which leads to faster satiation. This effect may be disturbed when people are distracted. Objective The objective of the study is to assess the effects of sip size in a focused state and a distracted state on ad libitum intake and on the estimated amount consumed. Design In this 3×2 cross-over design, 53 healthy subjects consumed ad libitum soup with small sips (5 g, 60 g/min), large sips (15 g, 60 g/min), and free sips (where sip size was determined by subjects themselves), in both a distracted and focused state. Sips were administered via a pump. There were no visual cues toward consumption. Subjects then estimated how much they had consumed by filling soup in soup bowls. Results Intake in the small-sip condition was ∼30% lower than in both the large-sip and free-sip conditions (P<0.001). In addition, subjects underestimated how much they had consumed in the large-sip and free-sip conditions (P<0.03). Distraction led to a general increase in food intake (P = 0.003), independent of sip size. Distraction did not influence sip size or estimations. Conclusions Consumption with large sips led to higher food intake, as expected. Large sips, that were either fixed or chosen by subjects themselves led to underestimations of the amount consumed. This may be a risk factor for over-consumption. Reducing sip or bite sizes may successfully lower food intake, even in a distracted state. PMID:23372657

  2. Model for fluorescence quenching in light harvesting complex II in different aggregation states.

    PubMed

    Andreeva, Atanaska; Abarova, Silvia; Stoitchkova, Katerina; Busheva, Mira

    2009-02-01

    Low-temperature (77 K) steady-state fluorescence emission spectroscopy and dynamic light scattering were applied to the main chlorophyll a/b protein light harvesting complex of photosystem II (LHC II) in different aggregation states to elucidate the mechanism of fluorescence quenching within LHC II oligomers. Evidences presented that LHC II oligomers are heterogeneous and consist of large and small particles with different fluorescence yield. At intermediate detergent concentrations the mean size of the small particles is similar to that of trimers, while the size of large particles is comparable to that of aggregated trimers without added detergent. It is suggested that in small particles and trimers the emitter is monomeric chlorophyll, whereas in large aggregates there is also another emitter, which is a poorly fluorescing chlorophyll associate. A model, describing populations of antenna chlorophyll molecules in small and large aggregates in their ground and first singlet excited states, is considered. The model enables us to obtain the ratio of the singlet excited-state lifetimes in small and large particles, the relative amount of chlorophyll molecules in large particles, and the amount of quenchers as a function of the degree of aggregation. These dependencies reveal that the quenching of the chl a fluorescence upon aggregation is due to the formation of large aggregates and the increasing of the amount of chlorophyll molecules forming these aggregates. As a consequence, the amount of quenchers, located in large aggregates, is increased, and their singlet excited-state lifetimes steeply decrease.

  3. Development of Ti microalloyed high strength steel plate by controlling thermo-mechanical control process schedule

    NASA Astrophysics Data System (ADS)

    Xia, Jinian; Huo, Xiangdong; Li, Liejun; Peng, Zhengwu; Chen, Songjun

    2017-12-01

    In this study, the TMCP parameters including non-recrystallization temperature (Tnr) and optimal isothermal temperature were determined by thermal simulation experiments, and a new Ti microalloyed high strength steel plate was developed by controlling thermo-mechanical control process (TMCP) schedule. The effects of TMCP process on microstructural features, precipitation behavior and mechanical properties of Ti microalloyed high strength steel plate were investigated. The results revealed that the double-stage rolling process consist of rolling in the γ recrystallization region and the γ non-recrystallization region was benefical to promoting the mechanical properties of Ti microalloyed steel by achieving grain refinement. It was also found that large amounts of fine TiC (<10 nm) particles were precipitated during the isothermal treatment at 600 °C, which generated a 215 MPa precipitation strengthening effect.

  4. Processing of oats and the impact of processing operations on nutrition and health benefits.

    PubMed

    Decker, Eric A; Rose, Devin J; Stewart, Derek

    2014-10-01

    Oats are a uniquely nutritious food as they contain an excellent lipid profile and high amounts of soluble fibre. However, an oat kernel is largely non-digestible and thus must be utilised in milled form to reap its nutritional benefits. Milling is made up of numerous steps, the most important being dehulling to expose the digestible groat, heat processing to inactivate enzymes that cause rancidity, and cutting, rolling or grinding to convert the groat into a product that can be used directly in oatmeal or can be used as a food ingredient in products such as bread, ready-to-eat breakfast cereals and snack bars. Oats can also be processed into oat bran and fibre to obtain high-fibre-containing fractions that can be used in a variety of food products.

  5. Highly Efficient Erythritol Recovery from Waste Erythritol Mother Liquor by a Yeast-Mediated Biorefinery Process.

    PubMed

    Wang, Siqi; Wang, Hengwei; Lv, Jiyang; Deng, Zixin; Cheng, Hairong

    2017-12-20

    Erythritol, a natural sugar alcohol, is produced industrially by fermentation and crystallization, but this process leaves a large amount of waste erythritol mother liquor (WEML) which contains more than 200 g/L erythritol as well as other polyol byproducts. These impurities make it very difficult to crystallize more erythritol. In our study, an efficient process for the recovery of erythritol from the WEML is described. The polyol impurities were first identified by high-performance liquid chromatography and gas chromatography-mass spectrometry, and a yeast strain Candida maltosa CGMCC 7323 was then isolated to metabolize those impurities to purify erythritol. Our results demonstrated that the process could remarkably improve the purity of erythritol and thus make the subsequent crystallization easier. This newly developed strategy is expected to have advantages in WEML treatment and provide helpful information with regard to green cell factories and zero-waste processing.

  6. Superior memory efficiency of quantum devices for the simulation of continuous-time stochastic processes

    NASA Astrophysics Data System (ADS)

    Elliott, Thomas J.; Gu, Mile

    2018-03-01

    Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.

  7. Diffuse CO2 degassing at Vesuvio, Italy

    NASA Astrophysics Data System (ADS)

    Frondini, Francesco; Chiodini, Giovanni; Caliro, Stefano; Cardellini, Carlo; Granieri, Domenico; Ventura, Guido

    2004-10-01

    At Vesuvio, a significant fraction of the rising hydrothermal-volcanic fluids is subjected to a condensation and separation process producing a CO2-rich gas phase, mainly expulsed through soil diffuse degassing from well defined areas called diffuse degassing structures (DDS), and a liquid phase that flows towards the outer part of the volcanic cone. A large amount of thermal energy is associated with the steam condensation process and subsequent cooling of the liquid phase. The total amount of volcanic-hydrothermal CO2 discharged through diffuse degassing has been computed through a sequential Gaussian simulation (sGs) approach based on several hundred accumulation chamber measurements and, at the time of the survey, amounted to 151 t d-1. The steam associated with the CO2 output, computed assuming that the original H2O/CO2 ratio of hydrothermal fluids is preserved in fumarolic effluents, is 553 t d-1, and the energy produced by the steam condensation and cooling of the liquid phase is 1.47×1012 J d-1 (17 MW). The location of the CO2 and temperature anomalies show that most of the gas is discharged from the inner part of the crater and suggests that crater morphology and local stratigraphy exert strong control on CO2 degassing and subsurface steam condensation. The amounts of gas and energy released by Vesuvio are comparable to those released by other volcanic degassing areas of the world and their estimates, through periodic surveys of soil CO2 flux, can constitute a useful tool to monitor volcanic activity.

  8. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  9. Interactive Scripting for Analysis and Visualization of Arbitrarily Large, Disparately Located Climate Data Ensembles Using a Progressive Runtime Server

    NASA Astrophysics Data System (ADS)

    Christensen, C.; Summa, B.; Scorzelli, G.; Lee, J. W.; Venkat, A.; Bremer, P. T.; Pascucci, V.

    2017-12-01

    Massive datasets are becoming more common due to increasingly detailed simulations and higher resolution acquisition devices. Yet accessing and processing these huge data collections for scientific analysis is still a significant challenge. Solutions that rely on extensive data transfers are increasingly untenable and often impossible due to lack of sufficient storage at the client side as well as insufficient bandwidth to conduct such large transfers, that in some cases could entail petabytes of data. Large-scale remote computing resources can be useful, but utilizing such systems typically entails some form of offline batch processing with long delays, data replications, and substantial cost for any mistakes. Both types of workflows can severely limit the flexible exploration and rapid evaluation of new hypotheses that are crucial to the scientific process and thereby impede scientific discovery. In order to facilitate interactivity in both analysis and visualization of these massive data ensembles, we introduce a dynamic runtime system suitable for progressive computation and interactive visualization of arbitrarily large, disparately located spatiotemporal datasets. Our system includes an embedded domain-specific language (EDSL) that allows users to express a wide range of data analysis operations in a simple and abstract manner. The underlying runtime system transparently resolves issues such as remote data access and resampling while at the same time maintaining interactivity through progressive and interruptible processing. Computations involving large amounts of data can be performed remotely in an incremental fashion that dramatically reduces data movement, while the client receives updates progressively thereby remaining robust to fluctuating network latency or limited bandwidth. This system facilitates interactive, incremental analysis and visualization of massive remote datasets up to petabytes in size. Our system is now available for general use in the community through both docker and anaconda.

  10. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  11. Aligning observed and modelled behaviour based on workflow decomposition

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  12. Chemical weathering on the North Island of New Zealand: CO2 consumption and fluxes of Sr and Os

    NASA Astrophysics Data System (ADS)

    Blazina, Tim; Sharma, Mukul

    2013-09-01

    We present Os and Sr isotope ratios and Os, Sr and major/trace element concentrations for river waters, spring waters and rains on the North Island of New Zealand. The Os and Sr data are used to examine whether the NINZ is a significant contributor of unradiogenic Os and Sr to the oceans. Major element chemistry is used to quantify weathering and CO2 consumption rates on the island to investigate relationships between these processes and Os and Sr behavior. Chemical erosion rates and CO2 consumption rates across the island range from 44 to 555 km-2 yr-1 and 95 to 1900 × 103 mol CO2 km-2 yr-1, respectively. Strontium flux for the island range from 177 to 16,100 mol km-2 yr-1 and the rivers have an average flux normalized 87Sr/86Sr ratio of 0.7075. In agreement with the previous studies these findings provide further evidence that weathering of arc terrains contributes a disproportionally large amount of Sr to the oceans and consumes very large amounts of CO2 annually compared to their areal extent. However, the 87Sr/86Sr from the NINZ is not particularly unradiogenic and it is likely not contributing significant amounts of unradiogenic Sr to the oceans. Repeated Os analyses and bottle leaching experiments revealed extensive and variable sample contamination by Os leaching from rigorously precleaned LDPE bottles. An upper bound on the flux of Os from NINZ can nevertheless be assessed and indicates that island arcs cannot provide significant amounts of unradiogenic Os to the oceans.

  13. A new model for the origin of Type-B and Fluffy Type-A CAIs: Analogies to remelted compound chondrules

    NASA Astrophysics Data System (ADS)

    Rubin, Alan E.

    2012-06-01

    In the scenario developed here, most types of calcium-aluminum-rich inclusions (CAIs) formed near the Sun where they developed Wark-Lovering rims before being transported by aerodynamic forces throughout the nebula. The amount of ambient dust in the nebula varied with heliocentric distance, peaking in the CV-CK formation location. Literature data show that accretionary rims (which occur outside the Wark-Lovering rims) around CAIs contain substantial 16O-rich forsterite, suggesting that, at this time, the ambient dust in the nebula consisted largely of 16O-rich forsterite. Individual sub-millimeter-size Compact Type-A CAIs (each surrounded by a Wark-Lovering rim) collided in the CV-CK region and stuck together (in a manner similar to that of sibling compound chondrules); the CTAs were mixed with small amounts of 16O-rich mafic dust and formed centimeter-size compound objects (large Fluffy Type-A CAIs) after experiencing minor melting. In contrast to other types of CAIs, centimeter-size Type-B CAIs formed directly in the CV-CK region after gehlenite-rich Compact Type-A CAIs collided and stuck together, incorporated significant amounts of 16O-rich forsteritic dust (on the order of 10-15%) and probably some anorthite, and experienced extensive melting and partial evaporation. (Enveloping compound chondrules formed in an analogous manner.) In those cases where appreciably higher amounts of 16O-rich forsterite (on the order of 25%) (and perhaps minor anorthite and pyroxene) were incorporated into compound Type-A objects prior to melting, centimeter-size forsterite-bearing Type-B CAIs (B3 inclusions) were produced. Type-B1 inclusions formed from B2 inclusions that collided with and stuck to melilite-rich Compact Type-A CAIs and experienced high-temperature processing.

  14. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berres, Anne Sabine; Adhinarayanan, Vignesh; Turton, Terece

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline atmore » the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.« less

  15. Development of potentiometric equipment for the identification of altered dry-cured hams: A preliminary study.

    PubMed

    Girón, Joel; Gil-Sánchez, Luís; García-Breijo, Eduardo; Pagán, M Jesús; Barat, José M; Grau, Raúl

    2015-08-01

    Microbiological contamination in dry-cured ham can occur in the early stages of the process, a large number of microorganisms involved in spoilage can produce alterations in the product. These include non-common odours, which are detected at the end of the process by a procedure called "cala", consisting of a sharp instrument punctured in every ham; this is smelled by an expert taster, who classifies hams as good and altered hams. An electronic device would be suitable for this process given the large amount of hams. The present research aims to develop objective equipment based on the potentiometry technique that identifies altered hams. A probe was developed, containing silver, nickel and copper electrodes, and was employed to classify altered and unaltered hams prior to classification by a tester. The results shown lower Ag and higher Cu potential values for altered hams. The differences in potentiometric response reveal a classification model, although further studies are required to obtain a reliable classification model. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Mesoporous systems for poorly soluble drugs.

    PubMed

    Xu, Wujun; Riikonen, Joakim; Lehto, Vesa-Pekka

    2013-08-30

    Utilization of inorganic mesoporous materials in formulations of poorly water-soluble drugs to enhance their dissolution and permeation behavior is a rapidly growing area in pharmaceutical materials research. The benefits of mesoporous materials in drug delivery applications stem from their large surface area and pore volume. These properties enable the materials to accommodate large amounts of payload molecules, protect them from premature degradation, and promote controlled and fast release. As carriers with various morphologies and chemical surface properties can be produced, these materials may even promote adsorption from the gastrointestinal tract to the systemic circulation. The main concern regarding their clinical applications is still the safety aspect even though most of them have been reported to be safely excreted, and a rather extensive toxicity screening has already been conducted with the most frequently studied mesoporous materials. In addition, the production of the materials on a large scale and at a reasonable cost may be a challenge when considering the utilization of the materials in industrial processes. However, if mesoporous materials could be employed in the industrial crystallization processes to produce hybrid materials with poorly soluble compounds, and hence to enhance their oral bioavailability, this might open new avenues for the pharmaceutical industry to employ nanotechnology in their processes. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Estimation of pollutant loads considering dam operation in Han River Basin by BASINS/Hydrological Simulation Program-FORTRAN.

    PubMed

    Jung, Kwang-Wook; Yoon, Choon-G; Jang, Jae-Ho; Kong, Dong-Soo

    2008-01-01

    Effective watershed management often demands qualitative and quantitative predictions of the effect of future management activities as arguments for policy makers and administration. The BASINS geographic information system was developed to compute total maximum daily loads, which are helpful to establish hydrological process and water quality modeling system. In this paper the BASINS toolkit HSPF model is applied in 20,271 km(2) large watershed of the Han River Basin is used for applicability of HSPF and BMPs scenarios. For proper evaluation of watershed and stream water quality, comprehensive estimation methods are necessary to assess large amounts of point source and nonpoint-source (NPS) pollution based on the total watershed area. In this study, The Hydrological Simulation Program-FORTRAN (HSPF) was estimated to simulate watershed pollutant loads containing dam operation and applied BMPs scenarios for control NPS pollution. The 8-day monitoring data (about three years) were used in the calibration and verification processes. Model performance was in the range of "very good" and "good" based on percent difference. The water-quality simulation results were encouraging for this large sizable watershed with dam operation practice and mixed land uses; HSPF proved adequate, and its application is recommended to simulate watershed processes and BMPs evaluation. IWA Publishing 2008.

  18. A Gap-Filling Procedure for Hydrologic Data Based on Kalman Filtering and Expectation Maximization: Application to Data from the Wireless Sensor Networks of the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Coogan, A.; Avanzi, F.; Akella, R.; Conklin, M. H.; Bales, R. C.; Glaser, S. D.

    2017-12-01

    Automatic meteorological and snow stations provide large amounts of information at dense temporal resolution, but data quality is often compromised by noise and missing values. We present a new gap-filling and cleaning procedure for networks of these stations based on Kalman filtering and expectation maximization. Our method utilizes a multi-sensor, regime-switching Kalman filter to learn a latent process that captures dependencies between nearby stations and handles sharp changes in snowfall rate. Since the latent process is inferred using observations across working stations in the network, it can be used to fill in large data gaps for a malfunctioning station. The procedure was tested on meteorological and snow data from Wireless Sensor Networks (WSN) in the American River basin of the Sierra Nevada. Data include air temperature, relative humidity, and snow depth from dense networks of 10 to 12 stations within 1 km2 swaths. Both wet and dry water years have similar data issues. Data with artificially created gaps was used to quantify the method's performance. Our multi-sensor approach performs better than a single-sensor one, especially with large data gaps, as it learns and exploits the dominant underlying processes in snowpack at each site.

  19. Waste Management Options for Long-Duration Space Missions: When to Reject, Reuse, or Recycle

    NASA Technical Reports Server (NTRS)

    Linne, Diane L.; Palaszewski, Bryan A.; Gokoglu, Suleyman; Gallo, Christopher A.; Balasubramaniam, Ramaswamy; Hegde, Uday G.

    2014-01-01

    The amount of waste generated on long-duration space missions away from Earth orbit creates the daunting challenge of how to manage the waste through reuse, rejection, or recycle. The option to merely dispose of the solid waste through an airlock to space was studied for both Earth-moon libration point missions and crewed Mars missions. Although the unique dynamic characteristics of an orbit around L2 might allow some discarded waste to intersect the lunar surface before re-impacting the spacecraft, the large amount of waste needed to be managed and potential hazards associated with volatiles recondensing on the spacecraft surfaces make this option problematic. A second option evaluated is to process the waste into useful gases to be either vented to space or used in various propulsion systems. These propellants could then be used to provide the yearly station-keeping needs at an L2 orbit, or if processed into oxygen and methane propellants, could be used to augment science exploration by enabling lunar mini landers to the far side of the moon.

  20. Blast investigation by fast multispectral radiometric analysis

    NASA Astrophysics Data System (ADS)

    Devir, A. D.; Bushlin, Y.; Mendelewicz, I.; Lessin, A. B.; Engel, M.

    2011-06-01

    Knowledge regarding the processes involved in blasts and detonations is required in various applications, e.g. missile interception, blasts of high-explosive materials, final ballistics and IED identification. Blasts release large amount of energy in short time duration. Some part of this energy is released as intense radiation in the optical spectral bands. This paper proposes to measure the blast radiation by a fast multispectral radiometer. The measurement is made, simultaneously, in appropriately chosen spectral bands. These spectral bands provide extensive information on the physical and chemical processes that govern the blast through the time-dependence of the molecular and aerosol contributions to the detonation products. Multi-spectral blast measurements are performed in the visible, SWIR and MWIR spectral bands. Analysis of the cross-correlation between the measured multi-spectral signals gives the time dependence of the temperature, aerosol and gas composition of the blast. Farther analysis of the development of these quantities in time may indicate on the order of the detonation and amount and type of explosive materials. Examples of analysis of measured explosions are presented to demonstrate the power of the suggested fast multispectral radiometric analysis approach.

Top