Sample records for processing large volumes

  1. An interfaces approach to TES ground data system processing design with the Science Investigator-led Processing System (SIPS)

    NASA Technical Reports Server (NTRS)

    Kurian, R.; Grifin, A.

    2002-01-01

    Developing production-quality software to process the large volumes of scientific data is the responsibility of the TES Ground Data System, which is being developed at the Jet Propulsion Laboratory together with support contractor Raytheon/ITSS. The large data volume and processing requirements of the TES pose significant challenges to the design.

  2. High-Volume Production of Lightweight Multijunction Solar Cells

    NASA Technical Reports Server (NTRS)

    Youtsey, Christopher

    2015-01-01

    MicroLink Devices, Inc., has transitioned its 6-inch epitaxial lift-off (ELO) solar cell fabrication process into a manufacturing platform capable of sustaining large-volume production. This Phase II project improves the ELO process by reducing cycle time and increasing the yield of large-area devices. In addition, all critical device fabrication processes have transitioned to 6-inch production tool sets designed for volume production. An emphasis on automated cassette-to-cassette and batch processes minimizes operator dependence and cell performance variability. MicroLink Devices established a pilot production line capable of at least 1,500 6-inch wafers per month at greater than 80 percent yield. The company also increased the yield and manufacturability of the 6-inch reclaim process, which is crucial to reducing the cost of the cells.

  3. Effects of large volume injection of aliphatic alcohols as sample diluents on the retention of low hydrophobic solutes in reversed-phase liquid chromatography.

    PubMed

    David, Victor; Galaon, Toma; Aboul-Enein, Hassan Y

    2014-01-03

    Recent studies showed that injection of large volume of hydrophobic solvents used as sample diluents could be applied in reversed-phase liquid chromatography (RP-LC). This study reports a systematic research focused on the influence of a series of aliphatic alcohols (from methanol to 1-octanol) on the retention process in RP-LC, when large volumes of sample are injected on the column. Several model analytes with low hydrophobic character were studied by RP-LC process, for mobile phases containing methanol or acetonitrile as organic modifiers in different proportions with aqueous component. It was found that starting with 1-butanol, the aliphatic alcohols can be used as sample solvents and they can be injected in high volumes, but they may influence the retention factor and peak shape of the dissolved solutes. The dependence of the retention factor of the studied analytes on the injection volume of these alcohols is linear, with a decrease of its value as the sample volume is increased. The retention process in case of injecting up to 200μL of upper alcohols is dependent also on the content of the organic modifier (methanol or acetonitrile) in mobile phase. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Radiation from Large Gas Volumes and Heat Exchange in Steam Boiler Furnaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, A. N., E-mail: tgtu-kafedra-ese@mail.ru

    2015-09-15

    Radiation from large cylindrical gas volumes is studied as a means of simulating the flare in steam boiler furnaces. Calculations of heat exchange in a furnace by the zonal method and by simulation of the flare with cylindrical gas volumes are described. The latter method is more accurate and yields more reliable information on heat transfer processes taking place in furnaces.

  5. Very Large Area/Volume Microwave ECR Plasma and Ion Source

    NASA Technical Reports Server (NTRS)

    Foster, John E. (Inventor); Patterson, Michael J. (Inventor)

    2009-01-01

    The present invention is an apparatus and method for producing very large area and large volume plasmas. The invention utilizes electron cyclotron resonances in conjunction with permanent magnets to produce dense, uniform plasmas for long life ion thruster applications or for plasma processing applications such as etching, deposition, ion milling and ion implantation. The large area source is at least five times larger than the 12-inch wafers being processed to date. Its rectangular shape makes it easier to accommodate to materials processing than sources that are circular in shape. The source itself represents the largest ECR ion source built to date. It is electrodeless and does not utilize electromagnets to generate the ECR magnetic circuit, nor does it make use of windows.

  6. Flat-plate solar array project. Volume 5: Process development

    NASA Technical Reports Server (NTRS)

    Gallagher, B.; Alexander, P.; Burger, D.

    1986-01-01

    The goal of the Process Development Area, as part of the Flat-Plate Solar Array (FSA) Project, was to develop and demonstrate solar cell fabrication and module assembly process technologies required to meet the cost, lifetime, production capacity, and performance goals of the FSA Project. R&D efforts expended by Government, Industry, and Universities in developing processes capable of meeting the projects goals during volume production conditions are summarized. The cost goals allocated for processing were demonstrated by small volume quantities that were extrapolated by cost analysis to large volume production. To provide proper focus and coverage of the process development effort, four separate technology sections are discussed: surface preparation, junction formation, metallization, and module assembly.

  7. Technologies for imaging neural activity in large volumes

    PubMed Central

    Ji, Na; Freeman, Jeremy; Smith, Spencer L.

    2017-01-01

    Neural circuitry has evolved to form distributed networks that act dynamically across large volumes. Collecting data from individual planes, conventional microscopy cannot sample circuitry across large volumes at the temporal resolution relevant to neural circuit function and behaviors. Here, we review emerging technologies for rapid volume imaging of neural circuitry. We focus on two critical challenges: the inertia of optical systems, which limits image speed, and aberrations, which restrict the image volume. Optical sampling time must be long enough to ensure high-fidelity measurements, but optimized sampling strategies and point spread function engineering can facilitate rapid volume imaging of neural activity within this constraint. We also discuss new computational strategies for the processing and analysis of volume imaging data of increasing size and complexity. Together, optical and computational advances are providing a broader view of neural circuit dynamics, and help elucidate how brain regions work in concert to support behavior. PMID:27571194

  8. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 2: Framework process description

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.

  9. Improving IT Portfolio Management Decision Confidence Using Multi-Criteria Decision Making and Hypervariate Display Techniques

    ERIC Educational Resources Information Center

    Landmesser, John Andrew

    2014-01-01

    Information technology (IT) investment decision makers are required to process large volumes of complex data. An existing body of knowledge relevant to IT portfolio management (PfM), decision analysis, visual comprehension of large volumes of information, and IT investment decision making suggest Multi-Criteria Decision Making (MCDM) and…

  10. Melt Electrospinning Writing of Highly Ordered Large Volume Scaffold Architectures.

    PubMed

    Wunner, Felix M; Wille, Marie-Luise; Noonan, Thomas G; Bas, Onur; Dalton, Paul D; De-Juan-Pardo, Elena M; Hutmacher, Dietmar W

    2018-05-01

    The additive manufacturing of highly ordered, micrometer-scale scaffolds is at the forefront of tissue engineering and regenerative medicine research. The fabrication of scaffolds for the regeneration of larger tissue volumes, in particular, remains a major challenge. A technology at the convergence of additive manufacturing and electrospinning-melt electrospinning writing (MEW)-is also limited in thickness/volume due to the accumulation of excess charge from the deposited material repelling and hence, distorting scaffold architectures. The underlying physical principles are studied that constrain MEW of thick, large volume scaffolds. Through computational modeling, numerical values variable working distances are established respectively, which maintain the electrostatic force at a constant level during the printing process. Based on the computational simulations, three voltage profiles are applied to determine the maximum height (exceeding 7 mm) of a highly ordered large volume scaffold. These thick MEW scaffolds have fully interconnected pores and allow cells to migrate and proliferate. To the best of the authors knowledge, this is the first study to report that z-axis adjustment and increasing the voltage during the MEW process allows for the fabrication of high-volume scaffolds with uniform morphologies and fiber diameters. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Enhanced FIB-SEM systems for large-volume 3D imaging.

    PubMed

    Xu, C Shan; Hayworth, Kenneth J; Lu, Zhiyuan; Grob, Patricia; Hassan, Ahmed M; García-Cerdán, José G; Niyogi, Krishna K; Nogales, Eva; Weinberg, Richard J; Hess, Harald F

    2017-05-13

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 10 6 µm 3 . These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processes and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology.

  12. Antarctic icebergs distributions 1992-2014

    NASA Astrophysics Data System (ADS)

    Tournadre, J.; Bouhier, N.; Girard-Ardhuin, F.; Rémy, F.

    2016-01-01

    Basal melting of floating ice shelves and iceberg calving constitute the two almost equal paths of freshwater flux between the Antarctic ice cap and the Southern Ocean. The largest icebergs (>100 km2) transport most of the ice volume but their basal melting is small compared to their breaking into smaller icebergs that constitute thus the major vector of freshwater. The archives of nine altimeters have been processed to create a database of small icebergs (<8 km2) within open water containing the positions, sizes, and volumes spanning the 1992-2014 period. The intercalibrated monthly ice volumes from the different altimeters have been merged in a homogeneous 23 year climatology. The iceberg size distribution, covering the 0.1-10,000 km2 range, estimated by combining small and large icebergs size measurements follows well a power law of slope -1.52 ± 0.32 close to the -3/2 laws observed and modeled for brittle fragmentation. The global volume of ice and its distribution between the ocean basins present a very strong interannual variability only partially explained by the number of large icebergs. Indeed, vast zones of the Southern Ocean free of large icebergs are largely populated by small iceberg drifting over thousands of kilometers. The correlation between the global small and large icebergs volumes shows that small icebergs are mainly generated by large ones breaking. Drifting and trapping by sea ice can transport small icebergs for long period and distances. Small icebergs act as an ice diffuse process along large icebergs trajectories while sea ice trapping acts as a buffer delaying melting.

  13. Composite media for ion processing

    DOEpatents

    Mann, Nick R [Blackfoot, ID; Wood, Donald J [Peshastin, WA; Todd, Terry A [Aberdeen, ID; Sebesta, Ferdinand [Prague, CZ

    2009-12-08

    Composite media, systems, and devices for substantially removing, or otherwise processing, one or more constituents of a fluid stream. The composite media comprise a plurality of beads, each having a matrix substantially comprising polyacrylonitrile (PAN) and supporting one or more active components which are effective in removing, by various mechanisms, one or more constituents from a fluid stream. Due to the porosity and large surface area of the beads, a high level of contact is achieved between composite media of the present invention and the fluid stream being processed. Further, the homogeneity of the beads facilitates use of the beads in high volume applications where it is desired to effectively process a large volume of flow per unit of time.

  14. Evaluation of Bacillus oleronius as a Biological Indicator for Terminal Sterilization of Large-Volume Parenterals.

    PubMed

    Izumi, Masamitsu; Fujifuru, Masato; Okada, Aki; Takai, Katsuya; Takahashi, Kazuhiro; Udagawa, Takeshi; Miyake, Makoto; Naruyama, Shintaro; Tokuda, Hiroshi; Nishioka, Goro; Yoden, Hikaru; Aoki, Mitsuo

    2016-01-01

    In the production of large-volume parenterals in Japan, equipment and devices such as tanks, pipework, and filters used in production processes are exhaustively cleaned and sterilized, and the cleanliness of water for injection, drug materials, packaging materials, and manufacturing areas is well controlled. In this environment, the bioburden is relatively low, and less heat resistant compared with microorganisms frequently used as biological indicators such as Geobacillus stearothermophilus (ATCC 7953) and Bacillus subtilis 5230 (ATCC 35021). Consequently, the majority of large-volume parenteral solutions in Japan are manufactured under low-heat sterilization conditions of F0 <2 min, so that loss of clarity of solutions and formation of degradation products of constituents are minimized. Bacillus oleronius (ATCC 700005) is listed as a biological indicator in "Guidance on the Manufacture of Sterile Pharmaceutical Products Produced by Terminal Sterilization" (guidance in Japan, issued in 2012). In this study, we investigated whether B. oleronius is an appropriate biological indicator of the efficacy of low-heat, moist-heat sterilization of large-volume parenterals. Specifically, we investigated the spore-forming ability of this microorganism in various cultivation media and measured the D-values and z-values as parameters of heat resistance. The D-values and z-values changed depending on the constituents of large-volume parenteral products. Also, the spores from B. oleronius showed a moist-heat resistance that was similar to or greater than many of the spore-forming organisms isolated from Japanese parenteral manufacturing processes. Taken together, these results indicate that B. oleronius is suitable as a biological indicator for sterility assurance of large-volume parenteral solutions subjected to low-heat, moist-heat terminal sterilization. © PDA, Inc. 2016.

  15. Strain heating in process zones; implications for metamorphism and partial melting in the lithosphere

    NASA Astrophysics Data System (ADS)

    Devès, Maud H.; Tait, Stephen R.; King, Geoffrey C. P.; Grandin, Raphaël

    2014-05-01

    Since the late 1970s, most earth scientists have discounted the plausibility of melting by shear-strain heating because temperature-dependent creep rheology leads to negative feedback and self-regulation. This paper presents a new model of distributed shear-strain heating that can account for the genesis of large volumes of magmas in both the crust and the mantle of the lithosphere. The kinematic (geometry and rates) frustration associated with incompatible fault junctions (e.g. triple-junction) prevents localisation of all strain on the major faults. Instead, deformation distributes off the main faults forming a large process zone that deforms still at high rates under both brittle and ductile conditions. The increased size of the shear-heated region minimises conductive heat loss, compared with that commonly associated with narrow shear zones, thus promoting strong heating and melting under reasonable rheological assumptions. Given the large volume of the heated zone, large volumes of melt can be generated even at small melt fractions.

  16. Enhanced FIB-SEM systems for large-volume 3D imaging

    PubMed Central

    Xu, C Shan; Hayworth, Kenneth J; Lu, Zhiyuan; Grob, Patricia; Hassan, Ahmed M; García-Cerdán, José G; Niyogi, Krishna K; Nogales, Eva; Weinberg, Richard J; Hess, Harald F

    2017-01-01

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 106 µm3. These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processes and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology. DOI: http://dx.doi.org/10.7554/eLife.25916.001 PMID:28500755

  17. A simple method for the production of large volume 3D macroporous hydrogels for advanced biotechnological, medical and environmental applications

    NASA Astrophysics Data System (ADS)

    Savina, Irina N.; Ingavle, Ganesh C.; Cundy, Andrew B.; Mikhalovsky, Sergey V.

    2016-02-01

    The development of bulk, three-dimensional (3D), macroporous polymers with high permeability, large surface area and large volume is highly desirable for a range of applications in the biomedical, biotechnological and environmental areas. The experimental techniques currently used are limited to the production of small size and volume cryogel material. In this work we propose a novel, versatile, simple and reproducible method for the synthesis of large volume porous polymer hydrogels by cryogelation. By controlling the freezing process of the reagent/polymer solution, large-scale 3D macroporous gels with wide interconnected pores (up to 200 μm in diameter) and large accessible surface area have been synthesized. For the first time, macroporous gels (of up to 400 ml bulk volume) with controlled porous structure were manufactured, with potential for scale up to much larger gel dimensions. This method can be used for production of novel 3D multi-component macroporous composite materials with a uniform distribution of embedded particles. The proposed method provides better control of freezing conditions and thus overcomes existing drawbacks limiting production of large gel-based devices and matrices. The proposed method could serve as a new design concept for functional 3D macroporous gels and composites preparation for biomedical, biotechnological and environmental applications.

  18. Determination of dissolved-phase pesticides in surface water from the Yakima River basin, Washington, using the Goulden large-sample extractor and gas chromatography/mass spectrometer

    USGS Publications Warehouse

    Foster, Gregory D.; Gates, Paul M.; Foreman, William T.; McKenzie, Stuart W.; Rinella, Frank A.

    1993-01-01

    Concentrations of pesticides in the dissolved phase of surface water samples from the Yakima River basin, WA, were determined using preconcentration in the Goulden large-sample extractor (GLSE) and gas chromatography/mass spectrometry (GC/MS) analysis. Sample volumes ranging from 10 to 120 L were processed with the GLSE, and the results from the large-sample analyses were compared to those derived from 1-L continuous liquid-liquid extractions Few of the 40 target pesticides were detected in 1-L samples, whereas large-sample preconcentration in the GLSE provided detectable levels for many of the target pesticides. The number of pesticides detected in GLSE processed samples was usually directly proportional to sample volume, although the measured concentrations of the pesticides were generally lower at the larger sample volumes for the same water source. The GLSE can be used to provide lower detection levels relative to conventional liquid-liquid extraction in GC/MS analysis of pesticides in samples of surface water.

  19. A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.

    PubMed

    De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc

    2010-09-01

    In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.

  20. High-throughput, automated extraction of DNA and RNA from clinical samples using TruTip technology on common liquid handling robots.

    PubMed

    Holmberg, Rebecca C; Gindlesperger, Alissa; Stokes, Tinsley; Brady, Dane; Thakore, Nitu; Belgrader, Philip; Cooney, Christopher G; Chandler, Darrell P

    2013-06-11

    TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively).

  1. A Parallel Pipelined Renderer for the Time-Varying Volume Data

    NASA Technical Reports Server (NTRS)

    Chiueh, Tzi-Cker; Ma, Kwan-Liu

    1997-01-01

    This paper presents a strategy for efficiently rendering time-varying volume data sets on a distributed-memory parallel computer. Time-varying volume data take large storage space and visualizing them requires reading large files continuously or periodically throughout the course of the visualization process. Instead of using all the processors to collectively render one volume at a time, a pipelined rendering process is formed by partitioning processors into groups to render multiple volumes concurrently. In this way, the overall rendering time may be greatly reduced because the pipelined rendering tasks are overlapped with the I/O required to load each volume into a group of processors; moreover, parallelization overhead may be reduced as a result of partitioning the processors. We modify an existing parallel volume renderer to exploit various levels of rendering parallelism and to study how the partitioning of processors may lead to optimal rendering performance. Two factors which are important to the overall execution time are re-source utilization efficiency and pipeline startup latency. The optimal partitioning configuration is the one that balances these two factors. Tests on Intel Paragon computers show that in general optimal partitionings do exist for a given rendering task and result in 40-50% saving in overall rendering time.

  2. Volume server: A scalable high speed and high capacity magnetic tape archive architecture with concurrent multi-host access

    NASA Technical Reports Server (NTRS)

    Rybczynski, Fred

    1993-01-01

    A major challenge facing data processing centers today is data management. This includes the storage of large volumes of data and access to it. Current media storage for large data volumes is typically off line and frequently off site in warehouses. Access to data archived in this fashion can be subject to long delays, errors in media selection and retrieval, and even loss of data through misplacement or damage to the media. Similarly, designers responsible for architecting systems capable of continuous high-speed recording of large volumes of digital data are faced with the challenge of identifying technologies and configurations that meet their requirements. Past approaches have tended to evaluate the combination of the fastest tape recorders with the highest capacity tape media and then to compromise technology selection as a consequence of cost. This paper discusses an architecture that addresses both of these challenges and proposes a cost effective solution based on robots, high speed helical scan tape drives, and large-capacity media.

  3. Protein Folding Using a Vortex Fluidic Device.

    PubMed

    Britton, Joshua; Smith, Joshua N; Raston, Colin L; Weiss, Gregory A

    2017-01-01

    Essentially all biochemistry and most molecular biology experiments require recombinant proteins. However, large, hydrophobic proteins typically aggregate into insoluble and misfolded species, and are directed into inclusion bodies. Current techniques to fold proteins recovered from inclusion bodies rely on denaturation followed by dialysis or rapid dilution. Such approaches can be time consuming, wasteful, and inefficient. Here, we describe rapid protein folding using a vortex fluidic device (VFD). This process uses mechanical energy introduced into thin films to rapidly and efficiently fold proteins. With the VFD in continuous flow mode, large volumes of protein solution can be processed per day with 100-fold reductions in both folding times and buffer volumes.

  4. A simple method for the production of large volume 3D macroporous hydrogels for advanced biotechnological, medical and environmental applications

    PubMed Central

    Savina, Irina N.; Ingavle, Ganesh C.; Cundy, Andrew B.; Mikhalovsky, Sergey V.

    2016-01-01

    The development of bulk, three-dimensional (3D), macroporous polymers with high permeability, large surface area and large volume is highly desirable for a range of applications in the biomedical, biotechnological and environmental areas. The experimental techniques currently used are limited to the production of small size and volume cryogel material. In this work we propose a novel, versatile, simple and reproducible method for the synthesis of large volume porous polymer hydrogels by cryogelation. By controlling the freezing process of the reagent/polymer solution, large-scale 3D macroporous gels with wide interconnected pores (up to 200 μm in diameter) and large accessible surface area have been synthesized. For the first time, macroporous gels (of up to 400 ml bulk volume) with controlled porous structure were manufactured, with potential for scale up to much larger gel dimensions. This method can be used for production of novel 3D multi-component macroporous composite materials with a uniform distribution of embedded particles. The proposed method provides better control of freezing conditions and thus overcomes existing drawbacks limiting production of large gel-based devices and matrices. The proposed method could serve as a new design concept for functional 3D macroporous gels and composites preparation for biomedical, biotechnological and environmental applications. PMID:26883390

  5. Mars Observer data production, transfer, and archival: The data production assembly line

    NASA Technical Reports Server (NTRS)

    Childs, David B.

    1993-01-01

    This paper describes the data production, transfer, and archival process designed for the Mars Observer Flight Project. It addresses the developmental and operational aspects of the archive collection production process. The developmental aspects cover the design and packaging of data products for archival and distribution to the planetary community. Also discussed is the design and development of a data transfer and volume production process capable of handling the large throughput and complexity of the Mars Observer data products. The operational aspects cover the main functions of the process: creating data and engineering products, collecting the data products and ancillary products in a central repository, producing archive volumes, validating volumes, archiving, and distributing the data to the planetary community.

  6. Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.

    PubMed

    Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher

    2011-01-01

    Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume. Copyright © 2011 American Institute of Chemical Engineers (AIChE).

  7. Accessibility and Analysis to NASA's New Large Volume Missions

    NASA Astrophysics Data System (ADS)

    Hausman, J.; Gangl, M.; McAuley, J.; Toaz, R., Jr.

    2016-12-01

    Each new satellite mission continues to measure larger volumes of data than the last. This is especially true with the new NASA satellite missions NISAR and SWOT, launching in 2020 and 2021, which will produce petabytes of data a year. A major concern is how will users be able to analyze such volumes? This presentation will show how cloud storage and analysis can help overcome and accommodate multiple users' needs. While users may only need gigabytes of data for their research, the data center will need to leverage the processing power of the cloud to perform search and subsetting capabilities over the large volume of data. There is also a vast array of user types that require different tools and services to access and analyze the data. Some users need global data to run climate models, while others require small, dynamic regions with lots of analysis and transformations. There will also be a need to generate data that have different inputs or correction algorithms that the project may not be able to provide as those will be very specialized for specific regions or evolve quicker than what the project can reprocess. By having the data and tools side by side, users will be able to access the data they require and analyze it all in one place. By placing data in the cloud, users can analyze the data there, shifting the current "download and analyze" paradigm to "log-in and analyze". The cloud will provide adequate processing power needed to analyze large volumes of data, subset small regions over large volumes of data, and regenerate/reformat data to the specificity each user requires.

  8. Numerical simulation of seismic wave propagation from land-excited large volume air-gun source

    NASA Astrophysics Data System (ADS)

    Cao, W.; Zhang, W.

    2017-12-01

    The land-excited large volume air-gun source can be used to study regional underground structures and to detect temporal velocity changes. The air-gun source is characterized by rich low frequency energy (from bubble oscillation, 2-8Hz) and high repeatability. It can be excited in rivers, reservoirs or man-made pool. Numerical simulation of the seismic wave propagation from the air-gun source helps to understand the energy partitioning and characteristics of the waveform records at stations. However, the effective energy recorded at a distance station is from the process of bubble oscillation, which can not be approximated by a single point source. We propose a method to simulate the seismic wave propagation from the land-excited large volume air-gun source by finite difference method. The process can be divided into three parts: bubble oscillation and source coupling, solid-fluid coupling and the propagation in the solid medium. For the first part, the wavelet of the bubble oscillation can be simulated by bubble model. We use wave injection method combining the bubble wavelet with elastic wave equation to achieve the source coupling. Then, the solid-fluid boundary condition is implemented along the water bottom. And the last part is the seismic wave propagation in the solid medium, which can be readily implemented by the finite difference method. Our method can get accuracy waveform of land-excited large volume air-gun source. Based on the above forward modeling technology, we analysis the effect of the excited P wave and the energy of converted S wave due to different water shapes. We study two land-excited large volume air-gun fields, one is Binchuan in Yunnan, and the other is Hutubi in Xinjiang. The station in Binchuan, Yunnan is located in a large irregular reservoir, the waveform records have a clear S wave. Nevertheless, the station in Hutubi, Xinjiang is located in a small man-made pool, the waveform records have very weak S wave. Better understanding of the characteristics of land-excited large volume air-gun can help to better use of the air-gun source.

  9. Meta-control of combustion performance with a data mining approach

    NASA Astrophysics Data System (ADS)

    Song, Zhe

    Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.

  10. Electronic Business Transaction Infrastructure Analysis Using Petri Nets and Simulation

    ERIC Educational Resources Information Center

    Feller, Andrew Lee

    2010-01-01

    Rapid growth in eBusiness has made industry and commerce increasingly dependent on the hardware and software infrastructure that enables high-volume transaction processing across the Internet. Large transaction volumes at major industrial-firm data centers rely on robust transaction protocols and adequately provisioned hardware capacity to ensure…

  11. An improved filter elution and cell culture assay procedure for evaluating public groundwater systems for culturable enteroviruses.

    PubMed

    Dahling, Daniel R

    2002-01-01

    Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.

  12. Apparatus and process for microbial detection and enumeration

    NASA Technical Reports Server (NTRS)

    Wilkins, J. R.; Grana, D. (Inventor)

    1982-01-01

    An apparatus and process for detecting and enumerating specific microorganisms from large volume samples containing small numbers of the microorganisms is presented. The large volume samples are filtered through a membrane filter to concentrate the microorganisms. The filter is positioned between two absorbent pads and previously moistened with a growth medium for the microorganisms. A pair of electrodes are disposed against the filter and the pad electrode filter assembly is retained within a petri dish by retainer ring. The cover is positioned on base of petri dish and sealed at the edges by a parafilm seal prior to being electrically connected via connectors to a strip chart recorder for detecting and enumerating the microorganisms collected on filter.

  13. Extraterrestrial processing and manufacturing of large space systems. Volume 3: Executive summary

    NASA Technical Reports Server (NTRS)

    Miller, R. H.; Smith, D. B. S.

    1979-01-01

    Facilities and equipment are defined for refining processes to commercial grade of lunar material that is delivered to a 'space manufacturing facility' in beneficiated, primary processed quality. The manufacturing facilities and the equipment for producing elements of large space systems from these materials and providing programmatic assessments of the concepts are also defined. In-space production processes of solar cells (by vapor deposition) and arrays, structures and joints, conduits, waveguides, RF equipment radiators, wire cables, converters, and others are described.

  14. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies

    PubMed Central

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948

  15. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies.

    PubMed

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.

  16. Public-Private Partnership: Joint recommendations to improve downloads of large Earth observation data

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Murphy, K. J.; Baynes, K.; Lynnes, C.

    2016-12-01

    With the volume of Earth observation data expanding rapidly, cloud computing is quickly changing the way Earth observation data is processed, analyzed, and visualized. The cloud infrastructure provides the flexibility to scale up to large volumes of data and handle high velocity data streams efficiently. Having freely available Earth observation data collocated on a cloud infrastructure creates opportunities for innovation and value-added data re-use in ways unforeseen by the original data provider. These innovations spur new industries and applications and spawn new scientific pathways that were previously limited due to data volume and computational infrastructure issues. NASA, in collaboration with Amazon, Google, and Microsoft, have jointly developed a set of recommendations to enable efficient transfer of Earth observation data from existing data systems to a cloud computing infrastructure. The purpose of these recommendations is to provide guidelines against which all data providers can evaluate existing data systems and be used to improve any issues uncovered to enable efficient search, access, and use of large volumes of data. Additionally, these guidelines ensure that all cloud providers utilize a common methodology for bulk-downloading data from data providers thus preventing the data providers from building custom capabilities to meet the needs of individual cloud providers. The intent is to share these recommendations with other Federal agencies and organizations that serve Earth observation to enable efficient search, access, and use of large volumes of data. Additionally, the adoption of these recommendations will benefit data users interested in moving large volumes of data from data systems to any other location. These data users include the cloud providers, cloud users such as scientists, and other users working in a high performance computing environment who need to move large volumes of data.

  17. Assessment of the Study of Army Logistics 1981. Volume II. Analysis of Recommendations.

    DTIC Science & Technology

    1983-02-01

    conceived. This third generation equipment, because of its size, cost and processing characteristics, demands large scale integrated processing with a... generated by DS4. Three systems changes to SAILS ABX have been implemented which reduce the volume of supply status provided to the DS4 system. 15... generated by the wholesale system by 50 percent or nearly 1,000,000 transactions per month. Additional reductions will be generated by selected status

  18. Computerized Production Process Planning. Volume 2. Benefit Analysis.

    DTIC Science & Technology

    1976-11-01

    advantage , in the long term, Systems 2 and 3 will return greater economic benefits . Plots of the cumulative present value of the cash flow by year are...is economically viable for large parts manufac- turers and does offer significant advantages over Systems I and 2 in terms of intangible benefits ...AD-RI51 996 COMPUTERIZED PRODUCTION PROCESS PLANNING VOLUME 2 i/1.. BENEFIT ANRLYSIS(U) IIT RESEARCH INST CHICRGO IL SH H HU ET AL. NOV 76 DAAHNi-76

  19. Research on volume metrology method of large vertical energy storage tank based on internal electro-optical distance-ranging method

    NASA Astrophysics Data System (ADS)

    Hao, Huadong; Shi, Haolei; Yi, Pengju; Liu, Ying; Li, Cunjun; Li, Shuguang

    2018-01-01

    A Volume Metrology method based on Internal Electro-optical Distance-ranging method is established for large vertical energy storage tank. After analyzing the vertical tank volume calculation mathematical model, the key processing algorithms, such as gross error elimination, filtering, streamline, and radius calculation are studied for the point cloud data. The corresponding volume values are automatically calculated in the different liquids by calculating the cross-sectional area along the horizontal direction and integrating from vertical direction. To design the comparison system, a vertical tank which the nominal capacity is 20,000 m3 is selected as the research object, and there are shown that the method has good repeatability and reproducibility. Through using the conventional capacity measurement method as reference, the relative deviation of calculated volume is less than 0.1%, meeting the measurement requirements. And the feasibility and effectiveness are demonstrated.

  20. High performance concentration method for viruses in drinking water.

    PubMed

    Kunze, Andreas; Pei, Lu; Elsässer, Dennis; Niessner, Reinhard; Seidel, Michael

    2015-09-15

    According to the risk assessment of the WHO, highly infectious pathogenic viruses like rotaviruses should not be present in large-volume drinking water samples of up to 90 m(3). On the other hand, quantification methods for viruses are only operable in small volumes, and presently no concentration procedure for processing such large volumes has been reported. Therefore, the aim of this study was to demonstrate a procedure for processing viruses in-line of a drinking water pipeline by ultrafiltration (UF) and consecutive further concentration by monolithic filtration (MF) and centrifugal ultrafiltration (CeUF) of viruses to a final 1-mL sample. For testing this concept, the model virus bacteriophage MS2 was spiked continuously in UF instrumentation. Tap water was processed in volumes between 32.4 m(3) (22 h) and 97.7 m(3) (72 h) continuously either in dead-end (DE) or cross-flow (CF) mode. Best results were found by DE-UF over 22 h. The concentration of MS2 was increased from 4.2×10(4) GU/mL (genomic units per milliliter) to 3.2×10(10) GU/mL and from 71 PFU/mL to 2×10(8) PFU/mL as determined by qRT-PCR and plaque assay, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    NASA Technical Reports Server (NTRS)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.

  2. Semi-automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    PubMed Central

    Jurrus, Elizabeth; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R. C.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes. PMID:22644867

  3. Onboard Classification of Hyperspectral Data on the Earth Observing One Mission

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Tran, Daniel; Schaffer, Steve; Rabideau, Gregg; Davies, Ashley Gerard; Doggett, Thomas; Greeley, Ronald; Ip, Felipe; Baker, Victor; Doubleday, Joshua; hide

    2009-01-01

    Remote-sensed hyperspectral data represents significant challenges in downlink due to its large data volumes. This paper describes a research program designed to process hyperspectral data products onboard spacecraft to (a) reduce data downlink volumes and (b) decrease latency to provide key data products (often by enabling use of lower data rate communications systems). We describe efforts to develop onboard processing to study volcanoes, floods, and cryosphere, using the Hyperion hyperspectral imager and onboard processing for the Earth Observing One (EO-1) mission as well as preliminary work targeting the Hyperspectral Infrared Imager (HyspIRI) mission.

  4. Interactive Computing and Processing of NASA Land Surface Observations Using Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Bell, Jordan

    2016-01-01

    Google's Earth Engine offers a "big data" approach to processing large volumes of NASA and other remote sensing products. h\\ps://earthengine.google.com/ Interfaces include a Javascript or Python-based API, useful for accessing and processing over large periods of record for Landsat and MODIS observations. Other data sets are frequently added, including weather and climate model data sets, etc. Demonstrations here focus on exploratory efforts to perform land surface change detection related to severe weather, and other disaster events.

  5. Monitoring landscape level processes using remote sensing of large plots

    Treesearch

    Raymond L. Czaplewski

    1991-01-01

    Global and regional assessaents require timely information on landscape level status (e.g., areal extent of different ecosystems) and processes (e.g., changes in land use and land cover). To measure and understand these processes at the regional level, and model their impacts, remote sensing is often necessary. However, processing massive volumes of remotely sensing...

  6. EEG based zero-phase phase-locking value (PLV) and effects of spatial filtering during actual movement.

    PubMed

    Jian, Wenjuan; Chen, Minyou; McFarland, Dennis J

    2017-04-01

    Phase-locking value (PLV) is a well-known feature in sensorimotor rhythm (SMR) based BCI. Zero-phase PLV has not been explored because it is generally regarded as the result of volume conduction. Because spatial filters are often used to enhance the amplitude (square root of band power (BP)) feature and attenuate volume conduction, they are frequently applied as pre-processing methods when computing PLV. However, the effects of spatial filtering on PLV are ambiguous. Therefore, this article aims to explore whether zero-phase PLV is meaningful and how this is influenced by spatial filtering. Based on archival EEG data of left and right hand movement tasks for 32 subjects, we compared BP and PLV feature using data with and without pre-processing by a large Laplacian. Results showed that using ear-referenced data, zero-phase PLV provided unique information independent of BP for task prediction which was not explained by volume conduction and was significantly decreased when a large Laplacian was applied. In other words, the large Laplacian eliminated the useful information in zero-phase PLV for task prediction suggesting that it contains effects of both amplitude and phase. Therefore, zero-phase PLV may have functional significance beyond volume conduction. The interpretation of spatial filtering may be complicated by effects of phase. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Coupling of RF antennas to large volume helicon plasma

    NASA Astrophysics Data System (ADS)

    Chang, Lei; Hu, Xinyue; Gao, Lei; Chen, Wei; Wu, Xianming; Sun, Xinfeng; Hu, Ning; Huang, Chongxiang

    2018-04-01

    Large volume helicon plasma sources are of particular interest for large scale semiconductor processing, high power plasma propulsion and recently plasma-material interaction under fusion conditions. This work is devoted to studying the coupling of four typical RF antennas to helicon plasma with infinite length and diameter of 0.5 m, and exploring its frequency dependence in the range of 13.56-70 MHz for coupling optimization. It is found that loop antenna is more efficient than half helix, Boswell and Nagoya III antennas for power absorption; radially parabolic density profile overwhelms Gaussian density profile in terms of antenna coupling for low-density plasma, but the superiority reverses for high-density plasma. Increasing the driving frequency results in power absorption more near plasma edge, but the overall power absorption increases with frequency. Perpendicular stream plots of wave magnetic field, wave electric field and perturbed current are also presented. This work can serve as an important reference for the experimental design of large volume helicon plasma source with high RF power.

  8. Scalable Performance Measurement and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number ofmore » tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.« less

  9. Semi-Automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jurrus, Elizabeth R.; Watanabe, Shigeki; Giuly, Richard J.

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated processmore » first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes.« less

  10. Atmospheric energetics in regions of intense convective activity

    NASA Technical Reports Server (NTRS)

    Fuelberg, H. E.

    1977-01-01

    Synoptic-scale budgets of kinetic and total potential energy are computed using 3- and 6-h data at nine times from NASA's fourth Atmospheric Variability Experiment (AVE IV). Two intense squall lines occurred during the period. Energy budgets for areas that enclose regions of intense convection are shown to have systematic changes that relate to the life cycles of the convection. Some of the synoptic-scale energy processes associated with the convection are found to be larger than those observed in the vicinity of mature cyclones. Volumes enclosing intense convection are found to have large values of cross-contour conversion of potential to kinetic energy and large horizontal export of kinetic energy. Although small net vertical transport of kinetic energy is observed, values at individual layers indicate large upward transport. Transfer of kinetic energy from grid to subgrid scales of motion occurs in the volumes. Latent heat release is large in the middle and upper troposphere and is thought to be the cause of the observed cyclic changes in the budget terms. Total potential energy is found to be imported horizontally in the lower half of the atmosphere, transported aloft, and then exported horizontally. Although local changes of kinetic energy and total potential energy are small, interaction between volumes enclosing convection with surrounding larger volumes is quite large.

  11. Gully development processes in the Ethiopian Highlands

    USDA-ARS?s Scientific Manuscript database

    Gully erosion is an important soil degradation process in a range of environments, causing considerable soil losses and producing large volumes of sediment. In Ethiopia, gully erosion is a major problem expanding at alarming rate and devastating cultivated and grazing lands. The objective of the stu...

  12. INTELLIGENT DECISION SUPPORT FOR WASTE MINIMIZATION IN ELECTROPLATING PLANTS. (R824732)

    EPA Science Inventory

    Abstract

    Wastewater, spent solvent, spent process solutions, and sludge are the major waste streams generated in large volumes daily in electroplating plants. These waste streams can be significantly minimized through process modification and operational improvement. I...

  13. Practical Applications of Data Processing to School Purchasing.

    ERIC Educational Resources Information Center

    California Association of School Business Officials, San Diego. Imperial Section.

    Electronic data processing provides a fast and accurate system for handling large volumes of routine data. If properly employed, computers can perform myriad functions for purchasing operations, including purchase order writing; equipment inventory control; vendor inventory; and equipment acquisition, transfer, and retirement. The advantages of…

  14. Ultra-large scale AFM of lipid droplet arrays: investigating the ink transfer volume in dip pen nanolithography.

    PubMed

    Förste, Alexander; Pfirrmann, Marco; Sachs, Johannes; Gröger, Roland; Walheim, Stefan; Brinkmann, Falko; Hirtz, Michael; Fuchs, Harald; Schimmel, Thomas

    2015-05-01

    There are only few quantitative studies commenting on the writing process in dip-pen nanolithography with lipids. Lipids are important carrier ink molecules for the delivery of bio-functional patters in bio-nanotechnology. In order to better understand and control the writing process, more information on the transfer of lipid material from the tip to the substrate is needed. The dependence of the transferred ink volume on the dwell time of the tip on the substrate was investigated by topography measurements with an atomic force microscope (AFM) that is characterized by an ultra-large scan range of 800 × 800 μm(2). For this purpose arrays of dots of the phospholipid1,2-dioleoyl-sn-glycero-3-phosphocholine were written onto planar glass substrates and the resulting pattern was imaged by large scan area AFM. Two writing regimes were identified, characterized of either a steady decline or a constant ink volume transfer per dot feature. For the steady state ink transfer, a linear relationship between the dwell time and the dot volume was determined, which is characterized by a flow rate of about 16 femtoliters per second. A dependence of the ink transport from the length of pauses before and in between writing the structures was observed and should be taken into account during pattern design when aiming at best writing homogeneity. The ultra-large scan range of the utilized AFM allowed for a simultaneous study of the entire preparation area of almost 1 mm(2), yielding good statistic results.

  15. Ultra-large scale AFM of lipid droplet arrays: investigating the ink transfer volume in dip pen nanolithography

    NASA Astrophysics Data System (ADS)

    Förste, Alexander; Pfirrmann, Marco; Sachs, Johannes; Gröger, Roland; Walheim, Stefan; Brinkmann, Falko; Hirtz, Michael; Fuchs, Harald; Schimmel, Thomas

    2015-05-01

    There are only few quantitative studies commenting on the writing process in dip-pen nanolithography with lipids. Lipids are important carrier ink molecules for the delivery of bio-functional patters in bio-nanotechnology. In order to better understand and control the writing process, more information on the transfer of lipid material from the tip to the substrate is needed. The dependence of the transferred ink volume on the dwell time of the tip on the substrate was investigated by topography measurements with an atomic force microscope (AFM) that is characterized by an ultra-large scan range of 800 × 800 μm2. For this purpose arrays of dots of the phospholipid1,2-dioleoyl-sn-glycero-3-phosphocholine were written onto planar glass substrates and the resulting pattern was imaged by large scan area AFM. Two writing regimes were identified, characterized of either a steady decline or a constant ink volume transfer per dot feature. For the steady state ink transfer, a linear relationship between the dwell time and the dot volume was determined, which is characterized by a flow rate of about 16 femtoliters per second. A dependence of the ink transport from the length of pauses before and in between writing the structures was observed and should be taken into account during pattern design when aiming at best writing homogeneity. The ultra-large scan range of the utilized AFM allowed for a simultaneous study of the entire preparation area of almost 1 mm2, yielding good statistic results.

  16. Biosynthesis of indigo using recombinant E. coli: Development of a biological system for the cost-effective production of a large volume chemical

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, A.; Battist, S.; Chotani, G.

    1995-11-01

    Cost-effective production of any large-volume chemical by fermentation requires extensive manipulation of both the production organism and the fermentation and recovery processes. We have developed a recombinant E. coli system for the production of tryptophan and several other products derived from the aromatic amino acid pathway. By linking our technology for low-cost production of tryptophan from glucose with the enzyme naphthalene dioxygenase (NDO), we have achieved an overall process for the production of indigo dye from glucose. To successfully join these two technologies, both the tryptophan pathway and NDO were extensively modified via genetic engineering. In addition, systems were developedmore » to remove deleterious by-products generated during the chemical oxidations leading to indigo formation. Low-cost fermentation processes were developed that utilized minimal-salts media containing glucose as the sole carbon source. Finally, economical recovery processes were used that preserved the environmental friendliness of the biosynthetic route to indigo.« less

  17. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  18. OPTIMIZED DETERMINATION OF TRACE JET FUEL VOLATILE ORGANIC COMPOUNDS IN HUMAN BLOOD USING IN-FIELD LIQUID-LIQUID EXTRACTION WITH SUBSEQUENT LABORATORY GAS CHROMATOGRAPHIC-MASS SPECTROMETRIC ANALYSIS AND ON-COLUMN LARGE VOLUME INJECTION

    EPA Science Inventory

    A practical and sensitive method to assess volatile organic compounds (VOCs) from JP-8 jet fuel in human whole blood was developed by modifying previously established liquid-liquid extraction procedures, optimizing extraction times, solvent volume, specific sample processing te...

  19. Lodgepole pine bole wood density 1 and 11 years after felling in central Montana

    Treesearch

    Duncan C. Lutes; Colin C. Hardy

    2013-01-01

    Estimates of large dead and down woody material biomass are used for evaluating ecological processes and making ecological assessments, such as for nutrient cycling, wildlife habitat, fire effects, and climate change science. Many methods are used to assess the abundance (volume) of woody material, which ultimately require an estimate of wood density to convert volume...

  20. Enhanced FIB-SEM systems for large-volume 3D imaging

    DOE PAGES

    Xu, C. Shan; Hayworth, Kenneth J.; Lu, Zhiyuan; ...

    2017-05-13

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 10 6 ?m 3 . These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processesmore » and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology.« less

  1. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig

    PubMed Central

    Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit is highly scalable and adaptable, which makes it suitable for use in neuroscience applications as a scalable data processing toolkit. As part of the ongoing extension of NeuroPigPen, we are developing new modules to support statistical functions to analyze signal data for brain connectivity research. In addition, the toolkit is being extended to allow integration with scientific workflow systems. NeuroPigPen is released under BSD license at: https://sites.google.com/a/case.edu/neuropigpen/. PMID:27375472

  2. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    PubMed

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit is highly scalable and adaptable, which makes it suitable for use in neuroscience applications as a scalable data processing toolkit. As part of the ongoing extension of NeuroPigPen, we are developing new modules to support statistical functions to analyze signal data for brain connectivity research. In addition, the toolkit is being extended to allow integration with scientific workflow systems. NeuroPigPen is released under BSD license at: https://sites.google.com/a/case.edu/neuropigpen/.

  3. Epoxidized Natural Rubber/Chitosan Network Binder for Silicon Anode in Lithium-Ion Battery.

    PubMed

    Lee, Sang Ha; Lee, Jeong Hun; Nam, Dong Ho; Cho, Misuk; Kim, Jaehoon; Chanthad, Chalathorn; Lee, Youngkwan

    2018-05-16

    Polymeric binder is extremely important for Si-based anode in lithium-ion batteries due to large volume variation during charging/discharging process. Here, natural rubber-incorporated chitosan networks were designed as a binder material to obtain both adhesion and elasticity. Chitosan could strongly anchor Si particles through hydrogen bonding, while the natural rubber could stretch reversibly during the volume variation of Si particles, resulting in high cyclic performance. The prepared electrode exhibited the specific capacities of 1350 mAh/g after 1600 cycles at the current density of 8 A/g and 2310 mAh/g after 500 cycles at the current density of 1 A/g. Furthermore, the cycle test with limiting lithiation capacity was conducted to study the optimal binder properties at varying degree of the volume expansion of silicon, and it was found that the elastic property of binder material was strongly required when the large volume expansion of Si occurred.

  4. Using Multiple Big Datasets and Machine Learning to Produce a New Global Particulate Dataset: A Technology Challenge Case Study

    NASA Astrophysics Data System (ADS)

    Lary, D. J.

    2013-12-01

    A BigData case study is described where multiple datasets from several satellites, high-resolution global meteorological data, social media and in-situ observations are combined using machine learning on a distributed cluster using an automated workflow. The global particulate dataset is relevant to global public health studies and would not be possible to produce without the use of the multiple big datasets, in-situ data and machine learning.To greatly reduce the development time and enhance the functionality a high level language capable of parallel processing has been used (Matlab). A key consideration for the system is high speed access due to the large data volume, persistence of the large data volumes and a precise process time scheduling capability.

  5. GMP Cryopreservation of Large Volumes of Cells for Regenerative Medicine: Active Control of the Freezing Process

    PubMed Central

    Massie, Isobel; Selden, Clare; Hodgson, Humphrey; Gibbons, Stephanie; Morris, G. John

    2014-01-01

    Cryopreservation protocols are increasingly required in regenerative medicine applications but must deliver functional products at clinical scale and comply with Good Manufacturing Process (GMP). While GMP cryopreservation is achievable on a small scale using a Stirling cryocooler-based controlled rate freezer (CRF) (EF600), successful large-scale GMP cryopreservation is more challenging due to heat transfer issues and control of ice nucleation, both complex events that impact success. We have developed a large-scale cryocooler-based CRF (VIA Freeze) that can process larger volumes and have evaluated it using alginate-encapsulated liver cell (HepG2) spheroids (ELS). It is anticipated that ELS will comprise the cellular component of a bioartificial liver and will be required in volumes of ∼2 L for clinical use. Sample temperatures and Stirling cryocooler power consumption was recorded throughout cooling runs for both small (500 μL) and large (200 mL) volume samples. ELS recoveries were assessed using viability (FDA/PI staining with image analysis), cell number (nuclei count), and function (protein secretion), along with cryoscanning electron microscopy and freeze substitution techniques to identify possible injury mechanisms. Slow cooling profiles were successfully applied to samples in both the EF600 and the VIA Freeze, and a number of cooling and warming profiles were evaluated. An optimized cooling protocol with a nonlinear cooling profile from ice nucleation to −60°C was implemented in both the EF600 and VIA Freeze. In the VIA Freeze the nucleation of ice is detected by the control software, allowing both noninvasive detection of the nucleation event for quality control purposes and the potential to modify the cooling profile following ice nucleation in an active manner. When processing 200 mL of ELS in the VIA Freeze—viabilities at 93.4%±7.4%, viable cell numbers at 14.3±1.7 million nuclei/mL alginate, and protein secretion at 10.5±1.7 μg/mL/24 h were obtained which, compared well with control ELS (viability −98.1%±0.9%; viable cell numbers −18.3±1.0 million nuclei/mL alginate; and protein secretion −18.7±1.8 μg/mL/24 h). Large volume GMP cryopreservation of ELS is possible with good functional recovery using the VIA Freeze and may also be applied to other regenerative medicine applications. PMID:24410575

  6. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.

  7. Process Mining Online Assessment Data

    ERIC Educational Resources Information Center

    Pechenizkiy, Mykola; Trcka, Nikola; Vasilyeva, Ekaterina; van der Aalst, Wil; De Bra, Paul

    2009-01-01

    Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for…

  8. Distributed shared memory for roaming large volumes.

    PubMed

    Castanié, Laurent; Mion, Christophe; Cavin, Xavier; Lévy, Bruno

    2006-01-01

    We present a cluster-based volume rendering system for roaming very large volumes. This system allows to move a gigabyte-sized probe inside a total volume of several tens or hundreds of gigabytes in real-time. While the size of the probe is limited by the total amount of texture memory on the cluster, the size of the total data set has no theoretical limit. The cluster is used as a distributed graphics processing unit that both aggregates graphics power and graphics memory. A hardware-accelerated volume renderer runs in parallel on the cluster nodes and the final image compositing is implemented using a pipelined sort-last rendering algorithm. Meanwhile, volume bricking and volume paging allow efficient data caching. On each rendering node, a distributed hierarchical cache system implements a global software-based distributed shared memory on the cluster. In case of a cache miss, this system first checks page residency on the other cluster nodes instead of directly accessing local disks. Using two Gigabit Ethernet network interfaces per node, we accelerate data fetching by a factor of 4 compared to directly accessing local disks. The system also implements asynchronous disk access and texture loading, which makes it possible to overlap data loading, volume slicing and rendering for optimal volume roaming.

  9. Proton Radiography Peers into Metal Solidification

    DOE PAGES

    Clarke, Amy J.; Imhoff, Seth D.; Gibbs, Paul J.; ...

    2013-06-19

    Historically, metals are cut up and polished to see the structure and to infer how processing influences the evolution. We can now peer into a metal during processing without destroying it using proton radiography. Understanding the link between processing and structure is important because structure profoundly affects the properties of engineering materials. Synchrotron x-ray radiography has enabled real-time glimpses into metal solidification. However, x-ray energies favor the examination of small volumes and low density metals. In this study, we use high energy proton radiography for the first time to image a large metal volume (>10,000 mm 3) during melting andmore » solidification. We also show complementary x-ray results from a small volume (<1mm 3), bridging four orders of magnitude. In conclusion, real-time imaging will enable efficient process development and the control of the structure evolution to make materials with intended properties; it will also permit the development of experimentally informed, predictive structure and process models.« less

  10. Vivaldi: A Domain-Specific Language for Volume Processing and Visualization on Distributed Heterogeneous Systems.

    PubMed

    Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki

    2014-12-01

    As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.

  11. EBIC/TEM investigations of process-induced defects in EFG silicon ribbon

    NASA Technical Reports Server (NTRS)

    Cunningham, B.; Ast, D. G.

    1981-01-01

    Electron bombardment induced conductivity and scanning transmission electron microscopy observations on unprocessed and processed edge-defined film-fed growth ribbon show that the phosphorus diffused junction depth is not uniform, and that a variety of chemical impurities precipitate out during processing. Two kinds of precipitates are found (1) 10 nm or less in size, located at the dislocation nodes in sub-boundary like dislocation arrangements formed during processing and (2) large precipitates, the chemical composition of which has been partially identified. These large precipitates emit dense dislocations tangles into the adjacent crystal volume.

  12. Changes in Composition and Phosphorus Profile during Dry Grind Process of Corn into Ethanol and DDGS

    USDA-ARS?s Scientific Manuscript database

    Demand for alternatives to fossil fuels has resulted in a dramatic increase in ethanol production from corn. Dry grind method has been a major process, resulting in a large volume of dried distiller grains with solubles (DDGS) as a co-product. The process consists of grinding, cooking, liquefactio...

  13. Does lake size matter? Combining morphology and process modeling to examine the contribution of lake classes to population-scale processes

    USGS Publications Warehouse

    Winslow, Luke A.; Read, Jordan S.; Hanson, Paul C.; Stanley, Emily H.

    2014-01-01

    With lake abundances in the thousands to millions, creating an intuitive understanding of the distribution of morphology and processes in lakes is challenging. To improve researchers’ understanding of large-scale lake processes, we developed a parsimonious mathematical model based on the Pareto distribution to describe the distribution of lake morphology (area, perimeter and volume). While debate continues over which mathematical representation best fits any one distribution of lake morphometric characteristics, we recognize the need for a simple, flexible model to advance understanding of how the interaction between morphometry and function dictates scaling across large populations of lakes. These models make clear the relative contribution of lakes to the total amount of lake surface area, volume, and perimeter. They also highlight the critical thresholds at which total perimeter, area and volume would be evenly distributed across lake size-classes have Pareto slopes of 0.63, 1 and 1.12, respectively. These models of morphology can be used in combination with models of process to create overarching “lake population” level models of process. To illustrate this potential, we combine the model of surface area distribution with a model of carbon mass accumulation rate. We found that even if smaller lakes contribute relatively less to total surface area than larger lakes, the increasing carbon accumulation rate with decreasing lake size is strong enough to bias the distribution of carbon mass accumulation towards smaller lakes. This analytical framework provides a relatively simple approach to upscaling morphology and process that is easily generalizable to other ecosystem processes.

  14. Characterizing and Optimizing the Performance of the MAESTRO 49-Core Processor

    DTIC Science & Technology

    2014-03-27

    process large volumes of data, it is necessary during testing to vary the dimensions of the inbound data matrix to determine what effect this has on the...needed that can process the extra data these systems seek to collect. However, the space environment presents a number of threats, such as ambient or...induced faults, and that also have sufficient computational power to handle the large flow of data they encounter. This research investigates one

  15. Extraterrestrial processing and manufacturing of large space systems, volume 1, chapters 1-6

    NASA Technical Reports Server (NTRS)

    Miller, R. H.; Smith, D. B. S.

    1979-01-01

    Space program scenarios for production of large space structures from lunar materials are defined. The concept of the space manufacturing facility (SMF) is presented. The manufacturing processes and equipment for the SMF are defined and the conceptual layouts are described for the production of solar cells and arrays, structures and joints, conduits, waveguides, RF equipment radiators, wire cables, and converters. A 'reference' SMF was designed and its operation requirements are described.

  16. Quantification of Protozoa and Viruses from Small Water Volumes

    PubMed Central

    Bonilla, J. Alfredo; Bonilla, Tonya D.; Abdelzaher, Amir M.; Scott, Troy M.; Lukasik, Jerzy; Solo-Gabriele, Helena M.; Palmer, Carol J.

    2015-01-01

    Large sample volumes are traditionally required for the analysis of waterborne pathogens. The need for large volumes greatly limits the number of samples that can be processed. The goals of this study were to compare extraction and detection procedures for quantifying protozoan parasites and viruses from small volumes of marine water. The intent was to evaluate a logistically simpler method of sample collection and processing that would facilitate direct pathogen measures as part of routine monitoring programs. Samples were collected simultaneously using a bilayer device with protozoa capture by size (top filter) and viruses capture by charge (bottom filter). Protozoan detection technologies utilized for recovery of Cryptosporidium spp. and Giardia spp. were qPCR and the more traditional immunomagnetic separation—IFA-microscopy, while virus (poliovirus) detection was based upon qPCR versus plaque assay. Filters were eluted using reagents consistent with the downstream detection technologies. Results showed higher mean recoveries using traditional detection methods over qPCR for Cryptosporidium (91% vs. 45%) and poliovirus (67% vs. 55%) whereas for Giardia the qPCR-based methods were characterized by higher mean recoveries (41% vs. 28%). Overall mean recoveries are considered high for all detection technologies. Results suggest that simultaneous filtration may be suitable for isolating different classes of pathogens from small marine water volumes. More research is needed to evaluate the suitability of this method for detecting pathogens at low ambient concentration levels. PMID:26114244

  17. Quantification of Protozoa and Viruses from Small Water Volumes.

    PubMed

    Bonilla, J Alfredo; Bonilla, Tonya D; Abdelzaher, Amir M; Scott, Troy M; Lukasik, Jerzy; Solo-Gabriele, Helena M; Palmer, Carol J

    2015-06-24

    Large sample volumes are traditionally required for the analysis of waterborne pathogens. The need for large volumes greatly limits the number of samples that can be processed. The aims of this study were to compare extraction and detection procedures for quantifying protozoan parasites and viruses from small volumes of marine water. The intent was to evaluate a logistically simpler method of sample collection and processing that would facilitate direct pathogen measures as part of routine monitoring programs. Samples were collected simultaneously using a bilayer device with protozoa capture by size (top filter) and viruses capture by charge (bottom filter). Protozoan detection technologies utilized for recovery of Cryptosporidium spp. and Giardia spp. were qPCR and the more traditional immunomagnetic separation-IFA-microscopy, while virus (poliovirus) detection was based upon qPCR versus plaque assay. Filters were eluted using reagents consistent with the downstream detection technologies. Results showed higher mean recoveries using traditional detection methods over qPCR for Cryptosporidium (91% vs. 45%) and poliovirus (67% vs. 55%) whereas for Giardia the qPCR-based methods were characterized by higher mean recoveries (41% vs. 28%). Overall mean recoveries are considered high for all detection technologies. Results suggest that simultaneous filtration may be suitable for isolating different classes of pathogens from small marine water volumes. More research is needed to evaluate the suitability of this method for detecting pathogens at low ambient concentration levels.

  18. A Comparison of Two Fat Grafting Methods on Operating Room Efficiency and Costs.

    PubMed

    Gabriel, Allen; Maxwell, G Patrick; Griffin, Leah; Champaneria, Manish C; Parekh, Mousam; Macarios, David

    2017-02-01

    Centrifugation (Cf) is a common method of fat processing but may be time consuming, especially when processing large volumes. To determine the effects on fat grafting time, volume efficiency, reoperations, and complication rates of Cf vs an autologous fat processing system (Rv) that incorporates fat harvesting and processing in a single unit. We performed a retrospective cohort study of consecutive patients who underwent autologous fat grafting during reconstructive breast surgery with Rv or Cf. Endpoints measured were volume of fat harvested (lipoaspirate) and volume injected after processing, time to complete processing, reoperations, and complications. A budget impact model was used to estimate cost of Rv vs Cf. Ninety-eight patients underwent fat grafting with Rv, and 96 patients received Cf. Mean volumes of lipoaspirate (506.0 vs 126.1 mL) and fat injected (177.3 vs 79.2 mL) were significantly higher (P < .0001) in the Rv vs Cf group, respectively. Mean time to complete fat grafting was significantly shorter in the Rv vs Cf group (34.6 vs 90.1 minutes, respectively; P < .0001). Proportions of patients with nodule and cyst formation and/or who received reoperations were significantly less in the Rv vs Cf group. Based on these outcomes and an assumed per minute operating room cost, an average per patient cost savings of $2,870.08 was estimated with Rv vs Cf. Compared to Cf, the Rv fat processing system allowed for a larger volume of fat to be processed for injection and decreased operative time in these patients, potentially translating to cost savings. LEVEL OF EVIDENCE 3. © 2016 The American Society for Aesthetic Plastic Surgery, Inc.

  19. Development of a Two-Stage Microalgae Dewatering Process – A Life Cycle Assessment Approach

    PubMed Central

    Soomro, Rizwan R.; Zeng, Xianhai; Lu, Yinghua; Lin, Lu; Danquah, Michael K.

    2016-01-01

    Even though microalgal biomass is leading the third generation biofuel research, significant effort is required to establish an economically viable commercial-scale microalgal biofuel production system. Whilst a significant amount of work has been reported on large-scale cultivation of microalgae using photo-bioreactors and pond systems, research focus on establishing high performance downstream dewatering operations for large-scale processing under optimal economy is limited. The enormous amount of energy and associated cost required for dewatering large-volume microalgal cultures has been the primary hindrance to the development of the needed biomass quantity for industrial-scale microalgal biofuels production. The extremely dilute nature of large-volume microalgal suspension and the small size of microalgae cells in suspension create a significant processing cost during dewatering and this has raised major concerns towards the economic success of commercial-scale microalgal biofuel production as an alternative to conventional petroleum fuels. This article reports an effective framework to assess the performance of different dewatering technologies as the basis to establish an effective two-stage dewatering system. Bioflocculation coupled with tangential flow filtration (TFF) emerged a promising technique with total energy input of 0.041 kWh, 0.05 kg CO2 emissions and a cost of $ 0.0043 for producing 1 kg of microalgae biomass. A streamlined process for operational analysis of two-stage microalgae dewatering technique, encompassing energy input, carbon dioxide emission, and process cost, is presented. PMID:26904075

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodford, William

    This document is the final technical report from 24M Technologies on the project titled: Low Cost, Structurally Advanced Novel Electrode and Cell Manufacturing. All of the program milestones and deliverables were completed during the performance of the award. Specific accomplishments are 1) 24M demonstrated the processability and electrochemical performance of semi-solid electrodes with active volume contents increased by 10% relative to the program baseline; 2) electrode-level metrics, quality, and yield were demonstrated at an 80 cm 2 electrode footprint; 3) these electrodes were integrated into cells with consistent capacities and impedances, including cells delivered to Argonne National Laboratory for independentmore » testing; 4) those processes were scaled to a large-format (> 260 cm 2) electrode footprint and quality and yield were demonstrated; 5) a high-volume manufacturing approach for large-format electrode fabrication was demonstrated; and 6) large-format cells (> 100 Ah capacity) were prototyped with consistent capacity and impedance, including cells which were delivered to Argonne National Laboratory for independent testing.« less

  1. Process cost and facility considerations in the selection of primary cell culture clarification technology.

    PubMed

    Felo, Michael; Christensen, Brandon; Higgins, John

    2013-01-01

    The bioreactor volume delineating the selection of primary clarification technology is not always easily defined. Development of a commercial scale process for the manufacture of therapeutic proteins requires scale-up from a few liters to thousands of liters. While the separation techniques used for protein purification are largely conserved across scales, the separation techniques for primary cell culture clarification vary with scale. Process models were developed to compare monoclonal antibody production costs using two cell culture clarification technologies. One process model was created for cell culture clarification by disc stack centrifugation with depth filtration. A second process model was created for clarification by multi-stage depth filtration. Analyses were performed to examine the influence of bioreactor volume, product titer, depth filter capacity, and facility utilization on overall operating costs. At bioreactor volumes <1,000 L, clarification using multi-stage depth filtration offers cost savings compared to clarification using centrifugation. For bioreactor volumes >5,000 L, clarification using centrifugation followed by depth filtration offers significant cost savings. For bioreactor volumes of ∼ 2,000 L, clarification costs are similar between depth filtration and centrifugation. At this scale, factors including facility utilization, available capital, ease of process development, implementation timelines, and process performance characterization play an important role in clarification technology selection. In the case study presented, a multi-product facility selected multi-stage depth filtration for cell culture clarification at the 500 and 2,000 L scales of operation. Facility implementation timelines, process development activities, equipment commissioning and validation, scale-up effects, and process robustness are examined. © 2013 American Institute of Chemical Engineers.

  2. Ion processing element with composite media

    DOEpatents

    Mann, Nick R.; Tranter, Troy J.; Todd, Terry A.; Sebesta, Ferdinand

    2003-02-04

    An ion processing element employing composite media disposed in a porous substrate, for facilitating removal of selected chemical species from a fluid stream. The ion processing element includes a porous fibrous glass substrate impregnated by composite media having one or more active components supported by a matrix material of polyacrylonitrile. The active components are effective in removing, by various mechanisms, one or more constituents from a fluid stream passing through the ion processing element. Due to the porosity and large surface area of both the composite medium and the substrate in which it is disposed, a high degree of contact is achieved between the active component and the fluid stream being processed. Further, the porosity of the matrix material and the substrate facilitates use of the ion processing element in high volume applications where it is desired to effectively process a high volume flows.

  3. Ion processing element with composite media

    DOEpatents

    Mann, Nick R [Blackfoot, ID; Tranter, Troy J [Idaho Falls, ID; Todd, Terry A [Aberdeen, ID; Sebesta, Ferdinand [Prague, CZ

    2009-03-24

    An ion processing element employing composite media disposed in a porous substrate, for facilitating removal of selected chemical species from a fluid stream. The ion processing element includes a porous fibrous glass substrate impregnated by composite media having one or more active components supported by a matrix material of polyacrylonitrile. The active components are effective in removing, by various mechanisms, one or more constituents from a fluid stream passing through the ion processing element. Due to the porosity and large surface area of both the composite medium and the substrate in which it is disposed, a high degree of contact is achieved between the active component and the fluid stream being processed. Further, the porosity of the matrix material and the substrate facilitates use of the ion processing element in high volume applications where it is desired to effectively process a high volume flows.

  4. A finite-volume ELLAM for three-dimensional solute-transport modeling

    USGS Publications Warehouse

    Russell, T.F.; Heberton, C.I.; Konikow, Leonard F.; Hornberger, G.Z.

    2003-01-01

    A three-dimensional finite-volume ELLAM method has been developed, tested, and successfully implemented as part of the U.S. Geological Survey (USGS) MODFLOW-2000 ground water modeling package. It is included as a solver option for the Ground Water Transport process. The FVELLAM uses space-time finite volumes oriented along the streamlines of the flow field to solve an integral form of the solute-transport equation, thus combining local and global mass conservation with the advantages of Eulerian-Lagrangian characteristic methods. The USGS FVELLAM code simulates solute transport in flowing ground water for a single dissolved solute constituent and represents the processes of advective transport, hydrodynamic dispersion, mixing from fluid sources, retardation, and decay. Implicit time discretization of the dispersive and source/sink terms is combined with a Lagrangian treatment of advection, in which forward tracking moves mass to the new time level, distributing mass among destination cells using approximate indicator functions. This allows the use of large transport time increments (large Courant numbers) with accurate results, even for advection-dominated systems (large Peclet numbers). Four test cases, including comparisons with analytical solutions and benchmarking against other numerical codes, are presented that indicate that the FVELLAM can usually yield excellent results, even if relatively few transport time steps are used, although the quality of the results is problem-dependent.

  5. Bigger is better! Hippocampal volume and declarative memory performance in healthy young men.

    PubMed

    Pohlack, Sebastian T; Meyer, Patric; Cacciaglia, Raffaele; Liebscher, Claudia; Ridder, Stephanie; Flor, Herta

    2014-01-01

    The importance of the hippocampus for declarative memory processes is firmly established. Nevertheless, the issue of a correlation between declarative memory performance and hippocampal volume in healthy subjects still remains controversial. The aim of the present study was to investigate this relationship in more detail. For this purpose, 50 healthy young male participants performed the California Verbal Learning Test. Hippocampal volume was assessed by manual segmentation of high-resolution 3D magnetic resonance images. We found a significant positive correlation between putatively hippocampus-dependent memory measures like short-delay retention, long-delay retention and discriminability and percent hippocampal volume. No significant correlation with measures related to executive processes was found. In addition, percent amygdala volume was not related to any of these measures. Our data advance previous findings reported in studies of brain-damaged individuals in a large and homogeneous young healthy sample and are important for theories on the neural basis of episodic memory.

  6. Design and Analysis of A Beacon-Less Routing Protocol for Large Volume Content Dissemination in Vehicular Ad Hoc Networks.

    PubMed

    Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea

    2016-11-01

    Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors' best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well.

  7. Design and Analysis of A Beacon-Less Routing Protocol for Large Volume Content Dissemination in Vehicular Ad Hoc Networks

    PubMed Central

    Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea

    2016-01-01

    Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors’ best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well. PMID:27809285

  8. High axial resolution imaging system for large volume tissues using combination of inclined selective plane illumination and mechanical sectioning

    PubMed Central

    Zhang, Qi; Yang, Xiong; Hu, Qinglei; Bai, Ke; Yin, Fangfang; Li, Ning; Gang, Yadong; Wang, Xiaojun; Zeng, Shaoqun

    2017-01-01

    To resolve fine structures of biological systems like neurons, it is required to realize microscopic imaging with sufficient spatial resolution in three dimensional systems. With regular optical imaging systems, high lateral resolution is accessible while high axial resolution is hard to achieve in a large volume. We introduce an imaging system for high 3D resolution fluorescence imaging of large volume tissues. Selective plane illumination was adopted to provide high axial resolution. A scientific CMOS working in sub-array mode kept the imaging area in the sample surface, which restrained the adverse effect of aberrations caused by inclined illumination. Plastic embedding and precise mechanical sectioning extended the axial range and eliminated distortion during the whole imaging process. The combination of these techniques enabled 3D high resolution imaging of large tissues. Fluorescent bead imaging showed resolutions of 0.59 μm, 0.47μm, and 0.59 μm in the x, y, and z directions, respectively. Data acquired from the volume sample of brain tissue demonstrated the applicability of this imaging system. Imaging of different depths showed uniform performance where details could be recognized in either the near-soma area or terminal area, and fine structures of neurons could be seen in both the xy and xz sections. PMID:29296503

  9. Cascading and Parallelising Curvilinear Inertial Focusing Systems for High Volume, Wide Size Distribution, Separation and Concentration of Particles

    PubMed Central

    Miller, B.; Jimenez, M.; Bridle, H.

    2016-01-01

    Inertial focusing is a microfluidic based separation and concentration technology that has expanded rapidly in the last few years. Throughput is high compared to other microfluidic approaches although sample volumes have typically remained in the millilitre range. Here we present a strategy for achieving rapid high volume processing with stacked and cascaded inertial focusing systems, allowing for separation and concentration of particles with a large size range, demonstrated here from 30 μm–300 μm. The system is based on curved channels, in a novel toroidal configuration and a stack of 20 devices has been shown to operate at 1 L/min. Recirculation allows for efficient removal of large particles whereas a cascading strategy enables sequential removal of particles down to a final stage where the target particle size can be concentrated. The demonstration of curved stacked channels operating in a cascaded manner allows for high throughput applications, potentially replacing filtration in applications such as environmental monitoring, industrial cleaning processes, biomedical and bioprocessing and many more. PMID:27808244

  10. Using Process Redesign and Information Technology to Improve Procurement

    DTIC Science & Technology

    1994-04-01

    contrac- tor. Many large-volume contractors have automated order processing tied to ac- counting, manufacturing, and shipping subsystems. Currently...the contractor must receive the mailed order, analyze it, extract pertinent information, and en- ter that information into the automated order ... processing system. Almost all orders for small purchases are unilateral documents that do not require acceptance or acknowledgment by the contractor. For

  11. Chromatographic hydrogen isotope separation

    DOEpatents

    Aldridge, Frederick T.

    1981-01-01

    Intermetallic compounds with the CaCu.sub.5 type of crystal structure, particularly LaNiCo.sub.4 and CaNi.sub.5, exhibit high separation factors and fast equilibrium times and therefore are useful for packing a chromatographic hydrogen isotope separation colum. The addition of an inert metal to dilute the hydride improves performance of the column. A large scale mutli-stage chromatographic separation process run as a secondary process off a hydrogen feedstream from an industrial plant which uses large volumes of hydrogen can produce large quantities of heavy water at an effective cost for use in heavy water reactors.

  12. Chromatographic hydrogen isotope separation

    DOEpatents

    Aldridge, F.T.

    Intermetallic compounds with the CaCu/sub 5/ type of crystal structure, particularly LaNiCo/sub 4/ and CaNi/sub 5/, exhibit high separation factors and fast equilibrium times and therefore are useful for packing a chromatographic hydrogen isotope separation column. The addition of an inert metal to dilute the hydride improves performance of the column. A large scale multi-stage chromatographic separation process run as a secondary process off a hydrogen feedstream from an industrial plant which uses large volumes of hydrogen cn produce large quantities of heavy water at an effective cost for use in heavy water reactors.

  13. Parallel volume ray-casting for unstructured-grid data on distributed-memory architectures

    NASA Technical Reports Server (NTRS)

    Ma, Kwan-Liu

    1995-01-01

    As computing technology continues to advance, computational modeling of scientific and engineering problems produces data of increasing complexity: large in size and unstructured in shape. Volume visualization of such data is a challenging problem. This paper proposes a distributed parallel solution that makes ray-casting volume rendering of unstructured-grid data practical. Both the data and the rendering process are distributed among processors. At each processor, ray-casting of local data is performed independent of the other processors. The global image composing processes, which require inter-processor communication, are overlapped with the local ray-casting processes to achieve maximum parallel efficiency. This algorithm differs from previous ones in four ways: it is completely distributed, less view-dependent, reasonably scalable, and flexible. Without using dynamic load balancing, test results on the Intel Paragon using from two to 128 processors show, on average, about 60% parallel efficiency.

  14. Big Data and Chemical Education

    ERIC Educational Resources Information Center

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  15. Comparing Latent Dirichlet Allocation and Latent Semantic Analysis as Classifiers

    ERIC Educational Resources Information Center

    Anaya, Leticia H.

    2011-01-01

    In the Information Age, a proliferation of unstructured text electronic documents exists. Processing these documents by humans is a daunting task as humans have limited cognitive abilities for processing large volumes of documents that can often be extremely lengthy. To address this problem, text data computer algorithms are being developed.…

  16. Focusing analytes from 50 μL into 500 pL: On-chip focusing from large sample volumes using isotachophoresis.

    PubMed

    van Kooten, Xander F; Truman-Rosentsvit, Marianna; Kaigala, Govind V; Bercovici, Moran

    2017-09-05

    The use of on-chip isotachophoresis assays for diagnostic applications is often limited by the small volumes of standard microfluidic channels. Overcoming this limitation is particularly important for detection of 'discrete' biological targets (such as bacteria) at low concentrations, where the volume of processed liquid in a standard microchannel might not contain any targets. We present a novel microfluidic chip that enables ITP focusing of target analytes from initial sample volumes of 50 μL into a concentrated zone with a volume of 500 pL, corresponding to a 100,000-fold increase in mean concentration, and a 300,000-fold increase in peak concentration. We present design considerations for limiting sample dispersion in such large-volume focusing (LVF) chips and discuss the trade-off between assay time and Joule heating, which ultimately governs the scalability of LVF designs. Finally, we demonstrate a 100-fold improvement of ITP focusing performance in the LVF chip as compared to conventional microchannels, and apply this enhancement to achieve highly sensitive detection of both molecular targets (DNA, down to 10 fM) and whole bacteria (down to 100 cfu/mL).

  17. Memory Network For Distributed Data Processors

    NASA Technical Reports Server (NTRS)

    Bolen, David; Jensen, Dean; Millard, ED; Robinson, Dave; Scanlon, George

    1992-01-01

    Universal Memory Network (UMN) is modular, digital data-communication system enabling computers with differing bus architectures to share 32-bit-wide data between locations up to 3 km apart with less than one millisecond of latency. Makes it possible to design sophisticated real-time and near-real-time data-processing systems without data-transfer "bottlenecks". This enterprise network permits transmission of volume of data equivalent to an encyclopedia each second. Facilities benefiting from Universal Memory Network include telemetry stations, simulation facilities, power-plants, and large laboratories or any facility sharing very large volumes of data. Main hub of UMN is reflection center including smaller hubs called Shared Memory Interfaces.

  18. Challenges of microtome‐based serial block‐face scanning electron microscopy in neuroscience

    PubMed Central

    WANNER, A. A.; KIRSCHMANN, M. A.

    2015-01-01

    Summary Serial block‐face scanning electron microscopy (SBEM) is becoming increasingly popular for a wide range of applications in many disciplines from biology to material sciences. This review focuses on applications for circuit reconstruction in neuroscience, which is one of the major driving forces advancing SBEM. Neuronal circuit reconstruction poses exceptional challenges to volume EM in terms of resolution, field of view, acquisition time and sample preparation. Mapping the connections between neurons in the brain is crucial for understanding information flow and information processing in the brain. However, information on the connectivity between hundreds or even thousands of neurons densely packed in neuronal microcircuits is still largely missing. Volume EM techniques such as serial section TEM, automated tape‐collecting ultramicrotome, focused ion‐beam scanning electron microscopy and SBEM (microtome serial block‐face scanning electron microscopy) are the techniques that provide sufficient resolution to resolve ultrastructural details such as synapses and provides sufficient field of view for dense reconstruction of neuronal circuits. While volume EM techniques are advancing, they are generating large data sets on the terabyte scale that require new image processing workflows and analysis tools. In this review, we present the recent advances in SBEM for circuit reconstruction in neuroscience and an overview of existing image processing and analysis pipelines. PMID:25907464

  19. Metastable Prepores in Tension-Free Lipid Bilayers

    NASA Astrophysics Data System (ADS)

    Ting, Christina L.; Awasthi, Neha; Müller, Marcus; Hub, Jochen S.

    2018-03-01

    The formation and closure of aqueous pores in lipid bilayers is a key step in various biophysical processes. Large pores are well described by classical nucleation theory, but the free-energy landscape of small, biologically relevant pores has remained largely unexplored. The existence of small and metastable "prepores" was hypothesized decades ago from electroporation experiments, but resolving metastable prepores from theoretical models remained challenging. Using two complementary methods—atomistic simulations and self-consistent field theory of a minimal lipid model—we determine the parameters for which metastable prepores occur in lipid membranes. Both methods consistently suggest that pore metastability depends on the relative volume ratio between the lipid head group and lipid tails: lipids with a larger head-group volume fraction (or shorter saturated tails) form metastable prepores, whereas lipids with a smaller head-group volume fraction (or longer unsaturated tails) form unstable prepores.

  20. Quantifying Standing Dead Tree Volume and Structural Loss with Voxelized Terrestrial Lidar Data

    NASA Astrophysics Data System (ADS)

    Popescu, S. C.; Putman, E.

    2017-12-01

    Standing dead trees (SDTs) are an important forest component and impact a variety of ecosystem processes, yet the carbon pool dynamics of SDTs are poorly constrained in terrestrial carbon cycling models. The ability to model wood decay and carbon cycling in relation to detectable changes in tree structure and volume over time would greatly improve such models. The overall objective of this study was to provide automated aboveground volume estimates of SDTs and automated procedures to detect, quantify, and characterize structural losses over time with terrestrial lidar data. The specific objectives of this study were: 1) develop an automated SDT volume estimation algorithm providing accurate volume estimates for trees scanned in dense forests; 2) develop an automated change detection methodology to accurately detect and quantify SDT structural loss between subsequent terrestrial lidar observations; and 3) characterize the structural loss rates of pine and oak SDTs in southeastern Texas. A voxel-based volume estimation algorithm, "TreeVolX", was developed and incorporates several methods designed to robustly process point clouds of varying quality levels. The algorithm operates on horizontal voxel slices by segmenting the slice into distinct branch or stem sections then applying an adaptive contour interpolation and interior filling process to create solid reconstructed tree models (RTMs). TreeVolX estimated large and small branch volume with an RMSE of 7.3% and 13.8%, respectively. A voxel-based change detection methodology was developed to accurately detect and quantify structural losses and incorporated several methods to mitigate the challenges presented by shifting tree and branch positions as SDT decay progresses. The volume and structural loss of 29 SDTs, composed of Pinus taeda and Quercus stellata, were successfully estimated using multitemporal terrestrial lidar observations over elapsed times ranging from 71 - 753 days. Pine and oak structural loss rates were characterized by estimating the amount of volumetric loss occurring in 20 equal-interval height bins of each SDT. Results showed that large pine snags exhibited more rapid structural loss in comparison to medium-sized oak snags in this study.

  1. Building high-performance system for processing a daily large volume of Chinese satellites imagery

    NASA Astrophysics Data System (ADS)

    Deng, Huawu; Huang, Shicun; Wang, Qi; Pan, Zhiqiang; Xin, Yubin

    2014-10-01

    The number of Earth observation satellites from China increases dramatically recently and those satellites are acquiring a large volume of imagery daily. As the main portal of image processing and distribution from those Chinese satellites, the China Centre for Resources Satellite Data and Application (CRESDA) has been working with PCI Geomatics during the last three years to solve two issues in this regard: processing the large volume of data (about 1,500 scenes or 1 TB per day) in a timely manner and generating geometrically accurate orthorectified products. After three-year research and development, a high performance system has been built and successfully delivered. The high performance system has a service oriented architecture and can be deployed to a cluster of computers that may be configured with high end computing power. The high performance is gained through, first, making image processing algorithms into parallel computing by using high performance graphic processing unit (GPU) cards and multiple cores from multiple CPUs, and, second, distributing processing tasks to a cluster of computing nodes. While achieving up to thirty (and even more) times faster in performance compared with the traditional practice, a particular methodology was developed to improve the geometric accuracy of images acquired from Chinese satellites (including HJ-1 A/B, ZY-1-02C, ZY-3, GF-1, etc.). The methodology consists of fully automatic collection of dense ground control points (GCP) from various resources and then application of those points to improve the photogrammetric model of the images. The delivered system is up running at CRESDA for pre-operational production and has been and is generating good return on investment by eliminating a great amount of manual labor and increasing more than ten times of data throughput daily with fewer operators. Future work, such as development of more performance-optimized algorithms, robust image matching methods and application workflows, is identified to improve the system in the coming years.

  2. Large scale extraction of poly(3-hydroxybutyrate) from Ralstonia eutropha H16 using sodium hypochlorite

    PubMed Central

    2012-01-01

    Isolation of polyhydroxyalkanoates (PHAs) from bacterial cell matter is a critical step in order to achieve a profitable production of the polymer. Therefore, an extraction method must lead to a high recovery of a pure product at low costs. This study presents a simplified method for large scale poly(3-hydroxybutyrate), poly(3HB), extraction using sodium hypochlorite. Poly(3HB) was extracted from cells of Ralstonia eutropha H16 at almost 96% purity. At different extraction volumes, a maximum recovery rate of 91.32% was obtained. At the largest extraction volume of 50 L, poly(3HB) with an average purity of 93.32% ± 4.62% was extracted with a maximum recovery of 87.03% of the initial poly(3HB) content. This process is easy to handle and requires less efforts than previously described processes. PMID:23164136

  3. MIDAS prototype Multispectral Interactive Digital Analysis System for large area earth resources surveys. Volume 2: Charge coupled device investigation

    NASA Technical Reports Server (NTRS)

    Kriegler, F.; Marshall, R.; Sternberg, S.

    1976-01-01

    MIDAS is a third-generation, fast, low cost, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from large regions with present and projected sensors. MIDAS, for example, can process a complete ERTS frame in forty seconds and provide a color map of sixteen constituent categories in a few minutes. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The need for advanced onboard spacecraft processing of remotely sensed data is stated and approaches to this problem are described which are feasible through the use of charge coupled devices. Tentative mechanizations for the required processing operations are given in large block form. These initial designs can serve as a guide to circuit/system designers.

  4. Spatial considerations during cryopreservation of a large volume sample.

    PubMed

    Kilbride, Peter; Lamb, Stephen; Milne, Stuart; Gibbons, Stephanie; Erro, Eloy; Bundy, James; Selden, Clare; Fuller, Barry; Morris, John

    2016-08-01

    There have been relatively few studies on the implications of the physical conditions experienced by cells during large volume (litres) cryopreservation - most studies have focused on the problem of cryopreservation of smaller volumes, typically up to 2 ml. This study explores the effects of ice growth by progressive solidification, generally seen during larger scale cryopreservation, on encapsulated liver hepatocyte spheroids, and it develops a method to reliably sample different regions across the frozen cores of samples experiencing progressive solidification. These issues are examined in the context of a Bioartificial Liver Device which requires cryopreservation of a 2 L volume in a strict cylindrical geometry for optimal clinical delivery. Progressive solidification cannot be avoided in this arrangement. In such a system optimal cryoprotectant concentrations and cooling rates are known. However, applying these parameters to a large volume is challenging due to the thermal mass and subsequent thermal lag. The specific impact of this to the cryopreservation outcome is required. Under conditions of progressive solidification, the spatial location of Encapsulated Liver Spheroids had a strong impact on post-thaw recovery. Cells in areas first and last to solidify demonstrated significantly impaired post-thaw function, whereas areas solidifying through the majority of the process exhibited higher post-thaw outcome. It was also found that samples where the ice thawed more rapidly had greater post-thaw viability 24 h post-thaw (75.7 ± 3.9% and 62.0 ± 7.2% respectively). These findings have implications for the cryopreservation of large volumes with a rigid shape and for the cryopreservation of a Bioartificial Liver Device. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Brain Structure in Young and Old East Asians and Westerners: Comparisons of Structural Volume and Cortical Thickness

    ERIC Educational Resources Information Center

    Chee, Michael Wei Liang; Zheng, Hui; Goh, Joshua Oon Soo; Park, Denise; Sutton, Bradley P.

    2011-01-01

    There is an emergent literature suggesting that East Asians and Westerners differ in cognitive processes because of cultural biases to process information holistically (East Asians) or analytically (Westerners). To evaluate the possibility that such differences are accompanied by differences in brain structure, we conducted a large comparative…

  6. Three-Dimensional Cell Printing of Large-Volume Tissues: Application to Ear Regeneration.

    PubMed

    Lee, Jung-Seob; Kim, Byoung Soo; Seo, Donghwan; Park, Jeong Hun; Cho, Dong-Woo

    2017-03-01

    The three-dimensional (3D) printing of large-volume cells, printed in a clinically relevant size, is one of the most important challenges in the field of tissue engineering. However, few studies have reported the fabrication of large-volume cell-printed constructs (LCCs). To create LCCs, appropriate fabrication conditions should be established: Factors involved include fabrication time, residence time, and temperature control of the cell-laden hydrogel in the syringe to ensure high cell viability and functionality. The prolonged time required for 3D printing of LCCs can reduce cell viability and result in insufficient functionality of the construct, because the cells are exposed to a harsh environment during the printing process. In this regard, we present an advanced 3D cell-printing system composed of a clean air workstation, a humidifier, and a Peltier system, which provides a suitable printing environment for the production of LCCs with high cell viability. We confirmed that the advanced 3D cell-printing system was capable of providing enhanced printability of hydrogels and fabricating an ear-shaped LCC with high cell viability. In vivo results for the ear-shaped LCC also showed that printed chondrocytes proliferated sufficiently and differentiated into cartilage tissue. Thus, we conclude that the advanced 3D cell-printing system is a versatile tool to create cell-printed constructs for the generation of large-volume tissues.

  7. Low-Cost and Large-Area Electronics, Roll-to-Roll Processing and Beyond

    NASA Astrophysics Data System (ADS)

    Wiesenhütter, Katarzyna; Skorupa, Wolfgang

    In the following chapter, the authors conduct a literature survey of current advances in state-of-the-art low-cost, flexible electronics. A new emerging trend in the design of modern semiconductor devices dedicated to scaling-up, rather than reducing, their dimensions is presented. To realize volume manufacturing, alternative semiconductor materials with superior performance, fabricated by innovative processing methods, are essential. This review provides readers with a general overview of the material and technology evolution in the area of macroelectronics. Herein, the term macroelectronics (MEs) refers to electronic systems that can cover a large area of flexible media. In stark contrast to well-established micro- and nano-scale semiconductor devices, where property improvement is associated with downscaling the dimensions of the functional elements, in macroelectronic systems their overall size defines the ultimate performance (Sun and Rogers in Adv. Mater. 19:1897-1916, 2007). The major challenges of large-scale production are discussed. Particular attention has been focused on describing advanced, short-term heat treatment approaches, which offer a range of advantages compared to conventional annealing methods. There is no doubt that large-area, flexible electronic systems constitute an important research topic for the semiconductor industry. The ability to fabricate highly efficient macroelectronics by inexpensive processes will have a significant impact on a range of diverse technology sectors. A new era "towards semiconductor volume manufacturing…" has begun.

  8. Free volumes and gas transport in polymers: amine-modified epoxy resins as a case study.

    PubMed

    Patil, Pushkar N; Roilo, David; Brusa, Roberto S; Miotello, Antonio; Aghion, Stefano; Ferragut, Rafael; Checchetto, Riccardo

    2016-02-07

    The CO2 transport process was studied in a series of amine-modified epoxy resins having different cross-linking densities but the same chemical environment for the penetrant molecules. Positron Annihilation Lifetime Spectroscopy (PALS) was used to monitor the free volume structure of the samples and experimentally evaluate their fractional free volume fh(T) and its temperature evolution. The analysis of the free volume hole size distribution showed that all the holes have a size large enough to accommodate the penetrant molecules at temperatures T above the glass transition temperature Tg. The measured gas diffusion constants at T > Tg have been reproduced in the framework of the free volume theory of diffusion using a novel procedure based on the use of fh(T) as an input experimental parameter.

  9. Evaluation of counting methods for oceanic radium-228

    NASA Astrophysics Data System (ADS)

    Orr, James C.

    1988-07-01

    Measurement of open ocean 228Ra is difficult, typically requiring at least 200 L of seawater. The burden of collecting and processing these large-volume samples severely limits the widespread use of this promising tracer. To use smaller-volume samples, a more sensitive means of analysis is required. To seek out new and improved counting method(s), conventional 228Ra counting methods have been compared with some promising techniques which are currently used for other radionuclides. Of the conventional methods, α spectrometry possesses the highest efficiency (3-9%) and lowest background (0.0015 cpm), but it suffers from the need for complex chemical processing after sampling and the need to allow about 1 year for adequate ingrowth of 228Th granddaughter. The other two conventional counting methods measure the short-lived 228Ac daughter while it remains supported by 228Ra, thereby avoiding the complex sample processing and the long delay before counting. The first of these, high-resolution γ spectrometry, offers the simplest processing and an efficiency (4.8%) comparable to α spectrometry; yet its high background (0.16 cpm) and substantial equipment cost (˜30,000) limit its widespread use. The second no-wait method, β-γ coincidence spectrometry, also offers comparable efficiency (5.3%), but it possesses both lower background (0.0054 cpm) and lower initial cost (˜12,000). Three new (i.e., untried for 228Ra) techniques all seem to promise about a fivefold increase in efficiency over conventional methods. By employing liquid scintillation methods, both α spectrometry and β-γ coincidence spectrometry can improve their counter efficiency while retaining low background. The third new 228Ra counting method could be adapted from a technique which measures 224Ra by 220Rn emanation. After allowing for ingrowth and then counting for the 224Ra great-granddaughter, 228Ra could be back calculated, thereby yielding a method with high efficiency, where no sample processing is required. The efficiency and background of each of the three new methods have been estimated and are compared with those of the three methods currently employed to measure oceanic 228Ra. From efficiency and background, the relative figure of merit and the detection limit have been determined for each of the six counters. These data suggest that the new counting methods have the potential to measure most 228Ra samples with just 30 L of seawater, to better than 5% precision. Not only would this reduce the time, effort, and expense involved in sample collection, but 228Ra could then be measured on many small-volume samples (20-30 L) previously collected with only 226Ra in mind. By measuring 228Ra quantitatively on such small-volume samples, three analyses (large-volume 228Ra, large-volume 226Ra, and small-volume 226Ra) could be reduced to one, thereby dramatically improving analytical precision.

  10. Metastable Prepores in Tension-Free Lipid Bilayers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ting, Christina L.; Awasthi, Neha; Muller, Marcus

    The formation and closure of aqueous pores in lipid bilayers is a key step in various biophysical processes. Large pores are well described by classical nucleation theory, but the free-energy landscape of small, biologically relevant pores has remained largely unexplored. The existence of small and metastable “prepores” was hypothesized decades ago from electroporation experiments, but resolving metastable prepores from theoretical models remained challenging. Using two complementary methods—atomistic simulations and self-consistent field theory of a minimal lipid model—we determine the parameters for which metastable prepores occur in lipid membranes. Here, both methods consistently suggest that pore metastability depends on the relativemore » volume ratio between the lipid head group and lipid tails: lipids with a larger head-group volume fraction (or shorter saturated tails) form metastable prepores, whereas lipids with a smaller head-group volume fraction (or longer unsaturated tails) form unstable prepores.« less

  11. Metastable Prepores in Tension-Free Lipid Bilayers

    DOE PAGES

    Ting, Christina L.; Awasthi, Neha; Muller, Marcus; ...

    2018-03-23

    The formation and closure of aqueous pores in lipid bilayers is a key step in various biophysical processes. Large pores are well described by classical nucleation theory, but the free-energy landscape of small, biologically relevant pores has remained largely unexplored. The existence of small and metastable “prepores” was hypothesized decades ago from electroporation experiments, but resolving metastable prepores from theoretical models remained challenging. Using two complementary methods—atomistic simulations and self-consistent field theory of a minimal lipid model—we determine the parameters for which metastable prepores occur in lipid membranes. Here, both methods consistently suggest that pore metastability depends on the relativemore » volume ratio between the lipid head group and lipid tails: lipids with a larger head-group volume fraction (or shorter saturated tails) form metastable prepores, whereas lipids with a smaller head-group volume fraction (or longer unsaturated tails) form unstable prepores.« less

  12. Joint Department of Defense/Department of Energy/Federal Emergency Management Agency Nuclear Weapon Accident Exercise (NUWAX 83) After Action Report. Volume 2

    DTIC Science & Technology

    1983-12-30

    support among the scientific community. In the absence of some agreed criteria, the economic impact and legal aspects could be overwhelming. _ 17 The...processing large numbers of people. Guidance on CCS operations needs to incl]ue release limits and procedures for receiptinq for articles held for...contaminated articles and the re-clothiny of personreJ. 5P ""N ~ ’aEMNON Better procedures and equipment with which to rapidly process large numbers of

  13. Digital image processing for the earth resources technology satellite data.

    NASA Technical Reports Server (NTRS)

    Will, P. M.; Bakis, R.; Wesley, M. A.

    1972-01-01

    This paper discusses the problems of digital processing of the large volumes of multispectral image data that are expected to be received from the ERTS program. Correction of geometric and radiometric distortions are discussed and a byte oriented implementation is proposed. CPU timing estimates are given for a System/360 Model 67, and show that a processing throughput of 1000 image sets per week is feasible.

  14. Magic angle spinning nuclear magnetic resonance apparatus and process for high-resolution in situ investigations

    DOEpatents

    Hu, Jian Zhi; Sears, Jr., Jesse A.; Hoyt, David W.; Mehta, Hardeep S.; Peden, Charles H. F.

    2015-11-24

    A continuous-flow (CF) magic angle sample spinning (CF-MAS) NMR rotor and probe are described for investigating reaction dynamics, stable intermediates/transition states, and mechanisms of catalytic reactions in situ. The rotor includes a sample chamber of a flow-through design with a large sample volume that delivers a flow of reactants through a catalyst bed contained within the sample cell allowing in-situ investigations of reactants and products. Flow through the sample chamber improves diffusion of reactants and products through the catalyst. The large volume of the sample chamber enhances sensitivity permitting in situ .sup.13C CF-MAS studies at natural abundance.

  15. Population attribute compression

    DOEpatents

    White, James M.; Faber, Vance; Saltzman, Jeffrey S.

    1995-01-01

    An image population having a large number of attributes is processed to form a display population with a predetermined smaller number of attributes that represent the larger number of attributes. In a particular application, the color values in an image are compressed for storage in a discrete look-up table (LUT). Color space containing the LUT color values is successively subdivided into smaller volumes until a plurality of volumes are formed, each having no more than a preselected maximum number of color values. Image pixel color values can then be rapidly placed in a volume with only a relatively few LUT values from which a nearest neighbor is selected. Image color values are assigned 8 bit pointers to their closest LUT value whereby data processing requires only the 8 bit pointer value to provide 24 bit color values from the LUT.

  16. Temporal dynamics of online petitions.

    PubMed

    Böttcher, Lucas; Woolley-Meza, Olivia; Brockmann, Dirk

    2017-01-01

    Online petitions are an important avenue for direct political action, yet the dynamics that determine when a petition will be successful are not well understood. Here we analyze the temporal characteristics of online-petition signing behavior in order to identify systematic differences between popular petitions, which receive a high volume of signatures, and unpopular ones. We find that, in line with other temporal characterizations of human activity, the signing process is typically non-Poissonian and non-homogeneous in time. However, this process exhibits anomalously high memory for human activity, possibly indicating that synchronized external influence or contagion play and important role. More interestingly, we find clear differences in the characteristics of the inter-event time distributions depending on the total number of signatures that petitions receive, independently of the total duration of the petitions. Specifically, popular petitions that attract a large volume of signatures exhibit more variance in the distribution of inter-event times than unpopular petitions with only a few signatures, which could be considered an indication that the former are more bursty. However, petitions with large signature volume are less bursty according to measures that consider the time ordering of inter-event times. Our results, therefore, emphasize the importance of accounting for time ordering to characterize human activity.

  17. Compact 3D Camera for Shake-the-Box Particle Tracking

    NASA Astrophysics Data System (ADS)

    Hesseling, Christina; Michaelis, Dirk; Schneiders, Jan

    2017-11-01

    Time-resolved 3D-particle tracking usually requires the time-consuming optical setup and calibration of 3 to 4 cameras. Here, a compact four-camera housing has been developed. The performance of the system using Shake-the-Box processing (Schanz et al. 2016) is characterized. It is shown that the stereo-base is large enough for sensible 3D velocity measurements. Results from successful experiments in water flows using LED illumination are presented. For large-scale wind tunnel measurements, an even more compact version of the system is mounted on a robotic arm. Once calibrated for a specific measurement volume, the necessity for recalibration is eliminated even when the system moves around. Co-axial illumination is provided through an optical fiber in the middle of the housing, illuminating the full measurement volume from one viewing direction. Helium-filled soap bubbles are used to ensure sufficient particle image intensity. This way, the measurement probe can be moved around complex 3D-objects. By automatic scanning and stitching of recorded particle tracks, the detailed time-averaged flow field of a full volume of cubic meters in size is recorded and processed. Results from an experiment at TU-Delft of the flow field around a cyclist are shown.

  18. Automatic partitioning of head CTA for enabling segmentation

    NASA Astrophysics Data System (ADS)

    Suryanarayanan, Srikanth; Mullick, Rakesh; Mallya, Yogish; Kamath, Vidya; Nagaraj, Nithin

    2004-05-01

    Radiologists perform a CT Angiography procedure to examine vascular structures and associated pathologies such as aneurysms. Volume rendering is used to exploit volumetric capabilities of CT that provides complete interactive 3-D visualization. However, bone forms an occluding structure and must be segmented out. The anatomical complexity of the head creates a major challenge in the segmentation of bone and vessel. An analysis of the head volume reveals varying spatial relationships between vessel and bone that can be separated into three sub-volumes: "proximal", "middle", and "distal". The "proximal" and "distal" sub-volumes contain good spatial separation between bone and vessel (carotid referenced here). Bone and vessel appear contiguous in the "middle" partition that remains the most challenging region for segmentation. The partition algorithm is used to automatically identify these partition locations so that different segmentation methods can be developed for each sub-volume. The partition locations are computed using bone, image entropy, and sinus profiles along with a rule-based method. The algorithm is validated on 21 cases (varying volume sizes, resolution, clinical sites, pathologies) using ground truth identified visually. The algorithm is also computationally efficient, processing a 500+ slice volume in 6 seconds (an impressive 0.01 seconds / slice) that makes it an attractive algorithm for pre-processing large volumes. The partition algorithm is integrated into the segmentation workflow. Fast and simple algorithms are implemented for processing the "proximal" and "distal" partitions. Complex methods are restricted to only the "middle" partition. The partitionenabled segmentation has been successfully tested and results are shown from multiple cases.

  19. A negative association between brainstem pontine grey-matter volume, well-being and resilience in healthy twins.

    PubMed

    Gatt, Justine M; Burton, Karen L O; Routledge, Kylie M; Grasby, Katrina L; Korgaonkar, Mayuresh S; Grieve, Stuart M; Schofield, Peter R; Harris, Anthony W F; Clark, C Richard; Williams, Leanne M

    2018-06-20

    Associations between well-being, resilience to trauma and the volume of grey-matter regions involved in affective processing (e.g., threat/reward circuits) are largely unexplored, as are the roles of shared genetic and environmental factors derived from multivariate twin modelling. This study presents, to our knowledge, the first exploration of well-being and volumes of grey-matter regions involved in affective processing using a region-of-interest, voxel-based approach in 263 healthy adult twins (60% monozygotic pairs, 61% females, mean age 39.69 yr). To examine patterns for resilience (i.e., positive adaptation following adversity), we evaluated associations between the same brain regions and well-being in a trauma-exposed subgroup. We found a correlated effect between increased well-being and reduced grey-matter volume of the pontine nuclei. This association was strongest for individuals with higher resilience to trauma. Multivariate twin modelling suggested that the common variance between the pons volume and well-being scores was due to environmental factors. We used a cross-sectional sample; results need to be replicated longitudinally and in a larger sample. Associations with altered grey matter of the pontine nuclei suggest that basic sensory processes, such as arousal, startle, memory consolidation and/or emotional conditioning, may have a role in well-being and resilience.

  20. Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach

    PubMed Central

    Zimmer, Jennifer S.D.; Monroe, Matthew E.; Qian, Wei-Jun; Smith, Richard D.

    2007-01-01

    Proteomics has recently demonstrated utility in understanding cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled to high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. PMID:16429408

  1. Modern Sorters for Soil Segregation on Large Scale Remediation Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shonka, J.J.; Kelley, J.E.; O'Brien, J.M.

    2008-01-15

    In the mid-1940's, Dr. C. Lapointe developed a Geiger tube based uranium ore scanner and picker to replace hand-cobbing. In the 1990's, a modern version of the Lapointe Picker for soil sorting was developed around the need to clean the Johnston Atoll of plutonium. It worked well with sand, but these systems are ineffective with soil, especially with wet conditions. Additionally, several other constraints limited throughput. Slow moving belts and thin layers of material on the belt coupled with the use of multiple small detectors and small sorting gates make these systems ineffective for high throughput. Soil sorting of clay-bearingmore » soils and building debris requires a new look at both the material handling equipment, and the radiation detection methodology. A new class of Super-Sorters has attained throughput of one hundred times that of the old designs. Higher throughput means shorter schedules which reduce costs substantially. The planning, cost, implementation, and other site considerations for these new Super-Sorters are discussed. Modern soil segregation was developed by Ed Bramlitt of the Defense Nuclear Agency for clean up at Johnston Atoll. The process eventually became the Segmented Gate System (SGS). This system uses an array of small sodium iodide (NaI) detectors, each viewing a small volume (segment), that control a gate. The volume in the gate is approximately one kg. This system works well when the material to be processed is sand; however, when the material is wet and sticky (soils with clays) the system has difficulty moving the material through the gates. Super-Sorters are a new class of machine designed to take advantage of high throughput aggregate processing conveyors, large acquisition volumes, and large NaI detectors using gamma spectroscopy. By using commercially available material handling equipment, the system can attain processing rates of up to 400 metric tons/hr with spectrum acquisition approximately every 0.5 sec, so the acquisition volume is 50 kilograms or less. Smaller sorting volumes can be obtained with lower throughput or by re-sorting the diverted material. This equipment can also handle large objects. The use of spectroscopy systems allows several regions of- interest to be set. Super-Sorters can bring waste processing charges down to less than $30/ metric ton on smaller jobs and can save hundreds of dollars per metric ton in disposal charges. The largest effect on the overall project cost occurs during planning and implementation. The overall goal is reduction of the length of the project, which dictates the most efficient soil processing. With all sorting systems the parameters that need to be accounted for are matrix type, soil feed rate, soil pre-processing, site conditions, and regulatory issues. The soil matrix and its ability to flow are extremely crucial to operations. It is also important to consider that as conditions change (i.e., moisture), the flowability of the soil matrix will change. Many soil parameters have to be considered: cohesive strength, internal and wall friction, permeability, and bulk density as a function of consolidating pressure. Clay bearing soils have very low permeability and high cohesive strength which makes them difficult to process, especially when wet. Soil feed speed is dependent on the equipment present and the ability to move the soil in the Super-Sorter processing area. When a Super-Sorter is running at 400 metric tons per hour it is difficult to feed the system. As an example, front-end loaders with large buckets would move approximately 5-10 metric tons of material, and 400 metric tons per hour would require 50-100 bucket-loads per hour to attain. Because the flowability of the soil matrix is important, poor material is often pre-processed before it is added to the feed hopper of the 'survey' conveyor. This pre-processing can consist of a 'grizzly' to remove large objects from the soil matrix, followed screening plant to prepare the soil so that it feeds well. Hydrated lime can be added to improve material properties. Site conditions (site area, typical weather conditions, etc.) also play a large part in project planning. Downtime lengthens project schedule and costs. The system must be configured to handle weather conditions or other variables that affect throughput. The largest single factor that plays into the project design is the regulatory environment. Before a sorter can be utilized, an averaging mass must be established by the regulator(s). There currently are no standards or guidelines in this area. The differences between acquisition mass and averaging mass are very important. The acquisition mass is defined based on the acquisition time and the geometry of the detectors. The averaging mass can then be as small as the acquisition mass or as large as several hundred tons (the averaging mass is simply the sum of a number of acquisitions). It is important to define volumetric limits and any required point-source limits. Super-Sorters handle both of these types of limits simultaneously. The minimum detectable activity for Super- Sorters is a function of speed. The chart below illustrates the detection confidence level for a 0.1 {mu}Ci point source of Ra-226 vs alarm point for three different sorter process rates. The minimal detection activity and diversion volume for a Super-Sorter is also a function of the acquisition mass. The curves were collected using a 0-15 kg acquisition mass. Diversion volumes ranged from 20-30 kg for a point source diversion. Soil Super-Sorters should be considered for every D and D project where it is desirable to reduce the waste stream. A volume reduction of 1:1000 can be gained for each pass through a modern sorter, resulting in significant savings in disposal costs.« less

  2. Distance-limited perpendicular distance sampling for coarse woody debris: theory and field results

    Treesearch

    Mark J. Ducey; Micheal S. Williams; Jeffrey H. Gove; Steven Roberge; Robert S. Kenning

    2013-01-01

    Coarse woody debris (CWD) has been identified as an important component in many forest ecosystem processes. Perpendicular distance sampling (PDS) is one of the several efficient new methods that have been proposed for CWD inventory. One drawback of PDS is that the maximum search distance can be very large, especially if CWD diameters are large or the volume factor...

  3. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  4. Definition of satellite servicing technology development missions for early space stations. Volume 2: Technical

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Early space station accommodation, build-up of space station manipulator capability, on-orbit spacecraft assembly test and launch, large antenna structure deployment, service/refurbish satellite, and servicing of free-flying materials processing platform are discussed.

  5. Computational modelling of large deformations in layered-silicate/PET nanocomposites near the glass transition

    NASA Astrophysics Data System (ADS)

    Figiel, Łukasz; Dunne, Fionn P. E.; Buckley, C. Paul

    2010-01-01

    Layered-silicate nanoparticles offer a cost-effective reinforcement for thermoplastics. Computational modelling has been employed to study large deformations in layered-silicate/poly(ethylene terephthalate) (PET) nanocomposites near the glass transition, as would be experienced during industrial forming processes such as thermoforming or injection stretch blow moulding. Non-linear numerical modelling was applied, to predict the macroscopic large deformation behaviour, with morphology evolution and deformation occurring at the microscopic level, using the representative volume element (RVE) approach. A physically based elasto-viscoplastic constitutive model, describing the behaviour of the PET matrix within the RVE, was numerically implemented into a finite element solver (ABAQUS) using an UMAT subroutine. The implementation was designed to be robust, for accommodating large rotations and stretches of the matrix local to, and between, the nanoparticles. The nanocomposite morphology was reconstructed at the RVE level using a Monte-Carlo-based algorithm that placed straight, high-aspect ratio particles according to the specified orientation and volume fraction, with the assumption of periodicity. Computational experiments using this methodology enabled prediction of the strain-stiffening behaviour of the nanocomposite, observed experimentally, as functions of strain, strain rate, temperature and particle volume fraction. These results revealed the probable origins of the enhanced strain stiffening observed: (a) evolution of the morphology (through particle re-orientation) and (b) early onset of stress-induced pre-crystallization (and hence lock-up of viscous flow), triggered by the presence of particles. The computational model enabled prediction of the effects of process parameters (strain rate, temperature) on evolution of the morphology, and hence on the end-use properties.

  6. Photoacoustic projection imaging using an all-optical detector array

    NASA Astrophysics Data System (ADS)

    Bauer-Marschallinger, J.; Felbermayer, K.; Berer, T.

    2018-02-01

    We present a prototype for all-optical photoacoustic projection imaging. By generating projection images, photoacoustic information of large volumes can be retrieved with less effort compared to common photoacoustic computed tomography where many detectors and/or multiple measurements are required. In our approach, an array of 60 integrating line detectors is used to acquire photoacoustic waves. The line detector array consists of fiber-optic MachZehnder interferometers, distributed on a cylindrical surface. From the measured variation of the optical path lengths of the interferometers, induced by photoacoustic waves, a photoacoustic projection image can be reconstructed. The resulting images represent the projection of the three-dimensional spatial light absorbance within the imaged object onto a two-dimensional plane, perpendicular to the line detector array. The fiber-optic detectors achieve a noise-equivalent pressure of 24 Pascal at a 10 MHz bandwidth. We present the operational principle, the structure of the array, and resulting images. The system can acquire high-resolution projection images of large volumes within a short period of time. Imaging large volumes at high frame rates facilitates monitoring of dynamic processes.

  7. Drizzle formation in stratocumulus clouds: Effects of turbulent mixing

    DOE PAGES

    Magaritz-Ronen, L.; Pinsky, M.; Khain, A.

    2016-02-17

    The mechanism of drizzle formation in shallow stratocumulus clouds and the effect of turbulent mixing on this process are investigated. A Lagrangian–Eularian model of the cloud-topped boundary layer is used to simulate the cloud measured during flight RF07 of the DYCOMS-II field experiment. The model contains ~ 2000 air parcels that are advected in a turbulence-like velocity field. In the model all microphysical processes are described for each Lagrangian air volume, and turbulent mixing between the parcels is also taken into account. It was found that the first large drops form in air volumes that are closest to adiabatic andmore » characterized by high humidity, extended residence near cloud top, and maximum values of liquid water content, allowing the formation of drops as a result of efficient collisions. The first large drops form near cloud top and initiate drizzle formation in the cloud. Drizzle is developed only when turbulent mixing of parcels is included in the model. Without mixing, the cloud structure is extremely inhomogeneous and the few large drops that do form in the cloud evaporate during their sedimentation. Lastly, it was found that turbulent mixing can delay the process of drizzle initiation but is essential for the further development of drizzle in the cloud.« less

  8. a Novel Approach of Indexing and Retrieving Spatial Polygons for Efficient Spatial Region Queries

    NASA Astrophysics Data System (ADS)

    Zhao, J. H.; Wang, X. Z.; Wang, F. Y.; Shen, Z. H.; Zhou, Y. C.; Wang, Y. L.

    2017-10-01

    Spatial region queries are more and more widely used in web-based applications. Mechanisms to provide efficient query processing over geospatial data are essential. However, due to the massive geospatial data volume, heavy geometric computation, and high access concurrency, it is difficult to get response in real time. Spatial indexes are usually used in this situation. In this paper, based on k-d tree, we introduce a distributed KD-Tree (DKD-Tree) suitbable for polygon data, and a two-step query algorithm. The spatial index construction is recursive and iterative, and the query is an in memory process. Both the index and query methods can be processed in parallel, and are implemented based on HDFS, Spark and Redis. Experiments on a large volume of Remote Sensing images metadata have been carried out, and the advantages of our method are investigated by comparing with spatial region queries executed on PostgreSQL and PostGIS. Results show that our approach not only greatly improves the efficiency of spatial region query, but also has good scalability, Moreover, the two-step spatial range query algorithm can also save cluster resources to support a large number of concurrent queries. Therefore, this method is very useful when building large geographic information systems.

  9. Drizzle formation in stratocumulus clouds: Effects of turbulent mixing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magaritz-Ronen, L.; Pinsky, M.; Khain, A.

    The mechanism of drizzle formation in shallow stratocumulus clouds and the effect of turbulent mixing on this process are investigated. A Lagrangian–Eularian model of the cloud-topped boundary layer is used to simulate the cloud measured during flight RF07 of the DYCOMS-II field experiment. The model contains ~ 2000 air parcels that are advected in a turbulence-like velocity field. In the model all microphysical processes are described for each Lagrangian air volume, and turbulent mixing between the parcels is also taken into account. It was found that the first large drops form in air volumes that are closest to adiabatic andmore » characterized by high humidity, extended residence near cloud top, and maximum values of liquid water content, allowing the formation of drops as a result of efficient collisions. The first large drops form near cloud top and initiate drizzle formation in the cloud. Drizzle is developed only when turbulent mixing of parcels is included in the model. Without mixing, the cloud structure is extremely inhomogeneous and the few large drops that do form in the cloud evaporate during their sedimentation. Lastly, it was found that turbulent mixing can delay the process of drizzle initiation but is essential for the further development of drizzle in the cloud.« less

  10. Boron Nitride Nanoribbons from Exfoliation of Boron Nitride Nanotubes

    NASA Technical Reports Server (NTRS)

    Hung, Ching-Cheh; Hurst, Janet; Santiago, Diana

    2017-01-01

    Two types of boron nitride nanotubes (BNNTs) were exfoliated into boron nitride nanoribbons (BNNR), which were identified using transmission electron microscopy: (1) commercial BNNTs with thin tube walls and small diameters. Tube unzipping was indicated by a large decrease of the sample's surface area and volume for pores less than 2 nm in diameter. (2) BNNTs with large diameters and thick walls synthesized at NASA Glenn Research Center. Here, tube unraveling was indicated by a large increase in external surface area and pore volume. For both, the exfoliation process was similar to the previous reported method to exfoliate commercial hexagonal boron nitride (hBN): Mixtures of BNNT, FeCl3, and NaF (or KF) were sequentially treated in 250 to 350 C nitrogen for intercalation, 500 to 750 C air for exfoliation, and finally HCl for purification. Property changes of the nanosized boron nitride throughout this process were also similar to the previously observed changes of commercial hBN during the exfoliation process: Both crystal structure (x-ray diffraction data) and chemical properties (Fourier-transform infrared spectroscopy data) of the original reactant changed after intercalation and exfoliation, but most (not all) of these changes revert back to those of the reactant once the final, purified products are obtained.

  11. Precipitation, landsliding, and erosion across the Olympic Mountains, Washington State, USA

    NASA Astrophysics Data System (ADS)

    Smith, Stephen G.; Wegmann, Karl W.

    2018-01-01

    In the Olympic Mountains of Washington State, landsliding is the primary surface process by which bedrock and hillslope regolith are delivered to river networks. However, the relative importance of large earthquakes versus high magnitude precipitation events to the total volume of landslide material transported to valley bottoms remains unknown in part due to the absence of large historical earthquakes. To test the hypothesis that erosion is linked to precipitation, approximately 1000 landslides were mapped from Google Earth imagery between 1990 and 2015 along a 15 km-wide × 85 km-long (1250 km2) swath across the range. The volume of hillslope material moved by each slide was calculated using previously published area-volume scaling relationships, and the spatial distribution of landslide volume was compared to mean annual precipitation data acquired from the PRISM climate group for the period 1981-2010. Statistical analysis reveals a significant correlation (r = 0.55; p < 0.001) between total landslide volume and mean annual precipitation, with 98% of landslide volume occurring along the windward, high-precipitation side of the range during the 25-year interval. Normalized to area, this volume yields a basin-wide erosion rate of 0.28 ± 0.11 mm yr- 1, which is similar to previous time-variable estimates of erosion throughout the Olympic Mountains, including those from river sediment yield, cosmogenic 10Be, fluvial terrace incision, and thermochronometry. The lack of large historic earthquakes makes it difficult to assess the relative contributions of precipitation and seismic shaking to total erosion, but our results suggest that climate, and more specifically a sharp precipitation gradient, plays an important role in controlling erosion and landscape evolution over both short and long timescales across the Olympic Mountains.

  12. The raft of the Saint-Jean River, Gaspé (Québec, Canada): A dynamic feature trapping most of the wood transported from the catchment

    NASA Astrophysics Data System (ADS)

    Boivin, Maxime; Buffin-Bélanger, Thomas; Piégay, Hervé

    2015-02-01

    The rivers of the Gaspé Peninsula, Québec (Canada), a coastal drainage system of the St. Lawrence River, receive and transport vast quantities of large wood. The rapid rate of channel shifting caused by high-energy flows and noncohesive banks allows wood recruitment that in turn greatly influences river dynamics. The delta of the Saint-Jean River has accumulated wood since 1960, leading to frequent avulsions over that time period. The wood raft there is now more than 3-km in length, which is unusual but natural. This jam configuration allows a unique opportunity to estimate a wood budget at the scale of a long river corridor and to better understand the dynamics of large wood (LW) in rivers. A wood budget includes the evaluation of wood volumes (i) produced by bank erosion (input), (ii) still in transit in the river corridor (deposited on sand bars or channel edges), and (iii) accumulated in the delta (output). The budget is based on an analysis of aerial photos dating back to 1963 as well as surveys carried out in 2010, all of which were used to locate and describe large wood accumulations along a 60-km river section. The main results of this paper show that the raft formation in the delta is dynamic and can be massive, but it is a natural process. Considering the estimated wood volume trapped in the delta from 1963 to 2013 (≈ 25,000 m3), two important points are revealed by the quantification of the wood recruitment volume from 1963 to 2004 (≈ 27,000 m3 ± 400 m3) and of the wood volume stored on the bars in 2010 (≈ 5950 m3). First, the recruitment of large wood from lateral migration for the 40-year period can account for the volume of large wood in the delta and in transit. Second, the excess wood volume produced by lateral migration and avulsion represents a minimum estimation of the large wood trapped on the floodplain owing to wood volume that has decomposed and large wood that exited the river system. Rafts are major trapping structures that provide good potential sites to monitor wood delivery from the catchment through time and allow estimations of LW residence time while in transit. These results contribute to understanding the interannual large wood dynamics in the Saint-Jean River and can assist river managers in determining sustainable solutions for coping with the issue of wood rafts in rivers.

  13. Non-diffusive ignition of a gaseous reactive mixture following time-resolved, spatially distributed energy deposition

    NASA Astrophysics Data System (ADS)

    Kassoy, D. R.

    2014-01-01

    Systematic asymptotic methods are applied to the compressible conservation and state equations for a reactive gas, including transport terms, to develop a rational thermomechanical formulation for the ignition of a chemical reaction following time-resolved, spatially distributed thermal energy addition from an external source into a finite volume of gas. A multi-parameter asymptotic analysis is developed for a wide range of energy deposition levels relative to the initial internal energy in the volume when the heating timescale is short compared to the characteristic acoustic timescale of the volume. Below a quantitatively defined threshold for energy addition, a nearly constant volume heating process occurs, with a small but finite internal gas expansion Mach number. Very little added thermal energy is converted to kinetic energy. The gas expelled from the boundary of the hot, high-pressure spot is the source of mechanical disturbances (acoustic and shock waves) that propagate away into the neighbouring unheated gas. When the energy addition reaches the threshold value, the heating process is fully compressible with a substantial internal gas expansion Mach number, the source of blast waves propagating into the unheated environmental gas. This case corresponds to an extremely large non-dimensional hot-spot temperature and pressure. If the former is sufficiently large, a high activation energy chemical reaction is initiated on the short heating timescale. This phenomenon is in contrast to that for more modest levels of energy addition, where a thermal explosion occurs only after the familiar extended ignition delay period for a classical high activation reaction. Transport effects, modulated by an asymptotically small Knudsen number, are shown to be negligible unless a local gradient in temperature, concentration or velocity is exceptionally large.

  14. An estimation of vehicle kilometer traveled and on-road emissions using the traffic volume and travel speed on road links in Incheon City.

    PubMed

    Jung, Sungwoon; Kim, Jounghwa; Kim, Jeongsoo; Hong, Dahee; Park, Dongjoo

    2017-04-01

    The objective of this study is to estimate the vehicle kilometer traveled (VKT) and on-road emissions using the traffic volume in urban. We estimated two VKT; one is based on registered vehicles and the other is based on traffic volumes. VKT for registered vehicles was 2.11 times greater than that of the applied traffic volumes because each VKT estimation method is different. Therefore, we had to define the inner VKT is moved VKT inner in urban to compare two values. Also, we focused on freight modes because these are discharged much air pollutant emissions. From analysis results, we found middle and large trucks registered in other regions traveled to target city in order to carry freight, target city has included many industrial and logistics areas. Freight is transferred through the harbors, large logistics centers, or via locations before being moved to the final destination. During this process, most freight is moved by middle and large trucks, and trailers rather than small trucks for freight import and export. Therefore, these trucks from other areas are inflow more than registered vehicles. Most emissions from diesel trucks had been overestimated in comparison to VKT from applied traffic volumes in target city. From these findings, VKT is essential based on traffic volume and travel speed on road links in order to estimate accurately the emissions of diesel trucks in target city. Our findings support the estimation of the effect of on-road emissions on urban air quality in Korea. Copyright © 2016. Published by Elsevier B.V.

  15. Estimating the system price of redox flow batteries for grid storage

    NASA Astrophysics Data System (ADS)

    Ha, Seungbum; Gallagher, Kevin G.

    2015-11-01

    Low-cost energy storage systems are required to support extensive deployment of intermittent renewable energy on the electricity grid. Redox flow batteries have potential advantages to meet the stringent cost target for grid applications as compared to more traditional batteries based on an enclosed architecture. However, the manufacturing process and therefore potential high-volume production price of redox flow batteries is largely unquantified. We present a comprehensive assessment of a prospective production process for aqueous all vanadium flow battery and nonaqueous lithium polysulfide flow battery. The estimated investment and variable costs are translated to fixed expenses, profit, and warranty as a function of production volume. When compared to lithium-ion batteries, redox flow batteries are estimated to exhibit lower costs of manufacture, here calculated as the unit price less materials costs, owing to their simpler reactor (cell) design, lower required area, and thus simpler manufacturing process. Redox flow batteries are also projected to achieve the majority of manufacturing scale benefits at lower production volumes as compared to lithium-ion. However, this advantage is offset due to the dramatically lower present production volume of flow batteries compared to competitive technologies such as lithium-ion.

  16. Image registration and analysis for quantitative myocardial perfusion: application to dynamic circular cardiac CT.

    PubMed

    Isola, A A; Schmitt, H; van Stevendaal, U; Begemann, P G; Coulon, P; Boussel, L; Grass, M

    2011-09-21

    Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.

  17. Survey of Knowledge Representation and Reasoning Systems

    DTIC Science & Technology

    2009-07-01

    processing large volumes of unstructured information such as natural language documents, email, audio , images and video [Ferrucci et al. 2006]. Using this...information we hope to obtain improved es- timation and prediction, data-mining, social network analysis, and semantic search and visualisation . Knowledge

  18. IN SITU AND SOIL DECONTAMINATION BY RADIO FREQUENCY HEATING

    EPA Science Inventory

    In situ radio frequency heating is performed by applying electromagnetic energy in the radio frequency band to an array of electrodes placed in bore holes drilled through the contaminated soil. he process removes organic contaminants from large volumes of soil by volatilization, ...

  19. Total Quality Management Guide. A Two Volume Guide for Defense Organizations. Volume 1. Key Features of the DoD Implementation

    DTIC Science & Technology

    1990-02-15

    XIALTM DoD 5000.51-G A * FINAL DRAFT 2/15/90 si- N ~LUM I -KY FEATRES OFTHE Do IMLEENAON TOTAL L/ De~rrn QUADefens FOREWORD Government and industry...away with all government inspectors. Rather. government oversight will change from te large scale product inspection and specifying the -how to...in class" - Set the course for the future, and - Provide a baseline for measuring progress. Benchmarking is a continuous process of comparing an

  20. 1H-NMR and HPLC studies of the changes involved in volume regulation in the muscle fibres of the crab, Hemigrapsus edwardsi.

    PubMed

    Bedford, J J; Smith, R A; Thomas, M; Leader, J P

    1991-01-01

    1. The process of cell volume readjustment, during adaptation to salinity changes, in muscle fibres of the euryhaline New Zealand shore crab, Hemigrapsus edwardsi, involve large changes in the amounts of free amino acid. 2. These are taurine, proline, alanine, arginine, glutamic acid, glycine and serine. 3. These changes may be quantified by High Performance Liquid Chromatography, and qualitatively demonstrated by proton nuclear magnetic resonance spectroscopy.

  1. Two stage hydrolysis of corn stover at high solids content for mixing power saving and scale-up applications.

    PubMed

    Liu, Ke; Zhang, Jian; Bao, Jie

    2015-11-01

    A two stage hydrolysis of corn stover was designed to solve the difficulties between sufficient mixing at high solids content and high power input encountered in large scale bioreactors. The process starts with the quick liquefaction to convert solid cellulose to liquid slurry with strong mixing in small reactors, then followed the comprehensive hydrolysis to complete saccharification into fermentable sugars in large reactors without agitation apparatus. 60% of the mixing energy consumption was saved by removing the mixing apparatus in large scale vessels. Scale-up ratio was small for the first step hydrolysis reactors because of the reduced reactor volume. For large saccharification reactors in the second step, the scale-up was easy because of no mixing mechanism was involved. This two stage hydrolysis is applicable for either simple hydrolysis or combined fermentation processes. The method provided a practical process option for industrial scale biorefinery processing of lignocellulose biomass. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  3. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    PubMed

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Atmospheric Gaseous Plasma with Large Dimensions

    NASA Astrophysics Data System (ADS)

    Korenev, Sergey

    2012-10-01

    The forming of atmospheric plasma with large dimensions using electrical discharge typically uses the Dielectric Barrier Discharge (DBD). The study of atmospheric DBD was shown some problems related to homogeneous volume plasma. The volume of this plasma determines by cross section and gas gap between electrode and dielectric. The using of electron beam for volume ionization of air molecules by CW relativistic electron beams was shown the high efficiency of this process [1, 2]. The main advantage of this approach consists in the ionization of gas molecules by electrons in longitudinal direction determines by their kinetic energy. A novel method for forming of atmospheric homogeneous plasma with large volume dimensions using ionization of gas molecules by pulsed non-relativistic electron beams is presented in the paper. The results of computer modeling for delivered doses of electron beams in gases and ionization are discussed. The structure of experimental bench with plasma diagnostics is considered. The preliminary results of forming atmospheric plasma with ionization gas molecules by pulsed nanosecond non-relativistic electron beam are given. The analysis of potential applications for atmospheric volume plasma is presented. Reference: [1] S. Korenev. ``The ionization of air by scanning relativistic high power CW electron beam,'' 2002 IEEE International Conference on Plasma Science. May 2002, Alberta, Canada. [2] S. Korenev, I. Korenev. ``The propagation of high power CW scanning electron beam in air.'' BEAMS 2002: 14th International Conference on High-Power Particle Beams, Albuquerque, New Mexico (USA), June 2002, AIP Conference Proceedings Vol. 650(1), pp. 373-376. December 17.

  5. Serial Section Scanning Electron Microscopy (S3EM) on Silicon Wafers for Ultra-Structural Volume Imaging of Cells and Tissues

    PubMed Central

    Horstmann, Heinz; Körber, Christoph; Sätzler, Kurt; Aydin, Daniel; Kuner, Thomas

    2012-01-01

    High resolution, three-dimensional (3D) representations of cellular ultrastructure are essential for structure function studies in all areas of cell biology. While limited subcellular volumes have been routinely examined using serial section transmission electron microscopy (ssTEM), complete ultrastructural reconstructions of large volumes, entire cells or even tissue are difficult to achieve using ssTEM. Here, we introduce a novel approach combining serial sectioning of tissue with scanning electron microscopy (SEM) using a conductive silicon wafer as a support. Ribbons containing hundreds of 35 nm thick sections can be generated and imaged on the wafer at a lateral pixel resolution of 3.7 nm by recording the backscattered electrons with the in-lens detector of the SEM. The resulting electron micrographs are qualitatively comparable to those obtained by conventional TEM. S3EM images of the same region of interest in consecutive sections can be used for 3D reconstructions of large structures. We demonstrate the potential of this approach by reconstructing a 31.7 µm3 volume of a calyx of Held presynaptic terminal. The approach introduced here, Serial Section SEM (S3EM), for the first time provides the possibility to obtain 3D ultrastructure of large volumes with high resolution and to selectively and repetitively home in on structures of interest. S3EM accelerates process duration, is amenable to full automation and can be implemented with standard instrumentation. PMID:22523574

  6. Serial section scanning electron microscopy (S3EM) on silicon wafers for ultra-structural volume imaging of cells and tissues.

    PubMed

    Horstmann, Heinz; Körber, Christoph; Sätzler, Kurt; Aydin, Daniel; Kuner, Thomas

    2012-01-01

    High resolution, three-dimensional (3D) representations of cellular ultrastructure are essential for structure function studies in all areas of cell biology. While limited subcellular volumes have been routinely examined using serial section transmission electron microscopy (ssTEM), complete ultrastructural reconstructions of large volumes, entire cells or even tissue are difficult to achieve using ssTEM. Here, we introduce a novel approach combining serial sectioning of tissue with scanning electron microscopy (SEM) using a conductive silicon wafer as a support. Ribbons containing hundreds of 35 nm thick sections can be generated and imaged on the wafer at a lateral pixel resolution of 3.7 nm by recording the backscattered electrons with the in-lens detector of the SEM. The resulting electron micrographs are qualitatively comparable to those obtained by conventional TEM. S(3)EM images of the same region of interest in consecutive sections can be used for 3D reconstructions of large structures. We demonstrate the potential of this approach by reconstructing a 31.7 µm(3) volume of a calyx of Held presynaptic terminal. The approach introduced here, Serial Section SEM (S(3)EM), for the first time provides the possibility to obtain 3D ultrastructure of large volumes with high resolution and to selectively and repetitively home in on structures of interest. S(3)EM accelerates process duration, is amenable to full automation and can be implemented with standard instrumentation.

  7. High-frequency strontium vapor laser for biomedical applications

    NASA Astrophysics Data System (ADS)

    Hvorostovsky, A.; Kolmakov, E.; Kudashev, I.; Redka, D.; Kancer, A.; Kustikova, M.; Bykovskaya, E.; Mayurova, A.; Stupnikov, A.; Ruzankina, J.; Tsvetkov, K.; Lukyanov, N.; Paklinov, N.

    2018-02-01

    Sr-laser with high pulse repetition rate and high peak radiation power is a unique tool for studying rapidly occurring processes in time (plasma diagnostics, photoablation, etc.). In addition, the study of the frequency characteristics of the active medium of the laser helps to reveal the physics of the formation of an inverse medium in metal vapor lasers. In this paper, an experimental study of an Sr-laser with an active volume of 5.8 cm3 in the pulse repetition frequency range from 25 to 200 kHz is carried out, and a comparison with the frequency characteristics of media with large active volumes is given. We considered the frequency characteristics of the active medium in two modes: at a constant energy in the excitation pulse CU2 / 2 and at a constant average power consumed by the rectifier. In the presented work with a small-volume GRT using the TASITR-5/12 TASITRON switch, a laser was generated for Pairs of strontium at a CSF of 200 kHz. The behavior of the characteristics of the generation lines of 6.456 μm, 1 μm, and 3 μm at increased repetition frequencies is considered. Using the example of large-volume GRT, it is shown that tubes with a large active volume increase their energy characteristics with the growth of the CSF. The possibility of laser operation at pulse repetition rates above 200 kHz is shown.

  8. Digital tissue and what it may reveal about the brain.

    PubMed

    Morgan, Josh L; Lichtman, Jeff W

    2017-10-30

    Imaging as a means of scientific data storage has evolved rapidly over the past century from hand drawings, to photography, to digital images. Only recently can sufficiently large datasets be acquired, stored, and processed such that tissue digitization can actually reveal more than direct observation of tissue. One field where this transformation is occurring is connectomics: the mapping of neural connections in large volumes of digitized brain tissue.

  9. Effects of solution volume on hydrogen production by pulsed spark discharge in ethanol solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xin, Y. B.; Sun, B., E-mail: sunb88@dlmu.edu.cn; Zhu, X. M.

    2016-07-15

    Hydrogen production from ethanol solution (ethanol/water) by pulsed spark discharge was optimized by varying the volume of ethanol solution (liquid volume). Hydrogen yield was initially increased and then decreased with the increase in solution volume, which achieved 1.5 l/min with a solution volume of 500 ml. The characteristics of pulsed spark discharge were studied in this work; the results showed that the intensity of peak current, the rate of current rise, and energy efficiency of hydrogen production can be changed by varying the volume of ethanol solution. Meanwhile, the mechanism analysis of hydrogen production was accomplished by monitoring the process of hydrogenmore » production and the state of free radicals. The analysis showed that decreasing the retention time of gas production and properly increasing the volume of ethanol solution can enhance the hydrogen yield. Through this research, a high-yield and large-scale method of hydrogen production can be achieved, which is more suitable for industrial application.« less

  10. Isolation of organic acids from large volumes of water by adsorption on macroporous resins

    USGS Publications Warehouse

    Aiken, George R.; Suffet, I.H.; Malaiyandi, Murugan

    1987-01-01

    Adsorption on synthetic macroporous resins, such as the Amberlite XAD series and Duolite A-7, is routinely used to isolate and concentrate organic acids from forge volumes of water. Samples as large as 24,500 L have been processed on site by using these resins. Two established extraction schemes using XAD-8 and Duolite A-7 resins are described. The choice of the appropriate resin and extraction scheme is dependent on the organic solutes of interest. The factors that affect resin performance, selectivity, and capacity for a particular solute are solution pH, resin surface area and pore size, and resin composition. The logistical problems of sample handling, filtration, and preservation are also discussed.

  11. Construction-grade plywood from grade 3 Appalachian oak

    Treesearch

    E. Paul Craft; E. Paul Craft

    1970-01-01

    In an effort to find a use for a large volume of the low-grade oak logs of Appalachia, we processed a sample of these logs in a typical southern-pine sheathing plant and determined the feasibility of converting them into construction grade plywood.

  12. Preliminary Evaluation of a Diagnostic Tool for Prosthetics

    DTIC Science & Technology

    2017-10-01

    volume change. Processing algorithms for data from the activity monitors were modified to run more efficiently so that large datasets could be...left) and blade style prostheses (right). Figure 4: Ankle ActiGraph correct position demonstrated for a left leg below-knee amputee cylindrical

  13. MOXIE, ISRU, and the History of In Situ Studies of the Hazards of Dust in Human Exploration of Mars

    NASA Astrophysics Data System (ADS)

    Hecht, M. H.; McClean, J. B.; Pike, W. T.; Smith, P. H.; Madsen, M. B.; Rapp, D.; Moxie Team

    2017-06-01

    The upcoming MOXIE experiment will be the first to ingest large volumes of dust-laden martian atmosphere for processing, and will serve as a test case for translating our understanding into mitigation practices.

  14. EVALUATION OF SOLID ADSORBENTS FOR THE COLLECTION AND ANALYSES OF AMBIENT BIOGENIC VOLATILE ORGANICS

    EPA Science Inventory

    Micrometeorological flux measurements of biogenic volatile organic compounds (BVOCs) usually require that large volumes of air be collected (whole air samples) or focused during the sampling process (cryogenic trapping or gas-solid partitioning on adsorbents) in order to achiev...

  15. Bootstrapping Least Squares Estimates in Biochemical Reaction Networks

    PubMed Central

    Linder, Daniel F.

    2015-01-01

    The paper proposes new computational methods of computing confidence bounds for the least squares estimates (LSEs) of rate constants in mass-action biochemical reaction network and stochastic epidemic models. Such LSEs are obtained by fitting the set of deterministic ordinary differential equations (ODEs), corresponding to the large volume limit of a reaction network, to network’s partially observed trajectory treated as a continuous-time, pure jump Markov process. In the large volume limit the LSEs are asymptotically Gaussian, but their limiting covariance structure is complicated since it is described by a set of nonlinear ODEs which are often ill-conditioned and numerically unstable. The current paper considers two bootstrap Monte-Carlo procedures, based on the diffusion and linear noise approximations for pure jump processes, which allow one to avoid solving the limiting covariance ODEs. The results are illustrated with both in-silico and real data examples from the LINE 1 gene retrotranscription model and compared with those obtained using other methods. PMID:25898769

  16. Controllable construction of flower-like FeS/Fe2O3 composite for lithium storage

    NASA Astrophysics Data System (ADS)

    Wang, Jie; He, Huan; Wu, Zexing; Liang, Jianing; Han, Lili; Xin, Huolin L.; Guo, Xuyun; Zhu, Ye; Wang, Deli

    2018-07-01

    Transitions metal sulfides/oxides have been considered as promising anode candidates for next generation lithium-ion batteries (LIBs) due to high theoretical capacities. However, the large volume change during lithiation/delithiation process and poor electronic conductivity often result in a poor charging/discharging performance. Herein, we design a flower-like FeS/Fe2O3 composite via a simple "solvothermal-oxidation" method, in which the Fe2O3 is most distributed on the surface of the flower. The unique porous structure and synergistic effect between FeS and Fe2O3 not only accommodate the large volume expansion, but also facilitate Li ion and electron transport. The Fe2O3 shell effectively reduce the dissolution of Li2Sx during discharge/charge process. When serving as the anode material in lithium ion battery, FeS/Fe2O3 exhibits superior specific capacity, rate capacity and cycling stability compared with pure FeS and Fe2O3.

  17. Nanocrystalline Iron-Ore-Based Catalysts for Fischer-Tropsch Synthesis.

    PubMed

    Yong, Seok; Park, Ji Chan; Lee, Ho-Tae; Yang, Jung-Il; Hong, SungJun; Jung, Heon; Chun, Dong Hyun

    2016-02-01

    Nanocrystalline iron ore particles were fabricated by a wet-milling process using an Ultra Apex Mill, after which they were used as raw materials of iron-based catalysts for low-temperature Fischer-Tropsch synthesis (FTS) below 280 degrees C, which usually requires catalysts with a high surface area, a large pore volume, and a small crystallite size. The wet-milling process using the Ultra Apex Mill effectively destroyed the initial crystallite structure of the natural iron ores of several tens to hundreds of nanometers in size, resulting in the generation of nanocrystalline iron ore particles with a high surface area and a large pore volume. The iron-ore-based catalysts prepared from the nanocrystalline iron ore particles effectively catalyzed the low-temperature FTS, displaying a high CO conversion (about 90%) and good C5+ hydrocarbon productivity (about 0.22 g/g(cat)(-h)). This demonstrates the feasibility of using the iron-ore-based catalysts as inexpensive and disposable catalysts for the low-temperature FTS.

  18. Different brains process numbers differently: structural bases of individual differences in spatial and nonspatial number representations.

    PubMed

    Krause, Florian; Lindemann, Oliver; Toni, Ivan; Bekkering, Harold

    2014-04-01

    A dominant hypothesis on how the brain processes numerical size proposes a spatial representation of numbers as positions on a "mental number line." An alternative hypothesis considers numbers as elements of a generalized representation of sensorimotor-related magnitude, which is not obligatorily spatial. Here we show that individuals' relative use of spatial and nonspatial representations has a cerebral counterpart in the structural organization of the posterior parietal cortex. Interindividual variability in the linkage between numbers and spatial responses (faster left responses to small numbers and right responses to large numbers; spatial-numerical association of response codes effect) correlated with variations in gray matter volume around the right precuneus. Conversely, differences in the disposition to link numbers to force production (faster soft responses to small numbers and hard responses to large numbers) were related to gray matter volume in the left angular gyrus. This finding suggests that numerical cognition relies on multiple mental representations of analogue magnitude using different neural implementations that are linked to individual traits.

  19. Manufactured caverns in carbonate rock

    DOEpatents

    Bruce, David A.; Falta, Ronald W.; Castle, James W.; Murdoch, Lawrence C.

    2007-01-02

    Disclosed is a process for manufacturing underground caverns suitable in one embodiment for storage of large volumes of gaseous or liquid materials. The method is an acid dissolution process that can be utilized to form caverns in carbonate rock formations. The caverns can be used to store large quantities of materials near transportation facilities or destination markets. The caverns can be used for storage of materials including fossil fuels, such as natural gas, refined products formed from fossil fuels, or waste materials, such as hazardous waste materials. The caverns can also be utilized for applications involving human access such as recreation or research. The method can also be utilized to form calcium chloride as a by-product of the cavern formation process.

  20. Quantifying large-scale historical formation of accommodation in the Mississippi Delta

    USGS Publications Warehouse

    Morton, Robert A.; Bernier, Julie C.; Kelso, Kyle W.; Barras, John A.

    2010-01-01

    Large volumes of new accommodation have formed within the Mississippi Delta plain since the mid-1950s in association with rapid conversion of coastal wetlands to open water. The three-dimensional aspects and processes responsible for accommodation formation were quantified by comparing surface elevations, water depths, and vertical displacements of stratigraphic contacts that were correlated between short sediment cores. Integration of data from remotely sensed images, sediment cores, and water-depth surveys at 10 geologically diverse areas in the delta plain provided a basis for estimating the total volume of accommodation formed by interior-wetland subsidence and subsequent erosion. Results indicate that at most of the study areas subsidence was a greater contributor than erosion to the formation of accommodation associated with wetland loss. Tens of millions of cubic meters of accommodation formed rapidly at each of the large open-water bodies that were formerly continuous interior delta-plain marsh. Together the individual study areas account for more than 440 × 106 × m3 of new accommodation that formed as holes in the Mississippi River delta-plain fabric between 1956 and 2004. This large volume provides an estimate of the new sediment that would be needed just at the study areas to restore the delta-plain wetlands to their pre-1956 areal extent and elevations.

  1. CLARA: CLAS12 Reconstruction and Analysis Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  2. A Distributed GPU-Based Framework for Real-Time 3D Volume Rendering of Large Astronomical Data Cubes

    NASA Astrophysics Data System (ADS)

    Hassan, A. H.; Fluke, C. J.; Barnes, D. G.

    2012-05-01

    We present a framework to volume-render three-dimensional data cubes interactively using distributed ray-casting and volume-bricking over a cluster of workstations powered by one or more graphics processing units (GPUs) and a multi-core central processing unit (CPU). The main design target for this framework is to provide an in-core visualization solution able to provide three-dimensional interactive views of terabyte-sized data cubes. We tested the presented framework using a computing cluster comprising 64 nodes with a total of 128GPUs. The framework proved to be scalable to render a 204GB data cube with an average of 30 frames per second. Our performance analyses also compare the use of NVIDIA Tesla 1060 and 2050GPU architectures and the effect of increasing the visualization output resolution on the rendering performance. Although our initial focus, as shown in the examples presented in this work, is volume rendering of spectral data cubes from radio astronomy, we contend that our approach has applicability to other disciplines where close to real-time volume rendering of terabyte-order three-dimensional data sets is a requirement.

  3. The Cold Gas History of the Universe as seen by the ngVLA

    NASA Astrophysics Data System (ADS)

    Riechers, Dominik A.; Carilli, Chris Luke; Casey, Caitlin; da Cunha, Elisabete; Hodge, Jacqueline; Ivison, Rob; Murphy, Eric J.; Narayanan, Desika; Sargent, Mark T.; Scoville, Nicholas; Walter, Fabian

    2017-01-01

    The Next Generation Very Large Array (ngVLA) will fundamentally advance our understanding of the formation processes that lead to the assembly of galaxies throughout cosmic history. The combination of large bandwidth with unprecedented sensitivity to the critical low-level CO lines over virtually the entire redshift range will open up the opportunity to conduct large-scale, deep cold molecular gas surveys, mapping the fuel for star formation in galaxies over substantial cosmic volumes. Informed by the first efforts with the Karl G. Jansky Very Large Array (COLDz survey) and the Atacama Large (sub)Millimeter Array (ASPECS survey), we here present initial predictions and possible survey strategies for such "molecular deep field" observations with the ngVLA. These investigations will provide a detailed measurement of the volume density of molecular gas in galaxies as a function of redshift, the "cold gas history of the universe". This will crucially complement studies of the neutral gas, star formation and stellar mass histories with large low-frequency arrays, the Large UV/Optical/Infrared Surveyor, and the Origins Space Telescope, providing the means to obtain a comprehensive picture of galaxy evolution through cosmic times.

  4. How large is the typical subarachnoid hemorrhage? A review of current neurosurgical knowledge.

    PubMed

    Whitmore, Robert G; Grant, Ryan A; LeRoux, Peter; El-Falaki, Omar; Stein, Sherman C

    2012-01-01

    Despite the morbidity and mortality of subarachnoid hemorrhage (SAH), the average volume of a typical hemorrhage is not well defined. Animal models of SAH often do not accurately mimic the human disease process. The purpose of this study is to estimate the average SAH volume, allowing standardization of animal models of the disease. We performed a MEDLINE search of SAH volume and erythrocyte counts in human cerebrospinal fluid as well as for volumes of blood used in animal injection models of SAH, from 1956 to 2010. We polled members of the American Association of Neurological Surgeons (AANS) for estimates of typical SAH volume. Using quantitative data from the literature, we calculated the total volume of SAH as equal to the volume of blood clotted in basal cisterns plus the volume of dispersed blood in cerebrospinal fluid. The results of the AANS poll confirmed our estimates. The human literature yielded 322 publications and animal literature, 237 studies. Four quantitative human studies reported blood clot volumes ranging from 0.2 to 170 mL, with a mean of ∼20 mL. There was only one quantitative study reporting cerebrospinal fluid red blood cell counts from serial lumbar puncture after SAH. Dispersed blood volume ranged from 2.9 to 45.9 mL, and we used the mean of 15 mL for our calculation. Therefore, total volume of SAH equals 35 mL. The AANS poll yielded 176 responses, ranging from 2 to 350 mL, with a mean of 33.9 ± 4.4 mL. Based on our estimate of total SAH volume of 35 mL, animal injection models may now become standardized for more accurate portrayal of the human disease process. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. D and D: Dismantling and Release of Large Components at the GNS Premises in Duisburg on the Example of a CASTOR S1 Container - 13536

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oehmigen, Steffen; Ambos, Frank

    There are a lot of metallic large components for the transport of radioactive waste in Germany. Some of these large components like for example the Castor S1 with 82 Mg are so old, that the transport via streets is not possible because the permission is not valid anymore. The application for a new permission is economically not reasonable. Out of this reason the large components need to be decontaminated and recycled to use them again in the economic cycle. Decontamination of large components by cleaning/removing the surface for example with beam technology is a very time-consuming release procedure. Manufacturing amore » specialized machine for decontamination and creation of a new surface was the intention of this project. The objective was to save interim storage and final repository volume and costs as well as developing a process that is nationally and internationally usable. 90% of the volume/mass of waste could be released and therefore possibly re-used. (authors)« less

  6. Temporal dynamics of online petitions

    PubMed Central

    Woolley-Meza, Olivia; Brockmann, Dirk

    2017-01-01

    Online petitions are an important avenue for direct political action, yet the dynamics that determine when a petition will be successful are not well understood. Here we analyze the temporal characteristics of online-petition signing behavior in order to identify systematic differences between popular petitions, which receive a high volume of signatures, and unpopular ones. We find that, in line with other temporal characterizations of human activity, the signing process is typically non-Poissonian and non-homogeneous in time. However, this process exhibits anomalously high memory for human activity, possibly indicating that synchronized external influence or contagion play and important role. More interestingly, we find clear differences in the characteristics of the inter-event time distributions depending on the total number of signatures that petitions receive, independently of the total duration of the petitions. Specifically, popular petitions that attract a large volume of signatures exhibit more variance in the distribution of inter-event times than unpopular petitions with only a few signatures, which could be considered an indication that the former are more bursty. However, petitions with large signature volume are less bursty according to measures that consider the time ordering of inter-event times. Our results, therefore, emphasize the importance of accounting for time ordering to characterize human activity. PMID:28542492

  7. IN-SITU CR(VI) SOURCE AND PLUME TREATMENT USING A FERROUS IRON BASED REDUCTANT

    EPA Science Inventory

    A large volume of chromite ore processing residue (COPR) generated from ferrochrome production operations is present at the Macalloy Corporation Superfund site in Charleston, S.C. Groundwater hexavalent chromium (Cr(VI)) concentrations in the approximately 20 acre-foot COPR satu...

  8. How to Study History: The View from Sociology.

    ERIC Educational Resources Information Center

    Goldstone, Jack A.

    1986-01-01

    Reviews two recent books: Charles Tilly's 1985 work, "Big Structures, Large Processes, Huge Comparisons," and the 1984 volume edited by Theda Skocpol, "Vision and Method in Historical Sociology." Concludes that historians who still harbor negative images of historical sociologists would benefit by gaining a more accurate…

  9. CICS Region Virtualization for Cost Effective Application Development

    ERIC Educational Resources Information Center

    Khan, Kamal Waris

    2012-01-01

    Mainframe is used for hosting large commercial databases, transaction servers and applications that require a greater degree of reliability, scalability and security. Customer Information Control System (CICS) is a mainframe software framework for implementing transaction services. It is designed for rapid, high-volume online processing. In order…

  10. IN-SITU CR(VI) SOURCE AND PLUME TREATMENT USING A FERROUS IRON-BASED REDUCTANT

    EPA Science Inventory

    A large volume of chromite ore processing residue (COPR) generated from ferrochrome production operations is present at the Macalloy Corporation Superfund site in Charleston, S.C. Groundwater hexavalent chromium (Cr(VI)) concentrations in the approximately 20 acre-foot COPR sat...

  11. Quality of cucumbers commercially fermented in calcium chloride brine without sodium salts

    USDA-ARS?s Scientific Manuscript database

    Commercial cucumber fermentation produces large volumes of salty wastewater. This study evaluated the quality of fermented cucumbers produced commercially using an alternative calcium chloride brining process. Fermentation conducted in calcium brines (0.1M calcium chloride, 6mM potassium sorbate, eq...

  12. Novel Technology for Enrichment of Biomolecules from Cell-Free Body Fluids and Subsequent DNA Sizing.

    PubMed

    Patel, Vipulkumar; Celec, Peter; Grunt, Magdalena; Schwarzenbach, Heidi; Jenneckens, Ingo; Hillebrand, Timo

    2016-01-01

    Circulating cell-free DNA (ccfDNA) is a promising diagnostic tool and its size fractionation is of interest. However, kits for isolation of ccfDNA available on the market are designed for small volumes hence processing large sample volumes is laborious. We have tested a new method that enables enrichment of ccfDNA from large volumes of plasma and subsequently allows size-fractionation of isolated ccfDNA into two fractions with individually established cut-off levels of ccfDNA length. This method allows isolation of low-abundant DNA as well as separation of long and short DNA molecules. This procedure may be important e.g., in prenatal diagnostics and cancer research that have been already confirmed by our primary experiments. Here, we report the results of selective separation of 200- and 500-bp long synthetic DNA fragments spiked in plasma samples. Furthermore, we size-fractionated ccfDNA from the plasma of pregnant women and verified the prevalence of fetal ccfDNA in all fractions.

  13. Computational study of noise in a large signal transduction network.

    PubMed

    Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena

    2011-06-21

    Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.

  14. High-volume optical vortex multiplexing and de-multiplexing for free-space optical communication.

    PubMed

    Wang, Zhongxi; Zhang, N; Yuan, X-C

    2011-01-17

    We report an approach to the increase of signal channels in free-space optical communication based on composed optical vortices (OVs). In the encoding process, conventional algorithm employed for the generation of collinearly superimposed OVs is combined with a genetic algorithm to achieve high-volume OV multiplexing. At the receiver end, a novel Dammann vortex grating is used to analyze the multihelix beams with a large number of OVs. We experimentally demonstrate a digitized system which is capable of transmitting and receiving 16 OV channels simultaneously. This system is expected to be compatible with a high-speed OV multiplexing technique, with potentials to extremely high-volume information density in OV communication.

  15. Design of a lamella settler for biomass recycling in continuous ethanol fermentation process.

    PubMed

    Tabera, J; Iznaola, M A

    1989-04-20

    The design and application of a settler to a continuous fermentation process with yeast recycle were studied. The compact lamella-type settler was chosen to avoid large volumes associated with conventional settling tanks. A rationale of the design method is covered. The sedimentation area was determined by classical batch settling rate tests and sedimentation capacity calculation. Limitations on the residence time of the microorganisms in the settler, rather than sludge thickening considerations, was the approach employed for volume calculation. Fermentation rate tests with yeast after different sedimentation periods were carried out to define a suitable residence time. Continuous cell recycle fermentation runs, performed with the old and new sedimentation devices, show that lamella settler improves biomass recycling efficiency, being the process able to operate at higher sugar concentrations and faster dilution rates.

  16. Three-dimensional hollow-structured binary oxide particles as an advanced anode material for high-rate and long cycle life lithium-ion batteries

    DOE PAGES

    Wang, Deli; Wang, Jie; He, Huan; ...

    2015-12-30

    Transition metal oxides are among the most promising anode candidates for next-generation lithium-ion batteries for their high theoretical capacity. However, the large volume expansion and low lithium ion diffusivity leading to a poor charging/discharging performance. In this study, we developed a surfactant and template-free strategy for the synthesis of a composite of Co xFe 3–xO 4 hollow spheres supported by carbon nanotubes via an impregnation–reduction–oxidation process. The synergy of the composite, as well as the hollow structures in the electrode materials, not only facilitate Li ion and electron transport, but also accommodate large volume expansion. Using state-of-the-art electron tomography, wemore » directly visualize the particles in 3-D, where the voids in the hollow structures serve to buffer the volume expansion of the material. These improvements result in a high reversible capacity as well as an outstanding rate performance for lithium-ion battery applications. As a result, this study sheds light on large-scale production of hollow structured metal oxides for commercial applications in energy storage and conversion.« less

  17. High Productivity DRIE solutions for 3D-SiP and MEMS Volume Manufacturing

    NASA Astrophysics Data System (ADS)

    Puech, M.; Thevenoud, JM; Launay, N.; Arnal, N.; Godinat, P.; Andrieu, B.; Gruffat, JM

    2006-04-01

    Emerging 3D-SiP technologies and high volume MEMS applications require high productivity mass production DRIE systems. The Alcatel DRIE product range has recently been optimised to reach the highest process and hardware production performances. A study based on sub-micron high aspect ratio structures encountered in the most stringent 3D-SiP has been carried out. The optimization of the Bosch process parameters has resulted in ultra high silicon etch rates, with unrivalled uniformity and repeatability leading to excellent process. In parallel, most recent hardware and proprietary design optimization including vacuum pumping lines, process chamber, wafer chucks, pressure control system, gas delivery are discussed. These improvements have been monitored in a mass production environment for a mobile phone application. Field data analysis shows a significant reduction of cost of ownership thanks to increased throughput and much lower running costs. These benefits are now available for all 3D-SiP and high volume MEMS applications. The typical etched patterns include tapered trenches for CMOS imagers, through silicon via holes for die stacking, well controlled profile angle for 3D high precision inertial sensors, and large exposed area features for inkjet printer heads and Silicon microphones.

  18. Scalable and Interactive Segmentation and Visualization of Neural Processes in EM Datasets

    PubMed Central

    Jeong, Won-Ki; Beyer, Johanna; Hadwiger, Markus; Vazquez, Amelio; Pfister, Hanspeter; Whitaker, Ross T.

    2011-01-01

    Recent advances in scanning technology provide high resolution EM (Electron Microscopy) datasets that allow neuroscientists to reconstruct complex neural connections in a nervous system. However, due to the enormous size and complexity of the resulting data, segmentation and visualization of neural processes in EM data is usually a difficult and very time-consuming task. In this paper, we present NeuroTrace, a novel EM volume segmentation and visualization system that consists of two parts: a semi-automatic multiphase level set segmentation with 3D tracking for reconstruction of neural processes, and a specialized volume rendering approach for visualization of EM volumes. It employs view-dependent on-demand filtering and evaluation of a local histogram edge metric, as well as on-the-fly interpolation and ray-casting of implicit surfaces for segmented neural structures. Both methods are implemented on the GPU for interactive performance. NeuroTrace is designed to be scalable to large datasets and data-parallel hardware architectures. A comparison of NeuroTrace with a commonly used manual EM segmentation tool shows that our interactive workflow is faster and easier to use for the reconstruction of complex neural processes. PMID:19834227

  19. Rapid and automated processing of bone marrow grafts without Ficoll density gradient for transplantation of cryopreserved autologous or ABO-incompatible allogeneic bone marrow.

    PubMed

    Schanz, U; Gmür, J

    1992-12-01

    The growing number of BMTs has increased interest in safe and standardized in vitro bone marrow processing techniques. We describe our experience with a rapid automated method for the isolation of mononuclear cells (MNC) from large volumes of bone marrow using a Fenwal CS-3000 cell separator without employing density gradient materials. Forty bone marrow harvests with a mean volume of 1650 +/- 307 ml were processed. A mean of 75 +/- 34% (50 percentile range 54-94%) of the original MNCs were recovered in a volume of 200 ml with only 4 +/- 2% of the starting red blood cells (RBC). Removal of granulocytes, immature myeloid precursors and platelets proved to be sufficient to permit safe cryopreservation and successful autologous BMT (n = 25). Allogeneic BMT (n = 14, including three major ABO-incompatible) could be performed without additional manipulation. In both groups of patients timely and stable engraftment comparable to historical controls receiving Ficoll gradient processed autologous (n = 17) or unprocessed allogeneic BMT (n = 54) was observed. Moreover, 70 +/- 14% of the RBC could be recovered from the grafts. They were used for autologous RBC support of donors, rendering unnecessary autologous blood pre-donations.

  20. Discrete element modeling of the mass movement and loose material supplying the gully process of a debris avalanche in the Bayi Gully, Southwest China

    NASA Astrophysics Data System (ADS)

    Zhou, Jia-wen; Huang, Kang-xin; Shi, Chong; Hao, Ming-hui; Guo, Chao-xu

    2015-03-01

    The dynamic process of a debris avalanche in mountainous areas is influenced by the landslide volume, topographical conditions, mass-material composition, mechanical properties and other factors. A good understanding of the mass movement and loose material supplying the gully process is very important for understanding the dynamic properties of debris avalanches. Three-dimensional particle flow code (PFC3D) was used to simulate a debris avalanche in Quaternary deposits at the Bayi Gully, Southwest China. FORTRAN and AutoCAD were used for the secondary development to display the mass movement process and to quantitatively describe the mass movement and loose material supplying the gully process. The simulated results show that after the landslide is initiated, the gravitational potential energy is converted into kinetic energy with a variation velocity for the sliding masses. Two stages exist for the average-movement velocity: the acceleration stage and the slowdown stage, which are influenced by the topographical conditions. For the loose materials supplying the gully process, the cumulative volume of the sliding masses into the gully gradually increases over the time. When the landslide volume is not large enough, the increasing landslide volume does not obviously influence the movement process of the sliding masses. The travel distance and movement velocity increase with the decreasing numerical parameters, and the mass-movement process is finished more quickly using low-value parameters. The deposition area of the sliding masses decreases with the increasing numerical parameters and the corresponding deposition thickness increases. The mass movement of the debris avalanche is not only influenced by the mechanical parameters but is also controlled by the topographical conditions.

  1. Military Compensation: Past, Present and Future. Volume 1. Executive Summary.

    DTIC Science & Technology

    1976-01-01

    Chapter 3 provides an overview of the current military compen- sation system -- i.e., the military pay and allowances system. The major subsystems to...research efforts produced processes for control of shrinkageof wool fabrics. In the US textile industry, wooliteis are nowtreated by these processes...led to development of .4,4continuous dyeing. Ilodern dyeing facilities of large textile • actories throughout the world trace their basic technology

  2. Digitally programmable microfluidic automaton for multiscale combinatorial mixing and sample processing†

    PubMed Central

    Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.

    2013-01-01

    A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232

  3. Efficient and scalable ionization of neutral atoms by an orderly array of gold-doped silicon nanowires

    NASA Astrophysics Data System (ADS)

    Bucay, Igal; Helal, Ahmed; Dunsky, David; Leviyev, Alex; Mallavarapu, Akhila; Sreenivasan, S. V.; Raizen, Mark

    2017-04-01

    Ionization of atoms and molecules is an important process in many applications and processes such as mass spectrometry. Ionization is typically accomplished by electron bombardment, and while it is scalable to large volumes, is also very inefficient due to the small cross section of electron-atom collisions. Photoionization methods can be highly efficient, but are not scalable due to the small ionization volume. Electric field ionization is accomplished using ultra-sharp conducting tips biased to a few kilovolts, but suffers from a low ionization volume and tip fabrication limitations. We report on our progress towards an efficient, robust, and scalable method of atomic and molecular ionization using orderly arrays of sharp, gold-doped silicon nanowires. As demonstrated in earlier work, the presence of the gold greatly enhances the ionization probability, which was attributed to an increase in available acceptor surface states. We present here a novel process used to fabricate the nanowire array, results of simulations aimed at optimizing the configuration of the array, and our progress towards demonstrating efficient and scalable ionization.

  4. Development of Solvent Extraction Approach to Recycle Enriched Molybdenum Material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tkac, Peter; Brown, M. Alex; Sen, Sujat

    2016-06-01

    Argonne National Laboratory, in cooperation with Oak Ridge National Laboratory and NorthStar Medical Technologies, LLC, is developing a recycling process for a solution containing valuable Mo-100 or Mo-98 enriched material. Previously, Argonne had developed a recycle process using a precipitation technique. However, this process is labor intensive and can lead to production of large volumes of highly corrosive waste. This report discusses an alternative process to recover enriched Mo in the form of ammonium heptamolybdate by using solvent extraction. Small-scale experiments determined the optimal conditions for effective extraction of high Mo concentrations. Methods were developed for removal of ammonium chloridemore » from the molybdenum product of the solvent extraction process. In large-scale experiments, very good purification from potassium and other elements was observed with very high recovery yields (~98%).« less

  5. Fast surface-based travel depth estimation algorithm for macromolecule surface shape description.

    PubMed

    Giard, Joachim; Alface, Patrice Rondao; Gala, Jean-Luc; Macq, Benoît

    2011-01-01

    Travel Depth, introduced by Coleman and Sharp in 2006, is a physical interpretation of molecular depth, a term frequently used to describe the shape of a molecular active site or binding site. Travel Depth can be seen as the physical distance a solvent molecule would have to travel from a point of the surface, i.e., the Solvent-Excluded Surface (SES), to its convex hull. Existing algorithms providing an estimation of the Travel Depth are based on a regular sampling of the molecule volume and the use of the Dijkstra's shortest path algorithm. Since Travel Depth is only defined on the molecular surface, this volume-based approach is characterized by a large computational complexity due to the processing of unnecessary samples lying inside or outside the molecule. In this paper, we propose a surface-based approach that restricts the processing to data defined on the SES. This algorithm significantly reduces the complexity of Travel Depth estimation and makes possible the analysis of large macromolecule surface shape description with high resolution. Experimental results show that compared to existing methods, the proposed algorithm achieves accurate estimations with considerably reduced processing times.

  6. Medical University of South Carolina Environmental Hazards Assessment Program. Volume 6: Annual report, July 1, 1993--June 30, 1994 deliverables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Medical University of South Carolina`s vision is to become the premier national resource for medical information and for environmental/health risk assessment. A key component to the success of the many missions of the Environmental Hazards Assessment Program (EHAP) is timely access to large volumes of data. This study documents the results of the needs assessment effort conducted to determine the information access and processing requirements of EHAP. This report addresses the Department of Environmental Health Science, education and training initiative.

  7. Knowledge Discovery as an Aid to Organizational Creativity.

    ERIC Educational Resources Information Center

    Siau, Keng

    2000-01-01

    This article presents the concept of knowledge discovery, a process of searching for associations in large volumes of computer data, as an aid to creativity. It then discusses the various techniques in knowledge discovery. Mednick's associative theory of creative thought serves as the theoretical foundation for this research. (Contains…

  8. Waste Controls at Base Metal Mines

    ERIC Educational Resources Information Center

    Bell, Alan V.

    1976-01-01

    Mining and milling of copper, lead, zinc and nickel in Canada involves an accumulation of a half-million tons of waste material each day and requires 250 million gallons of process water daily. Waste management considerations for handling large volumes of wastes in an economically and environmentally safe manner are discussed. (BT)

  9. Ease into Writing. Volume 2.

    ERIC Educational Resources Information Center

    Lott, Carolyn, Ed.; Stone, Janet, Ed.

    Addressing the expressed needs of the writing community, this book presents writing lessons for intermediate, middle school, and secondary school students that incorporate the 5-step writing process into content areas as a natural part of the curriculum. The 30 lessons in this book involve students in large and small groups and in individual…

  10. Magmatic evolution of a Cordilleran flare-up and its role in the creation of silicic crust.

    PubMed

    Ward, Kevin M; Delph, Jonathan R; Zandt, George; Beck, Susan L; Ducea, Mihai N

    2017-08-22

    The role of magmatic processes as a significant mechanism for the generation of voluminous silicic crust and the development of Cordilleran plateaus remains a lingering question in part because of the inherent difficulty in quantifying plutonic volumes. Despite this difficulty, a growing body of independently measured plutonic-to-volcanic ratios suggests the volume of plutonic material in the crust related to Cordilleran magmatic systems is much larger than is previously expected. To better examine the role of crustal magmatic processes and its relationship to erupted material in Cordilleran systems, we present a continuous high-resolution crustal seismic velocity model for an ~800 km section of the active South American Cordillera (Puna Plateau). Although the plutonic-to-volcanic ratios we estimate vary along the length of the Puna Plateau, all ratios are larger than those previously reported (~30:1 compared to 5:1) implying that a significant volume of intermediate to silicic plutonic material is generated in the crust of the central South American Cordillera. Furthermore, as Cordilleran-type margins have been common since the onset of modern plate tectonics, our findings suggest that similar processes may have played a significant role in generating and/or modifying large volumes of continental crust, as observed in the continents today.

  11. Membrane processes in biotechnology: an overview.

    PubMed

    Charcosset, Catherine

    2006-01-01

    Membrane processes are increasingly reported for various applications in both upstream and downstream technology, such as the established ultrafiltration and microfiltration, and emerging processes as membrane bioreactors, membrane chromatography, and membrane contactors for the preparation of emulsions and particles. Membrane systems exploit the inherent properties of high selectivity, high surface-area-per-unit-volume, and their potential for controlling the level of contact and/or mixing between two phases. This review presents these various membrane processes by focusing more precisely on membrane materials, module design, operating parameters and the large range of possible applications.

  12. Lysine production from methanol at 50 degrees C using Bacillus methanolicus: Modeling volume control, lysine concentration, and productivity using a three-phase continuous simulation.

    PubMed

    Lee, G H; Hur, W; Bremmon, C E; Flickinger, M C

    1996-03-20

    A simulation was developed based on experimental data obtained in a 14-L reactor to predict the growth and L-lysine accumulation kinetics, and change in volume of a large-scale (250-m(3)) Bacillus methanolicus methanol-based process. Homoserine auxotrophs of B. methanolicus MGA3 are unique methylotrophs because of the ability to secrete lysine during aerobic growth and threonine starvation at 50 degrees C. Dissolved methanol (100 mM), pH, dissolved oxygen tension (0.063 atm), and threonine levels were controlled to obtain threonine-limited conditions and high-cell density (25 g dry cell weight/L) in a 14-L reactor. As a fed-batch process, the additions of neat methanol (fed on demand), threonine, and other nutrients cause the volume of the fermentation to increase and the final lysine concentration to decrease. In addition, water produced as a result of methanol metabolism contributes to the increase in the volume of the reactor. A three-phase approach was used to predict the rate of change of culture volume based on carbon dioxide production and methanol consumption. This model was used for the evaluation of volume control strategies to optimize lysine productivity. A constant volume reactor process with variable feeding and continuous removal of broth and cells (VF(cstr)) resulted in higher lysine productivity than a fed-batch process without volume control. This model predicts the variation in productivity of lysine with changes in growth and in specific lysine productivity. Simple modifications of the model allows one to investigate other high-lysine-secreting strains with different growth and lysine productivity characteristics. Strain NOA2#13A5-2 which secretes lysine and other end-products were modeled using both growth and non-growth-associated lysine productivity. A modified version of this model was used to simulate the change in culture volume of another L-lysine producing mutant (NOA2#13A52-8A66) with reduced secretion of end-products. The modified simulation indicated that growth-associated production dominates in strain NOA2#13A52-8A66. (c) 1996 John Wiley & Sons, Inc.

  13. Dynamic Shape Capture of Free-Swimming Aquatic Life using Multi-view Stereo

    NASA Astrophysics Data System (ADS)

    Daily, David

    2017-11-01

    The reconstruction and tracking of swimming fish in the past has either been restricted to flumes, small volumes, or sparse point tracking in large tanks. The purpose of this research is to use an array of cameras to automatically track 50-100 points on the surface of a fish using the multi-view stereo computer vision technique. The method is non-invasive thus allowing the fish to swim freely in a large volume and to perform more advanced maneuvers such as rolling, darting, stopping, and reversing which have not been studied. The techniques for obtaining and processing the 3D kinematics and maneuvers of tuna, sharks, stingrays, and other species will be presented and compared. The National Aquarium and the Naval Undersea Warfare Center and.

  14. Open Data, Jupyter Notebooks and Geospatial Data Standards Combined - Opening up large volumes of marine and climate data to other communities

    NASA Astrophysics Data System (ADS)

    Clements, O.; Siemen, S.; Wagemann, J.

    2017-12-01

    The EU-funded Earthserver-2 project aims to offer on-demand access to large volumes of environmental data (Earth Observation, Marine, Climate data and Planetary data) via the interface standard Web Coverage Service defined by the Open Geospatial Consortium. Providing access to data via OGC web services (e.g. WCS and WMS) has the potential to open up services to a wider audience, especially to users outside the respective communities. Especially WCS 2.0 with its processing extension Web Coverage Processing Service (WCPS) is highly beneficial to make large volumes accessible to non-expert communities. Users do not have to deal with custom community data formats, such as GRIB for the meteorological community, but can directly access the data in a format they are more familiar with, such as NetCDF, JSON or CSV. Data requests can further directly be integrated into custom processing routines and users are not required to download Gigabytes of data anymore. WCS supports trim (reduction of data extent) and slice (reduction of data dimension) operations on multi-dimensional data, providing users a very flexible on-demand access to the data. WCPS allows the user to craft queries to run on the data using a text-based query language, similar to SQL. These queries can be very powerful, e.g. condensing a three-dimensional data cube into its two-dimensional mean. However, the more processing-intensive the more complex the query. As part of the EarthServer-2 project, we developed a python library that helps users to generate complex WCPS queries with Python, a programming language they are more familiar with. The interactive presentation aims to give practical examples how users can benefit from two specific WCS services from the Marine and Climate community. Use-cases from the two communities will show different approaches to take advantage of a Web Coverage (Processing) Service. The entire content is available with Jupyter Notebooks, as they prove to be a highly beneficial tool to generate reproducible workflows for environmental data analysis.

  15. Fokker-Planck description for the queue dynamics of large tick stocks.

    PubMed

    Garèche, A; Disdier, G; Kockelkoren, J; Bouchaud, J-P

    2013-09-01

    Motivated by empirical data, we develop a statistical description of the queue dynamics for large tick assets based on a two-dimensional Fokker-Planck (diffusion) equation. Our description explicitly includes state dependence, i.e., the fact that the drift and diffusion depend on the volume present on both sides of the spread. "Jump" events, corresponding to sudden changes of the best limit price, must also be included as birth-death terms in the Fokker-Planck equation. All quantities involved in the equation can be calibrated using high-frequency data on the best quotes. One of our central findings is that the dynamical process is approximately scale invariant, i.e., the only relevant variable is the ratio of the current volume in the queue to its average value. While the latter shows intraday seasonalities and strong variability across stocks and time periods, the dynamics of the rescaled volumes is universal. In terms of rescaled volumes, we found that the drift has a complex two-dimensional structure, which is a sum of a gradient contribution and a rotational contribution, both stable across stocks and time. This drift term is entirely responsible for the dynamical correlations between the ask queue and the bid queue.

  16. Fokker-Planck description for the queue dynamics of large tick stocks

    NASA Astrophysics Data System (ADS)

    Garèche, A.; Disdier, G.; Kockelkoren, J.; Bouchaud, J.-P.

    2013-09-01

    Motivated by empirical data, we develop a statistical description of the queue dynamics for large tick assets based on a two-dimensional Fokker-Planck (diffusion) equation. Our description explicitly includes state dependence, i.e., the fact that the drift and diffusion depend on the volume present on both sides of the spread. “Jump” events, corresponding to sudden changes of the best limit price, must also be included as birth-death terms in the Fokker-Planck equation. All quantities involved in the equation can be calibrated using high-frequency data on the best quotes. One of our central findings is that the dynamical process is approximately scale invariant, i.e., the only relevant variable is the ratio of the current volume in the queue to its average value. While the latter shows intraday seasonalities and strong variability across stocks and time periods, the dynamics of the rescaled volumes is universal. In terms of rescaled volumes, we found that the drift has a complex two-dimensional structure, which is a sum of a gradient contribution and a rotational contribution, both stable across stocks and time. This drift term is entirely responsible for the dynamical correlations between the ask queue and the bid queue.

  17. Invariance in the recurrence of large returns and the validation of models of price dynamics

    NASA Astrophysics Data System (ADS)

    Chang, Lo-Bin; Geman, Stuart; Hsieh, Fushing; Hwang, Chii-Ruey

    2013-08-01

    Starting from a robust, nonparametric definition of large returns (“excursions”), we study the statistics of their occurrences, focusing on the recurrence process. The empirical waiting-time distribution between excursions is remarkably invariant to year, stock, and scale (return interval). This invariance is related to self-similarity of the marginal distributions of returns, but the excursion waiting-time distribution is a function of the entire return process and not just its univariate probabilities. Generalized autoregressive conditional heteroskedasticity (GARCH) models, market-time transformations based on volume or trades, and generalized (Lévy) random-walk models all fail to fit the statistical structure of excursions.

  18. Production and Distribution of NASA MODIS Remote Sensing Products

    NASA Technical Reports Server (NTRS)

    Wolfe, Robert

    2007-01-01

    The two Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on-board NASA's Earth Observing System (EOS) Terra and Aqua satellites make key measurements for understanding the Earth's terrestrial ecosystems. Global time-series of terrestrial geophysical parameters have been produced from MODIS/Terra for over 7 years and for MODIS/Aqua for more than 4 1/2 years. These well calibrated instruments, a team of scientists and a large data production, archive and distribution systems have allowed for the development of a new suite of high quality product variables at spatial resolutions as fine as 250m in support of global change research and natural resource applications. This talk describes the MODIS Science team's products, with a focus on the terrestrial (land) products, the data processing approach and the process for monitoring and improving the product quality. The original MODIS science team was formed in 1989. The team's primary role is the development and implementation of the geophysical algorithms. In addition, the team provided feedback on the design and pre-launch testing of the instrument and helped guide the development of the data processing system. The key challenges the science team dealt with before launch were the development of algorithms for a new instrument and provide guidance of the large and complex multi-discipline processing system. Land, Ocean and Atmosphere discipline teams drove the processing system requirements, particularly in the area of the processing loads and volumes needed to daily produce geophysical maps of the Earth at resolutions as fine as 250 m. The processing system had to handle a large number of data products, large data volumes and processing loads, and complex processing requirements. Prior to MODIS, daily global maps from heritage instruments, such as Advanced Very High Resolution Radiometer (AVHRR), were not produced at resolutions finer than 5 km. The processing solution evolved into a combination of processing the lower level (Level 1) products and the higher level discipline specific Land and Atmosphere products in the MODIS Science Investigator Lead Processing System (SIPS), the MODIS Adaptive Processing System (MODAPS), and archive and distribution of the Land products to the user community by two of NASA s EOS Distributed Active Archive Centers (DAACs). Recently, a part of MODAPS, the Level 1 and Atmosphere Archive and Distribution System (LAADS), took over the role of archiving and distributing the Level 1 and Atmosphere products to the user community.

  19. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS), part 3. Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The performance, design and verification requirements for the space Construction Automated Fabrication Experiment (SCAFE) are defined. The SCAFE program defines, develops, and demonstrates the techniques, processes, and equipment required for the automatic fabrication of structural elements in space and for the assembly of such elements into a large, lightweight structure. The program defines a large structural platform to be constructed in orbit using the space shuttle as a launch vehicle and construction base.

  20. Rapid concentration of Bacillus and Clostridium spores from large volumes of milk, using continuous flow centrifugation.

    PubMed

    Agoston, Réka; Soni, Kamlesh A; McElhany, Katherine; Cepeda, Martha L; Zuckerman, Udi; Tzipori, Saul; Mohácsi-Farkas, Csilla; Pillai, Suresh D

    2009-03-01

    Deliberate or accidental contamination of foods such as milk, soft drinks, and drinking water with infectious agents or toxins is a major concern to health authorities. There is a critical need to develop technologies that can rapidly and efficiently separate and concentrate biothreat agents from food matrices. A key limitation of current centrifugation and filtration technologies is that they are batch processes with extensive hands-on involvement and processing times. The objective of our studies was to evaluate the continuous flow centrifugation (CFC) technique for the rapid separation and concentration of bacterial spores from large volumes of milk. We determined the effectiveness of the CFC technology for concentrating approximately 10(3) bacterial spores in 3.7 liters (1 gal) of whole milk and skim milk, using Bacillus subtilis, Bacillus atrophaeus, and Clostridium sporogenes spores as surrogates for biothreat agents. The spores in the concentrated samples were enumerated by using standard plating techniques. Three independent experiments were performed at 10,000 rpm and 0.7 liters/min flow rate. The mean B. subtilis spore recoveries were 71.3 and 56.5% in skim and whole milk, respectively, and those for B. atrophaeus were 55 and 59.3% in skim and whole milk, respectively. In contrast, mean C. sporogenes spore recoveries were 88.2 and 78.6% in skim and whole milk, respectively. The successful use of CFC to concentrate these bacterial spores from 3.7 liters of milk in 10 min shows promise for rapidly concentrating other spores from large volumes of milk.

  1. Switch-like reprogramming of gene expression after fusion of multinucleate plasmodial cells of two Physarum polycephalum sporulation mutants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, Pauline; Hoffmann, Xenia-Katharina; Ebeling, Britta

    2013-05-24

    Highlights: •We investigate reprogramming of gene expression in multinucleate single cells. •Cells of two differentiation control mutants are fused. •Fused cells proceed to alternative gene expression patterns. •The population of nuclei damps stochastic fluctuations in gene expression. •Dynamic processes of cellular reprogramming can be observed by repeated sampling of a cell. -- Abstract: Nonlinear dynamic processes involving the differential regulation of transcription factors are considered to impact the reprogramming of stem cells, germ cells, and somatic cells. Here, we fused two multinucleate plasmodial cells of Physarum polycephalum mutants defective in different sporulation control genes while being in different physiological states.more » The resulting heterokaryons established one of two significantly different expression patterns of marker genes while the plasmodial halves that were fused to each other synchronized spontaneously. Spontaneous synchronization suggests that switch-like control mechanisms spread over and finally control the entire plasmodium as a result of cytoplasmic mixing. Regulatory molecules due to the large volume of the vigorously streaming cytoplasm will define concentrations in acting on the population of nuclei and in the global setting of switches. Mixing of a large cytoplasmic volume is expected to damp stochasticity when individual nuclei deliver certain RNAs at low copy number into the cytoplasm. We conclude that spontaneous synchronization, the damping of molecular noise in gene expression by the large cytoplasmic volume, and the option to take multiple macroscopic samples from the same plasmodium provide unique options for studying the dynamics of cellular reprogramming at the single cell level.« less

  2. Local bone graft harvesting and volumes in posterolateral lumbar fusion: a technical report.

    PubMed

    Carragee, Eugene J; Comer, Garet C; Smith, Micah W

    2011-06-01

    In lumbar surgery, local bone graft is often harvested and used in posterolateral fusion procedures. The volume of local bone graft available for posterolateral fusion has not been determined in North American patients. Some authors have described this as minimal, but others have suggested the volume was sufficient to be reliably used as a stand-alone bone graft substitute for single-level fusion. To describe the technique used and determine the volume of local bone graft available in a cohort of patients undergoing single-level primary posterolateral fusion by the authors harvesting technique. Technical description and cohort report. Consecutive patients undergoing lumbar posterolateral fusion with or without instrumentation for degenerative processes. Local bone graft volume. Consecutive patients undergoing lumbar posterolateral fusion with or without instrumentation for degenerative processes of were studied. Local bone graft was harvested by a standard method in each patient and the volume measured by a standard procedure. Twenty-five patients were studied, and of these 11 (44%) had a previous decompression. The mean volume of local bone graft harvested was measured to be 25 cc (range, 12-36 cc). Local bone graft was augmented by iliac crest bone in six of 25 patients (24%) if the posterolateral fusion bed was not well packed with local bone alone. There was a trend to greater local bone graft volumes in men and in patients without previous decompression. Large volumes of local bone can be harvested during posterolateral lumbar fusion surgery. Even in patients with previous decompression the volume harvested is similar to that reported harvested from the posterior iliac crest for single-level fusion. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Manufacturing process scale-up of optical grade transparent spinel ceramic at ArmorLine Corporation

    NASA Astrophysics Data System (ADS)

    Spilman, Joseph; Voyles, John; Nick, Joseph; Shaffer, Lawrence

    2013-06-01

    While transparent Spinel ceramic's mechanical and optical characteristics are ideal for many Ultraviolet (UV), visible, Short-Wave Infrared (SWIR), Mid-Wave Infrared (MWIR), and multispectral sensor window applications, commercial adoption of the material has been hampered because the material has historically been available in relatively small sizes (one square foot per window or less), low volumes, unreliable supply, and with unreliable quality. Recent efforts, most notably by Technology Assessment and Transfer (TA and T), have scaled-up manufacturing processes and demonstrated the capability to produce larger windows on the order of two square feet, but with limited output not suitable for production type programs. ArmorLine Corporation licensed the hot-pressed Spinel manufacturing know-how of TA and T in 2009 with the goal of building the world's first dedicated full-scale Spinel production facility, enabling the supply of a reliable and sufficient volume of large Transparent Armor and Optical Grade Spinel plates. With over $20 million of private investment by J.F. Lehman and Company, ArmorLine has installed and commissioned the largest vacuum hot press in the world, the largest high-temperature/high-pressure hot isostatic press in the world, and supporting manufacturing processes within 75,000 square feet of manufacturing space. ArmorLine's equipment is capable of producing window blanks as large as 50" x 30" and the facility is capable of producing substantial volumes of material with its Lean configuration and 24/7 operation. Initial production capability was achieved in 2012. ArmorLine will discuss the challenges that were encountered during scale-up of the manufacturing processes, ArmorLine Optical Grade Spinel optical performance, and provide an overview of the facility and its capabilities.

  4. Quantitative three-dimensional transrectal ultrasound (TRUS) for prostate imaging

    NASA Astrophysics Data System (ADS)

    Pathak, Sayan D.; Aarnink, Rene G.; de la Rosette, Jean J.; Chalana, Vikram; Wijkstra, Hessel; Haynor, David R.; Debruyne, Frans M. J.; Kim, Yongmin

    1998-06-01

    With the number of men seeking medical care for prostate diseases rising steadily, the need of a fast and accurate prostate boundary detection and volume estimation tool is being increasingly experienced by the clinicians. Currently, these measurements are made manually, which results in a large examination time. A possible solution is to improve the efficiency by automating the boundary detection and volume estimation process with minimal involvement from the human experts. In this paper, we present an algorithm based on SNAKES to detect the boundaries. Our approach is to selectively enhance the contrast along the edges using an algorithm called sticks and integrate it with a SNAKES model. This integrated algorithm requires an initial curve for each ultrasound image to initiate the boundary detection process. We have used different schemes to generate the curves with a varying degree of automation and evaluated its effects on the algorithm performance. After the boundaries are identified, the prostate volume is calculated using planimetric volumetry. We have tested our algorithm on 6 different prostate volumes and compared the performance against the volumes manually measured by 3 experts. With the increase in the user inputs, the algorithm performance improved as expected. The results demonstrate that given an initial contour reasonably close to the prostate boundaries, the algorithm successfully delineates the prostate boundaries in an image, and the resulting volume measurements are in close agreement with those made by the human experts.

  5. Planning, Designing, Building, and Moving a Large Volume Maternity Service to a New Labor and Birth Unit.

    PubMed

    Thompson, Heather; Legorreta, Kimberly; Maher, Mary Ann; Lavin, Melanie M

    Our health system recognized the need to update facility space and associated technology for the labor and birth unit within our large volume perinatal service to improve the patient experience, and enhance safety, quality of care, and staff satisfaction. When an organization decides to invest $30 million dollars in a construction project such as a new labor and birth unit, many factors and considerations are involved. Financial support, planning, design, and construction phases of building a new unit are complex and therefore require strong interdisciplinary collaboration, leadership, and project management. The new labor and birth unit required nearly 3 years of planning, designing, and construction. Patient and family preferences were elicited through consumer focus groups. Multiple meetings with the administrative and nursing leadership teams, staff nurses, nurse midwives, and physicians were held to generate ideas for improvement in the new space. Involving frontline clinicians and childbearing women in the process was critical to success. The labor and birth unit moved to a new patient tower in a space that was doubled in square footage and geographically now on three separate floors. In the 6 months prior to the move, many efforts were made in our community to share our new space. The marketing strategy was very detailed and creative with ongoing input from the nursing leadership team. The nursing staff was involved in every step along the way. It was critical to have champions as workflow teams emerged. We hosted simulation drills and tested scenarios with new workflows. Move day was rehearsed with representatives of all members of the perinatal team participating. These efforts ultimately resulted in a move time of ~5 hours. Birth volumes increased 7% within the first 6 months. After 3 years in our new space, our birth volumes have risen nearly 15% and are still growing. Key processes and roles responsible for a successful build, efficient and safe move day, and optimal operational utility, as anticipated, of a new labor and birth unit in a large volume perinatal service are detailed.

  6. Monitoring the process of pulmonary melanoma metastasis using large area and label-free nonlinear optical microscopy

    NASA Astrophysics Data System (ADS)

    Hua, Daozhu; Qi, Shuhong; Li, Hui; Zhang, Zhihong; Fu, Ling

    2012-06-01

    We performed large area nonlinear optical microscopy (NOM) for label-free monitoring of the process of pulmonary melanoma metastasis ex vivo with subcellular resolution in C57BL/6 mice. Multiphoton autofluorescence (MAF) and second harmonic generation (SHG) images of lung tissue are obtained in a volume of ~2.2 mm×2.2 mm×30 μm. Qualitative differences in morphologic features and quantitative measurement of pathological lung tissues at different time points are characterized. We find that combined with morphological features, the quantitative parameters, such as the intensity ratio of MAF and SHG between pathological tissue and normal tissue and the MAF to SHG index versus depth clearly shows the tissue physiological changes during the process of pulmonary melanoma metastasis. Our results demonstrate that large area NOM succeeds in monitoring the process of pulmonary melanoma metastasis, which can provide a powerful tool for the research in tumor pathophysiology and therapy evaluation.

  7. Big data - smart health strategies. Findings from the yearbook 2014 special theme.

    PubMed

    Koutkias, V; Thiessard, F

    2014-08-15

    To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future.

  8. Big Data - Smart Health Strategies

    PubMed Central

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  9. Effects of voxelization on dose volume histogram accuracy

    NASA Astrophysics Data System (ADS)

    Sunderland, Kyle; Pinter, Csaba; Lasso, Andras; Fichtinger, Gabor

    2016-03-01

    PURPOSE: In radiotherapy treatment planning systems, structures of interest such as targets and organs at risk are stored as 2D contours on evenly spaced planes. In order to be used in various algorithms, contours must be converted into binary labelmap volumes using voxelization. The voxelization process results in lost information, which has little effect on the volume of large structures, but has significant impact on small structures, which contain few voxels. Volume differences for segmented structures affects metrics such as dose volume histograms (DVH), which are used for treatment planning. Our goal is to evaluate the impact of voxelization on segmented structures, as well as how factors like voxel size affects metrics, such as DVH. METHODS: We create a series of implicit functions, which represent simulated structures. These structures are sampled at varying resolutions, and compared to labelmaps with high sub-millimeter resolutions. We generate DVH and evaluate voxelization error for the same structures at different resolutions by calculating the agreement acceptance percentage between the DVH. RESULTS: We implemented tools for analysis as modules in the SlicerRT toolkit based on the 3D Slicer platform. We found that there were large DVH variation from the baseline for small structures or for structures located in regions with a high dose gradient, potentially leading to the creation of suboptimal treatment plans. CONCLUSION: This work demonstrates that labelmap and dose volume voxel size is an important factor in DVH accuracy, which must be accounted for in order to ensure the development of accurate treatment plans.

  10. Anatomical analysis of an aye-aye brain (Daubentonia madagascariensis, primates: Prosimii) combining histology, structural magnetic resonance imaging, and diffusion-tensor imaging.

    PubMed

    Kaufman, Jason A; Ahrens, Eric T; Laidlaw, David H; Zhang, Song; Allman, John M

    2005-11-01

    This report presents initial results of a multimodal analysis of tissue volume and microstructure in the brain of an aye-aye (Daubentonia madagascariensis). The left hemisphere of an aye-aye brain was scanned using T2-weighted structural magnetic resonance imaging (MRI) and diffusion-tensor imaging (DTI) prior to histological processing and staining for Nissl substance and myelinated fibers. The objectives of the experiment were to estimate the volume of gross brain regions for comparison with published data on other prosimians and to validate DTI data on fiber anisotropy with histological measurements of fiber spread. Measurements of brain structure volumes in the specimen are consistent with those reported in the literature: the aye-aye has a very large brain for its body size, a reduced volume of visual structures (V1 and LGN), and an increased volume of the olfactory lobe. This trade-off between visual and olfactory reliance is likely a reflection of the nocturnal extractive foraging behavior practiced by Daubentonia. Additionally, frontal cortex volume is large in the aye-aye, a feature that may also be related to its complex foraging behavior and sensorimotor demands. Analysis of DTI data in the anterior cingulum bundle demonstrates a strong correlation between fiber spread as measured from histological sections and fiber spread as measured from DTI. These results represent the first quantitative comparison of DTI data and fiber-stained histology in the brain. (c) 2005 Wiley-Liss, Inc.

  11. The Long-Term Effects of Large Wood Placement on Salmonid Habitat in East Fork Mill Creek, Redwood National and State Park, California

    NASA Astrophysics Data System (ADS)

    Rodriguez, D. L.; Stubblefield, A. P.

    2017-12-01

    The conservation and recovery of anadromous salmonids (Oncorhynchus sp.) depend on stream restoration and protection of freshwater habitats. Instream large wood dictates channel morphology, increase retention of terrestrial inputs such as organic matter, nutrients and sediment, and enhances the quality of fish habitat. Historic land use/land cover changes have resulted in aquatic systems devoid of this component. Restoration by placement of large wood jams is intended to restore physical and biological processes. An important question for scientists and managers, in addition to the initial effectiveness of restoration, is the persistence and fate of this type of project. In this study we compare channel change and large wood attributes on the East Fork of Mill Creek, a tributary of the Smith River in northern California, eight years after a major instream wood placement effort took place. Our results are compared with previously published data from before and one year after the restoration. Preliminary results suggest the dramatic increase in spawning gravel abundance and large wood accumulation observed in the earlier study have persisted. From 2008 to 2016 a reduction in median sediment size, ranging from 103-136 percent, has been observed in a majority of the sites. The sites have continued to grow in size and influence by racking floating wood from upstream and destabilizing proximate banks of riparian alder, increasing both instream large wood volume (5-196 %) and floodplain connectivity. Preliminary results also show a decrease in residual pool depth and an increase in pool length which may be attributed to floodplain connectivity. Changes to the following attributes are evaluated: 1) wood loading (total site wood volume, total wood volume in active channel, and wood piece count); 2) percent pool cover by large wood; 3) residual pool depth; 4) upstream sediment aggradation; 5) floodplain connectivity; and 6) mean sediment size directly above and below large wood. We present on these results and statistical comparisons of total site wood volume with response factors.

  12. Use of a Modern Polymerization Pilot-Plant for Undergraduate Control Projects.

    ERIC Educational Resources Information Center

    Mendoza-Bustos, S. A.; And Others

    1991-01-01

    Described is a project where students gain experience in handling large volumes of hazardous materials, process start up and shut down, equipment failures, operational variations, scaling up, equipment cleaning, and run-time scheduling while working in a modern pilot plant. Included are the system design, experimental procedures, and results. (KR)

  13. Light Helicopter Family Trade-Off Analysis. Volume 7. Appendix R

    DTIC Science & Technology

    1985-05-15

    required.Fl • There was less agreement among the pilots regarding FOV effects on the anti-armor .mssion. Although two pilots ielt they needed as large...1131. WiLkens, C. D. (1984) Processing resources in attention. In Varieties of Attention. New York: Academic Press, 63-102. Wickens, C. D. & Derrick W

  14. YouEDU: Addressing Confusion in MOOC Discussion Forums by Recommending Instructional Video Clips

    ERIC Educational Resources Information Center

    Agrawal, Akshay; Venkatraman, Jagadish; Leonard, Shane; Paepcke, Andreas

    2015-01-01

    In Massive Open Online Courses (MOOCs), struggling learners often seek help by posting questions in discussion forums. Unfortunately, given the large volume of discussion in MOOCs, instructors may overlook these learners' posts, detrimentally impacting the learning process and exacerbating attrition. In this paper, we present YouEDU, an…

  15. Naval EarthMap Observer (NEMO) science and naval products

    NASA Astrophysics Data System (ADS)

    Davis, Curtiss O.; Kappus, Mary E.; Gao, Bo-Cai; Bissett, W. Paul; Snyder, William A.

    1998-11-01

    A wide variety of applications of imaging spectrometry have been demonstrated using data from aircraft systems. Based on this experience the Navy is pursuing the Hyperspectral Remote Sensing Technology (HRST) Program to use hyperspectral imagery to characterize the littoral environment, for scientific and environmental studies and to meet Naval needs. To obtain the required space based hyperspectral imagery the Navy has joined in a partnership with industry to build and fly the Naval EarthMap Observer (NEMO). The NEMO spacecraft has the Coastal Ocean Imaging Spectrometer (COIS) a hyperspectral imager with adequate spectral and spatial resolution and a high signal-to- noise ratio to provide long term monitoring and real-time characterization of the coastal environment. It includes on- board processing for rapid data analysis and data compression, a large volume recorder, and high speed downlink to handle the required large volumes of data. This paper describes the algorithms for processing the COIS data to provide at-launch ocean data products and the research and modeling that are planned to use COIS data to advance our understanding of the dynamics of the coastal ocean.

  16. Pore water sampling in acid sulfate soils: a new peeper method.

    PubMed

    Johnston, Scott G; Burton, Edward D; Keene, Annabelle F; Bush, Richard T; Sullivan, Leigh A; Isaacson, Lloyd

    2009-01-01

    This study describes the design, deployment, and application of a modified equilibration dialysis device (peeper) optimized for sampling pore waters in acid sulfate soils (ASS). The modified design overcomes the limitations of traditional-style peepers, when sampling firm ASS materials over relatively large depth intervals. The new peeper device uses removable, individual cells of 25 mL volume housed in a 1.5 m long rigid, high-density polyethylene rod. The rigid housing structure allows the device to be inserted directly into relatively firm soils without requiring a supporting frame. The use of removable cells eliminates the need for a large glove-box after peeper retrieval, thus simplifying physical handling. Removable cells are easily maintained in an inert atmosphere during sample processing and the 25-mL sample volume is sufficient for undertaking multiple analyses. A field evaluation of equilibration times indicates that 32 to 38 d of deployment was necessary. Overall, the modified method is simple and effective and well suited to acquisition and processing of redox-sensitive pore water profiles>1 m deep in acid sulfate soil or any other firm wetland soils.

  17. Precise segmentation of multiple organs in CT volumes using learning-based approach and information theory.

    PubMed

    Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin

    2012-01-01

    In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region.

  18. Integrated circuits for volumetric ultrasound imaging with 2-D CMUT arrays.

    PubMed

    Bhuyan, Anshuman; Choe, Jung Woo; Lee, Byung Chul; Wygant, Ira O; Nikoozadeh, Amin; Oralkan, Ömer; Khuri-Yakub, Butrus T

    2013-12-01

    Real-time volumetric ultrasound imaging systems require transmit and receive circuitry to generate ultrasound beams and process received echo signals. The complexity of building such a system is high due to requirement of the front-end electronics needing to be very close to the transducer. A large number of elements also need to be interfaced to the back-end system and image processing of a large dataset could affect the imaging volume rate. In this work, we present a 3-D imaging system using capacitive micromachined ultrasonic transducer (CMUT) technology that addresses many of the challenges in building such a system. We demonstrate two approaches in integrating the transducer and the front-end electronics. The transducer is a 5-MHz CMUT array with an 8 mm × 8 mm aperture size. The aperture consists of 1024 elements (32 × 32) with an element pitch of 250 μm. An integrated circuit (IC) consists of a transmit beamformer and receive circuitry to improve the noise performance of the overall system. The assembly was interfaced with an FPGA and a back-end system (comprising of a data acquisition system and PC). The FPGA provided the digital I/O signals for the IC and the back-end system was used to process the received RF echo data (from the IC) and reconstruct the volume image using a phased array imaging approach. Imaging experiments were performed using wire and spring targets, a ventricle model and a human prostrate. Real-time volumetric images were captured at 5 volumes per second and are presented in this paper.

  19. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  20. Studies investigate effects of hydraulic fracturing

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2012-11-01

    The use of hydraulic fracturing, also known as fracking, to enhance the retrieval of natural gas from shale has been increasing dramatically—the number of natural gas wells rose about 50% since 2000. Shale gas has been hailed as a relatively low-cost, abundant energy source that is cleaner than coal. However, fracking involves injecting large volumes of water, sand, and chemicals into deep shale gas reservoirs under high pressure to open fractures through which the gas can travel, and the process has generated much controversy. The popular press, advocacy organizations, and the documentary film Gasland by Josh Fox have helped bring this issue to a broad audience. Many have suggested that fracking has resulted in contaminated drinking water supplies, enhanced seismic activity, demands for large quantities of water that compete with other uses, and challenges in managing large volumes of resulting wastewater. As demand for expanded domestic energy production intensifies, there is potential for substantially increased use of fracking together with other recovery techniques for "unconventional gas resources," like extended horizontal drilling.

  1. An improved plating process

    NASA Technical Reports Server (NTRS)

    Askew, John C.

    1994-01-01

    An alternative to the immersion process for the electrodeposition of chromium from aqueous solutions on the inside diameter (ID) of long tubes is described. The Vessel Plating Process eliminates the need for deep processing tanks, large volumes of solutions, and associated safety and environmental concerns. Vessel Plating allows the process to be monitored and controlled by computer thus increasing reliability, flexibility and quality. Elimination of the trivalent chromium accumulation normally associated with ID plating is intrinsic to the Vessel Plating Process. The construction and operation of a prototype Vessel Plating Facility with emphasis on materials of construction, engineered and operational safety and a unique system for rinse water recovery are described.

  2. Cotectic proportions of olivine and spinel in olivine-tholeiitic basalt and evaluation of pre-eruptive processes

    USGS Publications Warehouse

    Roeder, Peter; Gofton, Emma; Thornber, Carl

    2006-01-01

    The volume %, distribution, texture and composition of coexisting olivine, Cr-spinel and glass has been determined in quenched lava samples from Hawaii, Iceland and mid-oceanic ridges. The volume ratio of olivine to spinel varies from 60 to 2800 and samples with >0·02% spinel have a volume ratio of olivine to spinel of approximately 100. A plot of wt % MgO vs ppm Cr for natural and experimental basaltic glasses suggests that the general trend of the glasses can be explained by the crystallization of a cotectic ratio of olivine to spinel of about 100. One group of samples has an olivine to spinel ratio of approximately 100, with skeletal olivine phenocrysts and small (<50 μm) spinel crystals that tend to be spatially associated with the olivine phenocrysts. The large number of spinel crystals included within olivine phenocrysts is thought to be due to skeletal olivine phenocrysts coming into physical contact with spinel by synneusis during the chaotic conditions of ascent and extrusion. A second group of samples tend to have large olivine phenocrysts relatively free of included spinel, a few large (>100 μm) spinel crystals that show evidence of two stages of growth, and a volume ratio of olivine to spinel of 100 to well over 1000. The olivine and spinel in this group have crystallized more slowly with little physical interaction, and show evidence that they have accumulated in a magma chamber.

  3. Rapid Decimation for Direct Volume Rendering

    NASA Technical Reports Server (NTRS)

    Gibbs, Jonathan; VanGelder, Allen; Verma, Vivek; Wilhelms, Jane

    1997-01-01

    An approach for eliminating unnecessary portions of a volume when producing a direct volume rendering is described. This reduction in volume size sacrifices some image quality in the interest of rendering speed. Since volume visualization is often used as an exploratory visualization technique, it is important to reduce rendering times, so the user can effectively explore the volume. The methods presented can speed up rendering by factors of 2 to 3 with minor image degradation. A family of decimation algorithms to reduce the number of primitives in the volume without altering the volume's grid in any way is introduced. This allows the decimation to be computed rapidly, making it easier to change decimation levels on the fly. Further, because very little extra space is required, this method is suitable for the very large volumes that are becoming common. The method is also grid-independent, so it is suitable for multiple overlapping curvilinear and unstructured, as well as regular, grids. The decimation process can proceed automatically, or can be guided by the user so that important regions of the volume are decimated less than unimportant regions. A formal error measure is described based on a three-dimensional analog of the Radon transform. Decimation methods are evaluated based on this metric and on direct comparison with reference images.

  4. Evaluation of Sampling Methods for Bacillus Spore ...

    EPA Pesticide Factsheets

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  5. Data Streams: An Overview and Scientific Applications

    NASA Astrophysics Data System (ADS)

    Aggarwal, Charu C.

    In recent years, advances in hardware technology have facilitated the ability to collect data continuously. Simple transactions of everyday life such as using a credit card, a phone, or browsing the web lead to automated data storage. Similarly, advances in information technology have lead to large flows of data across IP networks. In many cases, these large volumes of data can be mined for interesting and relevant information in a wide variety of applications. When the volume of the underlying data is very large, it leads to a number of computational and mining challenges: With increasing volume of the data, it is no longer possible to process the data efficiently by using multiple passes. Rather, one can process a data item at most once. This leads to constraints on the implementation of the underlying algorithms. Therefore, stream mining algorithms typically need to be designed so that the algorithms work with one pass of the data. In most cases, there is an inherent temporal component to the stream mining process. This is because the data may evolve over time. This behavior of data streams is referred to as temporal locality. Therefore, a straightforward adaptation of one-pass mining algorithms may not be an effective solution to the task. Stream mining algorithms need to be carefully designed with a clear focus on the evolution of the underlying data. Another important characteristic of data streams is that they are often mined in a distributed fashion. Furthermore, the individual processors may have limited processing and memory. Examples of such cases include sensor networks, in which it may be desirable to perform in-network processing of data stream with limited processing and memory [1, 2]. This chapter will provide an overview of the key challenges in stream mining algorithms which arise from the unique setup in which these problems are encountered. This chapter is organized as follows. In the next section, we will discuss the generic challenges that stream mining poses to a variety of data management and data mining problems. The next section also deals with several issues which arise in the context of data stream management. In Sect. 3, we discuss several mining algorithms on the data stream model. Section 4 discusses various scientific applications of data streams. Section 5 discusses the research directions and conclusions.

  6. Production of recombinant adeno-associated vectors using two bioreactor configurations at different scales

    PubMed Central

    Negrete, Alejandro; Kotin, Robert M.

    2007-01-01

    The conventional methods for producing recombinant adeno-associated virus (rAAV) rely on transient transfection of adherent mammalian cells. To gain acceptance and achieve current good manufacturing process (cGMP) compliance, clinical grade rAAV production process should have the following qualities: simplicity, consistency, cost effectiveness, and scalability. Currently, the only viable method for producing rAAV in large-scale, e.g.≥1016 particles per production run, utilizes Baculovirus Expression Vectors (BEVs) and insect cells suspension cultures. The previously described rAAV production in 40 L culture using a stirred tank bioreactor requires special conditions for implementation and operation not available in all laboratories. Alternatives to producing rAAV in stirred-tank bioreactors are single-use, disposable bioreactors, e.g. Wave™. The disposable bags are purchased pre-sterilized thereby eliminating the need for end-user sterilization and also avoiding cleaning steps between production runs thus facilitating the production process. In this study, rAAV production in stirred tank and Wave™ bioreactors was compared. The working volumes were 10 L and 40 L for the stirred tank bioreactors and 5 L and 20 L for the Wave™ bioreactors. Comparable yields of rAAV, ~2e+13 particles per liter of cell culture were obtained in all volumes and configurations. These results demonstrate that producing rAAV in large scale using BEVs is reproducible, scalable, and independent of the bioreactor configuration. Keywords: adeno-associated vectors; large-scale production; stirred tank bioreactor; wave bioreactor; gene therapy. PMID:17606302

  7. Influence of fibre distribution and grain size on the mechanical behaviour of friction stir processed Mg–C composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mertens, A., E-mail: anne.mertens@ulg.ac.be; Simar, A.; Adrien, J.

    Short C fibres–Mg matrix composites have been produced by friction stir processing sandwiches made of a layer of C fabric stacked between two sheets of Mg alloy AZ31B or AZ91D. This novel processing technique can allow the easy production of large-scale metal matrix composites. The paper investigates the microstructure of FSPed C fibre–Mg composites in relation with the fragmentation of the C fibres during FSP and their influence on the tensile properties. 3D X-ray tomography reveals that the fibres orient like onion rings and are more or less fragmented depending on the local shear stress during the process. The fibremore » volume fraction can be increased from 2.3% to 7.1% by reducing the nugget volume, i.e. by using a higher advancing speed in AZ31B alloy or a stronger matrix alloy, like AZ91D alloy. A higher fibre volume fraction leads to a smaller grain size which brings about an increase of the composite yield strength by 15 to 25%. However, a higher fibre volume fraction also leads to a lower fracture strain. Fracture surface observations reveal that damage occurs by fibre/matrix decohesion along fibres oriented perpendicularly to the loading direction. - Graphical abstract: Display Omitted - Highlights: • C–Mg MMCs were produced by FSP sandwiches made of a C fabric between Mg sheets. • Fibre fragmentation and erosion is larger when the temperature reached during FSP is lower. • A lower advancing speed brings a lower fibre volume fraction and a lower grain size. • X-ray tomography reveals that fibres orient along the FSP material flow. • The fibres and grain size reduction increase the yield strength by 15 to 25%.« less

  8. AGRI Grain Power ethanol-for-fuel project feasibility-study report. Volume I. Project conceptual design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-04-01

    The AGRI GRAIN POWER (AGP) Project, hereafter referred to as the Project, was formed to evaluate the commercial viability and assess the desireability of implementing a large grain based grass-roots anhydrous ethanol fuel project to be sited near Des Moines, Iowa. This report presents the results of a Project feasibility evaluation. The Project concept is based on involving a very strong managerial, financial and technical joint venture that is extremely expert in all facets of planning and implementing a large ethanol project; on locating the ethanol project at a highly desireable site; on utilizing a proven ethanol process; and onmore » developing a Project that is well suited to market requirements, resource availability and competitive factors. The Project conceptual design is presented in this volume.« less

  9. Porous silicon micro-resonator implemented by standard photolithography process for sensing application

    NASA Astrophysics Data System (ADS)

    Girault, P.; Azuelos, P.; Lorrain, N.; Poffo, L.; Lemaitre, J.; Pirasteh, P.; Hardy, I.; Thual, M.; Guendouz, M.; Charrier, J.

    2017-10-01

    A micro-resonator based on porous silicon ridge waveguides is implemented by a large scale standard photolithography process to obtain a low cost and sensitive sensor based on volume detection principle instead of the evanescent one usually used. The porous nature of the ridge waveguides allows the target molecules to be infiltrated in the core and to be detected by direct interaction with the propagated light. Racetrack resonator with radius of 100 μm and a coupling length of 70 μm is optically characterized for the volume detection of different concentrations of glucose. A high sensitivity of 560 nm/RIU is reached with only one micro-resonator and a limit of detection of 8.10-5 RIU, equivalent to a glucose concentration of 0.7 g/L, is obtained.

  10. Analysis of changes in leg volume parameters, and orthostatic tolerance in response to lower body negative pressure during 28-days exposure to zero gravity Skylab 2

    NASA Technical Reports Server (NTRS)

    Barnett, R. D.; Gowen, R. J.; Carroll, D. R.

    1975-01-01

    The design of the leg volume measuring system employed for the M092 portion of the Skylab missions required the development of a system sensitive to large and small volume changes at the calf of the leg. These changes in volume were produced in response to the orthostatic stress of a Lower Body Negative Pressure Device (LBNPD) or by venous occlusion. The cardiovascular responses of the Apollo crewman associated with the postflight evaluations indicate varying decrements of orthostatic tolerance. The postflight changes indicate a slightly diminished ability of the cardiovascular system to function effectively against gravity following exposure to weightlessness. The objective of the Skylab LBNP experiments (M092) was to provide information about the magnitude and time course of the cardiovascular changes associated with prolonged periods of exposure to weightlessness. The equipment, signal processing, and analysis of the leg volume data obtained from the M092 experiment of the Skylab 2 Mission are described.

  11. Comparison of spectroscopy technologies for improved monitoring of cell culture processes in miniature bioreactors.

    PubMed

    Rowland-Jones, Ruth C; van den Berg, Frans; Racher, Andrew J; Martin, Elaine B; Jaques, Colin

    2017-03-01

    Cell culture process development requires the screening of large numbers of cell lines and process conditions. The development of miniature bioreactor systems has increased the throughput of such studies; however, there are limitations with their use. One important constraint is the limited number of offline samples that can be taken compared to those taken for monitoring cultures in large-scale bioreactors. The small volume of miniature bioreactor cultures (15 mL) is incompatible with the large sample volume (600 µL) required for bioanalysers routinely used. Spectroscopy technologies may be used to resolve this limitation. The purpose of this study was to compare the use of NIR, Raman, and 2D-fluorescence to measure multiple analytes simultaneously in volumes suitable for daily monitoring of a miniature bioreactor system. A novel design-of-experiment approach is described that utilizes previously analyzed cell culture supernatant to assess metabolite concentrations under various conditions while providing optimal coverage of the desired design space. Multivariate data analysis techniques were used to develop predictive models. Model performance was compared to determine which technology is more suitable for this application. 2D-fluorescence could more accurately measure ammonium concentration (RMSE CV 0.031 g L -1 ) than Raman and NIR. Raman spectroscopy, however, was more robust at measuring lactate and glucose concentrations (RMSE CV 1.11 and 0.92 g L -1 , respectively) than the other two techniques. The findings suggest that Raman spectroscopy is more suited for this application than NIR and 2D-fluorescence. The implementation of Raman spectroscopy increases at-line measuring capabilities, enabling daily monitoring of key cell culture components within miniature bioreactor cultures. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:337-346, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, C. Shan; Hayworth, Kenneth J.; Lu, Zhiyuan

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 10 6 ?m 3 . These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processesmore » and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology.« less

  13. The BREAST-V: a unifying predictive formula for volume assessment in small, medium, and large breasts.

    PubMed

    Longo, Benedetto; Farcomeni, Alessio; Ferri, Germano; Campanale, Antonella; Sorotos, Micheal; Santanelli, Fabio

    2013-07-01

    Breast volume assessment enhances preoperative planning of both aesthetic and reconstructive procedures, helping the surgeon in the decision-making process of shaping the breast. Numerous methods of breast size determination are currently reported but are limited by methodologic flaws and variable estimations. The authors aimed to develop a unifying predictive formula for volume assessment in small to large breasts based on anthropomorphic values. Ten anthropomorphic breast measurements and direct volumes of 108 mastectomy specimens from 88 women were collected prospectively. The authors performed a multivariate regression to build the optimal model for development of the predictive formula. The final model was then internally validated. A previously published formula was used as a reference. Mean (±SD) breast weight was 527.9 ± 227.6 g (range, 150 to 1250 g). After model selection, sternal notch-to-nipple, inframammary fold-to-nipple, and inframammary fold-to-fold projection distances emerged as the most important predictors. The resulting formula (the BREAST-V) showed an adjusted R of 0.73. The estimated expected absolute error on new breasts is 89.7 g (95 percent CI, 62.4 to 119.1 g) and the expected relative error is 18.4 percent (95 percent CI, 12.9 to 24.3 percent). Application of reference formula on the sample yielded worse predictions than those derived by the formula, showing an R of 0.55. The BREAST-V is a reliable tool for predicting small to large breast volumes accurately for use as a complementary device in surgeon evaluation. An app entitled BREAST-V for both iOS and Android devices is currently available for free download in the Apple App Store and Google Play Store. Diagnostic, II.

  14. Measurement and genetics of human subcortical and hippocampal asymmetries in large datasets.

    PubMed

    Guadalupe, Tulio; Zwiers, Marcel P; Teumer, Alexander; Wittfeld, Katharina; Vasquez, Alejandro Arias; Hoogman, Martine; Hagoort, Peter; Fernandez, Guillen; Buitelaar, Jan; Hegenscheid, Katrin; Völzke, Henry; Franke, Barbara; Fisher, Simon E; Grabe, Hans J; Francks, Clyde

    2014-07-01

    Functional and anatomical asymmetries are prevalent features of the human brain, linked to gender, handedness, and cognition. However, little is known about the neurodevelopmental processes involved. In zebrafish, asymmetries arise in the diencephalon before extending within the central nervous system. We aimed to identify genes involved in the development of subtle, left-right volumetric asymmetries of human subcortical structures using large datasets. We first tested the feasibility of measuring left-right volume differences in such large-scale samples, as assessed by two automated methods of subcortical segmentation (FSL|FIRST and FreeSurfer), using data from 235 subjects who had undergone MRI twice. We tested the agreement between the first and second scan, and the agreement between the segmentation methods, for measures of bilateral volumes of six subcortical structures and the hippocampus, and their volumetric asymmetries. We also tested whether there were biases introduced by left-right differences in the regional atlases used by the methods, by analyzing left-right flipped images. While many bilateral volumes were measured well (scan-rescan r = 0.6-0.8), most asymmetries, with the exception of the caudate nucleus, showed lower repeatabilites. We meta-analyzed genome-wide association scan results for caudate nucleus asymmetry in a combined sample of 3,028 adult subjects but did not detect associations at genome-wide significance (P < 5 × 10(-8) ). There was no enrichment of genetic association in genes involved in left-right patterning of the viscera. Our results provide important information for researchers who are currently aiming to carry out large-scale genome-wide studies of subcortical and hippocampal volumes, and their asymmetries. Copyright © 2013 Wiley Periodicals, Inc.

  15. High-productivity DRIE solutions for 3D-SiP and MEMS volume manufacturing

    NASA Astrophysics Data System (ADS)

    Puech, M.; Thevenoud, J. M.; Launay, N.; Arnal, N.; Godinat, P.; Andrieu, B.; Gruffat, J. M.

    2006-12-01

    Emerging 3D-SiP technologies and high volume MEMS applications require high productivity mass production DRIE systems. The Alcatel DRIE product range has recently been optimized to reach the highest process and hardware production performances. A study based on sub-micron high aspect ratio structures encountered in the most stringent 3D-SiP has been carried out. The optimization of the Bosch process parameters have shown ultra high silicon etch rate, with unrivaled uniformity and repeatability leading to excellent process yields. In parallel, most recent hardware and proprietary design optimization including vacuum pumping lines, process chamber, wafer chucks, pressure control system, gas delivery are discussed. A key factor for achieving the highest performances was the recognized expertise of Alcatel vacuum and plasma science technologies. These improvements have been monitored in a mass production environment for a mobile phone application. Field data analysis shows a significant reduction of cost of ownership thanks to increased throughput and much lower running costs. These benefits are now available for all 3D-SiP and high volume MEMS applications. The typical etched patterns include tapered trenches for CMOS imagers, through silicon via holes for die stacking, well controlled profile angle for 3D high precision inertial sensors, and large exposed area features for inkjet printer head and Silicon microphones.

  16. Espina: A Tool for the Automated Segmentation and Counting of Synapses in Large Stacks of Electron Microscopy Images

    PubMed Central

    Morales, Juan; Alonso-Nanclares, Lidia; Rodríguez, José-Rodrigo; DeFelipe, Javier; Rodríguez, Ángel; Merchán-Pérez, Ángel

    2011-01-01

    The synapses in the cerebral cortex can be classified into two main types, Gray's type I and type II, which correspond to asymmetric (mostly glutamatergic excitatory) and symmetric (inhibitory GABAergic) synapses, respectively. Hence, the quantification and identification of their different types and the proportions in which they are found, is extraordinarily important in terms of brain function. The ideal approach to calculate the number of synapses per unit volume is to analyze 3D samples reconstructed from serial sections. However, obtaining serial sections by transmission electron microscopy is an extremely time consuming and technically demanding task. Using focused ion beam/scanning electron microscope microscopy, we recently showed that virtually all synapses can be accurately identified as asymmetric or symmetric synapses when they are visualized, reconstructed, and quantified from large 3D tissue samples obtained in an automated manner. Nevertheless, the analysis, segmentation, and quantification of synapses is still a labor intensive procedure. Thus, novel solutions are currently necessary to deal with the large volume of data that is being generated by automated 3D electron microscopy. Accordingly, we have developed ESPINA, a software tool that performs the automated segmentation and counting of synapses in a reconstructed 3D volume of the cerebral cortex, and that greatly facilitates and accelerates these processes. PMID:21633491

  17. Comparison of semipermeable membrane device (SPMD) and large-volume solid-phase extraction techniques to measure water concentrations of 4,4'-DDT, 4,4'-DDE, and 4,4'-DDD in Lake Chelan, Washington.

    PubMed

    Ellis, Steven G; Booij, Kees; Kaputa, Mike

    2008-07-01

    Semipermeable membrane devices (SPMDs) spiked with the performance reference compound PCB29 were deployed 6.1 m above the sediments of Lake Chelan, Washington, for a period of 27 d, to estimate the dissolved concentrations of 4,4'-DDT, 4,4'-DDE, and 4,4'-DDD. Water concentrations were estimated using methods proposed in 2002 and newer equations published in 2006 to determine how the application of the newer equations affects historical SPMD data that used the older method. The estimated concentrations of DDD, DDE, and DDD calculated using the older method were 1.5-2.9 times higher than the newer method. SPMD estimates from both methods were also compared to dissolved and particulate DDT concentrations measured directly by processing large volumes of water through a large-volume solid-phase extraction device (Infiltrex 300). SPMD estimates of DDD+DDE+DDT (SigmaDDT) using the older and newer methods were lower than Infiltrex concentrations by factors of 1.1 and 2.3, respectively. All measurements of DDT were below the Washington State water quality standards for the protection of human health (0.59 ng l(-1)) and aquatic life (1.0 ng l(-1)).

  18. Data management in pattern recognition and image processing systems

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1976-01-01

    Data management considerations are important to any system which handles large volumes of data or where the manipulation of data is technically sophisticated. A particular problem is the introduction of image-formatted files into the mainstream of data processing application. This report describes a comprehensive system for the manipulation of image, tabular, and graphical data sets which involve conversions between the various data types. A key characteristic is the use of image processing technology to accomplish data management tasks. Because of this, the term 'image-based information system' has been adopted.

  19. GPU-based multi-volume ray casting within VTK for medical applications.

    PubMed

    Bozorgi, Mohammadmehdi; Lindseth, Frank

    2015-03-01

    Multi-volume visualization is important for displaying relevant information in multimodal or multitemporal medical imaging studies. The main objective with the current study was to develop an efficient GPU-based multi-volume ray caster (MVRC) and validate the proposed visualization system in the context of image-guided surgical navigation. Ray casting can produce high-quality 2D images from 3D volume data but the method is computationally demanding, especially when multiple volumes are involved, so a parallel GPU version has been implemented. In the proposed MVRC, imaginary rays are sent through the volumes (one ray for each pixel in the view), and at equal and short intervals along the rays, samples are collected from each volume. Samples from all the volumes are composited using front to back α-blending. Since all the rays can be processed simultaneously, the MVRC was implemented in parallel on the GPU to achieve acceptable interactive frame rates. The method is fully integrated within the visualization toolkit (VTK) pipeline with the ability to apply different operations (e.g., transformations, clipping, and cropping) on each volume separately. The implemented method is cross-platform (Windows, Linux and Mac OSX) and runs on different graphics card (NVidia and AMD). The speed of the MVRC was tested with one to five volumes of varying sizes: 128(3), 256(3), and 512(3). A Tesla C2070 GPU was used, and the output image size was 600 × 600 pixels. The original VTK single-volume ray caster and the MVRC were compared when rendering only one volume. The multi-volume rendering system achieved an interactive frame rate (> 15 fps) when rendering five small volumes (128 (3) voxels), four medium-sized volumes (256(3) voxels), and two large volumes (512(3) voxels). When rendering single volumes, the frame rate of the MVRC was comparable to the original VTK ray caster for small and medium-sized datasets but was approximately 3 frames per second slower for large datasets. The MVRC was successfully integrated in an existing surgical navigation system and was shown to be clinically useful during an ultrasound-guided neurosurgical tumor resection. A GPU-based MVRC for VTK is a useful tool in medical visualization. The proposed multi-volume GPU-based ray caster for VTK provided high-quality images at reasonable frame rates. The MVRC was effective when used in a neurosurgical navigation application.

  20. Manufacture of multi-layer woven preforms

    NASA Technical Reports Server (NTRS)

    Mohamed, M. H.; Zhang, Z.; Dickinson, L.

    1988-01-01

    This paper reviews current three-dimensional weaving processes and discusses a process developed at the Mars Mission Research Center of North Carolina State University to weave three-dimensional multilayer fabrics. The fabrics may vary in size and complexity from simple panels to T-section or I-section beams to large stiffened panels. Parameters such as fiber orientation, volume fraction of the fiber required in each direction, yarn spacings or density, etc., which determine the physical properties of the composites are discussed.

  1. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 1.

    DTIC Science & Technology

    1998-12-31

    solely to have a record that could be matched with the CMOS receipt data. (This problem is caused by DLA systems that currently do not populate CMOS with...unable to obtain passwords to the Depot D035 systems. Figure 16 shows daily savings as of 30 September 1998 (current time frame ) and projects savings...Engineering, modeling, and systems/software development company LAN Local Area Network LFA Large Frame Aircraft LMA Logistics Management Agency LMR

  2. Large-Scale Production of Nanographite by Tube-Shear Exfoliation in Water

    PubMed Central

    Engström, Ann-Christine; Hummelgård, Magnus; Andres, Britta; Forsberg, Sven; Olin, Håkan

    2016-01-01

    The number of applications based on graphene, few-layer graphene, and nanographite is rapidly increasing. A large-scale process for production of these materials is critically needed to achieve cost-effective commercial products. Here, we present a novel process to mechanically exfoliate industrial quantities of nanographite from graphite in an aqueous environment with low energy consumption and at controlled shear conditions. This process, based on hydrodynamic tube shearing, produced nanometer-thick and micrometer-wide flakes of nanographite with a production rate exceeding 500 gh-1 with an energy consumption about 10 Whg-1. In addition, to facilitate large-area coating, we show that the nanographite can be mixed with nanofibrillated cellulose in the process to form highly conductive, robust and environmentally friendly composites. This composite has a sheet resistance below 1.75 Ω/sq and an electrical resistivity of 1.39×10-4 Ωm and may find use in several applications, from supercapacitors and batteries to printed electronics and solar cells. A batch of 100 liter was processed in less than 4 hours. The design of the process allow scaling to even larger volumes and the low energy consumption indicates a low-cost process. PMID:27128841

  3. Basic principles of flight test instrumentation engineering, volume 1, issue 2

    NASA Technical Reports Server (NTRS)

    Borek, Robert W., Sr. (Editor); Pool, A. (Editor)

    1994-01-01

    Volume 1 of the AG 300 series on 'Flight Test Instrumentation' gives a general introduction to the basic principles of flight test instrumentation. The other volumes in the series provide more detailed treatments of selected topics on flight test instrumentation. Volume 1, first published in 1974, has been used extensively as an introduction for instrumentation courses and symposia, as well as being a reference work on the desk of most flight test and instrumentation engineers. It is hoped that this second edition, fully revised, will be used with as much enthusiasm as the first edition. In this edition a flight test system is considered to include both the data collection and data processing systems. In order to obtain an optimal data flow, the overall design of these two subsystems must be carefully matched; the detail development and the operation may have to be done by separate groups of specialists. The main emphasis is on the large automated instrumentation systems used for the initial flight testing of modern military and civil aircraft. This is done because there, many of the problems, which are discussed here, are more critical. It does not imply, however, that smaller systems with manual data processing are no longer used. In general, the systems should be designed to provide the required results at the lowest possible cost. For many tests which require only a few parameters, relatively simple systems are justified, especially if no complex equipment is available to the user. Although many of the aspects discussed in this volume apply to both small and large systems, aspects of the smaller systems are mentioned only when they are of special interest. The volume has been divided into three main parts. Part 1 defines the main starting points for the design of a flight test instrumentation system, as seen from the points of view of the flight test engineer and the instrumentation engineer. In Part 2 the discussion is concentrated on those aspects which apply to each individual measuring channel, and in Part 3 the main emphasis is on the integration of the individual data channels into one data collection system and on those aspects of the data processing which apply to the complete system.

  4. Recurrent eruption and subsidence at the Platoro caldera complex, southeastern San Juan volcanic field, Colorado: New tales from old tuffs

    USGS Publications Warehouse

    Lipman, P.W.; Dungan, M.A.; Brown, L.L.; Deino, A.

    1996-01-01

    Reinterpretation of a voluminous regional ash-flow sheet (Masonic Park Tuff) as two separate tuff sheets of similar phenocryst-rich dacite erupted from separate source calderas has important implications for evolution of the multicyclic Platoro caldera complex and for caldera-forming processes generally. Masonic Park Tuff in central parts of the San Juan field, including the type area, was erupted from a concealed source at 28.6 Ma, but widespread tuff previously mapped as Masonic Park Tuff in the southeastern San Juan Mountains is the product of the youngest large-volume eruption of the Platoro caldera complex at 28.4 Ma. This large unit, newly named the "Chiquito Peak Tuff," is the last-erupted tuff of the Treasure Mountain Group, which consists of at least 20 separate ash-flow sheets of dacite to low-silica rhyolite erupted from the Platoro complex during a 1 m.y. interval (29.5-28.4 Ma). Two Treasure Mountain tuff sheets have volumes in excess of 1000 km3 each, and five more have volumes of 50-150 km3. The total volume of ash-flow tuff exceeds 2500 km3, and caldera-related lavas of dominantly andesitic composition make up 250-500 km3 more. A much greater volume of intermediate-composition magma must have solidified in subcaldera magma chambers. Most preserved features of the Platoro complex - including postcollapse asymmetrical trap-door resurgent uplift of the ponded intracaldera tuff and concurrent infilling by andesitic lava flows - postdate eruption of the Chiquito Peak Tuff. The numerous large-volume pre-Chiquito Peak ash-flow tuffs document multiple eruptions accompanied by recurrent subsidence; early-formed caldera walls nearly coincide with margins of the later Chiquito Peak collapse. Repeated syneruptive collapse at the Platoro complex requires cumulative subsidence of at least 10 km. The rapid regeneration of silicic magmas requires the sustained presence of an andesitic subcaldera magma reservoir, or its rapid replenishment, during the 1 m.y. life span of the Platoro complex. Either case implies large-scale stoping and assimilative recycling of the Tertiary section, including intracaldera tuffs.

  5. Middleware for big data processing: test results

    NASA Astrophysics Data System (ADS)

    Gankevich, I.; Gaiduchok, V.; Korkhov, V.; Degtyarev, A.; Bogdanov, A.

    2017-12-01

    Dealing with large volumes of data is resource-consuming work which is more and more often delegated not only to a single computer but also to a whole distributed computing system at once. As the number of computers in a distributed system increases, the amount of effort put into effective management of the system grows. When the system reaches some critical size, much effort should be put into improving its fault tolerance. It is difficult to estimate when some particular distributed system needs such facilities for a given workload, so instead they should be implemented in a middleware which works efficiently with a distributed system of any size. It is also difficult to estimate whether a volume of data is large or not, so the middleware should also work with data of any volume. In other words, the purpose of the middleware is to provide facilities that adapt distributed computing system for a given workload. In this paper we introduce such middleware appliance. Tests show that this middleware is well-suited for typical HPC and big data workloads and its performance is comparable with well-known alternatives.

  6. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  7. Competition between reaction-induced expansion and creep compaction during gypsum formation: Experimental and numerical investigation

    NASA Astrophysics Data System (ADS)

    Skarbek, R. M.; Savage, H. M.; Spiegelman, M. W.; Kelemen, P. B.; Yancopoulos, D.

    2017-12-01

    Deformation and cracking caused by reaction-driven volume increase is an important process in many geological settings, however the conditions controlling these processes are poorly understood. The interaction of rocks with reactive fluids can change permeability and reactive surface area, leading to a large variety of feedbacks. Gypsum is an ideal material to study these processes. It forms rapidly at room temperature via bassanite hydration, and is commonly used as an analogue for rocks in high-temperature, high-pressure conditions. We conducted uniaxial strain experiments to study the effects of applied axial load on deformation and fluid flow during the formation of gypsum from bassanite. While hydration of bassanite to gypsum involves a solid volume increase, gypsum exhibits significant creep compaction when in contact with water. These two volume changing processes occur simultaneously during fluid flow through bassanite. We cold-pressed bassanite powder to form cylinders 2.5 cm in height and 1.2 cm in diameter. Samples were compressed with a static axial load of 0.01 to 4 MPa. Water infiltrated initially unsaturated samples through the bottom face and the height of the samples was recorded as a measure of the total volume change. We also performed experiments on pure gypsum samples to constrain the amount of creep observed in tests on bassanite hydration. At axial loads < 0.15 MPa, volume increase due to the reaction dominates and samples exhibit monotonic expansion. At loads > 1 MPa, creep in the gypsum dominates and samples exhibit monotonic compaction. At intermediate loads, samples exhibit alternating phases of compaction and expansion due to the interplay of the two volume changing processes. We observed a change from net compaction to net expansion at an axial load of 0.250 MPa. We explain this behavior with a simple model that predicts the strain evolution, but does not take fluid flow into account. We also implement a 1D poro-visco-elastic model of the imbibition process that includes the reaction and gypsum creep. We use the results of these models, with models of the creep rate in gypsum, to estimate the temperature dependence of the axial load where total strain transitions from compaction to expansion. Our results have implications for the depth dependence of reaction induced volume changes in the Earth.

  8. LiDAR monitoring of retrogressive processes on the steep rockslope of a large landslide in the Japanese Alps

    NASA Astrophysics Data System (ADS)

    Nishii, R.; Imaizumi, F.; Murakami, W.; Daimaru, H.; Miyamae, T.; Ogawa, Y.

    2012-04-01

    Akakuzure landslide in Japanese Alps is located in a steep mountain slope experienced deep-seated gravitational slope deformation. The landslide is 700 m high (1200-1900 m a.s.l.), 700 m wide and 400000 m2 in area with post-collapsed sediment ca 27 million m3 in volume. The steep rockslope (>40°) in the landslide shows anaclinal structure consisting of sandstone interbedding with shale. Large volume of sediment produced from the landslide has actively formed an alluvial fan on the outlet of the landslide. The volume and processes of the sediment production in the upper part (ca.40000 m^2) of the landslide were evaluated by geodetic surveys using techniques of airborne and ground-based LiDAR (Light Detection and Ranging). The airborne and ground-based LiDAR surveys were performed twice (2003 and 2007) and 3 times (2010-2011), respectively. Ground surface temperatures were monitored at 3 locations within the landslide from 2010 to 2011. Precipitation and air temperature have been also observed on a meteorological station near the study site. The average erosion depths in the observed rockslope reached 0.89 m (0.22 m/yr) during the first 4 years (2003-2007) and 0.55 m (0.18 m/yr) during the later 3 years (2007-2010). The erosion mainly occurred within the landslide rather than on the edge of the landslide (i.e. no significant retreat of the main scarp). Such large sediment production can be divided into three processes based on the depth of detachment. Deep detachment (>5 m in depth), significantly contributing to the retreat of the rockslope, happened to large blocks had located just above knick lines. During the observation period, at least five large blocks fell down, which appears to originate from sliding along the detachment zone steeper than 30°. Second, anaclinal bedding-parallel blocks (1-2 m in depth) fell down, which mainly occurred around sandstone layers. Finally, thin detachment (<1 m in depth) widely occurred on the rockslope. On one part of shale layers, the erosion depth reached 0.35 m from 2010 to 2011. In Akakuzure landside, numerous fractures of the bedrock, probably produced by gravitational deformation, play an important role to promote the rapid erosion, in addition to external triggers such as heavy rainfalls and frost actions.

  9. Partial volume correction and image analysis methods for intersubject comparison of FDG-PET studies

    NASA Astrophysics Data System (ADS)

    Yang, Jun

    2000-12-01

    Partial volume effect is an artifact mainly due to the limited imaging sensor resolution. It creates bias in the measured activity in small structures and around tissue boundaries. In brain FDG-PET studies, especially for Alzheimer's disease study where there is serious gray matter atrophy, accurate estimate of cerebral metabolic rate of glucose is even more problematic due to large amount of partial volume effect. In this dissertation, we developed a framework enabling inter-subject comparison of partial volume corrected brain FDG-PET studies. The framework is composed of the following image processing steps: (1)MRI segmentation, (2)MR-PET registration, (3)MR based PVE correction, (4)MR 3D inter-subject elastic mapping. Through simulation studies, we showed that the newly developed partial volume correction methods, either pixel based or ROI based, performed better than previous methods. By applying this framework to a real Alzheimer's disease study, we demonstrated that the partial volume corrected glucose rates vary significantly among the control, at risk and disease patient groups and this framework is a promising tool useful for assisting early identification of Alzheimer's patients.

  10. A summary of the history of the development of automated remote sensing for agricultural applications

    NASA Technical Reports Server (NTRS)

    Macdonald, R. B.

    1984-01-01

    An historical account is given of the development of technology for the processing of satellite-acquired multispectral data aimed at the identification of the type, condition, and ontogenic stages of agricultural areas. During 1972 and 1973, research established the feasibility of automating digital classification for the processing of large volumes of Landsat MSS data. This capability was successfully demonstrated during the Large Area Crop Inventory Experiment, which estimated wheat crop production on a global basis. This achievement in turn led to the Agriculture and Resources Inventory Surveys Through Aerospace Remote Sensing, which investigated other portions of the electromagnetic spectrum and expanded the study of key commercial crops in important agricultural areas.

  11. SEAPAK user's guide, version 2.0. Volume 2: Descriptions of programs

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.

    1991-01-01

    The SEAPAK is a user-interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made since version 1.0, and the ancillary environmental data analysis module was greatly expanded. The package continues to be user friendly and user interactive. Also, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing for large quantities of data to be ingested and analyzed.

  12. SEAPAK user's guide, version 2.0. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.

    1991-01-01

    The SEAPAK is a user interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made to version 1.0 of the guide, and the ancillary environmental data analysis module was expanded. The package continues to emphasize user friendliness and user interactive data analyses. Additionally, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing large quantities of data to be ingested and analyzed in background.

  13. ArcGIS Framework for Scientific Data Analysis and Serving

    NASA Astrophysics Data System (ADS)

    Xu, H.; Ju, W.; Zhang, J.

    2015-12-01

    ArcGIS is a platform for managing, visualizing, analyzing, and serving geospatial data. Scientific data as part of the geospatial data features multiple dimensions (X, Y, time, and depth) and large volume. Multidimensional mosaic dataset (MDMD), a newly enhanced data model in ArcGIS, models the multidimensional gridded data (e.g. raster or image) as a hypercube and enables ArcGIS's capabilities to handle the large volume and near-real time scientific data. Built on top of geodatabase, the MDMD stores the dimension values and the variables (2D arrays) in a geodatabase table which allows accessing a slice or slices of the hypercube through a simple query and supports animating changes along time or vertical dimension using ArcGIS desktop or web clients. Through raster types, MDMD can manage not only netCDF, GRIB, and HDF formats but also many other formats or satellite data. It is scalable and can handle large data volume. The parallel geo-processing engine makes the data ingestion fast and easily. Raster function, definition of a raster processing algorithm, is a very important component in ArcGIS platform for on-demand raster processing and analysis. The scientific data analytics is achieved through the MDMD and raster function templates which perform on-demand scientific computation with variables ingested in the MDMD. For example, aggregating monthly average from daily data; computing total rainfall of a year; calculating heat index for forecasting data, and identifying fishing habitat zones etc. Addtionally, MDMD with the associated raster function templates can be served through ArcGIS server as image services which provide a framework for on-demand server side computation and analysis, and the published services can be accessed by multiple clients such as ArcMap, ArcGIS Online, JavaScript, REST, WCS, and WMS. This presentation will focus on the MDMD model and raster processing templates. In addtion, MODIS land cover, NDFD weather service, and HYCOM ocean model will be used to illustrate how ArcGIS platform and MDMD model can facilitate scientific data visualization and analytics and how the analysis results can be shared to more audience through ArcGIS Online and Portal.

  14. Use of high-volume outdoor smog chamber photo-reactors for studying physical and chemical atmospheric aerosol formation and composition

    NASA Astrophysics Data System (ADS)

    Borrás, E.; Ródenas, M.; Vera, T.; Muñoz, A.

    2015-12-01

    The atmospheric particulate matter has a large impact on climate, biosphere behaviour and human health. Its study is complex because of large number of species are present at low concentrations and the continuous time evolution, being not easily separable from meteorology, and transport processes. Closed systems have been proposed by isolating specific reactions, pollutants or products and controlling the oxidizing environment. High volume simulation chambers, such as EUropean PHOtoREactor (EUPHORE), are an essential tool used to simulate atmospheric photochemical reactions. This communication describes the last results about the reactivity of prominent atmospheric pollutants and the subsequent particulate matter formation. Specific experiments focused on organic aerosols have been developed at the EUPHORE photo-reactor. The use of on-line instrumentation, supported by off-line techniques, has provided well-defined reaction profiles, physical properties, and up to 300 different species are determined in particulate matter. The application fields include the degradation of anthropogenic and biogenic pollutants, and pesticides under several atmospheric conditions, studying their contribution on the formation of secondary organic aerosols (SOA). The studies performed at the EUPHORE have improved the mechanistic studies of atmospheric degradation processes and the knowledge about the chemical and physical properties of atmospheric particulate matter formed during these processes.

  15. Biorefinery process for protein extraction from oriental mustard (Brassica juncea (L.) Czern.) using ethanol stillage.

    PubMed

    Ratanapariyanuch, Kornsulee; Tyler, Robert T; Shim, Youn Young; Reaney, Martin Jt

    2012-01-12

    Large volumes of treated process water are required for protein extraction. Evaporation of this water contributes greatly to the energy consumed in enriching protein products. Thin stillage remaining from ethanol production is available in large volumes and may be suitable for extracting protein rich materials. In this work protein was extracted from ground defatted oriental mustard (Brassica juncea (L.) Czern.) meal using thin stillage. Protein extraction efficiency was studied at pHs between 7.6 and 10.4 and salt concentrations between 3.4 × 10-2 and 1.2 M. The optimum extraction efficiency was pH 10.0 and 1.0 M NaCl. Napin and cruciferin were the most prevalent proteins in the isolate. The isolate exhibited high in vitro digestibility (74.9 ± 0.80%) and lysine content (5.2 ± 0.2 g/100 g of protein). No differences in the efficiency of extraction, SDS-PAGE profile, digestibility, lysine availability, or amino acid composition were observed between protein extracted with thin stillage and that extracted with NaCl solution. The use of thin stillage, in lieu of water, for protein extraction would decrease the energy requirements and waste disposal costs of the protein isolation and biofuel production processes.

  16. Biorefinery process for protein extraction from oriental mustard (Brassica juncea (L.) Czern.) using ethanol stillage

    PubMed Central

    2012-01-01

    Large volumes of treated process water are required for protein extraction. Evaporation of this water contributes greatly to the energy consumed in enriching protein products. Thin stillage remaining from ethanol production is available in large volumes and may be suitable for extracting protein rich materials. In this work protein was extracted from ground defatted oriental mustard (Brassica juncea (L.) Czern.) meal using thin stillage. Protein extraction efficiency was studied at pHs between 7.6 and 10.4 and salt concentrations between 3.4 × 10-2 and 1.2 M. The optimum extraction efficiency was pH 10.0 and 1.0 M NaCl. Napin and cruciferin were the most prevalent proteins in the isolate. The isolate exhibited high in vitro digestibility (74.9 ± 0.80%) and lysine content (5.2 ± 0.2 g/100 g of protein). No differences in the efficiency of extraction, SDS-PAGE profile, digestibility, lysine availability, or amino acid composition were observed between protein extracted with thin stillage and that extracted with NaCl solution. The use of thin stillage, in lieu of water, for protein extraction would decrease the energy requirements and waste disposal costs of the protein isolation and biofuel production processes. PMID:22239856

  17. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    PubMed

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance (MR) imaging and multi-modality positron emission tomography-CT (PET-CT). In our experiments, the AB-VH markedly improved the computational efficiency for the VH construction and thus improved the subsequent VH-driven volume manipulations. This efficiency was achieved without major degradation in the VH visually and numerical differences between the AB-VH and its full-bin counterpart. We applied several variants of the K-means clustering algorithm with varying Ks (the number of clusters) and found that higher values of K resulted in better performance at a lower computational gain. The AB-VH also had an improved performance when compared to the conventional method of down-sampling of the histogram bins (equal binning) for volume rendering visualisation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. A simplified method to recover urinary vesicles for clinical applications, and sample banking.

    PubMed

    Musante, Luca; Tataruch, Dorota; Gu, Dongfeng; Benito-Martin, Alberto; Calzaferri, Giulio; Aherne, Sinead; Holthofer, Harry

    2014-12-23

    Urinary extracellular vesicles provide a novel source for valuable biomarkers for kidney and urogenital diseases: Current isolation protocols include laborious, sequential centrifugation steps which hampers their widespread research and clinical use. Furthermore, large individual urine sample volumes or sizable target cohorts are to be processed (e.g. for biobanking), the storage capacity is an additional problem. Thus, alternative methods are necessary to overcome such limitations. We have developed a practical vesicle isolation technique to yield easily manageable sample volumes in an exceptionally cost efficient way to facilitate their full utilization in less privileged environments and maximize the benefit of biobanking. Urinary vesicles were isolated by hydrostatic dialysis with minimal interference of soluble proteins or vesicle loss. Large volumes of urine were concentrated up to 1/100 of original volume and the dialysis step allowed equalization of urine physico-chemical characteristics. Vesicle fractions were found suitable to any applications, including RNA analysis. In the yield, our hydrostatic filtration dialysis system outperforms the conventional ultracentrifugation-based methods and the labour intensive and potentially hazardous step of ultracentrifugations are eliminated. Likewise, the need for trained laboratory personnel and heavy initial investment is avoided. Thus, our method qualifies as a method for laboratories working with urinary vesicles and biobanking.

  19. A Simplified Method to Recover Urinary Vesicles for Clinical Applications, and Sample Banking

    PubMed Central

    Musante, Luca; Tataruch, Dorota; Gu, Dongfeng; Benito-Martin, Alberto; Calzaferri, Giulio; Aherne, Sinead; Holthofer, Harry

    2014-01-01

    Urinary extracellular vesicles provide a novel source for valuable biomarkers for kidney and urogenital diseases: Current isolation protocols include laborious, sequential centrifugation steps which hampers their widespread research and clinical use. Furthermore, large individual urine sample volumes or sizable target cohorts are to be processed (e.g. for biobanking), the storage capacity is an additional problem. Thus, alternative methods are necessary to overcome such limitations. We have developed a practical vesicle isolation technique to yield easily manageable sample volumes in an exceptionally cost efficient way to facilitate their full utilization in less privileged environments and maximize the benefit of biobanking. Urinary vesicles were isolated by hydrostatic dialysis with minimal interference of soluble proteins or vesicle loss. Large volumes of urine were concentrated up to 1/100 of original volume and the dialysis step allowed equalization of urine physico-chemical characteristics. Vesicle fractions were found suitable to any applications, including RNA analysis. In the yield, our hydrostatic filtration dialysis system outperforms the conventional ultracentrifugation-based methods and the labour intensive and potentially hazardous step of ultracentrifugations are eliminated. Likewise, the need for trained laboratory personnel and heavy initial investment is avoided. Thus, our method qualifies as a method for laboratories working with urinary vesicles and biobanking. PMID:25532487

  20. Dry etching of metallization

    NASA Technical Reports Server (NTRS)

    Bollinger, D.

    1983-01-01

    The production dry etch processes are reviewed from the perspective of microelectronic fabrication applications. The major dry etch processes used in the fabrication of microelectronic devices can be divided into two categories - plasma processes in which samples are directly exposed to an electrical discharge, and ion beam processes in which samples are etched by a beam of ions extracted from a discharge. The plasma etch processes can be distinguished by the degree to which ion bombardment contributes to the etch process. This, in turn is related to capability for anisotropic etching. Reactive Ion Etching (RIE) and Ion Beam Etching are of most interest for etching of thin film metals. RIE is generally considered the best process for large volume, anisotropic aluminum etching.

  1. Membrane-based, sedimentation-assisted plasma separator for point-of-care applications.

    PubMed

    Liu, Changchun; Mauk, Michael; Gross, Robert; Bushman, Frederic D; Edelstein, Paul H; Collman, Ronald G; Bau, Haim H

    2013-11-05

    Often, high-sensitivity, point-of-care (POC) clinical tests, such as HIV viral load, require large volumes of plasma. Although centrifuges are ubiquitously used in clinical laboratories to separate plasma from whole blood, centrifugation is generally inappropriate for on-site testing. Suitable alternatives are not readily available to separate the relatively large volumes of plasma from milliliters of blood that may be needed to meet stringent limit-of-detection specifications for low-abundance target molecules. We report on a simple-to-use, low-cost, pump-free, membrane-based, sedimentation-assisted plasma separator capable of separating a relatively large volume of plasma from undiluted whole blood within minutes. This plasma separator consists of an asymmetric, porous, polysulfone membrane housed in a disposable chamber. The separation process takes advantage of both gravitational sedimentation of blood cells and size exclusion-based filtration. The plasma separator demonstrated a "blood in-plasma out" capability, consistently extracting 275 ± 33.5 μL of plasma from 1.8 mL of undiluted whole blood within less than 7 min. The device was used to separate plasma laden with HIV viruses from HIV virus-spiked whole blood with recovery efficiencies of 95.5% ± 3.5%, 88.0% ± 9.5%, and 81.5% ± 12.1% for viral loads of 35,000, 3500, and 350 copies/mL, respectively. The separation process is self-terminating to prevent excessive hemolysis. The HIV-laden plasma was then injected into our custom-made microfluidic chip for nucleic acid testing and was successfully subjected to reverse-transcriptase loop-mediated isothermal amplification (RT-LAMP), demonstrating that the plasma is sufficiently pure to support high-efficiency nucleic acid amplification.

  2. Membrane-based, sedimentation-assisted plasma separator for point-of-care applications

    PubMed Central

    Liu, Changchun; Mauk, Michael; Gross, Robert; Bushman, Frederic D.; Edelstein, Paul H.; Collman, Ronald G.; Bau, Haim H.

    2014-01-01

    Often, high sensitivity, point of care, clinical tests, such as HIV viral load, require large volumes of plasma. Although centrifuges are ubiquitously used in clinical laboratories to separate plasma from whole blood, centrifugation is generally inappropriate for on-site testing. Suitable alternatives are not readily available to separate the relatively large volumes of plasma from milliliters of blood that may be needed to meet stringent limit-of-detection specifications for low abundance target molecules. We report on a simple to use, low-cost, pump-free, membrane-based, sedimentation-assisted plasma separator capable of separating a relatively large volume of plasma from undiluted whole blood within minutes. This plasma separator consists of an asymmetric, porous, polysulfone membrane housed in a disposable chamber. The separation process takes advantage of both gravitational sedimentation of blood cells and size exclusion-based filtration. The plasma separator demonstrated a “blood in-plasma out” capability, consistently extracting 275 ±33.5 μL of plasma from 1.8 mL of undiluted whole blood in less than 7 min. The device was used to separate plasma laden with HIV viruses from HIV virus-spiked whole blood with recovery efficiencies of 95.5% ± 3.5%, 88.0% ± 9.5%, and 81.5% ± 12.1% for viral loads of 35,000, 3,500 and 350 copies/mL, respectively. The separation process is self-terminating to prevent excessive hemolysis. The HIV-laden plasma was then injected into our custom-made microfluidic chip for nucleic acid Testing And Was Successfully Subjected To Reverse Transcriptase Loop mediated isothermal amplification (RT-LAMP), demonstrating that the plasma is sufficiently pure to support high efficiency nucleic acid amplification. PMID:24099566

  3. Groundwater drainage from fissures as a source for lahars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, P. J.; Valentine, G. A.; Stauffer, P. H.

    One mechanism for generating lahars at volcanoes experiencing unrest is the disruption of internal aquifers. These disruptions can trigger releases of large quantities of groundwater. An example of such aquifer disruption occurred at Nevado del Huila Volcano, Colombia, during February and April 2007 when large fractures formed across the summit area of the volcano and lahars were emitted from them. Previous work interpreted that lahar volumes could not be accounted for by melted glacial snow or precipitation, and by elimination suggested that the primary water source was groundwater. Conceptual models have been developed for perched, confined aquifers that have beenmore » heated and pressurized by magma intrusions, followed by sudden pressure release and water emission during fracture formation. In this paper, we consider an alternative end member wherein water release from large fissures at volcanoes is driven by simple gravity drainage. We apply numerical modeling to quantify water discharge from the porous medium surrounding a fissure with a low-elevation free exit. If a long fracture with high vertical extent (on the order of hundreds of meters) intersects a highly connected saturated porous medium, large volumes (on order 10 3 m 3/m of crack length) of water may be released within tens of minutes. The drainage rates from the model may be adequate to account for the Nevado del Huila events if the medium surrounding the crack contains a large volume of water and has high horizontal permeability. Finally, this simple but poorly understood mechanism can present a hazard on its own or compound other processes releasing water from volcanoes.« less

  4. Groundwater drainage from fissures as a source for lahars

    NASA Astrophysics Data System (ADS)

    Johnson, P. J.; Valentine, G. A.; Stauffer, P. H.; Lowry, C. S.; Sonder, I.; Pulgarín, B. A.; Santacoloma, C. C.; Agudelo, A.

    2018-04-01

    One mechanism for generating lahars at volcanoes experiencing unrest is the disruption of internal aquifers. These disruptions can trigger releases of large quantities of groundwater. An example of such aquifer disruption occurred at Nevado del Huila Volcano, Colombia, during February and April 2007 when large fractures formed across the summit area of the volcano and lahars were emitted from them. Previous work interpreted that lahar volumes could not be accounted for by melted glacial snow or precipitation, and by elimination suggested that the primary water source was groundwater. Conceptual models have been developed for perched, confined aquifers that have been heated and pressurized by magma intrusions, followed by sudden pressure release and water emission during fracture formation. We consider an alternative end member wherein water release from large fissures at volcanoes is driven by simple gravity drainage. We apply numerical modeling to quantify water discharge from the porous medium surrounding a fissure with a low-elevation free exit. If a long fracture with high vertical extent (on the order of hundreds of meters) intersects a highly connected saturated porous medium, large volumes (on order 103 m3/m of crack length) of water may be released within tens of minutes. The drainage rates from the model may be adequate to account for the Nevado del Huila events if the medium surrounding the crack contains a large volume of water and has high horizontal permeability. This simple but poorly understood mechanism can present a hazard on its own or compound other processes releasing water from volcanoes.

  5. Groundwater drainage from fissures as a source for lahars

    DOE PAGES

    Johnson, P. J.; Valentine, G. A.; Stauffer, P. H.; ...

    2018-03-22

    One mechanism for generating lahars at volcanoes experiencing unrest is the disruption of internal aquifers. These disruptions can trigger releases of large quantities of groundwater. An example of such aquifer disruption occurred at Nevado del Huila Volcano, Colombia, during February and April 2007 when large fractures formed across the summit area of the volcano and lahars were emitted from them. Previous work interpreted that lahar volumes could not be accounted for by melted glacial snow or precipitation, and by elimination suggested that the primary water source was groundwater. Conceptual models have been developed for perched, confined aquifers that have beenmore » heated and pressurized by magma intrusions, followed by sudden pressure release and water emission during fracture formation. In this paper, we consider an alternative end member wherein water release from large fissures at volcanoes is driven by simple gravity drainage. We apply numerical modeling to quantify water discharge from the porous medium surrounding a fissure with a low-elevation free exit. If a long fracture with high vertical extent (on the order of hundreds of meters) intersects a highly connected saturated porous medium, large volumes (on order 10 3 m 3/m of crack length) of water may be released within tens of minutes. The drainage rates from the model may be adequate to account for the Nevado del Huila events if the medium surrounding the crack contains a large volume of water and has high horizontal permeability. Finally, this simple but poorly understood mechanism can present a hazard on its own or compound other processes releasing water from volcanoes.« less

  6. Large Volume, Behaviorally-relevant Illumination for Optogenetics in Non-human Primates.

    PubMed

    Acker, Leah C; Pino, Erica N; Boyden, Edward S; Desimone, Robert

    2017-10-03

    This protocol describes a large-volume illuminator, which was developed for optogenetic manipulations in the non-human primate brain. The illuminator is a modified plastic optical fiber with etched tip, such that the light emitting surface area is > 100x that of a conventional fiber. In addition to describing the construction of the large-volume illuminator, this protocol details the quality-control calibration used to ensure even light distribution. Further, this protocol describes techniques for inserting and removing the large volume illuminator. Both superficial and deep structures may be illuminated. This large volume illuminator does not need to be physically coupled to an electrode, and because the illuminator is made of plastic, not glass, it will simply bend in circumstances when traditional optical fibers would shatter. Because this illuminator delivers light over behaviorally-relevant tissue volumes (≈ 10 mm 3 ) with no greater penetration damage than a conventional optical fiber, it facilitates behavioral studies using optogenetics in non-human primates.

  7. Data acquisition system issues for large experiments

    NASA Astrophysics Data System (ADS)

    Siskind, E. J.

    2007-09-01

    This talk consists of personal observations on two classes of data acquisition ("DAQ") systems for Silicon trackers in large experiments with which the author has been concerned over the last three or more years. The first half is a classic "lessons learned" recital based on experience with the high-level debug and configuration of the DAQ system for the GLAST LAT detector. The second half is concerned with a discussion of the promises and pitfalls of using modern (and future) generations of "system-on-a-chip" ("SOC") or "platform" field-programmable gate arrays ("FPGAs") in future large DAQ systems. The DAQ system pipeline for the 864k channels of Si tracker in the GLAST LAT consists of five tiers of hardware buffers which ultimately feed into the main memory of the (two-active-node) level-3 trigger processor farm. The data formats and buffer volumes of these tiers are briefly described, as well as the flow control employed between successive tiers. Lessons learned regarding data formats, buffer volumes, and flow control/data discard policy are discussed. The continued development of platform FPGAs containing large amounts of configurable logic fabric, embedded PowerPC hard processor cores, digital signal processing components, large volumes of on-chip buffer memory, and multi-gigabit serial I/O capability permits DAQ system designers to vastly increase the amount of data preprocessing that can be performed in parallel within the DAQ pipeline for detector systems in large experiments. The capabilities of some currently available FPGA families are reviewed, along with the prospects for next-generation families of announced, but not yet available, platform FPGAs. Some experience with an actual implementation is presented, and reconciliation between advertised and achievable specifications is attempted. The prospects for applying these components to space-borne Si tracker detectors are briefly discussed.

  8. A general method for assessing the effects of uncertainty in individual-tree volume model predictions on large-area volume estimates with a subtropical forest illustration

    Treesearch

    Ronald E. McRoberts; Paolo Moser; Laio Zimermann Oliveira; Alexander C. Vibrans

    2015-01-01

    Forest inventory estimates of tree volume for large areas are typically calculated by adding the model predictions of volumes for individual trees at the plot level, calculating the mean over plots, and expressing the result on a per unit area basis. The uncertainty in the model predictions is generally ignored, with the result that the precision of the large-area...

  9. The persistence of the large volumes in black holes

    NASA Astrophysics Data System (ADS)

    Ong, Yen Chin

    2015-08-01

    Classically, black holes admit maximal interior volumes that grow asymptotically linearly in time. We show that such volumes remain large when Hawking evaporation is taken into account. Even if a charged black hole approaches the extremal limit during this evolution, its volume continues to grow; although an exactly extremal black hole does not have a "large interior". We clarify this point and discuss the implications of our results to the information loss and firewall paradoxes.

  10. Changes in oil content, fatty acid composition, and functional lipid profiles during dry grind ethanol production from corn.

    USDA-ARS?s Scientific Manuscript database

    Demand for alternatives to fossil fuels has resulted in a dramatic increase in ethanol production from corn. The dry grind method has been the major process, resulting in a large volume of dried distiller grains with solubles (DDGS) as a co-product. This presentation reports our study to monitor ...

  11. Ultra-high performance fiber-reinforced concrete (UHPFRC) for infrastructure rehabilitation : volume 1 : evaluation of ultra high strength concrete (UHSC) in joints of bridge girders.

    DOT National Transportation Integrated Search

    2017-03-01

    Joints are often considered as the weak link in a structure and often deterioration of the structure initiates from the : joints. Joints transfer the stresses from super-structure to sub-structure and in this process are subjected to large : transfer...

  12. Fission meter

    DOEpatents

    Rowland, Mark S [Alamo, CA; Snyderman, Neal J [Berkeley, CA

    2012-04-10

    A neutron detector system for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly into information that a first responder can use to discriminate materials. The system comprises counting neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source.

  13. Using a remote sensing-based, percent tree cover map to enhance forest inventory estimation

    Treesearch

    Ronald E. McRoberts; Greg C. Liknes; Grant M. Domke

    2014-01-01

    For most national forest inventories, the variables of primary interest to users are forest area and growing stock volume. The precision of estimates of parameters related to these variables can be increased using remotely sensed auxiliary variables, often in combination with stratified estimators. However, acquisition and processing of large amounts of remotely sensed...

  14. Review of the harvesting and extraction of advanced biofuels and bioproducts

    Treesearch

    Babette L. Marrone;  Ronald E.  Lacey;  Daniel B. Anderson;  James Bonner;  Jim Coons;  Taraka Dale;  Cara Meghan Downes;  Sandun Fernando;  Christopher  Fuller;  Brian Goodall;  Johnathan E. Holladay;  Kiran Kadam;  Daniel  Kalb;  Wei  Liu;  John B. Mott;  Zivko Nikolov;  Kimberly L. Ogden;  Richard T. Sayre;  Brian G. Trewyn;  José A. Olivares

    2017-01-01

    Energy-efficient and scalable harvesting and lipid extraction processes must be developed in order for the algal biofuels and bioproducts industry to thrive. The major challenge for harvesting is the handling of large volumes of cultivation water to concentrate low amounts of biomass. For lipid extraction, the major energy and cost drivers are associated with...

  15. Catalyst for coal liquefaction process

    DOEpatents

    Huibers, Derk T. A.; Kang, Chia-Chen C.

    1984-01-01

    An improved catalyst for a coal liquefaction process; e.g., the H-Coal Process, for converting coal into liquid fuels, and where the conversion is carried out in an ebullated-catalyst-bed reactor wherein the coal contacts catalyst particles and is converted, in addition to liquid fuels, to gas and residual oil which includes preasphaltenes and asphaltenes. The improvement comprises a catalyst selected from the group consisting of the oxides of nickel molybdenum, cobalt molybdenum, cobalt tungsten, and nickel tungsten on a carrier of alumina, silica, or a combination of alumina and silica. The catalyst has a total pore volume of about 0.500 to about 0.900 cc/g and the pore volume comprises micropores, intermediate pores and macropores, the surface of the intermediate pores being sufficiently large to convert the preasphaltenes to asphaltenes and lighter molecules. The conversion of the asphaltenes takes place on the surface of micropores. The macropores are for metal deposition and to prevent catalyst agglomeration. The micropores have diameters between about 50 and about 200 angstroms (.ANG.) and comprise from about 50 to about 80% of the pore volume, whereas the intermediate pores have diameters between about 200 and 2000 angstroms (.ANG.) and comprise from about 10 to about 25% of the pore volume, and the macropores have diameters between about 2000 and about 10,000 angstroms (.ANG.) and comprise from about 10 to about 25% of the pore volume. The catalysts are further improved where they contain promoters. Such promoters include the oxides of vanadium, tungsten, copper, iron and barium, tin chloride, tin fluoride and rare earth metals.

  16. Big data challenges for large radio arrays

    NASA Astrophysics Data System (ADS)

    Jones, D. L.; Wagstaff, K.; Thompson, D. R.; D'Addario, L.; Navarro, R.; Mattmann, C.; Majid, W.; Lazio, J.; Preston, J.; Rebbapragada, U.

    2012-03-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields. The Jet Propulsion Laboratory is developing technologies to address big data issues, with an emphasis in three areas: 1) Lower-power digital processing architectures to make highvolume data generation operationally affordable, 2) Date-adaptive machine learning algorithms for real-time analysis (or "data triage") of large data volumes, and 3) Scalable data archive systems that allow efficient data mining and remote user code to run locally where the data are stored.

  17. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    PubMed Central

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  18. What will the future of cloud-based astronomical data processing look like?

    NASA Astrophysics Data System (ADS)

    Green, Andrew W.; Mannering, Elizabeth; Harischandra, Lloyd; Vuong, Minh; O'Toole, Simon; Sealey, Katrina; Hopkins, Andrew M.

    2017-06-01

    Astronomy is rapidly approaching an impasse: very large datasets require remote or cloud-based parallel processing, yet many astronomers still try to download the data and develop serial code locally. Astronomers understand the need for change, but the hurdles remain high. We are developing a data archive designed from the ground up to simplify and encourage cloud-based parallel processing. While the volume of data we host remains modest by some standards, it is still large enough that download and processing times are measured in days and even weeks. We plan to implement a python based, notebook-like interface that automatically parallelises execution. Our goal is to provide an interface sufficiently familiar and user-friendly that it encourages the astronomer to run their analysis on our system in the cloud-astroinformatics as a service. We describe how our system addresses the approaching impasse in astronomy using the SAMI Galaxy Survey as an example.

  19. Resin Film Infusion (RFI) Process Modeling for Large Transport Aircraft Wing Structures

    NASA Technical Reports Server (NTRS)

    Knott, Tamara W.; Loos, Alfred C.

    2000-01-01

    Resin film infusion (RFI) is a cost-effective method for fabricating stiffened aircraft wing structures. The RFI process lends itself to the use of near net shape textile preforms manufactured through a variety of automated textile processes such as knitting and braiding. Often, these advanced fiber architecture preforms have through-the-thickness stitching for improved damage tolerance and delamination resistance. The challenge presently facing RFI is to refine the process to ensure complete infiltration and cure of a geometrically complex shape preform with the high fiber volume fraction needed for structural applications. An accurate measurement of preform permeability is critical for successful modeling of the RFI resin infiltration process. Small changes in the permeability can result in very different infiltration behavior and times. Therefore, it is important to accurately measure the permeabilities of the textile preforms used in the RFI process. The objective of this investigation was to develop test methods that can be used to measure the compaction behavior and permeabilities of high fiber volume fraction, advanced fiber architecture textile preforms. These preforms are often highly compacted due to through-the-thickness stitching used to improve damage tolerance. Test fixtures were designed and fabricated and used to measure both transverse and in-plane permeabilities. The fixtures were used to measure the permeabilities of multiaxial warp knit and triaxial braided preforms at fiber volume fractions from 55% to 65%. In addition, the effects of stitching characteristics, thickness, and batch variability on permeability and compaction behavior were investigated.

  20. Voronoi Tessellation for reducing the processing time of correlation functions

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Sevilla-Noarbe, Ignacio

    2018-01-01

    The increase of data volume in Cosmology is motivating the search of new solutions for solving the difficulties associated with the large processing time and precision of calculations. This is specially true in the case of several relevant statistics of the galaxy distribution of the Large Scale Structure of the Universe, namely the two and three point angular correlation functions. For these, the processing time has critically grown with the increase of the size of the data sample. Beyond parallel implementations to overcome the barrier of processing time, space partitioning algorithms are necessary to reduce the computational load. These can delimit the elements involved in the correlation function estimation to those that can potentially contribute to the final result. In this work, Voronoi Tessellation is used to reduce the processing time of the two-point and three-point angular correlation functions. The results of this proof-of-concept show a significant reduction of the processing time when preprocessing the galaxy positions with Voronoi Tessellation.

  1. Large-aperture MOEMS Fabry-Perot interferometer for miniaturized spectral imagers

    NASA Astrophysics Data System (ADS)

    Rissanen, Anna; Langner, Andreas; Viherkanto, Kai; Mannila, Rami

    2015-02-01

    VTT's optical MEMS Fabry-Perot interferometers (FPIs) are tunable optical filters, which enable miniaturization of spectral imagers into small, mass producible hand-held sensors with versatile optical measurement capabilities. FPI technology has also created a basis for various hyperspectral imaging instruments, ranging from nanosatellites, environmental sensing and precision agriculture with UAVs to instruments for skin cancer detection. Until now, these application demonstrations have been mostly realized with piezo-actuated FPIs fabricated by non-monolithical assembly method, suitable for achieving very large optical apertures and with capacity to small-to-medium volumes; however large-volume production of MEMS manufacturing supports the potential for emerging spectral imaging applications also in large-volume applications, such as in consumer/mobile products. Previously reported optical apertures of MEMS FPIs in the visible range have been up to 2 mm in size; this paper presents the design, successful fabrication and characterization of MEMS FPIs for central wavelengths of λ = 500 nm and λ = 650 nm with optical apertures up to 4 mm in diameter. The mirror membranes of the FPI structures consist of ALD (atomic layer deposited) TiO2-Al2O3 λ/4- thin film Bragg reflectors, with the air gap formed by sacrificial polymer etching in O2 plasma. The entire fabrication process is conducted below 150 °C, which makes it possible to monolithically integrate the filter structures on other ICdevices such as detectors. The realized MEMS devices are aimed for nanosatellite space application as breadboard hyperspectral imager demonstrators.

  2. Quantifying sediment connectivity in an actively eroding gully complex, Waipaoa catchment, New Zealand

    NASA Astrophysics Data System (ADS)

    Taylor, Richard J.; Massey, Chris; Fuller, Ian C.; Marden, Mike; Archibald, Garth; Ries, William

    2018-04-01

    Using a combination of airborne LiDAR (2005) and terrestrial laser scanning (2007, 2008, 2010, 2011), sediment delivery processes and sediment connectivity in an 20-ha gully complex, which significantly contributes to the Waipaoa sediment cascade, are quantified over a 6-year period. The acquisition of terrain data from high-resolution surveys of the whole gully-fan system provides new insights into slope processes and slope-channel linkages operating in the complex. Raw terrain data from the airborne and ground-based laser scans were converted into raster DEMs with a vertical accuracy between surveys of <±0.1 m. Grid elevations in each successive DEM were subtracted from the previous DEM to provide models of change across the gully and fan complex. In these models deposition equates to positive and erosion to negative vertical change. Debris flows, slumping, and erosion by surface runoff (gullying in the conventional sense) generated on average 95,232 m3 of sediment annually, with a standard deviation of ± 20,806 m3. The volumes of debris eroded from those areas dominated by surface erosion processes were higher than in areas dominated by landslide processes. Over the six-year study period, sediment delivery from the source zones to the fan was a factor of 1.4 times larger than the volume of debris exported from the fan into Te Weraroa Stream. The average annual volume of sediment exported to Te Weraroa Stream varies widely from 23,195 to 102,796 m3. Fluctuations in the volume of stored sediment within the fan, rather than external forcing by rainstorms or earthquakes, account for this annual variation. No large rainfall events occurred during the monitoring period; therefore, sediment volumes and transfer processes captured by this study are representative of the background conditions that operate in this geomorphic system.

  3. Three-dimensional structural analysis using interactive graphics

    NASA Technical Reports Server (NTRS)

    Biffle, J.; Sumlin, H. A.

    1975-01-01

    The application of computer interactive graphics to three-dimensional structural analysis was described, with emphasis on the following aspects: (1) structural analysis, and (2) generation and checking of input data and examination of the large volume of output data (stresses, displacements, velocities, accelerations). Handling of three-dimensional input processing with a special MESH3D computer program was explained. Similarly, a special code PLTZ may be used to perform all the needed tasks for output processing from a finite element code. Examples were illustrated.

  4. Application of computer generated color graphic techniques to the processing and display of three dimensional fluid dynamic data

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.; Putt, C. W.; Giamati, C. C.

    1981-01-01

    Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.

  5. Draft Environmental Impact Statement. MX Deployment Area Selection and Land Withdrawal/Acquisition DEIS. Volume IV. Part I. Environmental Consequences to the Study Regions and Operating Base Vicinities.

    DTIC Science & Technology

    1980-12-01

    consequences such that the ecosystem will not recover at all, (7) are the consequences such that the impact may be large but the recovery process...Bswe $Vicinitoe MLWI Impact Analysis Process DEPLOYMENT AREA SELECTION AND LAND WITHDRAWAL/ ACQUISITION DISI, DEPARTMENT OF THE AmR F1ORC ’oritinax...Subtitle) S. TYPE OF REPORT & PERIOD COVEREDDraft Environmental Impact Statement-MX Deployment Area Selection-Environmental Draft-December 80 Consequences

  6. Enhanced catalytic activity through the tuning of micropore environment and supercritical CO2 processing: Al(porphyrin)-based porous organic polymers for the degradation of a nerve agent simulant.

    PubMed

    Totten, Ryan K; Kim, Ye-Seong; Weston, Mitchell H; Farha, Omar K; Hupp, Joseph T; Nguyen, SonBinh T

    2013-08-14

    An Al(porphyrin) functionalized with a large axial ligand was incorporated into a porous organic polymer (POP) using a cobalt-catalyzed acetylene trimerization strategy. Removal of the axial ligand afforded a microporous POP that is catalytically active in the methanolysis of a nerve agent simulant. Supercritical CO2 processing of the POP dramatically increased the pore size and volume, allowing for significantly higher catalytic activities.

  7. Architectures and algorithms for digital image processing; Proceedings of the Meeting, Cannes, France, December 5, 6, 1985

    NASA Technical Reports Server (NTRS)

    Duff, Michael J. B. (Editor); Siegel, Howard J. (Editor); Corbett, Francis J. (Editor)

    1986-01-01

    The conference presents papers on the architectures, algorithms, and applications of image processing. Particular attention is given to a very large scale integration system for image reconstruction from projections, a prebuffer algorithm for instant display of volume data, and an adaptive image sequence filtering scheme based on motion detection. Papers are also presented on a simple, direct practical method of sensing local motion and analyzing local optical flow, image matching techniques, and an automated biological dosimetry system.

  8. ONR (Office of Naval Research) Far East Scientific Bulletin. Volume 9, Number 3, July to September 1984,

    DTIC Science & Technology

    1984-09-01

    C-033-82 (1982). "Development of the Narrow Gap Submerged Arc Welding Process - NSA Process," Hirai, Y. et al., Kawasaki Steel Technical Report, 5, 81...upsurge in the resources committed to research in the neurosciences in general, and to membrane phenomena specifically. Because of this large...reader a review of most of the current research being conducted in Japan in the neuroscience and membrane physiology areas. The presentation of the

  9. Automated Array Assembly Task In-depth Study of Silicon Wafer Surface Texturizing

    NASA Technical Reports Server (NTRS)

    Jones, G. T.; Chitre, S.; Rhee, S. S.; Allison, K. L.

    1979-01-01

    A low cost wafer surface texturizing process was studied. An investigation of low cost cleaning operations to clean residual wax and organics from the surface of silicon wafers was made. The feasibility of replacing dry nitrogen with clean dry air for drying silicon wafers was examined. The two stage texturizing process was studied for the purpose of characterizing relevant parameters in large volume applications. The effect of gettering solar cells on photovoltaic energy conversion efficiency is described.

  10. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Beard, Daniel A.

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  11. Circuit engineering principles for construction of bipolar large-scale integrated circuit storage devices and very large-scale main memory

    NASA Astrophysics Data System (ADS)

    Neklyudov, A. A.; Savenkov, V. N.; Sergeyez, A. G.

    1984-06-01

    Memories are improved by increasing speed or the memory volume on a single chip. The most effective means for increasing speeds in bipolar memories are current control circuits with the lowest extraction times for a specific power consumption (1/4 pJ/bit). The control current circuitry involves multistage current switches and circuits accelerating transient processes in storage elements and links. Circuit principles for the design of bipolar memories with maximum speeds for an assigned minimum of circuit topology are analyzed. Two main classes of storage with current control are considered: the ECL type and super-integrated injection type storage with data capacities of N = 1/4 and N 4/16, respectively. The circuits reduce logic voltage differentials and the volumes of lexical and discharge buses and control circuit buses. The limiting speed is determined by the antiinterference requirements of the memory in storage and extraction modes.

  12. AGRI Grain Power ethanol-for-fuel project feasibility-study report. Volume II. Project marketing/economic/financial/ and organization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-04-01

    The AGRI GRAIN POWER (AGP) project, hereafter referred to as the Project, was formed to evaluate the commercial viability and assess the desireability of implementing a large grain based grass-roots anhydrous ethanol fuel project to be sited near Des Moines, Iowa. This report presents the results of a Project feasibility evaluation. The Project concept is based on involving a very strong managerial, financial and technical joint venture that is extremely expert in all facets of planning and implementing a large ethanol project; on locating the ethanol project at a highly desireable site; on utilizing a proven ethanol process; and onmore » developing a Project that is well suited to market requirements, resource availability and competitive factors. The results of marketing, economic, and financial studies are reported in this volume.« less

  13. AGRI Grain Power ethanol-for-fuel project feasibility-study report. Volume III. Project environmental/health/safety/ and socioeconomic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-04-01

    The AGRI GRAIN POWER (AGP) project, hereafter referred to as the Project, was formed to evaluate the commercial viability and assess the desireability of implementing a large grain based grass-roots anhydrous ethanol fuel project to be sited near Des Moines, Iowa. This report presents the results of a Project feasibility evaluation. The Project concept is based on involving a very strong managerial, financial and technical joint venture that is extremely expert in all facets of planning and implementing a large ethanol project; on locating the ethanol project at a highly desireable site; on utilizing a proven ethanol process; and onmore » developing a Project that is well suited to market requirements, resource availability and competitive factors. This volume contains the results of the environmental, health, safety, and socio-economic studies.« less

  14. Usage of FTIR-ATR as Non-Destructive Analysis of Selected Toxic Dyes

    NASA Astrophysics Data System (ADS)

    Bartošová, Alica; Blinová, Lenka; Sirotiak, Maroš; Michalíková, Anna

    2017-06-01

    The degradation of the environment which is due to the discharge of polluting wastewater from industrial sources poses a real problem in several countries. Textile industries use large volumes of water in their operations, discharging thus large volume of wastewater into the environment, most of which is untreated. The wastewater contains a variety of chemicals from various stages of process operations, including desizing, scouring, bleaching and dyeing. The main purpose of this paper is to introduce Infrared Spectrometry with Fourier transformation as a non-destructive method for study, identifation and rapid determination of selected representatives of cationic (Methylene Blue), azo (Congo Red, Eriochrome Black T) and nitroso (Naphthol Green B) dyes. In conjunction with the ATR technique, FTIR offers a reliable detection method of dyes without extraction by other dangerous substances. Spectral interpretation of dye spectra revealed valuable information about the identification and characterization of each group of dyes.

  15. Framework for cognitive analysis of dynamic perfusion computed tomography with visualization of large volumetric data

    NASA Astrophysics Data System (ADS)

    Hachaj, Tomasz; Ogiela, Marek R.

    2012-10-01

    The proposed framework for cognitive analysis of perfusion computed tomography images is a fusion of image processing, pattern recognition, and image analysis procedures. The output data of the algorithm consists of: regions of perfusion abnormalities, anatomy atlas description of brain tissues, measures of perfusion parameters, and prognosis for infracted tissues. That information is superimposed onto volumetric computed tomography data and displayed to radiologists. Our rendering algorithm enables rendering large volumes on off-the-shelf hardware. This portability of rendering solution is very important because our framework can be run without using expensive dedicated hardware. The other important factors are theoretically unlimited size of rendered volume and possibility of trading of image quality for rendering speed. Such rendered, high quality visualizations may be further used for intelligent brain perfusion abnormality identification, and computer aided-diagnosis of selected types of pathologies.

  16. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  17. New large volume hydrothermal reaction cell for studying chemical processes under supercritical hydrothermal conditions using time-resolved in situ neutron diffraction.

    PubMed

    Ok, Kang Min; O'Hare, Dermot; Smith, Ronald I; Chowdhury, Mohammed; Fikremariam, Hanna

    2010-12-01

    The design and testing of a new large volume Inconel pressure cell for the in situ study of supercritical hydrothermal syntheses using time-resolved neutron diffraction is introduced for the first time. The commissioning of this new cell is demonstrated by the measurement of the time-of-flight neutron diffraction pattern for TiO(2) (Anatase) in supercritical D(2)O on the POLARIS diffractometer at the United Kingdom's pulsed spallation neutron source, ISIS, Rutherford Appleton Laboratory. The sample can be studied over a wide range of temperatures (25-450 °C) and pressures (1-355 bar). This novel apparatus will now enable us to study the kinetics and mechanisms of chemical syntheses under extreme environments such as supercritical water, and in particular to study the crystallization of a variety of technologically important inorganic materials.

  18. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  19. The physics of large eruptions

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust

    2015-04-01

    Based on eruptive volumes, eruptions can be classified as follows: small if the volumes are from less than 0.001 km3 to 0.1 km3, moderate if the volumes are from 0.1 to 10 km3, and large if the volumes are from 10 km3 to 1000 km3 or larger. The largest known explosive and effusive eruptions have eruptive volumes of 4000-5000 km3. The physics of small to moderate eruptions is reasonably well understood. For a typical mafic magma chamber in a crust that behaves as elastic, about 0.1% of the magma leaves the chamber (erupted and injected as a dyke) during rupture and eruption. Similarly, for a typical felsic magma chamber, the eruptive/injected volume during rupture and eruption is about 4%. To provide small to moderate eruptions, chamber volumes of the order of several tens to several hundred cubic kilometres would be needed. Shallow crustal chambers of these sizes are common, and deep-crustal and upper-mantle reservoirs of thousands of cubic kilometres exist. Thus, elastic and poro-elastic chambers of typical volumes can account for small to moderate eruptive volumes. When the eruptions become large, with volumes of tens or hundreds of cubic kilometres or more, an ordinary poro-elastic mechanism can no longer explain the eruptive volumes. The required sizes of the magma chambers and reservoirs to explain such volumes are simply too large to be plausible. Here I propose that the mechanics of large eruptions is fundamentally different from that of small to moderate eruptions. More specifically, I suggest that all large eruptions derive their magmas from chambers and reservoirs whose total cavity-volumes are mechanically reduced very much during the eruption. There are two mechanisms by which chamber/reservoir cavity-volumes can be reduced rapidly so as to squeeze out much of, or all, their magmas. One is piston-like caldera collapse. The other is graben subsidence. During large slip on the ring-faults/graben-faults the associated chamber/reservoir shrinks in volume, thereby maintaining the excess magmatic pressure much longer than is possible in the ordinary poro-elastic mechanism. Here the physics of caldera subsidence and graben subsidence is regarded as basically the same. The geometric difference in the surface expression is simply a reflection of the horizontal cross-sectional shape of the underlying magma body. In this new mechanism, the large eruption is the consequence -- not the cause -- of the caldera/graben subsidence. Thus, once the conditions for large-scale subsidence of a caldera/graben during an unrest period are established, then the likelihood of large to very large eruptions can be assessed and used in reliable forecasting. Gudmundsson, A., 2012. Strengths and strain energies of volcanic edifices: implications for eruptions, collapse calderas and landslides. Nat. Hazards Earth Syst. Sci., 12, 2241-2258. Gudmundsson, A., 2014. Energy release in great earthquakes and eruptions. Front. Earth Science 2:10. doi: 10.3389/feart.2014.00010 Gudmundsson, A., Acocella, V., 2015.Volcanotectonics: Understanding the Structure, Deformation, and Dynamics of Volcanoes. Cambridge University Press (published 2015).

  20. Strong ion exchange in centrifugal partition extraction (SIX-CPE): effect of partition cell design and dimensions on purification process efficiency.

    PubMed

    Hamzaoui, Mahmoud; Hubert, Jane; Reynaud, Romain; Marchal, Luc; Foucault, Alain; Renault, Jean-Hugues

    2012-07-20

    The aim of this article was to evaluate the influence of the column design of a hydrostatic support-free liquid-liquid chromatography device on the process efficiency when the strong ion-exchange (SIX) development mode is used. The purification of p-hydroxybenzylglucosinolate (sinalbin) from a crude aqueous extract of white mustard seeds (Sinapis alba L.) was achieved on two types of devices: a centrifugal partition chromatograph (CPC) and a centrifugal partition extractor (CPE). They differ in the number, volume and geometry of their partition cells. The SIX-CPE process was evaluated in terms of productivity and sinalbin purification capability as compared to previously optimized SIX-CPC protocols that were carried out on columns of 200 mL and 5700 mL inner volume, respectively. The objective was to determine whether the decrease in partition cell number, the increase in their volume and the use of a "twin cell" design would induce a significant increase in productivity by applying higher mobile phase flow rate while maintaining a constant separation quality. 4.6g of sinalbin (92% recovery) were isolated from 25 g of a crude white mustard seed extract, in only 32 min and with a purity of 94.7%, thus corresponding to a productivity of 28 g per hour and per liter of column volume (g/h/LV(c)). Therefore, the SIX-CPE process demonstrates promising industrial technology transfer perspectives for the large-scale isolation of ionized natural products. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Seismic reflection images of the accretionary wedge of Costa Rica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shipley, T.H.; Stoffa, P.L.; McIntosh, K.

    The large-scale structure of modern accretionary wedges is known almost entirely from seismic reflection investigations using single or grids of two-dimensional profiles. The authors will report on the first three-dimensional seismic reflection data volume collected of a wedge. This data set covers a 9-km-wide {times} 22-km-long {times} 6-km-thick volume of the accretionary wedge just arcward of the Middle America Trench off Costa Rica. The three-dimensional processing has improved the imaging ability of the multichannel data, and the data volume allows mapping of structures from a few hundred meters to kilometers in size. These data illustrate the relationships between the basement,more » the wedge shape, and overlying slope sedimentary deposits. Reflections from within the wedge define the gross structural features and tectonic processes active along this particular convergent margin. So far, the analysis shows that the subdued basement relief (horst and graben structures seldom have relief of more than a few hundred meters off Costa Rica) does affect the larger scale through going structural features within the wedge. The distribution of mud volcanoes and amplitude anomalies associated with the large-scale wedge structures suggests that efficient fluid migration paths may extend from the top of the downgoing slab at the shelf edge out into the lower and middle slope region at a distance of 50-100 km. Offscraping of the uppermost (about 45 m) sediment occurs within 4 km of the trench, creating a small pile of sediments near the trench lower slope. Underplating of parts of the 400-m-thick subducted sedimentary section begins at a very shallow structural level, 4-10 km arcward of the trench. Volumetrically, the most important accretionary process is underplating.« less

  2. Comparison of spectroscopy technologies for improved monitoring of cell culture processes in miniature bioreactors

    PubMed Central

    van den Berg, Frans; Racher, Andrew J.; Martin, Elaine B.; Jaques, Colin

    2017-01-01

    Cell culture process development requires the screening of large numbers of cell lines and process conditions. The development of miniature bioreactor systems has increased the throughput of such studies; however, there are limitations with their use. One important constraint is the limited number of offline samples that can be taken compared to those taken for monitoring cultures in large‐scale bioreactors. The small volume of miniature bioreactor cultures (15 mL) is incompatible with the large sample volume (600 µL) required for bioanalysers routinely used. Spectroscopy technologies may be used to resolve this limitation. The purpose of this study was to compare the use of NIR, Raman, and 2D‐fluorescence to measure multiple analytes simultaneously in volumes suitable for daily monitoring of a miniature bioreactor system. A novel design‐of‐experiment approach is described that utilizes previously analyzed cell culture supernatant to assess metabolite concentrations under various conditions while providing optimal coverage of the desired design space. Multivariate data analysis techniques were used to develop predictive models. Model performance was compared to determine which technology is more suitable for this application. 2D‐fluorescence could more accurately measure ammonium concentration (RMSECV 0.031 g L−1) than Raman and NIR. Raman spectroscopy, however, was more robust at measuring lactate and glucose concentrations (RMSECV 1.11 and 0.92 g L−1, respectively) than the other two techniques. The findings suggest that Raman spectroscopy is more suited for this application than NIR and 2D‐fluorescence. The implementation of Raman spectroscopy increases at‐line measuring capabilities, enabling daily monitoring of key cell culture components within miniature bioreactor cultures. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:337–346, 2017 PMID:28271638

  3. Bin recycling strategy for improving the histogram precision on GPU

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.

    2016-07-01

    Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.

  4. Lightning activity and severe storm structure

    NASA Technical Reports Server (NTRS)

    Taylor, W. L.; Brandes, E. A.; Rust, W. D.; Macgorman, D. R.

    1984-01-01

    Space-time mapping of VHF sources from four severe storms on June 19, 1980 reveals that lightning processes for cloud-to-ground (CG) and large intracloud (IC) flashes are confined to an altitude below about 10 km and closely associated with the central regions of high reflectivity. Another class of IC flashes produces a splattering of sources within the storms' main electrically active volumes and also within the large divergent wind canopy aloft. There is no apparent temporal association between the small high altitude IC flashes that occur almost continuously and the large IC and CG flashes that occur sporadically in the lower portions of storms.

  5. Cycle time and cost reduction in large-size optics production

    NASA Astrophysics Data System (ADS)

    Hallock, Bob; Shorey, Aric; Courtney, Tom

    2005-09-01

    Optical fabrication process steps have remained largely unchanged for decades. Raw glass blanks have been rough-machined, generated to near net shape, loose abrasive or fine bound diamond ground and then polished. This set of processes is sequential and each subsequent operation removes the damage and micro cracking induced by the prior operational step. One of the long-lead aspects of this process has been the glass polishing. Primarily, this has been driven by the need to remove relatively large volumes of glass material compared to the polishing removal rate to ensure complete damage removal. The secondary time driver has been poor convergence to final figure and the corresponding polish-metrology cycles. The overall cycle time and resultant cost due to labor, equipment utilization and shop efficiency is increased, often significantly, when the optical prescription is aspheric. In addition to the long polishing cycle times, the duration of the polishing time is often very difficult to predict given that current polishing processes are not deterministic processes. This paper will describe a novel approach to large optics finishing, relying on several innovative technologies to be presented and illustrated through a variety of examples. The cycle time reductions enabled by this approach promises to result in significant cost and lead-time reductions for large size optics. In addition, corresponding increases in throughput will provide for less capital expenditure per square meter of optic produced. This process, comparative cycles time estimates and preliminary results will be discussed.

  6. Integrated processes for desalination and salt production: A mini-review

    NASA Astrophysics Data System (ADS)

    Wenten, I. Gede; Ariono, Danu; Purwasasmita, Mubiar; Khoirudin

    2017-03-01

    The scarcity of fresh water due to the rapid growth of population and industrial activities has increased attention on desalination process as an alternative freshwater supply. In desalination process, a large volume of saline water is treated to produce freshwater while a concentrated brine is discharged back into the environment. The concentrated brine contains a high concentration of salt and also chemicals used during desalination operations. Due to environmental impacts arising from improper treatment of the brine and more rigorous regulations of the pollution control, many efforts have been devoted to minimize, treat, or reuse the rejected brine. One of the most promising alternatives for brine handling is reusing the brine which can reduce pollution, minimize waste volume, and recover valuable salt. Integration of desalination and salt production can be implemented to reuse the brine by recovering water and the valuable salts. The integrated processes can achieve zero liquid discharge, increase water recovery, and produce the profitable salt which can reduce the overall desalination cost. This paper gives an overview of desalination processes and the brine impacts. The integrated processes, including their progress and advantages in dual-purpose desalination and salt production are discussed.

  7. Design and optimization of hot-filling pasteurization conditions: Cupuaçu (Theobroma grandiflorum) fruit pulp case study.

    PubMed

    Silva, Filipa V M; Martins, Rui C; Silva, Cristina L M

    2003-01-01

    Cupuaçu (Theobroma grandiflorum) is an Amazonian tropical fruit with a great economic potential. Pasteurization, by a hot-filling technique, was suggested for the preservation of this fruit pulp at room temperature. The process was implemented with local communities in Brazil. The process was modeled, and a computer program was written in Turbo Pascal. The relative importance among the pasteurization process variables (initial product temperature, heating rate, holding temperature and time, container volume and shape, cooling medium type and temperature) on the microbial target and quality was investigated, by performing simulations according to a screening factorial design. Afterward, simulations of the different processing conditions were carried out. The holding temperature (T(F)) and time (t(hold)) affected pasteurization value (P), and the container volume (V) influenced largely the quality parameters. The process was optimized for retail (1 L) and industrial (100 L) size containers, by maximizing volume average quality in terms of color lightness and sensory "fresh notes" and minimizing volume average total color difference and sensory "cooked notes". Equivalent processes were designed and simulated (P(91)( degrees )(C) = 4.6 min on Alicyclobacillus acidoterrestris spores) and final quality (color, flavor, and aroma attributes) was evaluated. Color was slightly affected by the pasteurization processes, and few differences were observed between the six equivalent treatments designed (T(F) between 80 and 97 degrees C). T(F) >/= 91 degrees C minimized "cooked notes" and maximized "fresh notes" of cupuaçu pulp aroma and flavor for 1 L container. Concerning the 100 L size, the "cooked notes" development can be minimized with T(F) >/= 91 degrees C, but overall the quality was greatly degraded as a result of the long cooling times. A more efficient method to speed up the cooling phase was recommended, especially for the industrial size of containers.

  8. Defect-induced solid state amorphization of molecular crystals

    NASA Astrophysics Data System (ADS)

    Lei, Lei; Carvajal, Teresa; Koslowski, Marisol

    2012-04-01

    We investigate the process of mechanically induced amorphization in small molecule organic crystals under extensive deformation. In this work, we develop a model that describes the amorphization of molecular crystals, in which the plastic response is calculated with a phase field dislocation dynamics theory in four materials: acetaminophen, sucrose, γ-indomethacin, and aspirin. The model is able to predict the fraction of amorphous material generated in single crystals for a given applied stress. Our results show that γ-indomethacin and sucrose demonstrate large volume fractions of amorphous material after sufficient plastic deformation, while smaller amorphous volume fractions are predicted in acetaminophen and aspirin, in agreement with experimental observation.

  9. Large volume continuous counterflow dialyzer has high efficiency

    NASA Technical Reports Server (NTRS)

    Mandeles, S.; Woods, E. C.

    1967-01-01

    Dialyzer separates macromolecules from small molecules in large volumes of solution. It takes advantage of the high area/volume ratio in commercially available 1/4-inch dialysis tubing and maintains a high concentration gradient at the dialyzing surface by counterflow.

  10. Daily Planet Redesign: eZ Publish Web Content Management Implementation

    NASA Technical Reports Server (NTRS)

    Dutra, Jayne E.

    2006-01-01

    This viewgraph presentation reviews the process of the redesign of the Daily . Planet news letter as a content management implementation project. This is a site that is an internal news site that acts as a communication vehicle for a large volume of content. The Objectives for the site redesign was: (1) Clean visual design, (2) Facilitation of publication processes, (3) More efficient maintenance mode, (4) Automated publishing to internal portal, (5) Better navigation through improved site IA, (6) Archiving and retrieval functionality, (7) Back to basics on fundamental business goals. The CM is a process not a software package

  11. Chemical P recovery from dairy manure using the Quick Wash process and use of low-P washed manure solids as soil amendments.

    USDA-ARS?s Scientific Manuscript database

    Large volumes of manure generated by intensive dairy production and their final land disposal is a significant environmental problem. Due to the imbalance of nitrogen (N) and phosphorus (P) (4:1), emendation of soils with dairy manure entails a raise in available soil P levels beyond the crops' capa...

  12. Phalanx. Volume 47, Number 4

    DTIC Science & Technology

    2014-12-01

    the official channels and processes of the national defense community. The principal venue for such discussions is the ...going to shape the Department for the next couple decades and determine in large part on whether or not we have a future that is defined more by...bar for analytic excellence, you have the opportunity to ensure a bright future for yourself and the MORS

  13. Enhancing the Professionalism of Purchasing Agents (GS 1105s) within the Department of the Army.

    DTIC Science & Technology

    1987-09-01

    automated small purchase process offers distinct advantages, as well as cost savings in both time and money , for activities with a large volume of small...resident) Massasoit Northeast $1,888 $6,400 Boston, MA Broward Southeast 1,327 2,765 Ft. Lauder - dale, FL Rock Valley Midwest 1,728 7,552 Rockford, IL De

  14. Grid Computing in K-12 Schools. Soapbox Digest. Volume 3, Number 2, Fall 2004

    ERIC Educational Resources Information Center

    AEL, 2004

    2004-01-01

    Grid computing allows large groups of computers (either in a lab, or remote and connected only by the Internet) to extend extra processing power to each individual computer to work on components of a complex request. Grid middleware, recognizing priorities set by systems administrators, allows the grid to identify and use this power without…

  15. An Exploratory Study of the Effects of Online Course Efficiency Perceptions on Student Evaluation of Teaching (SET) Measures

    ERIC Educational Resources Information Center

    Estelami, Hooman

    2016-01-01

    One of the fundamental drivers of the growing use of distance learning methods in modern business education has been the efficiency gains associated with this method of educational delivery. Distance methods benefit both students and educational institutions as they facilitate the processing of large volumes of learning material to overcome…

  16. Systematic identification of latent disease-gene associations from PubMed articles.

    PubMed

    Zhang, Yuji; Shen, Feichen; Mojarad, Majid Rastegar; Li, Dingcheng; Liu, Sijia; Tao, Cui; Yu, Yue; Liu, Hongfang

    2018-01-01

    Recent scientific advances have accumulated a tremendous amount of biomedical knowledge providing novel insights into the relationship between molecular and cellular processes and diseases. Literature mining is one of the commonly used methods to retrieve and extract information from scientific publications for understanding these associations. However, due to large data volume and complicated associations with noises, the interpretability of such association data for semantic knowledge discovery is challenging. In this study, we describe an integrative computational framework aiming to expedite the discovery of latent disease mechanisms by dissecting 146,245 disease-gene associations from over 25 million of PubMed indexed articles. We take advantage of both Latent Dirichlet Allocation (LDA) modeling and network-based analysis for their capabilities of detecting latent associations and reducing noises for large volume data respectively. Our results demonstrate that (1) the LDA-based modeling is able to group similar diseases into disease topics; (2) the disease-specific association networks follow the scale-free network property; (3) certain subnetwork patterns were enriched in the disease-specific association networks; and (4) genes were enriched in topic-specific biological processes. Our approach offers promising opportunities for latent disease-gene knowledge discovery in biomedical research.

  17. Comparing memory-efficient genome assemblers on stand-alone and cloud infrastructures.

    PubMed

    Kleftogiannis, Dimitrios; Kalnis, Panos; Bajic, Vladimir B

    2013-01-01

    A fundamental problem in bioinformatics is genome assembly. Next-generation sequencing (NGS) technologies produce large volumes of fragmented genome reads, which require large amounts of memory to assemble the complete genome efficiently. With recent improvements in DNA sequencing technologies, it is expected that the memory footprint required for the assembly process will increase dramatically and will emerge as a limiting factor in processing widely available NGS-generated reads. In this report, we compare current memory-efficient techniques for genome assembly with respect to quality, memory consumption and execution time. Our experiments prove that it is possible to generate draft assemblies of reasonable quality on conventional multi-purpose computers with very limited available memory by choosing suitable assembly methods. Our study reveals the minimum memory requirements for different assembly programs even when data volume exceeds memory capacity by orders of magnitude. By combining existing methodologies, we propose two general assembly strategies that can improve short-read assembly approaches and result in reduction of the memory footprint. Finally, we discuss the possibility of utilizing cloud infrastructures for genome assembly and we comment on some findings regarding suitable computational resources for assembly.

  18. Systematic identification of latent disease-gene associations from PubMed articles

    PubMed Central

    Mojarad, Majid Rastegar; Li, Dingcheng; Liu, Sijia; Tao, Cui; Yu, Yue; Liu, Hongfang

    2018-01-01

    Recent scientific advances have accumulated a tremendous amount of biomedical knowledge providing novel insights into the relationship between molecular and cellular processes and diseases. Literature mining is one of the commonly used methods to retrieve and extract information from scientific publications for understanding these associations. However, due to large data volume and complicated associations with noises, the interpretability of such association data for semantic knowledge discovery is challenging. In this study, we describe an integrative computational framework aiming to expedite the discovery of latent disease mechanisms by dissecting 146,245 disease-gene associations from over 25 million of PubMed indexed articles. We take advantage of both Latent Dirichlet Allocation (LDA) modeling and network-based analysis for their capabilities of detecting latent associations and reducing noises for large volume data respectively. Our results demonstrate that (1) the LDA-based modeling is able to group similar diseases into disease topics; (2) the disease-specific association networks follow the scale-free network property; (3) certain subnetwork patterns were enriched in the disease-specific association networks; and (4) genes were enriched in topic-specific biological processes. Our approach offers promising opportunities for latent disease-gene knowledge discovery in biomedical research. PMID:29373609

  19. Development testing of large volume water sprays for warm fog dispersal

    NASA Technical Reports Server (NTRS)

    Keller, V. W.; Anderson, B. J.; Burns, R. A.; Lala, G. G.; Meyer, M. B.; Beard, K. V.

    1986-01-01

    A new brute-force method of warm fog dispersal is described. The method uses large volume recycled water sprays to create curtains of falling drops through which the fog is processed by the ambient wind and spray induced air flow. Fog droplets are removed by coalescence/rainout. The efficiency of the technique depends upon the drop size spectra in the spray, the height to which the spray can be projected, the efficiency with which fog laden air is processed through the curtain of spray, and the rate at which new fog may be formed due to temperature differences between the air and spray water. Results of a field test program, implemented to develop the data base necessary to assess the proposed method, are presented. Analytical calculations based upon the field test results indicate that this proposed method of warm fog dispersal is feasible. Even more convincingly, the technique was successfully demonstrated in the one natural fog event which occurred during the test program. Energy requirements for this technique are an order of magnitude less than those to operate a thermokinetic system. An important side benefit is the considerable emergency fire extinguishing capability it provides along the runway.

  20. Commercial aspects of epitaxial thin film growth in outer space

    NASA Technical Reports Server (NTRS)

    Ignatiev, Alex; Chu, C. W.

    1988-01-01

    A new concept for materials processing in space exploits the ultra vacuum component of space for thin film epitaxial growth. The unique low earth orbit space environment is expected to yield 10 to the -14th torr or better pressures, semiinfinite pumping speeds and large ultra vacuum volume (about 100 cu m) without walls. These space ultra vacuum properties promise major improvement in the quality, unique nature, and the throughput of epitaxially grown materials especially in the area of semiconductors for microelectronics use. For such thin film materials there is expected a very large value added from space ultra vacuum processing, and as a result the application of the epitaxial thin film growth technology to space could lead to major commercial efforts in space.

  1. Concepts for on-board satellite image registration. Volume 3: Impact of VLSI/VHSIC on satellite on-board signal processing

    NASA Technical Reports Server (NTRS)

    Aanstoos, J. V.; Snyder, W. E.

    1981-01-01

    Anticipated major advances in integrated circuit technology in the near future are described as well as their impact on satellite onboard signal processing systems. Dramatic improvements in chip density, speed, power consumption, and system reliability are expected from very large scale integration. Improvements are expected from very large scale integration enable more intelligence to be placed on remote sensing platforms in space, meeting the goals of NASA's information adaptive system concept, a major component of the NASA End-to-End Data System program. A forecast of VLSI technological advances is presented, including a description of the Defense Department's very high speed integrated circuit program, a seven-year research and development effort.

  2. Big Data in the Industry - Overview of Selected Issues

    NASA Astrophysics Data System (ADS)

    Gierej, Sylwia

    2017-12-01

    This article reviews selected issues related to the use of Big Data in the industry. The aim is to define the potential scope and forms of using large data sets in manufacturing companies. By systematically reviewing scientific and professional literature, selected issues related to the use of mass data analytics in production were analyzed. A definition of Big Data was presented, detailing its main attributes. The importance of mass data processing technology in the development of Industry 4.0 concept has been highlighted. Subsequently, attention was paid to issues such as production process optimization, decision making and mass production individualisation, and indicated the potential for large volumes of data. As a result, conclusions were drawn regarding the potential of using Big Data in the industry.

  3. Update on Bio-Refining and Nanocellulose Composite Materials Manufacturing.

    PubMed

    Postek, Michael T; Poster, Dianne L

    2017-01-01

    Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. One of the factors limiting the potential of nanocellulose and the vast array of potential new products is the ability to produce high-volume quantities of this nano-material. However, recent research has demonstrated that nanocellulose can be efficently produced in large volumes from wood at relatively low cost by the incorporation of ionizing radiation in the process stream. Ionizing radiation causes significant break down of the polysaccharides and leads to the production of potentially useful gaseous products such as H 2 and CO. Ionizing radiation processing remains an open field, ripe for innovation and application. This presentation will review the strong collaboration between the National Institute of Standards and Technology (NIST) and its academic partners pursuing the demonstration of applied ionizing radiation processing to plant materials for the manufacturing and characterization of novel nanomaterials.

  4. Update on Bio-Refining and Nanocellulose Composite Materials Manufacturing

    PubMed Central

    Postek, Michael T.; Poster, Dianne L.

    2017-01-01

    Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. One of the factors limiting the potential of nanocellulose and the vast array of potential new products is the ability to produce high-volume quantities of this nano-material. However, recent research has demonstrated that nanocellulose can be efficently produced in large volumes from wood at relatively low cost by the incorporation of ionizing radiation in the process stream. Ionizing radiation causes significant break down of the polysaccharides and leads to the production of potentially useful gaseous products such as H2 and CO. Ionizing radiation processing remains an open field, ripe for innovation and application. This presentation will review the strong collaboration between the National Institute of Standards and Technology (NIST) and its academic partners pursuing the demonstration of applied ionizing radiation processing to plant materials for the manufacturing and characterization of novel nanomaterials. PMID:29225398

  5. Update on bio-refining and nanocellulose composite materials manufacturing

    NASA Astrophysics Data System (ADS)

    Postek, Michael T.; Poster, Dianne L.

    2017-08-01

    Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. One of the factors limiting the potential of nanocellulose and the vast array of potential new products is the ability to produce high-volume quantities of this nano-material. However, recent research has demonstrated that nanocellulose can be efficently produced in large volumes from wood at relatively low cost by the incorporation of ionizing radiation in the process stream. Ionizing radiation causes significant break down of the polysaccharides and leads to the production of potentially useful gaseous products such as H2 and CO. Ionizing radiation processing remains an open field, ripe for innovation and application. This presentation will review the strong collaboration between the National Institute of Standards and Technology (NIST) and its academic partners pursuing the demonstration of applied ionizing radiation processing to plant materials for the manufacturing and characterization of novel nanomaterials.

  6. Effect of hydrothermal liquefaction aqueous phase recycling on bio-crude yields and composition.

    PubMed

    Biller, Patrick; Madsen, René B; Klemmer, Maika; Becker, Jacob; Iversen, Bo B; Glasius, Marianne

    2016-11-01

    Hydrothermal liquefaction (HTL) is a promising thermo-chemical processing technology for the production of biofuels but produces large amounts of process water. Therefore recirculation of process water from HTL of dried distillers grains with solubles (DDGS) is investigated. Two sets of recirculation on a continuous reactor system using K2CO3 as catalyst were carried out. Following this, the process water was recirculated in batch experiments for a total of 10 rounds. To assess the effect of alkali catalyst, non-catalytic HTL process water recycling was performed with 9 recycle rounds. Both sets of experiments showed a large increase in bio-crude yields from approximately 35 to 55wt%. The water phase and bio-crude samples from all experiments were analysed via quantitative gas chromatography-mass spectrometry (GC-MS) to investigate their composition and build-up of organic compounds. Overall the results show an increase in HTL conversion efficiency and a lower volume, more concentrated aqueous by-product following recycling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Cerebellum and personality traits.

    PubMed

    Petrosini, Laura; Cutuli, Debora; Picerni, Eleonora; Laricchiuta, Daniela

    2015-02-01

    Personality traits are multidimensional traits comprising cognitive, emotional, and behavioral characteristics, and a wide array of cerebral structures mediate individual variability. Differences in personality traits covary with brain morphometry in specific brain regions. A cerebellar role in emotional and affective processing and on personality characteristics has been suggested. In a large sample of healthy subjects of both sexes and differently aged, the macro- and micro-structural variations of the cerebellum were correlated with the scores obtained in the Temperament and Character Inventory (TCI) by Cloninger. Cerebellar volumes were associated positively with Novelty Seeking scores and negatively with Harm Avoidance scores. Given the cerebellar contribution in personality traits and emotional processing, we investigated the cerebellar involvement even in alexithymia, construct of personality characterized by impairment in cognitive, emotional, and affective processing. Interestingly, the subjects with high alexithymic traits had larger volumes in the bilateral Crus 1. The cerebellar substrate for some personality dimensions extends the relationship between personality and brain areas to a structure up to now thought to be involved mainly in motor and cognitive functions, much less in emotional processes and even less in personality individual differences. The enlarged volumes of Crus 1 in novelty seekers and alexithymics support the tendency to action featuring both personality constructs. In fact, Novelty Seeking and alexithymia are rooted in behavior and inescapably have a strong action component, resulting in stronger responses in the structures more focused on action and embodiment, as the cerebellum is.

  8. Cone Penetration Testing, a new approach to quantify coastal-deltaic land subsidence by peat consolidation

    NASA Astrophysics Data System (ADS)

    Koster, Kay; Erkens, Gilles; Zwanenburg, Cor

    2016-04-01

    It is undisputed that land subsidence threatens coastal-deltaic lowlands all over the world. Any loss of elevation (on top of sea level rise) increases flood risk in these lowlands, and differential subsidence may cause damage to infrastructure and constructions. Many of these settings embed substantial amounts of peat, which is, due to its mechanically weak organic composition, one of the main drivers of subsidence. Peat is very susceptible to volume reduction by loading and drainage induced consolidation, which dissipates pore water, resulting in a tighter packing of the organic components. Often, the current state of consolidation of peat embedded within coastal-deltaic subsidence hotspots (e.g. Venice lagoon, Mississippi delta, San Joaquin delta, Kalimantan peatlands), is somewhere between its initial (natural) and maximum compressed stage. Quantifying the current state regarding peat volume loss, is of utmost importance to predict potential (near) future subsidence when draining or loading an area. The processes of subsidence often afflict large areas (>103 km2), thus demanding large datasets to assess the current state of the subsurface. In contrast to data describing the vertical motions of the actual surface (geodesy, satellite imagery), subsurface information applicable for subsidence analysis are often lacking in subsiding deltas. This calls for new initiatives to bridge that gap. Here we introduce Cone Penetration Testing (CPT) to quantify the amount of volume loss peat layers embedded within the Holland coastal plain (the Netherlands) experienced. CPT measures soil mechanical strength, and hundreds of thousands of CPTs are conducted each year on all continents. We analyzed 28 coupled CPT-borehole observations, and found strong empirical relations between volume loss and increased peat mechanical strength. The peat lost between ~20 - 95% of its initial thickness by dissipation of excess pore water. An increase in 0.1 - 0.4 MPa of peat strength is accountable for 20 - 75 % of the volume loss, and 0.4 - 0.7 MPa for 75 - 95 % volume loss. This indicates that large amounts of volume by water dissipation has to be lost, before peat experiences a serious increase in strength, which subsequently continuous to increase with only small amount of volume loss. To demonstrate the robustness of our approach to the international field of land subsidence, we applied the obtained empirical relations to previously published CPT logs deriving from the peat-rich San Joaquin-Sacramento delta and the Kalimantan peatlands, and found volume losses that correspond with previously published results. Furthermore, we used the obtained results to predict maximum surface lowering for these areas by consolidation. In conclusion, these promising results and its worldwide popularity yielding large datasets, open the door for CPT as a generic method to contribute to quantifying the imminent threat of coastal-deltaic land subsidence.

  9. Sedimentation as a Control for Large Submarine Landslides: Mechanical Modeling and Analysis of the Santa Barbara Basin

    NASA Astrophysics Data System (ADS)

    Stoecklin, A.; Friedli, B.; Puzrin, A. M.

    2017-11-01

    The volume of submarine landslides is a key controlling factor for their damage potential. Particularly large landslides are found in active sedimentary regions. However, the mechanism controlling their volume, and in particular their thickness, remains unclear. Here we present a mechanism that explains how rapid sedimentation can lead to localized slope failure at a preferential depth and set the conditions for the emergence of large-scale slope-parallel landslides. We account for the contractive shearing behavior of the sediments, which locally accelerates the development of overpressures in the pore fluid, even on very mild slopes. When applied to the Santa Barbara basin, the mechanism offers an explanation for the regional variation in landslide thickness and their sedimentation-controlled recurrence. Although earthquakes are the most likely trigger for these mass movements, our results suggest that the sedimentation process controls the geometry of their source region. The mechanism introduced here is generally applicable and can provide initial conditions for subsequent landslide triggering, runout, and tsunami-source analyses in sedimentary regions.

  10. Application of computer vision to automatic prescription verification in pharmaceutical mail order

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.

    2005-05-01

    In large volume pharmaceutical mail order, before shipping out prescriptions, licensed pharmacists ensure that the drug in the bottle matches the information provided in the patient prescription. Typically, the pharmacist has about 2 sec to complete the prescription verification process of one prescription. Performing about 1800 prescription verification per hour is tedious and can generate human errors as a result of visual and brain fatigue. Available automatic drug verification systems are limited to a single pill at a time. This is not suitable for large volume pharmaceutical mail order, where a prescription can have as many as 60 pills and where thousands of prescriptions are filled every day. In an attempt to reduce human fatigue, cost, and limit human error, the automatic prescription verification system (APVS) was invented to meet the need of large scale pharmaceutical mail order. This paper deals with the design and implementation of the first prototype online automatic prescription verification machine to perform the same task currently done by a pharmacist. The emphasis here is on the visual aspects of the machine. The system has been successfully tested on 43,000 prescriptions.

  11. Finding the Cold Needle in a Warm Haystack: Infrared Imaging Applied to Locating Cryo-cooled Crystals in Loops

    NASA Technical Reports Server (NTRS)

    Snell, Edward; vanderWoerd, Mark

    2003-01-01

    Thermally imaging the cryocooling processes of crystals has been demonstrated showing the progression of a cold wave through a crystal from the face closest to the origin of the coldstream ending at the point furthest away. During these studies large volume crystals were clearly distinguished from the loop holding them. Large volume crystals, used for neutron studies, were chosen deliberately to enhance the imaging. The different infrared transmission and reflectance properties of the crystal in comparison to the cryo-protectant are thought to be the parameter that produces the contrast making the crystal visible. As an application of the technology to locating crystals, more small crystals of lysozyme and a bFGF/dna complex were cryo-protected and imaged in large loops. The crystals were clearly distinguished from the vitrified solution. In the case of the bFGF/dna complex the illumination had to be carefully manipulated to enable the crystal to be seen in the visible spectrum. These preliminary results will be presented along with advantages and disadvantages of the technique and a discussion of how it might be applied.

  12. Evaluating non-relational storage technology for HEP metadata and meta-data catalog

    NASA Astrophysics Data System (ADS)

    Grigorieva, M. A.; Golosova, M. V.; Gubin, M. Y.; Klimentov, A. A.; Osipova, V. V.; Ryabinkin, E. A.

    2016-10-01

    Large-scale scientific experiments produce vast volumes of data. These data are stored, processed and analyzed in a distributed computing environment. The life cycle of experiment is managed by specialized software like Distributed Data Management and Workload Management Systems. In order to be interpreted and mined, experimental data must be accompanied by auxiliary metadata, which are recorded at each data processing step. Metadata describes scientific data and represent scientific objects or results of scientific experiments, allowing them to be shared by various applications, to be recorded in databases or published via Web. Processing and analysis of constantly growing volume of auxiliary metadata is a challenging task, not simpler than the management and processing of experimental data itself. Furthermore, metadata sources are often loosely coupled and potentially may lead to an end-user inconsistency in combined information queries. To aggregate and synthesize a range of primary metadata sources, and enhance them with flexible schema-less addition of aggregated data, we are developing the Data Knowledge Base architecture serving as the intelligence behind GUIs and APIs.

  13. Development of an interactive data base management system for capturing large volumes of data.

    PubMed

    Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L

    1995-10-01

    Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.

  14. Driftcretions: The legacy impacts of driftwood on shoreline morphology

    NASA Astrophysics Data System (ADS)

    Kramer, Natalie; Wohl, Ellen

    2015-07-01

    This research demonstrates how vegetation interacts with physical processes to govern landscape development. We quantify and describe interactions among driftwood, sedimentation, and vegetation for Great Slave Lake, which is used as proxy for shoreline dynamics and landforms before deforestation and wood removal along major waterways. We introduce driftcretion to describe large, persistent concentrations of driftwood that interact with vegetation and sedimentation to influence shoreline evolution. We report the volume and distribution of driftwood along shorelines, the morphological impacts of driftwood delivery throughout the Holocene, and rates of driftwood accretion. Driftcretions facilitate the formation of complex, diverse morphologies that increase biological productivity and organic carbon capture and buffer against erosion. Driftcretions should be common on shorelines receiving a large wood supply and with processes which store wood permanently. We encourage others to work in these depositional zones to understand the physical and biological impacts of large wood export from river basins.

  15. Decadal to monthly timescales of magma transfer and reservoir growth at a caldera volcano.

    PubMed

    Druitt, T H; Costa, F; Deloule, E; Dungan, M; Scaillet, B

    2012-02-01

    Caldera-forming volcanic eruptions are low-frequency, high-impact events capable of discharging tens to thousands of cubic kilometres of magma explosively on timescales of hours to days, with devastating effects on local and global scales. Because no such eruption has been monitored during its long build-up phase, the precursor phenomena are not well understood. Geophysical signals obtained during recent episodes of unrest at calderas such as Yellowstone, USA, and Campi Flegrei, Italy, are difficult to interpret, and the conditions necessary for large eruptions are poorly constrained. Here we present a study of pre-eruptive magmatic processes and their timescales using chemically zoned crystals from the 'Minoan' caldera-forming eruption of Santorini volcano, Greece, which occurred in the late 1600s BC. The results provide insights into how rapidly large silicic systems may pass from a quiescent state to one on the edge of eruption. Despite the large volume of erupted magma (40-60 cubic kilometres), and the 18,000-year gestation period between the Minoan eruption and the previous major eruption, most crystals in the Minoan magma record processes that occurred less than about 100 years before the eruption. Recharge of the magma reservoir by large volumes of silicic magma (and some mafic magma) occurred during the century before eruption, and mixing between different silicic magma batches was still taking place during the final months. Final assembly of large silicic magma reservoirs may occur on timescales that are geologically very short by comparison with the preceding repose period, with major growth phases immediately before eruption. These observations have implications for the monitoring of long-dormant, but potentially active, caldera systems.

  16. Sodium content of processed foods in the United Kingdom: analysis of 44,000 foods purchased by 21,000 households123

    PubMed Central

    Capelin, Cathy; Dunford, Elizabeth K; Webster, Jacqueline L; Neal, Bruce C; Jebb, Susan A

    2011-01-01

    Background: In the United Kingdom, sodium reduction targets have been set for a large number of processed food categories. Assessment and monitoring are essential to evaluate progress. Objectives: Our aim was to determine whether household consumer panel food-purchasing data could be used to assess the sodium content of processed foods. Our further objectives were to estimate the mean sodium content of UK foods by category and undertake analyses weighted by food-purchasing volumes. Design: Data were obtained for 21,108 British households between October 2008 and September 2009. Purchasing data (product description, product weight, annual purchases) and sodium values (mg/100 g) were collated for all food categories known to be major contributors to sodium intake. Unweighted and weighted mean sodium values were calculated. Results: Data were available for 44,372 food products. The largest contributors to sodium purchases were table salt (23%), processed meat (18%), bread and bakery products (13%), dairy products (12%), and sauces and spreads (11%). More than one-third of sodium purchased (37%) was accounted for by 5 food categories: bacon, bread, milk, cheese, and sauces. For some food groups (bread and bakery, cereals and cereal products, processed meat), purchase-weighted means were 18–35% higher than unweighted means, suggesting that market leaders have higher sodium contents than the category mean. Conclusion: The targeting of sodium reduction in a small number of food categories and focusing on products sold in the highest volumes could lead to large decreases in sodium available for consumption and therefore to gains in public health. PMID:21191142

  17. Sodium content of processed foods in the United Kingdom: analysis of 44,000 foods purchased by 21,000 households.

    PubMed

    Ni Mhurchu, Cliona; Capelin, Cathy; Dunford, Elizabeth K; Webster, Jacqueline L; Neal, Bruce C; Jebb, Susan A

    2011-03-01

    In the United Kingdom, sodium reduction targets have been set for a large number of processed food categories. Assessment and monitoring are essential to evaluate progress. Our aim was to determine whether household consumer panel food-purchasing data could be used to assess the sodium content of processed foods. Our further objectives were to estimate the mean sodium content of UK foods by category and undertake analyses weighted by food-purchasing volumes. Data were obtained for 21,108 British households between October 2008 and September 2009. Purchasing data (product description, product weight, annual purchases) and sodium values (mg/100 g) were collated for all food categories known to be major contributors to sodium intake. Unweighted and weighted mean sodium values were calculated. Data were available for 44,372 food products. The largest contributors to sodium purchases were table salt (23%), processed meat (18%), bread and bakery products (13%), dairy products (12%), and sauces and spreads (11%). More than one-third of sodium purchased (37%) was accounted for by 5 food categories: bacon, bread, milk, cheese, and sauces. For some food groups (bread and bakery, cereals and cereal products, processed meat), purchase-weighted means were 18-35% higher than unweighted means, suggesting that market leaders have higher sodium contents than the category mean. The targeting of sodium reduction in a small number of food categories and focusing on products sold in the highest volumes could lead to large decreases in sodium available for consumption and therefore to gains in public health.

  18. The GRIDView Visualization Package

    NASA Astrophysics Data System (ADS)

    Kent, B. R.

    2011-07-01

    Large three-dimensional data cubes, catalogs, and spectral line archives are increasingly important elements of the data discovery process in astronomy. Visualization of large data volumes is of vital importance for the success of large spectral line surveys. Examples of data reduction utilizing the GRIDView software package are shown. The package allows users to manipulate data cubes, extract spectral profiles, and measure line properties. The package and included graphical user interfaces (GUIs) are designed with pipeline infrastructure in mind. The software has been used with great success analyzing spectral line and continuum data sets obtained from large radio survey collaborations. The tools are also important for multi-wavelength cross-correlation studies and incorporate Virtual Observatory client applications for overlaying database information in real time as cubes are examined by users.

  19. Earth observing system. Output data products and input requirements, version 2.0. Volume 1: Instrument data product characteristics

    NASA Technical Reports Server (NTRS)

    Lu, Yun-Chi; Chang, Hyo Duck; Krupp, Brian; Kumar, Ravindra; Swaroop, Anand

    1992-01-01

    Information on Earth Observing System (EOS) output data products and input data requirements that has been compiled by the Science Processing Support Office (SPSO) at GSFC is presented. Since Version 1.0 of the SPSO Report was released in August 1991, there have been significant changes in the EOS program. In anticipation of a likely budget cut for the EOS Project, NASA HQ restructured the EOS program. An initial program consisting of two large platforms was replaced by plans for multiple, smaller platforms, and some EOS instruments were either deselected or descoped. Updated payload information reflecting the restructured EOS program superseding the August 1991 version of the SPSO report is included. This report has been expanded to cover information on non-EOS data products, and consists of three volumes (Volumes 1, 2, and 3). Volume 1 provides information on instrument outputs and input requirements. Volume 2 is devoted to Interdisciplinary Science (IDS) outputs and input requirements, including the 'best' and 'alternative' match analysis. Volume 3 provides information about retrieval algorithms, non-EOS input requirements of instrument teams and IDS investigators, and availability of non-EOS data products at seven primary Distributed Active Archive Centers (DAAC's).

  20. A new model of reaction-driven cracking: fluid volume consumption and tensile failure during serpentinization

    NASA Astrophysics Data System (ADS)

    Eichenbaum-Pikser, J. M.; Spiegelman, M. W.; Kelemen, P. B.; Wilson, C. R.

    2013-12-01

    Reactive fluid flow plays an important role in a wide range of geodynamic processes, such as melt migration, formation of hydrous minerals on fault surfaces, and chemical weathering. These processes are governed by the complex coupling between fluid transport, reaction, and solid deformation. Reaction-driven cracking is a potentially critical feedback mechanism, by which volume change associated with chemical reaction drives fracture in the surrounding rock. It has been proposed to play a role in both serpentinization and carbonation of peridotite, motivating consideration of its application to mineral carbon sequestration. Previous studies of reactive cracking have focused on the increase in solid volume, and as such, have considered failure in compression. However, if the consumption of fluid is considered in the overall volume budget, the reaction can be net volume reducing, potentially leading to failure in tension. To explore these problems, we have formulated and solved a 2-D model of coupled porous flow, reaction kinetics, and elastic deformation using the finite element model assembler TerraFERMA (Wilson et al, G3 2013 submitted). The model is applied to the serpentinization of peridotite, which can be reasonably approximated as the transfer of a single reactive component (H2O) between fluid and solid phases, making it a simple test case to explore the process. The behavior of the system is controlled by the competition between the rate of volume consumption by the reaction, and the rate of volume replacement by fluid transport, as characterized by a nondimensional parameter χ, which depends on permeability, reaction rate, and the bulk modulus of the solid. Large values of χ correspond to fast fluid transport relative to reaction rate, resulting in a low stress, volume replacing regime. At smaller values of χ, fluid transport cannot keep up with the reaction, resulting in pore fluid under-pressure and tensile solid stresses. For the range of χ relevant to the serpentinization of peridotite, these stresses can reach hundreds of MPa, exceeding the tensile strength of peridotite.

  1. Metalorganic chemical vapor deposition of AlGaAs and InGaP heterojunction bipolar transistors

    NASA Astrophysics Data System (ADS)

    Pan, N.; Welser, R. E.; Lutz, C. R.; DeLuca, P. M.; Han, B.; Hong, K.

    2001-05-01

    Heterojunction bipolar transistors (HBT) are now beginning to be widely incorporated as power amplifiers, laser drivers, multiplexers, clock data recovery circuits, as well as transimpedance and broadband amplifiers in high performance millimeter wave circuits (MMICs). The increasing acceptance of this device is principally due to advancements in metalorganic chemical vapor deposition (MOCVD), device processing, and circuit design technologies. Many of the DC electrical characteristics of large area devices can be directly correlated to the DC performance of small area RF devices. A precise understanding of the growth parameters and their relationship to device characteristics is critical for ensuring the high degree of reproducibility required for low cost high-yield volume manufacturing. Significant improvements in the understanding of the MOCVD growth process have been realized through the implementation of statistical process control on the key HBT device parameters. This tool has been successfully used to maintain the high quality of the device characteristics in high-volume production of 4″ GaAs-based HBTs. There is a growing demand to migrate towards 6″ diameter wafer size due to the potential cost reductions and increased volume production that can be realized. Preliminary results, indicating good heterostructure layer characteristics, demonstrate the feasibility of 6″ InGaP-based HBT devices.

  2. Classifying magnetic resonance image modalities with convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Remedios, Samuel; Pham, Dzung L.; Butman, John A.; Roy, Snehashis

    2018-02-01

    Magnetic Resonance (MR) imaging allows the acquisition of images with different contrast properties depending on the acquisition protocol and the magnetic properties of tissues. Many MR brain image processing techniques, such as tissue segmentation, require multiple MR contrasts as inputs, and each contrast is treated differently. Thus it is advantageous to automate the identification of image contrasts for various purposes, such as facilitating image processing pipelines, and managing and maintaining large databases via content-based image retrieval (CBIR). Most automated CBIR techniques focus on a two-step process: extracting features from data and classifying the image based on these features. We present a novel 3D deep convolutional neural network (CNN)- based method for MR image contrast classification. The proposed CNN automatically identifies the MR contrast of an input brain image volume. Specifically, we explored three classification problems: (1) identify T1-weighted (T1-w), T2-weighted (T2-w), and fluid-attenuated inversion recovery (FLAIR) contrasts, (2) identify pre vs postcontrast T1, (3) identify pre vs post-contrast FLAIR. A total of 3418 image volumes acquired from multiple sites and multiple scanners were used. To evaluate each task, the proposed model was trained on 2137 images and tested on the remaining 1281 images. Results showed that image volumes were correctly classified with 97.57% accuracy.

  3. Biases in measuring the brain: the trouble with the telencephalon.

    PubMed

    LaDage, Lara D; Roth, Timothy C; Pravosudov, Vladimir V

    2009-01-01

    When correlating behavior with particular brain regions thought responsible for the behavior, a different region of the brain is usually measured as a control region. This technique is often used to relate spatial processes with the hippocampus, while concomitantly controlling for overall brain changes by measuring the remainder of the telencephalon. We have identified two methods in the literature (the HOM and TTM) that estimate the volume of the telencephalon, although the majority of studies are ambiguous regarding the method employed in measuring the telencephalon. Of these two methods, the HOM might produce an artificial correlation between the telencephalon and the hippocampus, and this bias could result in a significant overestimation of the relative hippocampal volume and a significant underestimation of the telencephalon volume, both of which are regularly used in large comparative analyses. We suggest that future studies should avoid this method and all studies should explicitly delineate the procedures used when estimating brain volumes. Copyright 2009 S. Karger AG, Basel.

  4. A Vertically Lagrangian Finite-Volume Dynamical Core for Global Models

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann

    2003-01-01

    A finite-volume dynamical core with a terrain-following Lagrangian control-volume discretization is described. The vertically Lagrangian discretization reduces the dimensionality of the physical problem from three to two with the resulting dynamical system closely resembling that of the shallow water dynamical system. The 2D horizontal-to-Lagrangian-surface transport and dynamical processes are then discretized using the genuinely conservative flux-form semi-Lagrangian algorithm. Time marching is split- explicit, with large-time-step for scalar transport, and small fractional time step for the Lagrangian dynamics, which permits the accurate propagation of fast waves. A mass, momentum, and total energy conserving algorithm is developed for mapping the state variables periodically from the floating Lagrangian control-volume to an Eulerian terrain-following coordinate for dealing with physical parameterizations and to prevent severe distortion of the Lagrangian surfaces. Deterministic baroclinic wave growth tests and long-term integrations using the Held-Suarez forcing are presented. Impact of the monotonicity constraint is discussed.

  5. FluoRender: joint freehand segmentation and visualization for many-channel fluorescence data analysis.

    PubMed

    Wan, Yong; Otsuna, Hideo; Holman, Holly A; Bagley, Brig; Ito, Masayoshi; Lewis, A Kelsey; Colasanto, Mary; Kardon, Gabrielle; Ito, Kei; Hansen, Charles

    2017-05-26

    Image segmentation and registration techniques have enabled biologists to place large amounts of volume data from fluorescence microscopy, morphed three-dimensionally, onto a common spatial frame. Existing tools built on volume visualization pipelines for single channel or red-green-blue (RGB) channels have become inadequate for the new challenges of fluorescence microscopy. For a three-dimensional atlas of the insect nervous system, hundreds of volume channels are rendered simultaneously, whereas fluorescence intensity values from each channel need to be preserved for versatile adjustment and analysis. Although several existing tools have incorporated support of multichannel data using various strategies, the lack of a flexible design has made true many-channel visualization and analysis unavailable. The most common practice for many-channel volume data presentation is still converting and rendering pseudosurfaces, which are inaccurate for both qualitative and quantitative evaluations. Here, we present an alternative design strategy that accommodates the visualization and analysis of about 100 volume channels, each of which can be interactively adjusted, selected, and segmented using freehand tools. Our multichannel visualization includes a multilevel streaming pipeline plus a triple-buffer compositing technique. Our method also preserves original fluorescence intensity values on graphics hardware, a crucial feature that allows graphics-processing-unit (GPU)-based processing for interactive data analysis, such as freehand segmentation. We have implemented the design strategies as a thorough restructuring of our original tool, FluoRender. The redesign of FluoRender not only maintains the existing multichannel capabilities for a greatly extended number of volume channels, but also enables new analysis functions for many-channel data from emerging biomedical-imaging techniques.

  6. A flexible piezoresistive carbon black network in silicone rubber for wide range deformation and strain sensing

    NASA Astrophysics Data System (ADS)

    Zhu, Jianxiong; Wang, Hai; Zhu, Yali

    2018-01-01

    This work presents the design, fabrication, and measurement of a piezoresistive device with a carbon black (CB) particle network in a highly flexible silicone rubber for large deformation and wide range strain sensing. The piezoresistive composite film was fabricated with a mixture of silicone rubber and CB filler particles. The test results showed that the CB particle network in the silicone rubber strongly affected the resistance of the device during the process of drawing and its recovery. We found that the 50% volume ratio of CB filler particles showed a lower relative resistance than the 33.3% volume ratio of CB filler particles, but with an advantage of good resistance recovery stability and a smaller perturbation error (smaller changed resistance) during the periodic back and forth linear motor test. With both having a 50% volume ratio of CB filler particles and a 33.3% volume ratio of CB filler particles, one can reach up to 200% strain with resistances 18 kΩ and 110 kΩ, respectively. We also found that the relative resistance increased in an approximately linear relationship corresponding to the value of step-increased instantaneous length for the reported device. Moreover, an application test through hand drawing was used to demonstrate the piezoresistive performance of the device, which showed that the reported device was capable of measuring the instantaneous length with large deformation.

  7. An HTML5-Based Pure Website Solution for Rapidly Viewing and Processing Large-Scale 3D Medical Volume Reconstruction on Mobile Internet

    PubMed Central

    Chen, Xin; Zhang, Ye; Zhang, Jingna; Li, Ying; Mo, Xuemei; Chen, Wei

    2017-01-01

    This study aimed to propose a pure web-based solution to serve users to access large-scale 3D medical volume anywhere with good user experience and complete details. A novel solution of the Master-Slave interaction mode was proposed, which absorbed advantages of remote volume rendering and surface rendering. On server side, we designed a message-responding mechanism to listen to interactive requests from clients (Slave model) and to guide Master volume rendering. On client side, we used HTML5 to normalize user-interactive behaviors on Slave model and enhance the accuracy of behavior request and user-friendly experience. The results showed that more than four independent tasks (each with a data size of 249.4 MB) could be simultaneously carried out with a 100-KBps client bandwidth (extreme test); the first loading time was <12 s, and the response time of each behavior request for final high quality image remained at approximately 1 s, while the peak value of bandwidth was <50-KBps. Meanwhile, the FPS value for each client was ≥40. This solution could serve the users by rapidly accessing the application via one URL hyperlink without special software and hardware requirement in a diversified network environment and could be easily integrated into other telemedical systems seamlessly. PMID:28638406

  8. An HTML5-Based Pure Website Solution for Rapidly Viewing and Processing Large-Scale 3D Medical Volume Reconstruction on Mobile Internet.

    PubMed

    Qiao, Liang; Chen, Xin; Zhang, Ye; Zhang, Jingna; Wu, Yi; Li, Ying; Mo, Xuemei; Chen, Wei; Xie, Bing; Qiu, Mingguo

    2017-01-01

    This study aimed to propose a pure web-based solution to serve users to access large-scale 3D medical volume anywhere with good user experience and complete details. A novel solution of the Master-Slave interaction mode was proposed, which absorbed advantages of remote volume rendering and surface rendering. On server side, we designed a message-responding mechanism to listen to interactive requests from clients ( Slave model) and to guide Master volume rendering. On client side, we used HTML5 to normalize user-interactive behaviors on Slave model and enhance the accuracy of behavior request and user-friendly experience. The results showed that more than four independent tasks (each with a data size of 249.4 MB) could be simultaneously carried out with a 100-KBps client bandwidth (extreme test); the first loading time was <12 s, and the response time of each behavior request for final high quality image remained at approximately 1 s, while the peak value of bandwidth was <50-KBps. Meanwhile, the FPS value for each client was ≥40. This solution could serve the users by rapidly accessing the application via one URL hyperlink without special software and hardware requirement in a diversified network environment and could be easily integrated into other telemedical systems seamlessly.

  9. Perspective: Aerosol microphysics: From molecules to the chemical physics of aerosols

    NASA Astrophysics Data System (ADS)

    Bzdek, Bryan R.; Reid, Jonathan P.

    2017-12-01

    Aerosols are found in a wide diversity of contexts and applications, including the atmosphere, pharmaceutics, and industry. Aerosols are dispersions of particles in a gas, and the coupling of the two phases results in highly dynamic systems where chemical and physical properties like size, composition, phase, and refractive index change rapidly in response to environmental perturbations. Aerosol particles span a wide range of sizes from 1 nm to tens of micrometres or from small molecular clusters that may more closely resemble gas phase molecules to large particles that can have similar qualities to bulk materials. However, even large particles with finite volumes exhibit distinct properties from the bulk condensed phase, due in part to their higher surface-to-volume ratio and their ability to easily access supersaturated solute states inaccessible in the bulk. Aerosols represent a major challenge for study because of the facile coupling between the particle and gas, the small amounts of sample available for analysis, and the sheer breadth of operative processes. Time scales of aerosol processes can be as short as nanoseconds or as long as years. Despite their very different impacts and applications, fundamental chemical physics processes serve as a common theme that underpins our understanding of aerosols. This perspective article discusses challenges in the study of aerosols and highlights recent chemical physics advancements that have enabled improved understanding of these complex systems.

  10. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    PubMed

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  11. Assessing the variability of glacier lake bathymetries and potential peak discharge based on large-scale measurements in the Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Cochachin, Alejo; Huggel, Christian; Salazar, Cesar; Haeberli, Wilfried; Frey, Holger

    2015-04-01

    Over timescales of hundreds to thousands of years ice masses in mountains produced erosion in bedrock and subglacial sediment, including the formation of overdeepenings and large moraine dams that now serve as basins for glacial lakes. Satellite based studies found a total of 8355 glacial lakes in Peru, whereof 830 lakes were observed in the Cordillera Blanca. Some of them have caused major disasters due to glacial lake outburst floods in the past decades. On the other hand, in view of shrinking glaciers, changing water resources, and formation of new lakes, glacial lakes could have a function as water reservoirs in the future. Here we present unprecedented bathymetric studies of 124 glacial lakes in the Cordillera Blanca, Huallanca, Huayhuash and Raura in the regions of Ancash, Huanuco and Lima. Measurements were carried out using a boat equipped with GPS, a total station and an echo sounder to measure the depth of the lakes. Autocad Civil 3D Land and ArcGIS were used to process the data and generate digital topographies of the lake bathymetries, and analyze parameters such as lake area, length and width, and depth and volume. Based on that, we calculated empirical equations for mean depth as related to (1) area, (2) maximum length, and (3) maximum width. We then applied these three equations to all 830 glacial lakes of the Cordillera Blanca to estimate their volumes. Eventually we used three relations from the literature to assess the peak discharge of potential lake outburst floods, based on lake volumes, resulting in 3 x 3 peak discharge estimates. In terms of lake topography and geomorphology results indicate that the maximum depth is located in the center part for bedrock lakes, and in the back part for lakes in moraine material. Best correlations are found for mean depth and maximum width, however, all three empirical relations show a large spread, reflecting the wide range of natural lake bathymetries. Volumes of the 124 lakes with bathymetries amount to 0.9 km3 while the volume of all glacial lakes of the Cordillera Blanca ranges between 1.15 and 1.29 km3. The small difference in volume of the large lake sample as compared to the smaller sample of bathymetrically surveyed lakes is due to the large size of the measured lakes. The different distributions for lake volume and peak discharge indicate the range of variability in such estimates, and provides valuable first-order information for management and adaptation efforts in the field of water resources and flood prevention.

  12. Next Generation Cloud-based Science Data Systems and Their Implications on Data and Software Stewardship, Preservation, and Provenance

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.

    2017-12-01

    NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.

  13. Nondestructive Evaluation Techniques for Development and Characterization of Carbon Nanotube Based Superstructures

    NASA Technical Reports Server (NTRS)

    Wincheski, Buzz; Kim, Jae-Woo; Sauti, Godfrey; Wainwright, Elliot; Williams, Phillip; Siochi, Emile J.

    2014-01-01

    Recently, multiple commercial vendors have developed capability for the production of large-scale quantities of high-quality carbon nanotube sheets and yarns. While the materials have found use in electrical shielding applications, development of structural systems composed of a high volume fraction of carbon nanotubes is still lacking. A recent NASA program seeks to address this by prototyping a structural nanotube composite with strength-toweight ratio exceeding current state-of-the-art carbon fiber composites. Commercially available carbon nanotube sheets, tapes, and yarns are being processed into high volume fraction carbon nanotube-polymer nanocomposites. Nondestructive evaluation techniques have been applied throughout this development effort for material characterization and process control. This paper will report on the progress of these efforts, including magnetic characterization of residual catalyst content, Raman scattering characterization of nanotube diameter, defect ratio, and nanotube strain, and polarized Raman scattering for characterization of nanotube alignment.

  14. Analysis of peptides using an integrated microchip HPLC-MS/MS system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirby, Brian J.; Chirica, Gabriela S.; Reichmuth, David S.

    Hyphendated LC-MS techniques are quickly becoming the standard tool for protemic analyses. For large homogeneous samples, bulk processing methods and capillary injection and separation techniques are suitable. However, for analysis of small or heterogeneous samples, techniques that can manipulate picoliter samples without dilution are required or samples will be lost or corrupted; further, static nanospray-type flowrates are required to maximize SNR. Microchip-level integration of sample injection with separation and mass spectrometry allow small-volume analytes to be processed on chip and immediately injected without dilution for analysis. An on-chip HPLC was fabricated using in situ polymerization of both fixed and mobilemore » polymer monoliths. Integration of the chip with a nanospray MS emitter enables identification of peptides by the use of tandem MS. The chip is capable of analyzing of very small sample volumes (< 200 pl) in short times (< 3 min).« less

  15. Stroke volume variation as a guide for fluid resuscitation in patients undergoing large-volume liposuction.

    PubMed

    Jain, Anil Kumar; Khan, Asma M

    2012-09-01

    : The potential for fluid overload in large-volume liposuction is a source of serious concern. Fluid management in these patients is controversial and governed by various formulas that have been advanced by many authors. Basically, it is the ratio of what goes into the patient and what comes out. Central venous pressure has been used to monitor fluid therapy. Dynamic parameters, such as stroke volume and pulse pressure variation, are better predictors of volume responsiveness and are superior to static indicators, such as central venous pressure and pulmonary capillary wedge pressure. Stroke volume variation was used in this study to guide fluid resuscitation and compared with one guided by an intraoperative fluid ratio of 1.2 (i.e., Rohrich formula). : Stroke volume variation was used as a guide for intraoperative fluid administration in 15 patients subjected to large-volume liposuction. In another 15 patients, fluid resuscitation was guided by an intraoperative fluid ratio of 1.2. The amounts of intravenous fluid administered in the groups were compared. : The mean amount of fluid infused was 561 ± 181 ml in the stroke volume variation group and 2383 ± 1208 ml in the intraoperative fluid ratio group. The intraoperative fluid ratio when calculated for the stroke volume variation group was 0.936 ± 0.084. All patients maintained hemodynamic parameters (heart rate and systolic, diastolic, and mean blood pressure). Renal and metabolic indices remained within normal limits. : Stroke volume variation-guided fluid application could result in an appropriate amount of intravenous fluid use in patients undergoing large-volume liposuction. : Therapeutic, II.

  16. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    NASA Astrophysics Data System (ADS)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  17. Process Performance of Optima XEx Single Wafer High Energy Implanter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, J. H.; Yoon, Jongyoon; Kondratenko, S.

    2011-01-07

    To meet the process requirements for well formation in future CMOS memory production, high energy implanters require more robust angle, dose, and energy control while maintaining high productivity. The Optima XEx high energy implanter meets these requirements by integrating a traditional LINAC beamline with a robust single wafer handling system. To achieve beam angle control, Optima XEx can control both the horizontal and vertical beam angles to within 0.1 degrees using advanced beam angle measurement and correction. Accurate energy calibration and energy trim functions accelerate process matching by eliminating energy calibration errors. The large volume process chamber and UDC (upstreammore » dose control) using faraday cups outside of the process chamber precisely control implant dose regardless of any chamber pressure increase due to PR (photoresist) outgassing. An optimized RF LINAC accelerator improves reliability and enables singly charged phosphorus and boron energies up to 1200 keV and 1500 keV respectively with higher beam currents. A new single wafer endstation combined with increased beam performance leads to overall increased productivity. We report on the advanced performance of Optima XEx observed during tool installation and volume production at an advanced memory fab.« less

  18. High Volume Manufacturing and Field Stability of MEMS Products

    NASA Astrophysics Data System (ADS)

    Martin, Jack

    Low volume MEMS/NEMS production is practical when an attractive concept is implemented with business, manufacturing, packaging, and test support. Moving beyond this to high volume production adds requirements on design, process control, quality, product stability, market size, market maturity, capital investment, and business systems. In a broad sense, this chapter uses a case study approach: It describes and compares the silicon-based MEMS accelerometers, pressure sensors, image projection systems, and gyroscopes that are in high volume production. Although they serve several markets, these businesses have common characteristics. For example, the manufacturing lines use automated semiconductor equipment and standard material sets to make consistent products in large quantities. Standard, well controlled processes are sometimes modified for a MEMS product. However, novel processes that cannot run with standard equipment and material sets are avoided when possible. This reliance on semiconductor tools, as well as the organizational practices required to manufacture clean, particle-free products partially explains why the MEMS market leaders are integrated circuit manufacturers. There are other factors. MEMS and NEMS are enabling technologies, so it can take several years for high volume applications to develop. Indeed, market size is usually a strong function of price. This becomes a vicious circle, because low price requires low cost - a result that is normally achieved only after a product is in high volume production. During the early years, IC companies reduced cost and financial risk by using existing facilities for low volume MEMS production. As a result, product architectures are partially determined by capabilities developed for previous products. This chapter includes a discussion of MEMS product architecture with particular attention to the impact of electronic integration, packaging, and surfaces. Packaging and testing are critical, because they are significant factors in MEMS product cost. These devices have extremelyhigh surface/volume ratios, so performance and stability may depend on the control of surface characteristics after packaging. Looking into the future, the competitive advantage of IC suppliers will decrease as small companies learn to integrate MEMS/NEMS devices on CMOS foundry wafers. Packaging challenges still remain, because most MEMS/NEMS products must interact with the environment without degrading stability or reliability. Generic packaging solutions are unlikely. However, packaging subcontractors recognize that MEMS/NEMS is a growth opportunity. They will spread the overhead burden of high-capital-cost-facilities by developing flexible processes in order to package several types of moderate volume integrated MEMS/NEMS products on the same equipment.

  19. High Volume Manufacturing and Field Stability of MEMS Products

    NASA Astrophysics Data System (ADS)

    Martin, Jack

    Low volume MEMS/NEMS production is practical when an attractive concept is implemented with business, manufacturing, packaging, and test support. Moving beyond this to high volume production adds requirements on design, process control, quality, product stability, market size, market maturity, capital investment, and business systems. In a broad sense, this chapter uses a case study approach: It describes and compares the silicon-based MEMS accelerometers, pressure sensors, image projection systems, and gyroscopes that are in high volume production. Although they serve several markets, these businesses have common characteristics. For example, the manufacturing lines use automated semiconductor equipment and standard material sets to make consistent products in large quantities. Standard, well controlled processes are sometimes modified for a MEMS product. However, novel processes that cannot run with standard equipment and material sets are avoided when possible. This reliance on semiconductor tools, as well as the organizational practices required to manufacture clean, particle-free products partially explains why the MEMS market leaders are integrated circuit manufacturers. There are other factors. MEMS and NEMS are enabling technologies, so it can take several years for high volume applications to develop. Indeed, market size is usually a strong function of price. This becomes a vicious circle, because low price requires low cost - a result that is normally achieved only after a product is in high volume production. During the early years, IC companies reduced cost and financial risk by using existing facilities for low volume MEMS production. As a result, product architectures are partially determined by capabilities developed for previous products. This chapter includes a discussion of MEMS product architecture with particular attention to the impact of electronic integration, packaging, and surfaces. Packaging and testing are critical, because they are significant factors in MEMS product cost. These devices have extremely high surface/volume ratios, so performance and stability may depend on the control of surface characteristics after packaging. Looking into the future, the competitive advantage of IC suppliers will decrease as small companies learn to integrate MEMS/NEMS devices on CMOS foundry wafers. Packaging challenges still remain, because most MEMS/NEMS products must interact with the environment without degrading stability or reliability. Generic packaging solutions are unlikely. However, packaging subcontractors recognize that MEMS/NEMS is a growth opportunity. They will spread the overhead burden of high-capital-cost-facilities by developing flexible processes in order to package several types of moderate volume integrated MEMS/NEMS products on the same equipment.

  20. Surgical volume-to-outcome relationship and monitoring of technical performance in pediatric cardiac surgery.

    PubMed

    Kalfa, David; Chai, Paul; Bacha, Emile

    2014-08-01

    A significant inverse relationship of surgical institutional and surgeon volumes to outcome has been demonstrated in many high-stakes surgical specialties. By and large, the same results were found in pediatric cardiac surgery, for which a more thorough analysis has shown that this relationship depends on case complexity and type of surgical procedures. Lower-volume programs tend to underperform larger-volume programs as case complexity increases. High-volume pediatric cardiac surgeons also tend to have better results than low-volume surgeons, especially at the more complex end of the surgery spectrum (e.g., the Norwood procedure). Nevertheless, this trend for lower mortality rates at larger centers is not universal. All larger programs do not perform better than all smaller programs. Moreover, surgical volume seems to account for only a small proportion of the overall between-center variation in outcome. Intraoperative technical performance is one of the most important parts, if not the most important part, of the therapeutic process and a critical component of postoperative outcome. Thus, the use of center-specific, risk-adjusted outcome as a tool for quality assessment together with monitoring of technical performance using a specific score may be more reliable than relying on volume alone. However, the relationship between surgical volume and outcome in pediatric cardiac surgery is strong enough that it ought to support adapted and well-balanced health care strategies that take advantage of the positive influence that higher center and surgeon volumes have on outcome.

  1. Synthetic carbohydrate: An aid to nutrition in the future

    NASA Technical Reports Server (NTRS)

    Berman, G. A. (Editor); Murashige, K. H. (Editor)

    1973-01-01

    The synthetic production of carbohydrate on a large scale is discussed. Three possible nonagricultural methods of making starch are presented in detail and discussed. The simplest of these, the hydrolysis of cellulose wastes to glucose followed by polymerization to starch, appears a reasonable and economic supplement to agriculture at the present time. The conversion of fossil fuels to starch was found to be not competitive with agriculture at the present time, but tractable enough to allow a reasonable plant design to be made. A reconstruction of the photosynthetic process using isolated enzyme systems proved technically much more difficult than either of the other two processes. Particular difficulties relate to the replacement of expensive energy carrying compounds, separation of similar materials, and processing of large reactant volumes. Problem areas were pinpointed, and technological progress necessary to permit such a system to become practical is described.

  2. Vectorization with SIMD extensions speeds up reconstruction in electron tomography.

    PubMed

    Agulleiro, J I; Garzón, E M; García, I; Fernández, J J

    2010-06-01

    Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.

  3. The Use of Cryogenically Cooled 5A Molecular Sieves for Large Volume Reduction of Tritiated Hydrogen Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antoniazzi, A.B.; Bartoszek, F.E.; Sherlock, A.M.

    2006-07-01

    A commercial hydrogen isotope separation system based on gas chromatography (AGC-ISS) has been built. The system operates in two modes: stripping and volume reduction. The purpose of the stripping mode is to reduce a large volume of tritiated hydrogen gas to a small volume of tritium rich hydrogen gas. The results here illustrate the effectiveness of the AGC-ISS in the stripping and volume reduction phases. Column readiness for hydrogen isotope separation is confirmed by room temperature air separation tests. Production runs were initially carried out using natural levels of deuterium (110-160 ppm) in high purity hydrogen. After completion of themore » deuterium/hydrogen runs the system began operations with tritiated hydrogen. The paper presents details of the AGC-ISS design and results of tritium tests. The heart of the AGC-ISS consists of two packed columns (9 m long, 3.8 cm OD) containing 5A molecular sieve material of 40/60 mesh size. Each column has 5 individually controlled heaters along the length of the column and is coiled around an inverted inner dewar. The coiled column and inner dewar are both contained within an outer dewar. In this arrangement liquid nitrogen, used to cryogenically cool the columns, flows into and out off the annular space defined by the two dewars, allowing for alternate heating and cooling cycles. Tritiated hydrogen feed is injected in batch quantities. The batch size is variable with the maximum quantity restricted by the tritium concentration in the exhausted hydrogen. The stripping operations can be carried out in full automated mode or in full manual mode. The average cycle time between injections is about 75 minutes. To date, the maximum throughput achieved is 10.5 m{sup 3}/day. A total of 37.8 m{sup 3} of tritiated hydrogen has been processed during commissioning. The system has demonstrated that venting of >99.95% of the feed gas is possible while retaining 99.98% of the tritium. At a maximum tritium concentration of {approx}7 GBq/m{sup 3} (190 mCi/m{sup 3}), processing tritiated hydrogen gas at a rate of 8.1 m{sup 3} (NTP)/day results in an average tritium concentration in the process effluent line of 1.4 MBq/m{sup 3} (37 {mu}Ci/m{sup 3}). The average process exhaust flow, split between helium and hydrogen, is 10.6 litre/min. Product from the stripping phase is stored on a 5 kg depleted uranium bed. A 250 g depleted uranium bed is available for storage of enriched product. Several, ionization type, tritium sensors are located throughout the process to control emissions, control valve switching, and monitor evolution of tritiated species from the columns. (authors)« less

  4. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The techniques, processes, and equipment required for automatic fabrication and assembly of structural elements in space using the space shuttle as a launch vehicle and construction base were investigated. Additional construction/systems/operational techniques, processes, and equipment which can be developed/demonstrated in the same program to provide further risk reduction benefits to future large space systems were included. Results in the areas of structure/materials, fabrication systems (beam builder, assembly jig, and avionics/controls), mission integration, and programmatics are summarized. Conclusions and recommendations are given.

  5. Polyglot Programming in Applications Used for Genetic Data Analysis

    PubMed Central

    Nowak, Robert M.

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development. PMID:25197633

  6. Polyglot programming in applications used for genetic data analysis.

    PubMed

    Nowak, Robert M

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development.

  7. System of Programmed Modules for Measuring Photographs with a Gamma-Telescope

    NASA Technical Reports Server (NTRS)

    Averin, S. A.; Veselova, G. V.; Navasardyan, G. V.

    1978-01-01

    Physical experiments using tracking cameras resulted in hundreds of thousands of stereo photographs of events being received. To process such a large volume of information, automatic and semiautomatic measuring systems are required. At the Institute of Space Research of the Academy of Science of the USSR, a system for processing film information from the spark gamma-telescope was developed. The system is based on a BPS-75 projector in line with the minicomputer Elektronika 1001. The report describes this system. The various computer programs available to the operators are discussed.

  8. Study for identification of beneficial uses of Space (BUS). Volume 2: Technical report. Book 1: Development and business analysis of space processed isoenzymes

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A separation method to provide reasonable yields of high specificity isoenzymes for the purpose of large scale, early clinical diagnosis of diseases and organic damage such as, myocardial infarction, hepatoma, muscular dystrophy, and infectous disorders is presented. Preliminary development plans are summarized. An analysis of required research and development and production resources is included. The costs of such resources and the potential profitability of a commercial space processing opportunity for electrophoretic separation of high specificity isoenzymes are reviewed.

  9. Application of evolutionary games to modeling carcinogenesis.

    PubMed

    Swierniak, Andrzej; Krzeslak, Michal

    2013-06-01

    We review a quite large volume of literature concerning mathematical modelling of processes related to carcinogenesis and the growth of cancer cell populations based on the theory of evolutionary games. This review, although partly idiosyncratic, covers such major areas of cancer-related phenomena as production of cytotoxins, avoidance of apoptosis, production of growth factors, motility and invasion, and intra- and extracellular signaling. We discuss the results of other authors and append to them some additional results of our own simulations dealing with the possible dynamics and/or spatial distribution of the processes discussed.

  10. Trends in Segregation of Hispanic Students in Major School Districts Having Large Hispanic Enrollment. Ethnographic Case Studies, Volume II. Final Report.

    ERIC Educational Resources Information Center

    Aspira, Inc., New York, NY.

    School desegregation did not lead to greater understanding of the Hispanic community by white educational personnel in two school districts analyzed to document the desegregation process and the impact of school desegregation on the Hispanic community. Each district was in a white-controlled, tri-ethnic community in its second year of successful…

  11. Effect of Hardwood Sawmill Edging and Trimming Practices on Furniture Part Production

    Treesearch

    D. Earl Kline; Carmen Regalado; Eugene M. Wengert; Fred M. Lamb; Philip A. Araman

    1993-01-01

    In a recent edging and trimming study at three hardwood sawmills, it was observed that the lumber volume produced was approximately 10 percent less than would be necessary to make the most valuable lumber. Furthermore, the excess portion of wood that was removed from the edging and trimming process contained a large percentage of clear wood. In light of rising costs...

  12. Concepts for on-board satellite image registration. Volume 2: IAS prototype performance evaluation standard definition

    NASA Astrophysics Data System (ADS)

    Daluge, D. R.; Ruedger, W. H.

    1981-06-01

    Problems encountered in testing onboard signal processing hardware designed to achieve radiometric and geometric correction of satellite imaging data are considered. These include obtaining representative image and ancillary data for simulation and the transfer and storage of a large quantity of image data at very high speed. The high resolution, high speed preprocessing of LANDSAT-D imagery is considered.

  13. Hydrodynamics of the Fluid Filtrate on Drilling-In

    NASA Astrophysics Data System (ADS)

    Abbasov, É. M.; Agaeva, N. A.

    2014-01-01

    The volume of the liquid penetrating into the formation after drilling-in has been determined on the basis of theoretical investigations. The dynamics of change in the bottom-hole pressure has been determined in this process. It has been shown that because of the water hammer, the bottom-hole pressure can be doubled in the presence of large fractures and pores closer to the well-bottom zone.

  14. Landslides and Landscape Evolution

    NASA Astrophysics Data System (ADS)

    Densmore, A. L.; Hovius, N.

    2017-12-01

    Landslides have long been recognised as a major hazard, and are a common product of both large earthquakes and rainstorms. Our appreciation for landslides as agents of erosion and land surface evolution, however, is much more recent. Only in the last twenty years have we come to understand the critical role that landslides play at the landscape scale: in allowing hillslopes to keep pace with fluvial incision, in supplying sediment to channel networks and sedimentary basins, in divide migration, and in setting the basic structure of the landscape. This perspective has been made possible in part by repeat remote sensing and new ways of visualising the land surface, and by extending our understanding of failure processes to the landscape scale; but it is also true that the big jumps in our knowledge have been triggered by large events, such as the 1999 Chi-Chi and 2008 Wenchuan earthquakes. Thanks in part to a relative handful of such case studies, we now have a better idea of the spatial distribution of landslides that are triggered in large events, the volume of sediment that they mobilise, the time scales over which that sediment is mobilised and evacuated, and the overall volume balance between erosion and tectonic processes in the growth of mountainous topography. There remain, however, some major challenges that must still be overcome. Estimates of landslide volume remain highly uncertain, as does our ability to predict the evolution of hillslope propensity to failure after a major triggering event, the movement of landslide sediment (especially the coarse fraction that is transported as bedload), and the impact of landslides on both long-term erosion rates and tectonic processes. The limited range of case studies also means that we struggle to predict outcomes for triggering events in different geological settings, such as loess landscapes or massive lithologies. And the perspective afforded by taking a landscape-scale view has yet to be fully reflected in our approach to landslide hazard. We close by outlining some promising future research directions by which these challenges might be overcome.

  15. Simplification and validation of a large volume polyurethane foam sampler for the analysis of persistent hydrophobic compounds in drinking water.

    PubMed

    Choi, J W; Lee, J H; Moon, B S; Kannan, K

    2008-08-01

    The use of a large volume polyurethane foam (PUF) sampler was validated for rapid extraction of persistent organic pollutants (POPs), such as polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs), in raw water and treated water from drinking water plants. To validate the recovery of target compounds in the sampling process, a (37)Cl-labeled standard was spiked into the 1st PUF plug prior to filtration. An accelerated solvent extraction method, as a pressurized liquid extractor (PLE), was optimized to extract the PUF plug. For sample preparation, tandem column chromatography (TCC) clean-up was used for rapid analysis. The recoveries of labeled compounds in the analytical method were 80-110% (n = 9). The optimized PUF-PLE-TCC method was applied in the analysis of raw water and treated potable water from seven drinking water plants in South Korea. The sample volume used was between 18 and 102 L for raw water at a flow rate of 0.4-2 L min(-1), 95 and 107 L for treated water at a flow rate of 1.5-2.2 L min(-1). Limit of quantitation (LOQ) was a function of sample volume and it decreased with increasing sample volume. The LOQ of PCDD/Fs in raw waters analyzed by this method was 3-11 times lower than that described using large-size disk-type solid phase extraction (SPE) method. The LOQ of PCDD/F congeners in raw water and treated water were 0.022-3.9 ng L(-1) and 0.018-0.74 ng L(-1), respectively. Octachlorinated dibenzo-p-dioxin (OCDD) was found in some raw water samples, while their concentrations were well below the tentative criterion set by the Japanese Environmental Ministry for drinking water. OCDD was below the LOQ in the treated drinking water.

  16. SU-G-IeP1-12: Size Selective Arterial Cerebral Blood Volume Mapping Using Multiple Inversion Time Arterial Spin Labeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, Y; Johnston, M; Whitlow, C

    Purpose: To demonstrate the feasibility of a novel method for size specific arterial cerebral blood volume (aCBV) mapping using pseudo-continuous arterial spin labeling (PCASL), with multiple TI. Methods: Multiple PCASL images were obtained from a subject with TI of [300, 400, 500, 600, 700, 800, 900, 1000, 1500, 2000, 2500, 3000, 3500, 4000] ms. Each TI pair was averaged six times. Two scans were performed: one without a flow crusher gradient and the other with a crusher gradient (10cm/s in three directions) to remove signals from large arteries. Scan times were 5min. without a crusher gradient and 5.5 min withmore » a crusher gradient. Non-linear fitting algorithm finds the minimum mean squared solution of per-voxel based aCBV, cerebral blood flow, and arterial transit time, and fits the data into a hemodynamic model that represents superposition of blood volume and flow components within a single voxel. Results: aCBV maps with a crusher gradient represent signals from medium and small sized arteries, while those without a crusher gradient represent signals from all sized arteries, indicating that flow crusher gradients can be effectively employed to achieve size-specific aCBV mapping. Regardless of flow crusher, the CBF and ATT maps are very similar in appearance. Conclusion: Quantitative size selective blood volume mapping controlled by a flow crusher is feasible without additional information because the ASL quantification process doesn’t require an arterial input function measured from a large artery. The size specific blood volume mapping is not interfered by sSignals from large arteries do not interfere with size specific aCBV mapping in the applications of interest in for applications in which only medium or small arteries are of interest.« less

  17. The geomorphic function and characteristics of large woody debris in low gradient rivers, coastal Maine, USA

    NASA Astrophysics Data System (ADS)

    Magilligan, F. J.; Nislow, K. H.; Fisher, G. B.; Wright, J.; Mackey, G.; Laser, M.

    2008-05-01

    The role, function, and importance of large woody debris (LWD) in rivers depend strongly on environmental context and land use history. The coastal watersheds of central and northern Maine, northeastern U.S., are characterized by low gradients, moderate topography, and minimal influence of mass wasting processes, along with a history of intensive commercial timber harvest. In spite of the ecological importance of these rivers, which contain the last wild populations of Atlantic salmon ( Salmo salar) in the U.S., we know little about LWD distribution, dynamics, and function in these systems. We conducted a cross-basin analysis in seven coastal Maine watersheds, documenting the size, frequency, volume, position, and orientation of LWD, as well as the association between LWD, pool formation, and sediment storage. In conjunction with these LWD surveys, we conducted extensive riparian vegetation surveys. We observed very low LWD frequencies and volumes across the 60 km of rivers surveyed. Frequency of LWD ≥ 20 cm diameter ranged from 15-50 pieces km - 1 and wood volumes were commonly < 10-20 m 3 km - 1 . Moreover, most of this wood was located in the immediate low-flow channel zone, was oriented parallel to flow, and failed to span the stream channel. As a result, pool formation associated with LWD is generally lacking and < 20% of the wood was associated with sediment storage. Low LWD volumes are consistent with the relatively young riparian stands we observed, with the large majority of trees < 20 cm DBH. These results strongly reflect the legacy of intensive timber harvest and land clearing and suggest that the frequency and distribution of LWD may be considerably less than presettlement and/or future desired conditions.

  18. Submarine pipeline on-bottom stability. Volume 2: Software and manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-01

    The state-of-the-art in pipeline stability design has been changing very rapidly recent. The physics governing on-bottom stability are much better understood now than they were eight years. This is due largely because of research and large scale model tests sponsored by PRCI. Analysis tools utilizing this new knowledge have been developed. These tools provide the design engineer with a rational approach have been developed. These tools provide the design engineer with a rational approach for weight coating design, which he can use with confidence because the tools have been developed based on full scale and near full scale model tests.more » These tools represent the state-of-the-art in stability design and model the complex behavior of pipes subjected to both wave and current loads. These include: hydrodynamic forces which account for the effect of the wake (generated by flow over the pipe) washing back and forth over the pipe in oscillatory flow; and the embedment (digging) which occurs as a pipe resting on the seabed is exposed to oscillatory loadings and small oscillatory deflections. This report has been developed as a reference handbook for use in on-bottom pipeline stability analysis It consists of two volumes. Volume one is devoted descriptions of the various aspects of the problem: the pipeline design process; ocean physics, wave mechanics, hydrodynamic forces, and meteorological data determination; geotechnical data collection and soil mechanics; and stability design procedures. Volume two describes, lists, and illustrates the analysis software. Diskettes containing the software and examples of the software are also included in Volume two.« less

  19. Automatic segmentation of tumor-laden lung volumes from the LIDC database

    NASA Astrophysics Data System (ADS)

    O'Dell, Walter G.

    2012-03-01

    The segmentation of the lung parenchyma is often a critical pre-processing step prior to application of computer-aided detection of lung nodules. Segmentation of the lung volume can dramatically decrease computation time and reduce the number of false positive detections by excluding from consideration extra-pulmonary tissue. However, while many algorithms are capable of adequately segmenting the healthy lung, none have been demonstrated to work reliably well on tumor-laden lungs. Of particular challenge is to preserve tumorous masses attached to the chest wall, mediastinum or major vessels. In this role, lung volume segmentation comprises an important computational step that can adversely affect the performance of the overall CAD algorithm. An automated lung volume segmentation algorithm has been developed with the goals to maximally exclude extra-pulmonary tissue while retaining all true nodules. The algorithm comprises a series of tasks including intensity thresholding, 2-D and 3-D morphological operations, 2-D and 3-D floodfilling, and snake-based clipping of nodules attached to the chest wall. It features the ability to (1) exclude trachea and bowels, (2) snip large attached nodules using snakes, (3) snip small attached nodules using dilation, (4) preserve large masses fully internal to lung volume, (5) account for basal aspects of the lung where in a 2-D slice the lower sections appear to be disconnected from main lung, and (6) achieve separation of the right and left hemi-lungs. The algorithm was developed and trained to on the first 100 datasets of the LIDC image database.

  20. Liposuction: Anaesthesia challenges

    PubMed Central

    Sood, Jayashree; Jayaraman, Lakshmi; Sethi, Nitin

    2011-01-01

    Liposuction is one of the most popular treatment modalities in aesthetic surgery with certain unique anaesthetic considerations. Liposuction is often performed as an office procedure. There are four main types of liposuction techniques based on the volume of infiltration or wetting solution injected, viz dry, wet, superwet, and tumescent technique. The tumescent technique is one of the most common liposuction techniques in which large volumes of dilute local anaesthetic (wetting solution) are injected into the fat to facilitate anaesthesia and decrease blood loss. The amount of lignocaine injected may be very large, approximately 35-55 mg/kg, raising concerns regarding local anaesthetic toxicity. Liposuction can be of two types according to the volume of solution aspirated: High volume (>4,000 ml aspirated) or low volume (<4,000 ml aspirated). While small volume liposuction may be done under local/monitored anaesthesia care, large-volume liposuction requires general anaesthesia. As a large volume of wetting solution is injected into the subcutaneous tissue, the intraoperative fluid management has to be carefully titrated along with haemodynamic monitoring and temperature control. Assessment of blood loss is difficult, as it is mixed with the aspirated fat. Since most obese patients opt for liposuction as a quick method to lose weight, all concerns related to obesity need to be addressed in a preoperative evaluation. PMID:21808392

  1. Strategies for Interactive Visualization of Large Scale Climate Simulations

    NASA Astrophysics Data System (ADS)

    Xie, J.; Chen, C.; Ma, K.; Parvis

    2011-12-01

    With the advances in computational methods and supercomputing technology, climate scientists are able to perform large-scale simulations at unprecedented resolutions. These simulations produce data that are time-varying, multivariate, and volumetric, and the data may contain thousands of time steps with each time step having billions of voxels and each voxel recording dozens of variables. Visualizing such time-varying 3D data to examine correlations between different variables thus becomes a daunting task. We have been developing strategies for interactive visualization and correlation analysis of multivariate data. The primary task is to find connection and correlation among data. Given the many complex interactions among the Earth's oceans, atmosphere, land, ice and biogeochemistry, and the sheer size of observational and climate model data sets, interactive exploration helps identify which processes matter most for a particular climate phenomenon. We may consider time-varying data as a set of samples (e.g., voxels or blocks), each of which is associated with a vector of representative or collective values over time. We refer to such a vector as a temporal curve. Correlation analysis thus operates on temporal curves of data samples. A temporal curve can be treated as a two-dimensional function where the two dimensions are time and data value. It can also be treated as a point in the high-dimensional space. In this case, to facilitate effective analysis, it is often necessary to transform temporal curve data from the original space to a space of lower dimensionality. Clustering and segmentation of temporal curve data in the original or transformed space provides us a way to categorize and visualize data of different patterns, which reveals connection or correlation of data among different variables or at different spatial locations. We have employed the power of GPU to enable interactive correlation visualization for studying the variability and correlations of a single or a pair of variables. It is desired to create a succinct volume classification that summarizes the connection among all correlation volumes with respect to various reference locations. Providing a reference location must correspond to a voxel position, the number of correlation volumes equals the total number of voxels. A brute-force solution takes all correlation volumes as the input and classifies their corresponding voxels according to their correlation volumes' distance. For large-scale time-varying multivariate data, calculating all these correlation volumes on-the-fly and analyzing the relationships among them is not feasible. We have developed a sampling-based approach for volume classification in order to reduce the computation cost of computing the correlation volumes. Users are able to employ their domain knowledge in selecting important samples. The result is a static view that captures the essence of correlation relationships; i.e., for all voxels in the same cluster, their corresponding correlation volumes are similar. This sampling-based approach enables us to obtain an approximation of correlation relations in a cost-effective manner, thus leading to a scalable solution to investigate large-scale data sets. These techniques empower climate scientists to study large data from their simulations.

  2. Knowledge Discovery and Data Mining in Iran's Climatic Researches

    NASA Astrophysics Data System (ADS)

    Karimi, Mostafa

    2013-04-01

    Advances in measurement technology and data collection is the database gets larger. Large databases require powerful tools for analysis data. Iterative process of acquiring knowledge from information obtained from data processing is done in various forms in all scientific fields. However, when the data volume large, and many of the problems the Traditional methods cannot respond. in the recent years, use of databases in various scientific fields, especially atmospheric databases in climatology expanded. in addition, increases in the amount of data generated by the climate models is a challenge for analysis of it for extraction of hidden pattern and knowledge. The approach to this problem has been made in recent years uses the process of knowledge discovery and data mining techniques with the use of the concepts of machine learning, artificial intelligence and expert (professional) systems is overall performance. Data manning is analytically process for manning in massive volume data. The ultimate goal of data mining is access to information and finally knowledge. climatology is a part of science that uses variety and massive volume data. Goal of the climate data manning is Achieve to information from variety and massive atmospheric and non-atmospheric data. in fact, Knowledge Discovery performs these activities in a logical and predetermined and almost automatic process. The goal of this research is study of uses knowledge Discovery and data mining technique in Iranian climate research. For Achieve This goal, study content (descriptive) analysis and classify base method and issue. The result shown that in climatic research of Iran most clustering, k-means and wards applied and in terms of issues precipitation and atmospheric circulation patterns most introduced. Although several studies in geography and climate issues with statistical techniques such as clustering and pattern extraction is done, Due to the nature of statistics and data mining, but cannot say for internal climate studies in data mining and knowledge discovery techniques are used. However, it is necessary to use the KDD Approach and DM techniques in the climatic studies, specific interpreter of climate modeling result.

  3. Systems aspects of COBE science data compression

    NASA Technical Reports Server (NTRS)

    Freedman, I.; Boggess, E.; Seiler, E.

    1993-01-01

    A general approach to compression of diverse data from large scientific projects has been developed and this paper addresses the appropriate system and scientific constraints together with the algorithm development and test strategy. This framework has been implemented for the COsmic Background Explorer spacecraft (COBE) by retrofitting the existing VAS-based data management system with high-performance compression software permitting random access to the data. Algorithms which incorporate scientific knowledge and consume relatively few system resources are preferred over ad hoc methods. COBE exceeded its planned storage by a large and growing factor and the retrieval of data significantly affects the processing, delaying the availability of data for scientific usage and software test. Embedded compression software is planned to make the project tractable by reducing the data storage volume to an acceptable level during normal processing.

  4. Magnetite-doped polydimethylsiloxane (PDMS) for phosphopeptide enrichment.

    PubMed

    Sandison, Mairi E; Jensen, K Tveen; Gesellchen, F; Cooper, J M; Pitt, A R

    2014-10-07

    Reversible phosphorylation plays a key role in numerous biological processes. Mass spectrometry-based approaches are commonly used to analyze protein phosphorylation, but such analysis is challenging, largely due to the low phosphorylation stoichiometry. Hence, a number of phosphopeptide enrichment strategies have been developed, including metal oxide affinity chromatography (MOAC). Here, we describe a new material for performing MOAC that employs a magnetite-doped polydimethylsiloxane (PDMS), that is suitable for the creation of microwell array and microfluidic systems to enable low volume, high throughput analysis. Incubation time and sample loading were explored and optimized and demonstrate that the embedded magnetite is able to enrich phosphopeptides. This substrate-based approach is rapid, straightforward and suitable for simultaneously performing multiple, low volume enrichments.

  5. Industrial Catalysis: A Practical Guide

    NASA Astrophysics Data System (ADS)

    Farrauto, Robert J.

    Every student of chemistry, material science, and chemical engineering should be schooled in catalysis and catalytic reactions. The reason is quite simple; most products produced in the chemical and petroleum industry utilize catalysts to enhance the rate of reaction and selectivity to desired products. Catalysts are also extensively used to minimize harmful byproduct pollutants in environmental applications. Enhanced reaction rates translate to higher production volumes at lower temperatures with smaller and less exotic materials of construction necessary. When a highly selective catalyst is used, large volumes of desired products are produced with virtually no undesirable byproducts. Gasoline, diesel, home heating oil, and aviation fuels owe their performance quality to catalytic processing used to upgrade crude oil.

  6. Storage and computationally efficient permutations of factorized covariance and square-root information arrays

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector stored Upper triangular Diagonal factorized covariance and vector stored upper triangular Square Root Information arrays is presented. The method involves cyclic permutation of the rows and columns of the arrays and retriangularization with fast (slow) Givens rotations (reflections). Minimal computation is performed, and a one dimensional scratch array is required. To make the method efficient for large arrays on a virtual memory machine, computations are arranged so as to avoid expensive paging faults. This method is potentially important for processing large volumes of radio metric data in the Deep Space Network.

  7. Acoustic Profiling of Bottom Sediments in Large Oil Storage Tanks

    NASA Astrophysics Data System (ADS)

    Svet, V. D.; Tsysar', S. A.

    2018-01-01

    Characteristic features of acoustic profiling of bottom sediments in large oil storage tanks are considered. Basic acoustic parameters of crude oil and bottom sediments are presented. It is shown that, because of the presence of both transition layers in crude oil and strong reverberation effects in oil tanks, the volume of bottom sediments that is calculated from an acoustic surface image is generally overestimated. To reduce the error, additional post-processing of acoustic profilometry data is proposed in combination with additional measurements of viscosity and tank density distributions in vertical at several points of the tank.

  8. Enhancing surveillance for hepatitis C through public health informatics.

    PubMed

    Heisey-Grove, Dawn M; Church, Daniel R; Haney, Gillian A; Demaria, Alfred

    2011-01-01

    Disease surveillance for hepatitis C in the United States is limited by the occult nature of many of these infections, the large volume of cases, and limited public health resources. Through a series of discrete processes, the Massachusetts Department of Public Health modified its surveillance system in an attempt to improve timeliness and completeness of reporting and case follow-up of hepatitis C. These processes included clinician-based reporting, electronic laboratory reporting, deployment of a Web-based disease surveillance system, automated triage of pertinent data, and automated character recognition software for case-report processing. These changes have resulted in an increase in the timeliness of reporting.

  9. Material flows generated by pyromet copper smelting

    USGS Publications Warehouse

    Goonan, T.G.

    2005-01-01

    Copper production through smelting generates large volumes of material flows. As copper contained in ore becomes copper contained in concentrate to be fed into the smelting process, it leaves behind an altered landscape, sometimes mine waste, and always mill tailings. Copper concentrate, fluxing materials, fuels, oxygen, recyclables, scrap and water are inputs to the process. Dust (recycled), gases - containing carbon dioxide (CO2) (dissipated) and sulfur dioxide (SO2) (mostly collected, transformed and sold) and slag (discarded or sold) - are among the significant process outputs. This article reports estimates of the flows of these input/output materials for a particular set of smelters studied in some countries.

  10. Cogeneration technology alternatives study. Volume 6: Computer data

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The potential technical capabilities of energy conversion systems in the 1985 - 2000 time period were defined with emphasis on systems using coal, coal-derived fuels or alternate fuels. Industrial process data developed for the large energy consuming industries serve as a framework for the cogeneration applications. Ground rules for the study were established and other necessary equipment (balance-of-plant) was defined. This combination of technical information, energy conversion system data ground rules, industrial process information and balance-of-plant characteristics was analyzed to evaluate energy consumption, capital and operating costs and emissions. Data in the form of computer printouts developed for 3000 energy conversion system-industrial process combinations are presented.

  11. An Approach to Data Center-Based KDD of Remote Sensing Datasets

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Mack, Robert; Wharton, Stephen W. (Technical Monitor)

    2001-01-01

    The data explosion in remote sensing is straining the ability of data centers to deliver the data to the user community, yet many large-volume users actually seek a relatively small information component within the data, which they extract at their sites using Knowledge Discovery in Databases (KDD) techniques. To improve the efficiency of this process, the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC) has implemented a KDD subsystem that supports execution of the user's KDD algorithm at the data center, dramatically reducing the volume that is sent to the user. The data are extracted from the archive in a planned, organized "campaign"; the algorithms are executed, and the output products sent to the users over the network. The first campaign, now complete, has resulted in overall reductions in shipped volume from 3.3 TB to 0.4 TB.

  12. Recent progress in simulating galaxy formation from the largest to the smallest scales

    NASA Astrophysics Data System (ADS)

    Faucher-Giguère, Claude-André

    2018-05-01

    Galaxy formation simulations are an essential part of the modern toolkit of astrophysicists and cosmologists alike. Astrophysicists use the simulations to study the emergence of galaxy populations from the Big Bang, as well as the formation of stars and supermassive black holes. For cosmologists, galaxy formation simulations are needed to understand how baryonic processes affect measurements of dark matter and dark energy. Owing to the extreme dynamic range of galaxy formation, advances are driven by novel approaches using simulations with different tradeoffs between volume and resolution. Large-volume but low-resolution simulations provide the best statistics, while higher-resolution simulations of smaller cosmic volumes can be evolved with self-consistent physics and reveal important emergent phenomena. I summarize recent progress in galaxy formation simulations, including major developments in the past five years, and highlight some key areas likely to drive further advances over the next decade.

  13. Glass Property Data and Models for Estimating High-Level Waste Glass Volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vienna, John D.; Fluegel, Alexander; Kim, Dong-Sang

    2009-10-05

    This report describes recent efforts to develop glass property models that can be used to help estimate the volume of high-level waste (HLW) glass that will result from vitrification of Hanford tank waste. The compositions of acceptable and processable HLW glasses need to be optimized to minimize the waste-form volume and, hence, to save cost. A database of properties and associated compositions for simulated waste glasses was collected for developing property-composition models. This database, although not comprehensive, represents a large fraction of data on waste-glass compositions and properties that were available at the time of this report. Glass property-composition modelsmore » were fit to subsets of the database for several key glass properties. These models apply to a significantly broader composition space than those previously publised. These models should be considered for interim use in calculating properties of Hanford waste glasses.« less

  14. The origin and crust/mantle mass balance of Central Andean ignimbrite magmatism constrained by oxygen and strontium isotopes and erupted volumes

    NASA Astrophysics Data System (ADS)

    Freymuth, Heye; Brandmeier, Melanie; Wörner, Gerhard

    2015-06-01

    Volcanism during the Neogene in the Central Volcanic Zone (CVZ) of the Andes produced (1) stratovolcanoes, (2) rhyodacitic to rhyolitic ignimbrites which reach volumes of generally less than 300 km3 and (3) large-volume monotonous dacitic ignimbrites of up to several thousand cubic kilometres. We present models for the origin of these magma types using O and Sr isotopes to constrain crust/mantle proportions for the large-volume ignimbrites and explore the relationship to the evolution of the Andean crust. Oxygen isotope ratios were measured on phenocrysts in order to avoid the effects of secondary alteration. Our results show a complete overlap in the Sr-O isotope compositions of lavas from stratovolcanoes and low-volume rhyolitic ignimbrites as well as older (>9 Ma) large-volume dacitic ignimbrites. This suggests that the mass balance of crustal and mantle components are largely similar. By contrast, younger (<10 Ma) large-volume dacitic ignimbrites from the southern portion of the Central Andes have distinctly more radiogenic Sr and heavier O isotopes and thus contrast with older dacitic ignimbrites in northernmost Chile and southern Peru. Results of assimilation and fractional crystallization (AFC) models show that the largest chemical changes occur in the lower crust where magmas acquire a base-level geochemical signature that is later modified by middle to upper crustal AFC. Using geospatial analysis, we estimated the volume of these ignimbrite deposits throughout the Central Andes during the Neogene and examined the spatiotemporal pattern of so-called ignimbrite flare-ups. We observe a N-S migration of maximum ages of the onset of large-volume "ignimbrite pulses" through time: Major pulses occurred at 19-24 Ma (e.g. Oxaya, Nazca Group), 13-14 Ma (e.g. Huaylillas and Altos de Pica ignimbrites) and <10 Ma (Altiplano and Puna ignimbrites). Such "flare-ups" represent magmatic production rates of 25 to >70 km3 Ma-1 km-1 (assuming plutonic/volcanic ratios of 1:5) which are additional to, but within the order of, the arc background magmatic flux. Comparing our results to average shortening rates observed in the Andes, we observe a "lag-time" with large-volume eruptions occurring after accelerated shortening. A similar delay exists between the ignimbrite pulses and the subduction of the Juan Fernandez ridge. This is consistent with the idea that large-volume ignimbrite eruptions occurred in the wake of the N-S passage of the ridge after slab steepening has allowed hot asthenospheric mantle to ascend into and cause the melting of the mantle wedge. In our model, the older large-volume dacitic ignimbrites in the northern part of the CVZ have lower (15-37 %) crustal contributions because they were produced at times when the Central Andean crust was thinner and colder, and large-scale melting in the middle crust could not be achieved. Younger ignimbrite flare-ups further south (<10 Ma, >22°S) formed with a significantly higher crustal contribution (22-68 %) because at that time the Andean crust was thicker and hotter and, therefore primed for more extensive crustal melting. The rhyolitic lower-volume ignimbrites are more equally distributed in the CVZ in time and space and are produced by mechanisms similar to those operating below large stratovolcanoes, but at times of higher melt fluxes from the mantle wedge.

  15. How far does the CO2 travel beyond a leaky point?

    NASA Astrophysics Data System (ADS)

    Kong, X.; Delshad, M.; Wheeler, M.

    2012-12-01

    Xianhui Kong, Mojdeh Delshad, Mary F. Wheeler The University of Texas at Austin Numerous research studies have been carried out to investigate the long term feasibility of safe storage of large volumes of CO2 in subsurface saline aquifers. The injected CO2 will undergo complex petrophysical and geochemical processes. During these processes, part of CO2 will be trapped while some will remain as a mobile phase, causing a leakage risk. The comprehensive and accurate characterizations of the trapping and leakage mechanisms are critical for accessing the safety of sequestration, and are challenges in this research area. We have studied different leakage scenarios using realistic aquifer properties including heterogeneity and put forward a comprehensive trapping model for CO2 in deep saline aquifer. The reservoir models include several geological layers and caprocks up to the near surface. Leakage scenarios, such as fracture, high permeability pathways, abandoned wells, are studied. In order to accurately model the fractures, very fine grids are needed near the fracture. Considering that the aquifer usually has a large volume and reservoir model needs large number of grid blocks, simulation would be computational expensive. To deal with this challenge, we carried out the simulations using our in-house parallel reservoir simulator. Our study shows the significance of capillary pressure and permeability-porosity variations on CO2 trapping and leakage. The improved understanding on trapping and leakage will provide confidence in future implementation of sequestration projects.

  16. Record of massive upwellings from the Pacific large low shear velocity province

    NASA Astrophysics Data System (ADS)

    Madrigal, Pilar; Gazel, Esteban; Flores, Kennet E.; Bizimis, Michael; Jicha, Brian

    2016-11-01

    Large igneous provinces, as the surface expression of deep mantle processes, play a key role in the evolution of the planet. Here we analyse the geochemical record and timing of the Pacific Ocean Large Igneous Provinces and preserved accreted terranes to reconstruct the history of pulses of mantle plume upwellings and their relation with a deep-rooted source like the Pacific large low-shear velocity Province during the Mid-Jurassic to Upper Cretaceous. Petrological modelling and geochemical data suggest the need of interaction between these deep-rooted upwellings and mid-ocean ridges in pulses separated by ~10-20 Ma, to generate the massive volumes of melt preserved today as oceanic plateaus. These pulses impacted the marine biota resulting in episodes of anoxia and mass extinctions shortly after their eruption.

  17. Recurrence interval analysis of trading volumes

    NASA Astrophysics Data System (ADS)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q . The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  18. Recurrence interval analysis of trading volumes.

    PubMed

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  19. Intranasal volume increases with age: Computed tomography volumetric analysis in adults.

    PubMed

    Loftus, Patricia A; Wise, Sarah K; Nieto, Daniel; Panella, Nicholas; Aiken, Ashley; DelGaudio, John M

    2016-10-01

    It is theorized that intranasal cavity volumes change throughout the aging process, possibly secondary to hormonal changes and atrophy of the sinonasal mucosa. Our objective is to compare intranasal volumes from different age groups to test the hypothesis that intranasal cavity volume increases with age. Case series. An analysis of computed tomography (CT) scans performed for reasons other than sinonasal complaints. Intranasal volumes of three groups (age 20-30 years, 40-50 years, and 70 years and above) were calculated using Vitrea software. The total intranasal volume was measured from the nasal vestibule anteriorly, the nasopharynx posteriorly, the olfactory cleft superiorly, and the nasal floor inferiorly. The total volume included the sum of the right and left sides. Sixty-two CT scans were analyzed. There was a progressive, relatively linear, increase in intranasal volume with increasing age: 20 to 30 years = 15.73 mL, 40 to 50 years = 17.30 mL, and 70 years and above = 18.38 mL. Mean intranasal volume for males was 19.07 mL, and for females was 15.23 mL. Analysis of variance demonstrated significant group differences in mean intranasal volume for age (P = .003) and gender (P < .001), with moderate-to-large effect size of 0.206 and 0.289 (partial η(2) ), respectively. Post hoc testing revealed a significant difference between the 20 to 30-year and >70-year age groups (P = .006). There was no significant difference in intranasal volume dependent upon body mass index. Intranasal volume increases with age and is larger in males. Specific etiologies responsible for increased intranasal cavity volume with age are actively being evaluated. 4 Laryngoscope, 126:2212-2215, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  20. Endoclips vs large or small-volume epinephrine in peptic ulcer recurrent bleeding

    PubMed Central

    Ljubicic, Neven; Budimir, Ivan; Biscanin, Alen; Nikolic, Marko; Supanc, Vladimir; Hrabar, Davor; Pavic, Tajana

    2012-01-01

    AIM: To compare the recurrent bleeding after endoscopic injection of different epinephrine volumes with hemoclips in patients with bleeding peptic ulcer. METHODS: Between January 2005 and December 2009, 150 patients with gastric or duodenal bleeding ulcer with major stigmata of hemorrhage and nonbleeding visible vessel in an ulcer bed (Forrest IIa) were included in the study. Patients were randomized to receive a small-volume epinephrine group (15 to 25 mL injection group; Group 1, n = 50), a large-volume epinephrine group (30 to 40 mL injection group; Group 2, n = 50) and a hemoclip group (Group 3, n = 50). The rate of recurrent bleeding, as the primary outcome, was compared between the groups of patients included in the study. Secondary outcomes compared between the groups were primary hemostasis rate, permanent hemostasis, need for emergency surgery, 30 d mortality, bleeding-related deaths, length of hospital stay and transfusion requirements. RESULTS: Initial hemostasis was obtained in all patients. The rate of early recurrent bleeding was 30% (15/50) in the small-volume epinephrine group (Group 1) and 16% (8/50) in the large-volume epinephrine group (Group 2) (P = 0.09). The rate of recurrent bleeding was 4% (2/50) in the hemoclip group (Group 3); the difference was statistically significant with regard to patients treated with either small-volume or large-volume epinephrine solution (P = 0.0005 and P = 0.045, respectively). Duration of hospital stay was significantly shorter among patients treated with hemoclips than among patients treated with epinephrine whereas there were no differences in transfusion requirement or even 30 d mortality between the groups. CONCLUSION: Endoclip is superior to both small and large volume injection of epinephrine in the prevention of recurrent bleeding in patients with peptic ulcer. PMID:22611315

  1. A scalable neuroinformatics data flow for electrophysiological signals using MapReduce.

    PubMed

    Jayapandian, Catherine; Wei, Annan; Ramesh, Priya; Zonjy, Bilal; Lhatoo, Samden D; Loparo, Kenneth; Zhang, Guo-Qiang; Sahoo, Satya S

    2015-01-01

    Data-driven neuroscience research is providing new insights in progression of neurological disorders and supporting the development of improved treatment approaches. However, the volume, velocity, and variety of neuroscience data generated from sophisticated recording instruments and acquisition methods have exacerbated the limited scalability of existing neuroinformatics tools. This makes it difficult for neuroscience researchers to effectively leverage the growing multi-modal neuroscience data to advance research in serious neurological disorders, such as epilepsy. We describe the development of the Cloudwave data flow that uses new data partitioning techniques to store and analyze electrophysiological signal in distributed computing infrastructure. The Cloudwave data flow uses MapReduce parallel programming algorithm to implement an integrated signal data processing pipeline that scales with large volume of data generated at high velocity. Using an epilepsy domain ontology together with an epilepsy focused extensible data representation format called Cloudwave Signal Format (CSF), the data flow addresses the challenge of data heterogeneity and is interoperable with existing neuroinformatics data representation formats, such as HDF5. The scalability of the Cloudwave data flow is evaluated using a 30-node cluster installed with the open source Hadoop software stack. The results demonstrate that the Cloudwave data flow can process increasing volume of signal data by leveraging Hadoop Data Nodes to reduce the total data processing time. The Cloudwave data flow is a template for developing highly scalable neuroscience data processing pipelines using MapReduce algorithms to support a variety of user applications.

  2. A scalable neuroinformatics data flow for electrophysiological signals using MapReduce

    PubMed Central

    Jayapandian, Catherine; Wei, Annan; Ramesh, Priya; Zonjy, Bilal; Lhatoo, Samden D.; Loparo, Kenneth; Zhang, Guo-Qiang; Sahoo, Satya S.

    2015-01-01

    Data-driven neuroscience research is providing new insights in progression of neurological disorders and supporting the development of improved treatment approaches. However, the volume, velocity, and variety of neuroscience data generated from sophisticated recording instruments and acquisition methods have exacerbated the limited scalability of existing neuroinformatics tools. This makes it difficult for neuroscience researchers to effectively leverage the growing multi-modal neuroscience data to advance research in serious neurological disorders, such as epilepsy. We describe the development of the Cloudwave data flow that uses new data partitioning techniques to store and analyze electrophysiological signal in distributed computing infrastructure. The Cloudwave data flow uses MapReduce parallel programming algorithm to implement an integrated signal data processing pipeline that scales with large volume of data generated at high velocity. Using an epilepsy domain ontology together with an epilepsy focused extensible data representation format called Cloudwave Signal Format (CSF), the data flow addresses the challenge of data heterogeneity and is interoperable with existing neuroinformatics data representation formats, such as HDF5. The scalability of the Cloudwave data flow is evaluated using a 30-node cluster installed with the open source Hadoop software stack. The results demonstrate that the Cloudwave data flow can process increasing volume of signal data by leveraging Hadoop Data Nodes to reduce the total data processing time. The Cloudwave data flow is a template for developing highly scalable neuroscience data processing pipelines using MapReduce algorithms to support a variety of user applications. PMID:25852536

  3. A new large-volume metal reference standard for radioactive waste management.

    PubMed

    Tzika, F; Hult, M; Stroh, H; Marissens, G; Arnold, D; Burda, O; Kovář, P; Suran, J; Listkowska, A; Tyminski, Z

    2016-03-01

    A new large-volume metal reference standard has been developed. The intended use is for calibration of free-release radioactivity measurement systems and is made up of cast iron tubes placed inside a box of the size of a Euro-pallet (80 × 120 cm). The tubes contain certified activity concentrations of (60)Co (0.290 ± 0.006 Bq g(-1)) and (110m)Ag (3.05 ± 0.09 Bq g(-1)) (reference date: 30 September 2013). They were produced using centrifugal casting from a smelt into which (60)Co was first added and then one piece of neutron irradiated silver wire was progressively diluted. The iron castings were machined to the desirable dimensions. The final material consists of 12 iron tubes of 20 cm outer diameter, 17.6 cm inner diameter, 40 cm length/height and 245.9 kg total mass. This paper describes the reference standard and the process of determining the reference activity values. © The Author 2015. Published by Oxford University Press.

  4. Dynamic illumination of spatially restricted or large brain volumes via a single tapered optical fiber

    PubMed Central

    Pisanello, Marco; Oldenburg, Ian A.; Sileo, Leonardo; Markowitz, Jeffrey E.; Peterson, Ralph E.; Della Patria, Andrea; Haynes, Trevor M.; Emara, Mohamed S.; Spagnolo, Barbara; Datta, Sandeep Robert; De Vittorio, Massimo; Sabatini, Bernardo L.

    2017-01-01

    Optogenetics promises spatiotemporal precise control of neural processes using light. However, the spatial extent of illumination within the brain is difficult to control and cannot be adjusted using standard fiber optics. We demonstrate that optical fibers with tapered tips can be used to illuminate either spatially restricted or large brain volumes. Remotely adjusting the light input angle to the fiber varies the light-emitting portion of the taper over several millimeters without movement of the implant. We use this mode to activate dorsal versus ventral striatum of individual mice and reveal different effects of each manipulation on motor behavior. Conversely, injecting light over the full numerical aperture of the fiber results in light emission from the entire taper surface, achieving broader and more efficient optogenetic activation of neurons when compared to the standard flat-faced fiber stimulation. Thus, tapered fibers permit focal or broad illumination that can be precisely and dynamically matched to experimental needs. PMID:28628101

  5. Zircon Age Distributions Provide Magma Fluxes in the Earth's Crust

    NASA Astrophysics Data System (ADS)

    Caricchi, L.; Simpson, G.; Schaltegger, U.

    2014-12-01

    Magma fluxes control the growth of continents, the frequency and magnitude of volcanic eruptions and are important for the genesis of magmatic ore deposits. A significant part of the magma produced in the Earth's mantle solidifies at depth and this limits our capability of determining magma fluxes, which, in turn, compromises our ability to establish a link between global heat transfer and large-scale geological processes. Using thermal modelling in combination with high precision zircon dating we show that populations of zircon ages provide an accurate mean to retrieve magma fluxes. The characteristics of zircon age populations vary significantly and systematically as function of the flux and total volume of magma accumulated at depth. This new approach provides results that are identical to independent determinations of magma fluxes and volumes of magmatic systems. The analysis of existing age population datasets by our method highlights that porphyry-type deposits, plutons and large eruptions each require magma input over different timescales at characteristic average fluxes.

  6. Highly stable carbon coated Mg2Si intermetallic nanoparticles for lithium-ion battery anode

    NASA Astrophysics Data System (ADS)

    Tamirat, Andebet Gedamu; Hou, Mengyan; Liu, Yao; Bin, Duan; Sun, Yunhe; Fan, Long; Wang, Yonggang; Xia, Yongyao

    2018-04-01

    Silicon is an ideal candidate anode material for Li-ion batteries (LIBs). However, it suffers from rapid capacity fading due to large volume expansion upon lithium insertion. Herein, we design and fabricate highly stable carbon coated porous Mg2Si intermetallic anode material using facile mechano-thermal technique followed by carbon coating using thermal vapour deposition (TVD), toluene as carbon source. The electrode exhibits an excellent first reversible capacity of 726 mAh g-1 at a rate of 100 mA g-1. More importantly, the electrode demonstrates high rate capability (380 mAh g-1 at high rate of 2 A g-1) as well as high cycle stability, with capacity retentions of 65% over 500 cycles. These improvements are attributable to both Mg supporting medium and the uniform carbon coating, which can effectively increase the conductivity and electronic contact of the active material and protects large volume alterations during the electrochemical cycling process.

  7. Neural networks within multi-core optic fibers

    PubMed Central

    Cohen, Eyal; Malka, Dror; Shemer, Amir; Shahmoon, Asaf; Zalevsky, Zeev; London, Michael

    2016-01-01

    Hardware implementation of artificial neural networks facilitates real-time parallel processing of massive data sets. Optical neural networks offer low-volume 3D connectivity together with large bandwidth and minimal heat production in contrast to electronic implementation. Here, we present a conceptual design for in-fiber optical neural networks. Neurons and synapses are realized as individual silica cores in a multi-core fiber. Optical signals are transferred transversely between cores by means of optical coupling. Pump driven amplification in erbium-doped cores mimics synaptic interactions. We simulated three-layered feed-forward neural networks and explored their capabilities. Simulations suggest that networks can differentiate between given inputs depending on specific configurations of amplification; this implies classification and learning capabilities. Finally, we tested experimentally our basic neuronal elements using fibers, couplers, and amplifiers, and demonstrated that this configuration implements a neuron-like function. Therefore, devices similar to our proposed multi-core fiber could potentially serve as building blocks for future large-scale small-volume optical artificial neural networks. PMID:27383911

  8. Neural networks within multi-core optic fibers.

    PubMed

    Cohen, Eyal; Malka, Dror; Shemer, Amir; Shahmoon, Asaf; Zalevsky, Zeev; London, Michael

    2016-07-07

    Hardware implementation of artificial neural networks facilitates real-time parallel processing of massive data sets. Optical neural networks offer low-volume 3D connectivity together with large bandwidth and minimal heat production in contrast to electronic implementation. Here, we present a conceptual design for in-fiber optical neural networks. Neurons and synapses are realized as individual silica cores in a multi-core fiber. Optical signals are transferred transversely between cores by means of optical coupling. Pump driven amplification in erbium-doped cores mimics synaptic interactions. We simulated three-layered feed-forward neural networks and explored their capabilities. Simulations suggest that networks can differentiate between given inputs depending on specific configurations of amplification; this implies classification and learning capabilities. Finally, we tested experimentally our basic neuronal elements using fibers, couplers, and amplifiers, and demonstrated that this configuration implements a neuron-like function. Therefore, devices similar to our proposed multi-core fiber could potentially serve as building blocks for future large-scale small-volume optical artificial neural networks.

  9. Dynamic illumination of spatially restricted or large brain volumes via a single tapered optical fiber.

    PubMed

    Pisanello, Ferruccio; Mandelbaum, Gil; Pisanello, Marco; Oldenburg, Ian A; Sileo, Leonardo; Markowitz, Jeffrey E; Peterson, Ralph E; Della Patria, Andrea; Haynes, Trevor M; Emara, Mohamed S; Spagnolo, Barbara; Datta, Sandeep Robert; De Vittorio, Massimo; Sabatini, Bernardo L

    2017-08-01

    Optogenetics promises precise spatiotemporal control of neural processes using light. However, the spatial extent of illumination within the brain is difficult to control and cannot be adjusted using standard fiber optics. We demonstrate that optical fibers with tapered tips can be used to illuminate either spatially restricted or large brain volumes. Remotely adjusting the light input angle to the fiber varies the light-emitting portion of the taper over several millimeters without movement of the implant. We use this mode to activate dorsal versus ventral striatum of individual mice and reveal different effects of each manipulation on motor behavior. Conversely, injecting light over the full numerical aperture of the fiber results in light emission from the entire taper surface, achieving broader and more efficient optogenetic activation of neurons, compared to standard flat-faced fiber stimulation. Thus, tapered fibers permit focal or broad illumination that can be precisely and dynamically matched to experimental needs.

  10. Next Generation Non-Vacuum, Maskless, Low Temperature Nanoparticle Ink Laser Digital Direct Metal Patterning for a Large Area Flexible Electronics

    PubMed Central

    Yeo, Junyeob; Hong, Sukjoon; Lee, Daehoo; Hotz, Nico; Lee, Ming-Tsang; Grigoropoulos, Costas P.; Ko, Seung Hwan

    2012-01-01

    Flexible electronics opened a new class of future electronics. The foldable, light and durable nature of flexible electronics allows vast flexibility in applications such as display, energy devices and mobile electronics. Even though conventional electronics fabrication methods are well developed for rigid substrates, direct application or slight modification of conventional processes for flexible electronics fabrication cannot work. The future flexible electronics fabrication requires totally new low-temperature process development optimized for flexible substrate and it should be based on new material too. Here we present a simple approach to developing a flexible electronics fabrication without using conventional vacuum deposition and photolithography. We found that direct metal patterning based on laser-induced local melting of metal nanoparticle ink is a promising low-temperature alternative to vacuum deposition– and photolithography-based conventional metal patterning processes. The “digital” nature of the proposed direct metal patterning process removes the need for expensive photomask and allows easy design modification and short turnaround time. This new process can be extremely useful for current small-volume, large-variety manufacturing paradigms. Besides, simple, scalable, fast and low-temperature processes can lead to cost-effective fabrication methods on a large-area polymer substrate. The developed process was successfully applied to demonstrate high-quality Ag patterning (2.1 µΩ·cm) and high-performance flexible organic field effect transistor arrays. PMID:22900011

  11. Next generation non-vacuum, maskless, low temperature nanoparticle ink laser digital direct metal patterning for a large area flexible electronics.

    PubMed

    Yeo, Junyeob; Hong, Sukjoon; Lee, Daehoo; Hotz, Nico; Lee, Ming-Tsang; Grigoropoulos, Costas P; Ko, Seung Hwan

    2012-01-01

    Flexible electronics opened a new class of future electronics. The foldable, light and durable nature of flexible electronics allows vast flexibility in applications such as display, energy devices and mobile electronics. Even though conventional electronics fabrication methods are well developed for rigid substrates, direct application or slight modification of conventional processes for flexible electronics fabrication cannot work. The future flexible electronics fabrication requires totally new low-temperature process development optimized for flexible substrate and it should be based on new material too. Here we present a simple approach to developing a flexible electronics fabrication without using conventional vacuum deposition and photolithography. We found that direct metal patterning based on laser-induced local melting of metal nanoparticle ink is a promising low-temperature alternative to vacuum deposition- and photolithography-based conventional metal patterning processes. The "digital" nature of the proposed direct metal patterning process removes the need for expensive photomask and allows easy design modification and short turnaround time. This new process can be extremely useful for current small-volume, large-variety manufacturing paradigms. Besides, simple, scalable, fast and low-temperature processes can lead to cost-effective fabrication methods on a large-area polymer substrate. The developed process was successfully applied to demonstrate high-quality Ag patterning (2.1 µΩ·cm) and high-performance flexible organic field effect transistor arrays.

  12. The role of porous matrix in water flow regulation within a karst unsaturated zone: an integrated hydrogeophysical approach

    NASA Astrophysics Data System (ADS)

    Carrière, Simon D.; Chalikakis, Konstantinos; Danquigny, Charles; Davi, Hendrik; Mazzilli, Naomi; Ollivier, Chloé; Emblanch, Christophe

    2016-11-01

    Some portions of the porous rock matrix in the karst unsaturated zone (UZ) can contain large volumes of water and play a major role in water flow regulation. The essential results are presented of a local-scale study conducted in 2011 and 2012 above the Low Noise Underground Laboratory (LSBB - Laboratoire Souterrain à Bas Bruit) at Rustrel, southeastern France. Previous research revealed the geological structure and water-related features of the study site and illustrated the feasibility of specific hydrogeophysical measurements. In this study, the focus is on hydrodynamics at the seasonal and event timescales. Magnetic resonance sounding (MRS) measured a high water content (more than 10 %) in a large volume of rock. This large volume of water cannot be stored in fractures and conduits within the UZ. MRS was also used to measure the seasonal variation of water stored in the karst UZ. A process-based model was developed to simulate the effect of vegetation on groundwater recharge dynamics. In addition, electrical resistivity tomography (ERT) monitoring was used to assess preferential water pathways during a rain event. This study demonstrates the major influence of water flow within the porous rock matrix on the UZ hydrogeological functioning at both the local (LSBB) and regional (Fontaine de Vaucluse) scales. By taking into account the role of the porous matrix in water flow regulation, these findings may significantly improve karst groundwater hydrodynamic modelling, exploitation, and sustainable management.

  13. Three-dimensional digital holographic aperture synthesis for rapid and highly-accurate large-volume metrology

    NASA Astrophysics Data System (ADS)

    Crouch, Stephen; Kaylor, Brant M.; Barber, Zeb W.; Reibel, Randy R.

    2015-09-01

    Currently large volume, high accuracy three-dimensional (3D) metrology is dominated by laser trackers, which typically utilize a laser scanner and cooperative reflector to estimate points on a given surface. The dependency upon the placement of cooperative targets dramatically inhibits the speed at which metrology can be conducted. To increase speed, laser scanners or structured illumination systems can be used directly on the surface of interest. Both approaches are restricted in their axial and lateral resolution at longer stand-off distances due to the diffraction limit of the optics used. Holographic aperture ladar (HAL) and synthetic aperture ladar (SAL) can enhance the lateral resolution of an imaging system by synthesizing much larger apertures by digitally combining measurements from multiple smaller apertures. Both of these approaches only produce two-dimensional imagery and are therefore not suitable for large volume 3D metrology. We combined the SAL and HAL approaches to create a swept frequency digital holographic 3D imaging system that provides rapid measurement speed for surface coverage with unprecedented axial and lateral resolution at longer standoff ranges. The technique yields a "data cube" of Fourier domain data, which can be processed with a 3D Fourier transform to reveal a 3D estimate of the surface. In this paper, we provide the theoretical background for the technique and show experimental results based on an ultra-wideband frequency modulated continuous wave (FMCW) chirped heterodyne ranging system showing ~100 micron lateral and axial precisions at >2 m standoff distances.

  14. Commercialization of a novel fermentation concept.

    PubMed

    Mazumdar-Shaw, Kiran; Suryanarayan, Shrikumar

    2003-01-01

    Fermentation is the core of biotechnology where current methodologies span across technologies based on the use of either solid or liquid substrates. Traditionally, solid substrate fermentation technologies have been the widely practiced in the Far East to manufacture fermented foods such as soya sauce, sake etc. The Western World briefly used solid substrate fermentation for the manufacture of antibiotics and enzymes but rapidly replaced this technology with submerged fermentation which proved to be a superior technology in terms of automation, containment and large volume fermentation. Biocon India developed its enzyme technology based on solid substrate fermentation as a low-cost, low-energy option for the production of specialty enzymes. However, the limitations of applying solid substrate fermentation to more sophisticated biotechnology products as well as large volume fermentations were recognized by Biocon India as early as 1990 and the company embarked on a 8 year research and development program to develop a novel bioreactor capable of conducting solid substrate fermentation with comparable levels of automation and containment as those practiced by submerged fermentation. In addition, the novel technology enabled fed-batch fermentation, in situ extraction and other enabling features that will be discussed in this article. The novel bioreactor was christened the "PlaFractor" (pronounced play-fractor). The next level of research on this novel technology is now focused on addressing large volume fermentation. This article traces the evolution of Biocon India's original solid substrate fermentation to the PlaFractor technology and provides details of the scale-up and commercialization processes that were involved therein. What is also apparent in the article is Biocon India's commercially focused research programs and the perceived need to be globally competitive through low costs of innovation that address, at all times, processes and technologies that exhibit high degrees of conformance to the international standards of regulatory and good manufacturing practice.

  15. Sound-level-dependent representation of frequency modulations in human auditory cortex: a low-noise fMRI study.

    PubMed

    Brechmann, André; Baumgart, Frank; Scheich, Henning

    2002-01-01

    Recognition of sound patterns must be largely independent of level and of masking or jamming background sounds. Auditory patterns of relevance in numerous environmental sounds, species-specific vocalizations and speech are frequency modulations (FM). Level-dependent activation of the human auditory cortex (AC) in response to a large set of upward and downward FM tones was studied with low-noise (48 dB) functional magnetic resonance imaging at 3 Tesla. Separate analysis in four territories of AC was performed in each individual brain using a combination of anatomical landmarks and spatial activation criteria for their distinction. Activation of territory T1b (including primary AC) showed the most robust level dependence over the large range of 48-102 dB in terms of activated volume and blood oxygen level dependent contrast (BOLD) signal intensity. The left nonprimary territory T2 also showed a good correlation of level with activated volume but, in contrast to T1b, not with BOLD signal intensity. These findings are compatible with level coding mechanisms observed in animal AC. A systematic increase of activation with level was not observed for T1a (anterior of Heschl's gyrus) and T3 (on the planum temporale). Thus these areas might not be specifically involved in processing of the overall intensity of FM. The rostral territory T1a of the left hemisphere exhibited highest activation when the FM sound level fell 12 dB below scanner noise. This supports the previously suggested special involvement of this territory in foreground-background decomposition tasks. Overall, AC of the left hemisphere showed a stronger level-dependence of signal intensity and activated volume than the right hemisphere. But any side differences of signal intensity at given levels were lateralized to right AC. This might point to an involvement of the right hemisphere in more specific aspects of FM processing than level coding.

  16. Allometric Analysis Detects Brain Size-Independent Effects of Sex and Sex Chromosome Complement on Human Cerebellar Organization

    PubMed Central

    Mankiw, Catherine; Park, Min Tae M.; Reardon, P.K.; Fish, Ari M.; Clasen, Liv S.; Greenstein, Deanna; Blumenthal, Jonathan D.; Lerch, Jason P.; Chakravarty, M. Mallar

    2017-01-01

    The cerebellum is a large hindbrain structure that is increasingly recognized for its contribution to diverse domains of cognitive and affective processing in human health and disease. Although several of these domains are sex biased, our fundamental understanding of cerebellar sex differences—including their spatial distribution, potential biological determinants, and independence from brain volume variation—lags far behind that for the cerebrum. Here, we harness automated neuroimaging methods for cerebellar morphometrics in 417 individuals to (1) localize normative male–female differences in raw cerebellar volume, (2) compare these to sex chromosome effects estimated across five rare sex (X/Y) chromosome aneuploidy (SCA) syndromes, and (3) clarify brain size-independent effects of sex and SCA on cerebellar anatomy using a generalizable allometric approach that considers scaling relationships between regional cerebellar volume and brain volume in health. The integration of these approaches shows that (1) sex and SCA effects on raw cerebellar volume are large and distributed, but regionally heterogeneous, (2) human cerebellar volume scales with brain volume in a highly nonlinear and regionally heterogeneous fashion that departs from documented patterns of cerebellar scaling in phylogeny, and (3) cerebellar organization is modified in a brain size-independent manner by sex (relative expansion of total cerebellum, flocculus, and Crus II-lobule VIIIB volumes in males) and SCA (contraction of total cerebellar, lobule IV, and Crus I volumes with additional X- or Y-chromosomes; X-specific contraction of Crus II-lobule VIIIB). Our methods and results clarify the shifts in human cerebellar organization that accompany interwoven variations in sex, sex chromosome complement, and brain size. SIGNIFICANCE STATEMENT Cerebellar systems are implicated in diverse domains of sex-biased behavior and pathology, but we lack a basic understanding of how sex differences in the human cerebellum are distributed and determined. We leverage a rare neuroimaging dataset to deconvolve the interwoven effects of sex, sex chromosome complement, and brain size on human cerebellar organization. We reveal topographically variegated scaling relationships between regional cerebellar volume and brain size in humans, which (1) are distinct from those observed in phylogeny, (2) invalidate a traditional neuroimaging method for brain volume correction, and (3) allow more valid and accurate resolution of which cerebellar subcomponents are sensitive to sex and sex chromosome complement. These findings advance understanding of cerebellar organization in health and sex chromosome aneuploidy. PMID:28314818

  17. Seismic signal processing on heterogeneous supercomputers

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas

    2015-04-01

    The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that require dedicated HPC solutions. The chosen application is using a wide range of common signal processing methods, which include various IIR filter designs, amplitude and phase correlation, computing the analytic signal, and discrete Fourier transforms. Furthermore, various processing methods specific for seismology, like rotation of seismic traces, are used. Efficient implementation of all these methods on the GPU-accelerated systems represents several challenges. In particular, it requires a careful distribution of work between the sequential processors and accelerators. Furthermore, since the application is designed to process very large volumes of data, special attention had to be paid to the efficient use of the available memory and networking hardware resources in order to reduce intensity of data input and output. In our contribution we will explain the software architecture as well as principal engineering decisions used to address these challenges. We will also describe the programming model based on C++ and CUDA that we used to develop the software. Finally, we will demonstrate performance improvements achieved by using the heterogeneous computing architecture. This work was supported by a grant from the Swiss National Supercomputing Centre (CSCS) under project ID d26.

  18. A Comparative Study of Point Cloud Data Collection and Processing

    NASA Astrophysics Data System (ADS)

    Pippin, J. E.; Matheney, M.; Gentle, J. N., Jr.; Pierce, S. A.; Fuentes-Pineda, G.

    2016-12-01

    Over the past decade, there has been dramatic growth in the acquisition of publicly funded high-resolution topographic data for scientific, environmental, engineering and planning purposes. These data sets are valuable for applications of interest across a large and varied user community. However, because of the large volumes of data produced by high-resolution mapping technologies and expense of aerial data collection, it is often difficult to collect and distribute these datasets. Furthermore, the data can be technically challenging to process, requiring software and computing resources not readily available to many users. This study presents a comparison of advanced computing hardware and software that is used to collect and process point cloud datasets, such as LIDAR scans. Activities included implementation and testing of open source libraries and applications for point cloud data processing such as, Meshlab, Blender, PDAL, and PCL. Additionally, a suite of commercial scale applications, Skanect and Cloudcompare, were applied to raw datasets. Handheld hardware solutions, a Structure Scanner and Xbox 360 Kinect V1, were tested for their ability to scan at three field locations. The resultant data projects successfully scanned and processed subsurface karst features ranging from small stalactites to large rooms, as well as a surface waterfall feature. Outcomes support the feasibility of rapid sensing in 3D at field scales.

  19. Assessing Stress Responses in Beaked and Sperm Whales in the Bahamas

    DTIC Science & Technology

    2012-09-30

    acceptable extraction efficiency for steroids (Hayward et al. 2010; Wasser et al. 2010). The"small sample size" effect on hormone concentration was...efficiency ( Wasser pers. comm., Hunt et al. unpub. data). 4) Pilot test of hormone content in seawater removed from samples. The large volume of...2006), and Wasser et al. (2010), with extraction modifications discussed above. RESULTS Sample processing Using a consistent fecal:solvent

  20. Microbial stabilization and mass reduction of wastes containing radionuclides and toxic metals

    DOEpatents

    Francis, A.J.; Dodge, C.J.; Gillow, J.B.

    1991-09-10

    A process is provided to treat wastes containing radionuclides and toxic metals with Clostridium sp. BFGl to release a large fraction of the waste solids into solution and convert the radionuclides and toxic metals to a more concentrated and stable form with concurrent volume and mass reduction. The radionuclides and toxic metals being in a more stable form are available for recovery, recycling and disposal. 18 figures.

  1. Manufacturing Diamond Under Very High Pressure

    NASA Technical Reports Server (NTRS)

    Voronov, Oleg

    2007-01-01

    A process for manufacturing bulk diamond has been made practical by the invention of the High Pressure and Temperature Apparatus capable of applying the combination of very high temperature and high pressure needed to melt carbon in a sufficiently large volume. The apparatus includes a reaction cell wherein a controlled static pressure as high as 20 GPa and a controlled temperature as high as 5,000 C can be maintained.

  2. Microbial stabilization and mass reduction of wastes containing radionuclides and toxic metals

    DOEpatents

    Francis, Arokiasamy J.; Dodge, Cleveland J.; Gillow, Jeffrey B.

    1991-01-01

    A process is provided to treat wastes containing radionuclides and toxic metals with Clostridium sp. BFGl to release a large fraction of the waste solids into solutin and convert the radionuclides and toxic metals to a more concentrated and stable form with concurrent volume and mass reduction. The radionuclides and toxic metals being in a more stable form are available for recovery, recycling and disposal.

  3. Research@ARL. Imaging & Image Processing. Volume 3, Issue 1

    DTIC Science & Technology

    2014-01-01

    goal, the focal plane arrays (FPAs) the Army deploys must excel in all areas of performance including thermal sensitivity, image resolution, speed of...are available only in relatively small sizes. Further, the difference in thermal expansion coefficients between a CZT substrate and its silicon (Si...read-out integrated circuitry reduces the reliability of large format FPAs due to repeated thermal cycling. Some in the community believed this

  4. Concepts for on-board satellite image registration. Volume 2: IAS prototype performance evaluation standard definition. [NEEDS Information Adaptive System

    NASA Technical Reports Server (NTRS)

    Daluge, D. R.; Ruedger, W. H.

    1981-01-01

    Problems encountered in testing onboard signal processing hardware designed to achieve radiometric and geometric correction of satellite imaging data are considered. These include obtaining representative image and ancillary data for simulation and the transfer and storage of a large quantity of image data at very high speed. The high resolution, high speed preprocessing of LANDSAT-D imagery is considered.

  5. Root induced changes of effective 1D hydraulic properties in a soil column.

    PubMed

    Scholl, P; Leitner, D; Kammerer, G; Loiskandl, W; Kaul, H-P; Bodner, G

    Roots are essential drivers of soil structure and pore formation. This study aimed at quantifying root induced changes of the pore size distribution (PSD). The focus was on the extent of clogging vs. formation of pores during active root growth. Parameters of Kosugi's lognormal PSD model were determined by inverse estimation in a column experiment with two cover crops (mustard, rye) and an unplanted control. Pore dynamics were described using a convection-dispersion like pore evolution model. Rooted treatments showed a wider range of pore radii with increasing volumes of large macropores >500 μm and micropores <2.5 μm, while fine macropores, mesopores and larger micropores decreased. The non-rooted control showed narrowing of the PSD and reduced porosity over all radius classes. The pore evolution model accurately described root induced changes, while structure degradation in the non-rooted control was not captured properly. Our study demonstrated significant short term root effects with heterogenization of the pore system as dominant process of root induced structure formation. Pore clogging is suggested as a partial cause for reduced pore volume. The important change in micro- and large macropores however indicates that multiple mechanic and biochemical processes are involved in root-pore interactions.

  6. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    PubMed

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  7. Visualization for Molecular Dynamics Simulation of Gas and Metal Surface Interaction

    NASA Astrophysics Data System (ADS)

    Puzyrkov, D.; Polyakov, S.; Podryga, V.

    2016-02-01

    The development of methods, algorithms and applications for visualization of molecular dynamics simulation outputs is discussed. The visual analysis of the results of such calculations is a complex and actual problem especially in case of the large scale simulations. To solve this challenging task it is necessary to decide on: 1) what data parameters to render, 2) what type of visualization to choose, 3) what development tools to use. In the present work an attempt to answer these questions was made. For visualization it was offered to draw particles in the corresponding 3D coordinates and also their velocity vectors, trajectories and volume density in the form of isosurfaces or fog. We tested the way of post-processing and visualization based on the Python language with use of additional libraries. Also parallel software was developed that allows processing large volumes of data in the 3D regions of the examined system. This software gives the opportunity to achieve desired results that are obtained in parallel with the calculations, and at the end to collect discrete received frames into a video file. The software package "Enthought Mayavi2" was used as the tool for visualization. This visualization application gave us the opportunity to study the interaction of a gas with a metal surface and to closely observe the adsorption effect.

  8. Finite Volume Scheme for Double Convection-Diffusion Exchange of Solutes in Bicarbonate High-Flux Hollow-Fiber Dialyzer Therapy

    PubMed Central

    Annan, Kodwo

    2012-01-01

    The efficiency of a high-flux dialyzer in terms of buffering and toxic solute removal largely depends on the ability to use convection-diffusion mechanism inside the membrane. A two-dimensional transient convection-diffusion model coupled with acid-base correction term was developed. A finite volume technique was used to discretize the model and to numerically simulate it using MATLAB software tool. We observed that small solute concentration gradients peaked and were large enough to activate solute diffusion process in the membrane. While CO2 concentration gradients diminished from their maxima and shifted toward the end of the membrane, HCO3 − concentration gradients peaked at the same position. Also, CO2 concentration decreased rapidly within the first 47 minutes while optimal HCO3 − concentration was achieved within 30 minutes of the therapy. Abnormally high diffusion fluxes were observed near the blood-membrane interface that increased diffusion driving force and enhanced the overall diffusive process. While convective flux dominated total flux during the dialysis session, there was a continuous interference between convection and diffusion fluxes that call for the need to seek minimal interference between these two mechanisms. This is critical for the effective design and operation of high-flux dialyzers. PMID:23197994

  9. Sewage sludge pasteurization by gamma radiation: Financial viability case studies

    NASA Astrophysics Data System (ADS)

    Swinwood, Jean F.; Kotler, Jiri

    This paper examines the financial viability of sewage sludge pasteurization by gamma radiation, by examining the following three North American scenarios: 1) Small volume sewage treatment plant experiencing high sludge disposal costs. 2) Large volume sewage treatment plant experiencing low sludge disposal costs. 3) Large volume sewage treatment plant experiencing high sludge disposal costs.

  10. Fully automated, real-time 3D ultrasound segmentation to estimate first trimester placental volume using deep learning.

    PubMed

    Looney, Pádraig; Stevenson, Gordon N; Nicolaides, Kypros H; Plasencia, Walter; Molloholli, Malid; Natsis, Stavros; Collins, Sally L

    2018-06-07

    We present a new technique to fully automate the segmentation of an organ from 3D ultrasound (3D-US) volumes, using the placenta as the target organ. Image analysis tools to estimate organ volume do exist but are too time consuming and operator dependant. Fully automating the segmentation process would potentially allow the use of placental volume to screen for increased risk of pregnancy complications. The placenta was segmented from 2,393 first trimester 3D-US volumes using a semiautomated technique. This was quality controlled by three operators to produce the "ground-truth" data set. A fully convolutional neural network (OxNNet) was trained using this ground-truth data set to automatically segment the placenta. OxNNet delivered state-of-the-art automatic segmentation. The effect of training set size on the performance of OxNNet demonstrated the need for large data sets. The clinical utility of placental volume was tested by looking at predictions of small-for-gestational-age babies at term. The receiver-operating characteristics curves demonstrated almost identical results between OxNNet and the ground-truth). Our results demonstrated good similarity to the ground-truth and almost identical clinical results for the prediction of SGA.

  11. Effect of cold drawing ratio on γ′ precipitation in Inconel X-750

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ha, Jeong Won; Research and Development Center, KOS Limited, Yangsan 626-230; Seong, Baek Seok

    2014-10-15

    Inconel X-750 is a Ni-based precipitation-hardened superalloy having large tensile and fracture strengths. In the study, X-750 wires were cold drawn to different extents. Small angle neutron scattering was employed to quantitatively measure the size and volume fraction of the γ′ phase as a function of the cold drawing ratio (DR) and aging temperature. The presence and size of γ′ precipitates were confirmed by transmission electron microscopy. The drawing ratio had an important effect on the volume fraction of the γ′ precipitates. However, the size of the precipitates was independent on the drawing ratio. The specimen with the minimum drawingmore » ratio (DR0) produced the largest volume fraction of γ′ as compared with large drawing ratio (DR) specimens such as DR17 and DR42. The small volume fraction of the γ′ phase for a sizeable drawing ratio was associated with the large amount of nucleation sites for secondary carbides, M{sub 23}C{sub 6}, and the fast diffusion path, i.e., dislocation, needed to form M{sub 23}C{sub 6}. A Cr depletion zone around the secondary carbides raised the solubility of γ′. Therefore, the significant drawing ratio contributing to the large volume fraction of the secondary carbides decreased the volume fraction of the γ′ precipitates in Inconel X-750. - Highlights: • The volume fraction of secondary carbides increased with the drawing ratio. • The volume fraction of γ′ decreased as the drawing ratio increased. • The drawing ratio affected the γ′ volume fraction with no variation of the γ' size. • The volume fraction of γ′ was affected by the secondary carbide volume fraction.« less

  12. The evolution of self-control

    PubMed Central

    MacLean, Evan L.; Hare, Brian; Nunn, Charles L.; Addessi, Elsa; Amici, Federica; Anderson, Rindy C.; Aureli, Filippo; Baker, Joseph M.; Bania, Amanda E.; Barnard, Allison M.; Boogert, Neeltje J.; Brannon, Elizabeth M.; Bray, Emily E.; Bray, Joel; Brent, Lauren J. N.; Burkart, Judith M.; Call, Josep; Cantlon, Jessica F.; Cheke, Lucy G.; Clayton, Nicola S.; Delgado, Mikel M.; DiVincenti, Louis J.; Fujita, Kazuo; Herrmann, Esther; Hiramatsu, Chihiro; Jacobs, Lucia F.; Jordan, Kerry E.; Laude, Jennifer R.; Leimgruber, Kristin L.; Messer, Emily J. E.; de A. Moura, Antonio C.; Ostojić, Ljerka; Picard, Alejandra; Platt, Michael L.; Plotnik, Joshua M.; Range, Friederike; Reader, Simon M.; Reddy, Rachna B.; Sandel, Aaron A.; Santos, Laurie R.; Schumann, Katrin; Seed, Amanda M.; Sewall, Kendra B.; Shaw, Rachael C.; Slocombe, Katie E.; Su, Yanjie; Takimoto, Ayaka; Tan, Jingzhi; Tao, Ruoting; van Schaik, Carel P.; Virányi, Zsófia; Visalberghi, Elisabetta; Wade, Jordan C.; Watanabe, Arii; Widness, Jane; Young, Julie K.; Zentall, Thomas R.; Zhao, Yini

    2014-01-01

    Cognition presents evolutionary research with one of its greatest challenges. Cognitive evolution has been explained at the proximate level by shifts in absolute and relative brain volume and at the ultimate level by differences in social and dietary complexity. However, no study has integrated the experimental and phylogenetic approach at the scale required to rigorously test these explanations. Instead, previous research has largely relied on various measures of brain size as proxies for cognitive abilities. We experimentally evaluated these major evolutionary explanations by quantitatively comparing the cognitive performance of 567 individuals representing 36 species on two problem-solving tasks measuring self-control. Phylogenetic analysis revealed that absolute brain volume best predicted performance across species and accounted for considerably more variance than brain volume controlling for body mass. This result corroborates recent advances in evolutionary neurobiology and illustrates the cognitive consequences of cortical reorganization through increases in brain volume. Within primates, dietary breadth but not social group size was a strong predictor of species differences in self-control. Our results implicate robust evolutionary relationships between dietary breadth, absolute brain volume, and self-control. These findings provide a significant first step toward quantifying the primate cognitive phenome and explaining the process of cognitive evolution. PMID:24753565

  13. Validation of a rapid, semiautomatic image analysis tool for measurement of gastric accommodation and emptying by magnetic resonance imaging

    PubMed Central

    Dixit, Sudeepa; Fox, Mark; Pal, Anupam

    2014-01-01

    Magnetic resonance imaging (MRI) has advantages for the assessment of gastrointestinal structures and functions; however, processing MRI data is time consuming and this has limited uptake to a few specialist centers. This study introduces a semiautomatic image processing system for rapid analysis of gastrointestinal MRI. For assessment of simpler regions of interest (ROI) such as the stomach, the system generates virtual images along arbitrary planes that intersect the ROI edges in the original images. This generates seed points that are joined automatically to form contours on each adjacent two-dimensional image and reconstructed in three dimensions (3D). An alternative thresholding approach is available for rapid assessment of complex structures like the small intestine. For assessment of dynamic gastrointestinal function, such as gastric accommodation and emptying, the initial 3D reconstruction is used as reference to process adjacent image stacks automatically. This generates four-dimensional (4D) reconstructions of dynamic volume change over time. Compared with manual processing, this semiautomatic system reduced the user input required to analyze a MRI gastric emptying study (estimated 100 vs. 10,000 mouse clicks). This analysis was not subject to variation in volume measurements seen between three human observers. In conclusion, the image processing platform presented processed large volumes of MRI data, such as that produced by gastric accommodation and emptying studies, with minimal user input. 3D and 4D reconstructions of the stomach and, potentially, other gastrointestinal organs are produced faster and more accurately than manual methods. This system will facilitate the application of MRI in gastrointestinal research and clinical practice. PMID:25540229

  14. Improving Butanol Fermentation To Enter the Advanced Biofuel Market

    PubMed Central

    Tracy, Bryan P.

    2012-01-01

    ABSTRACT 1-Butanol is a large-volume, intermediate chemical with favorable physical and chemical properties for blending with or directly substituting for gasoline. The per-volume value of butanol, as a chemical, is sufficient for investing into the recommercialization of the classical acetone-butanol-ethanol (ABE) (E. M. Green, Curr. Opin. Biotechnol. 22:337–343, 2011) fermentation process. Furthermore, with modest improvements in three areas of the ABE process, operating costs can be sufficiently decreased to make butanol an economically viable advanced biofuel. The three areas of greatest interest are (i) maximizing yields of butanol on any particular substrate, (ii) expanding substrate utilization capabilities of the host microorganism, and (iii) reducing the energy consumption of the overall production process, in particular the separation and purification operations. In their study in the September/October 2012 issue of mBio, Jang et al. [mBio 3(5):e00314-12, 2012] describe a comprehensive study on driving glucose metabolism in Clostridium acetobutylicum to the production of butanol. Moreover, they execute a metabolic engineering strategy to achieve the highest yet reported yields of butanol on glucose. PMID:23232720

  15. Bulk properties and near-critical behaviour of SiO2 fluid

    NASA Astrophysics Data System (ADS)

    Green, Eleanor C. R.; Artacho, Emilio; Connolly, James A. D.

    2018-06-01

    Rocky planets and satellites form through impact and accretion processes that often involve silicate fluids at extreme temperatures. First-principles molecular dynamics (FPMD) simulations have been used to investigate the bulk thermodynamic properties of SiO2 fluid at high temperatures (4000-6000 K) and low densities (500-2240 kg m-3), conditions which are relevant to protoplanetary disc condensation. Liquid SiO2 is highly networked at the upper end of this density range, but depolymerises with increasing temperature and volume, in a process characterised by the formation of oxygen-oxygen (Odbnd O) pairs. The onset of vaporisation is closely associated with the depolymerisation process, and is likely to be non-stoichiometric at high temperature, initiated via the exsolution of O2 molecules to leave a Si-enriched fluid. By 6000 K the simulated fluid is supercritical. A large anomaly in the constant-volume heat capacity occurs near the critical temperature. We present tabulated thermodynamic properties for silica fluid that reconcile observations from FPMD simulations with current knowledge of the SiO2 melting curve and experimental Hugoniot curves.

  16. Advanced Water Purification System for In Situ Resource Utilization

    NASA Technical Reports Server (NTRS)

    Anthony, Stephen M.; Jolley, Scott T.; Captain, James G.

    2013-01-01

    A main goal in the field of In Situ Resource Utilization is to develop technologies that produce oxygen from regolith to provide consumables to an extraterrestrial outpost. The processes developed reduce metal oxides in the regolith to produce water, which is then electrolyzed to produce oxygen. Hydrochloric and hydrofluoric acids are byproducts of the reduction processes, which must be removed to meet electrolysis purity standards. We previously characterized Nation, a highly water selective polymeric proton-exchange membrane, as a filtration material to recover pure water from the contaminated solution. While the membranes successfully removed both acid contaminants, the removal efficiency of and water flow rate through the membranes were not sufficient to produce large volumes of electrolysis-grade water. In the present study, we investigated electrodialysis as a potential acid removal technique. Our studies have shown a rapid and significant reduction in chloride and fluoride concentrations in the feed solution, while generating a relatively small volume of concentrated waste water. Electrodialysis has shown significant promise as the primary separation technique in ISRU water purification processes.

  17. Advanced Water Purification System for In Situ Resource Utilization Project

    NASA Technical Reports Server (NTRS)

    Anthony, Stephen M.

    2014-01-01

    A main goal in the field of In Situ Resource Utilization is to develop technologies that produce oxygen from regolith to provide consumables to an extratrrestrial outpost. The processes developed reduce metal oxides in the regolith to produce water, which is then electrolyzed to produce oxygen. Hydrochloric and hydrofluoric acids are byproducts of the reduction processes, which must be removed to meet electrolysis purity standards. We previously characterized Nation, a highly water selective polymeric proton-exchange membrane, as a filtrtion material to recover pure water from the contaminated solution. While the membranes successfully removed both acid contaminants, the removal efficiency of and water flow rate through the membranes were not sufficient to produce large volumes of electrolysis-grade water. In the present study, we investigated electrodialysis as a potential acid removable technique. Our studies have show a rapid and significant reduction in chloride and fluoride concentrations in the feed solution, while generating a relatively small volume of concentrated waste water. Electrodialysis has shown significant promise as the primary separation technique in ISRU water purification processes.

  18. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  19. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-21

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  20. Atmospheric Pressure Glow Discharge for Point-of-Use Water Treatment

    NASA Astrophysics Data System (ADS)

    Lindsay, Alexander; Byrns, Brandon; Shannon, Steven; Knappe, Detlef

    2012-10-01

    Treatment of biological and chemical contaminants is an area of growing global interest where atmospheric pressure plasmas can make a significant contribution. Addressing key challenges of volume processing and operational cost, a large volume 162 MHz coaxial air-plasma source has been developed.footnotetextByrns (2012) J. Phys. D: Appl. Phys. 45 (2012) 195204 Because of VHF ballasting effects, the electric discharge is maintained at a steady glow, allowing formation of critical non-equilibrium chemistry. High densities, ne = 10^11-10^12, have been recorded. The atmospheric nature of the device permits straightforward and efficient treatment of water samples. [H^+] concentrations in 150 milliliter tap water samples have been shown to increase by 10^5 after five minutes of discharge exposure. Recent literature has demonstrated that increasing acidity is strongly correlated with a solution's ability to deactivate microbial contaminants.footnotetextTraylor (2011) J. Phys. D: Appl. Phys. 44 (2011) 472001 The work presented here will explore the impact of treatment gas, system configuration, and power density on water disinfection and PFC abatement. An array of plasma diagnostics, including OES and electrical measurements, are combined with post-process water analysis, including GC-MS and QT analysis of coliform and E.coli bacteria. Development of volume processing atmospheric plasma disinfection methods offers promise for point-of-use treatments in developing areas of the world, potentially supplementing or replacing supply and weather-dependent disinfection methods.

  1. Volume Computation of a Stockpile - a Study Case Comparing GPS and Uav Measurements in AN Open Pit Quarry

    NASA Astrophysics Data System (ADS)

    Raeva, P. L.; Filipova, S. L.; Filipov, D. G.

    2016-06-01

    The following paper aims to test and evaluate the accuracy of UAV data for volumetric measurements to the conventional GNSS techniques. For this purpose, an appropriate open pit quarry has been chosen. Two sets of measurements were performed. Firstly, a stockpile was measured by GNSS technologies and later other terrestrial GNSS measurements for modelling the berms of the quarry were taken. Secondly, the area of the whole quarry including the stockpile site was mapped by a UAV flight. Having considered how dynamic our world is, new techniques and methods should be presented in numerous fields. For instance, the management of an open pit quarry requires gaining, processing and storing a large amount of information which is constantly changing with time. Fast and precise acquisition of measurements regarding the process taking place in a quarry is the key to an effective and stable maintenance. In other words, this means getting an objective evaluations of the processes, using up-to-date technologies and reliable accuracy of the results. Often legislations concerning mine engineering state that the volumetric calculations are to present ±3% accuracy of the whole amount. On one hand, extremely precise measurements could be performed by GNSS technologies, however, it could be really time consuming. On the other hand, UAV photogrammetry presents a fast, accurate method for mapping large areas and calculating stockpiles volumes. The study case was performed as a part of a master thesis.

  2. Novel diamond cells for neutron diffraction using multi-carat CVD anvils.

    PubMed

    Boehler, R; Molaison, J J; Haberl, B

    2017-08-01

    Traditionally, neutron diffraction at high pressure has been severely limited in pressure because low neutron flux required large sample volumes and therefore large volume presses. At the high-flux Spallation Neutron Source at the Oak Ridge National Laboratory, we have developed new, large-volume diamond anvil cells for neutron diffraction. The main features of these cells are multi-carat, single crystal chemical vapor deposition diamonds, very large diffraction apertures, and gas membranes to accommodate pressure stability, especially upon cooling. A new cell has been tested for diffraction up to 40 GPa with an unprecedented sample volume of ∼0.15 mm 3 . High quality spectra were obtained in 1 h for crystalline Ni and in ∼8 h for disordered glassy carbon. These new techniques will open the way for routine megabar neutron diffraction experiments.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parthipun, A. A., E-mail: aneeta@hotmail.co.uk; Taylor, J.; Manyonda, I.

    The purpose of this study was to determine whether there is a correlation between large uterine fibroid diameter, uterine volume, number of vials of embolic agent used and risk of complications from uterine artery embolisation (UAE). This was a prospective study involving 121 patients undergoing UAE embolisation for symptomatic uterine fibroids at a single institution. Patients were grouped according to diameter of largest fibroid and uterine volume. Results were also stratified according to the number of vials of embolic agent used and rate of complications. No statistical difference in complication rate was demonstrated between the two groups according to diametermore » of the largest fibroid (large fibroids were classified as {>=}10 cm; Fisher's exact test P = 1.00), and no statistical difference in complication rate was demonstrated according to uterine volume (large uterine volume was defined as {>=}750 cm{sup 3}; Fisher's exact test P = 0.70). 84 of the 121 patients had documentation of the number of vials used during the procedure. Patients were divided into two groups, with {>=}4 used defined as a large number of embolic agent. There was no statistical difference between these two groups and no associated increased risk of developing complications. This study showed no increased incidence of complications in women with large-diameter fibroids or uterine volumes as defined. In addition, there was no evidence of increased complications according to quantity of embolic material used. Therefore, UAE should be offered to women with large fibroids and uterine volumes.« less

  4. Large-scale fabrication of micro-lens array by novel end-fly-cutting-servo diamond machining.

    PubMed

    Zhu, Zhiwei; To, Suet; Zhang, Shaojian

    2015-08-10

    Fast/slow tool servo (FTS/STS) diamond turning is a very promising technique for the generation of micro-lens array (MLA). However, it is still a challenge to process MLA in large scale due to certain inherent limitations of this technique. In the present study, a novel ultra-precision diamond cutting method, as the end-fly-cutting-servo (EFCS) system, is adopted and investigated for large-scale generation of MLA. After a detailed discussion of the characteristic advantages for processing MLA, the optimal toolpath generation strategy for the EFCS is developed with consideration of the geometry and installation pose of the diamond tool. A typical aspheric MLA over a large area is experimentally fabricated, and the resulting form accuracy, surface micro-topography and machining efficiency are critically investigated. The result indicates that the MLA with homogeneous quality over the whole area is obtained. Besides, high machining efficiency, extremely small volume of control points for the toolpath, and optimal usage of system dynamics of the machine tool during the whole cutting can be simultaneously achieved.

  5. MIDAS, prototype Multivariate Interactive Digital Analysis System for large area earth resources surveys. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Christenson, D.; Gordon, M.; Kistler, R.; Kriegler, F.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1977-01-01

    A third-generation, fast, low cost, multispectral recognition system (MIDAS) able to keep pace with the large quantity and high rates of data acquisition from large regions with present and projected sensots is described. The program can process a complete ERTS frame in forty seconds and provide a color map of sixteen constituent categories in a few minutes. A principle objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in the overall program is described. The system contains a midi-computer to control the various high speed processing elements in the data path, a preprocessor to condition data, and a classifier which implements an all digital prototype multivariate Gaussian maximum likelihood or a Bayesian decision algorithm. Sufficient software was developed to perform signature extraction, control the preprocessor, compute classifier coefficients, control the classifier operation, operate the color display and printer, and diagnose operation.

  6. Heterogeneous seepage at the Nopal I natural analogue site, Chihuahua, Mexico

    NASA Astrophysics Data System (ADS)

    Dobson, Patrick F.; Ghezzehei, Teamrat A.; Cook, Paul J.; Rodríguez-Pineda, J. Alfredo; Villalba, Lourdes; de La Garza, Rodrigo

    2012-02-01

    A study of seepage occurring in an adit at the Nopal I uranium mine in Chihuahua, Mexico, was conducted as part of an integrated natural analogue study to evaluate the effects of infiltration and seepage on the mobilization and transport of radionuclides. An instrumented seepage collection system and local automated weather station permit direct correlation between local precipitation events and seepage. Field observations recorded between April 2005 and December 2006 indicate that seepage is highly heterogeneous with respect to time, location, and quantity. Seepage, precipitation, and fracture data were used to test two hypotheses: (1) that fast flow seepage is triggered by large precipitation events, and (2) that an increased abundance of fractures and/or fracture intersections leads to higher seepage volumes. A few zones in the back adit recorded elevated seepage volumes immediately following large (>20 mm/day) precipitation events, with transit times of less than 4 h through the 8-m thick rock mass. In most locations, there is a 1-6 month time lag between the onset of the rainy season and seepage, with longer times observed for the front adit. There is a less clear-cut relation between fracture abundance and seepage volume; processes such as evaporation and surface flow along the ceiling may also influence seepage.

  7. Lightning activity observed in upper and lower portions of storms and its relationship to storm structure from VHF mapping and Doppler radar

    NASA Technical Reports Server (NTRS)

    Taylor, W. L.; Rust, W. D.; Macgorman, D. R.; Brandes, E. A.

    1983-01-01

    Space time mapping of very high frequencies (VHF) sources reveals lightning processes for cloud to ground (CG) and for large intracloud (IC) flashes are confined to an altitude below about 10 km and closely associated with the central high reflectivity region of a storm. Another class of IC flashes was identified that produces a splattering of small sources within the main electrically active volume of a storm and also within a large divergent wind canopy at the top of a storm. There is no apparent temporal association between the small high altitude IC flashes occurring almost continuously and the large IC and CG flashes sporadically occurring in the lower portions of storms.

  8. Supplement a to compilation of air pollutant emission factors. Volume 1. Stationary point and area sources. Fifth edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-02-01

    This Supplement to AP-42 addresses pollutant-generating activity from Bituminous and Subbituminous Coal Combustion; Anthracite Coal Combustion; Fuel Oil Combustion; Natural Gas Combustion; Wood Waste Combustion in Boilers; Lignite Combustion; Waste Oil Combustion: Stationary Gas Turbines for Electricity Generation; Heavy-duty Natural Gas-fired Pipeline Compressor Engines; Large Stationary Diesel and all Stationary Dual-fuel engines; Natural Gas Processing; Organic Liquid Storage Tanks; Meat Smokehouses; Meat Rendering Plants; Canned Fruits and Vegetables; Dehydrated Fruits and Vegetables; Pickles, Sauces and Salad Dressing; Grain Elevators and Processes; Cereal Breakfast Foods; Pasta Manufacturing; Vegetable Oil Processing; Wines and Brandy; Coffee Roasting; Charcoal; Coal Cleaning; Frit Manufacturing; Sandmore » and Gravel Processing; Diatomite Processing; Talc Processing; Vermiculite Processing; paved Roads; and Unpaved Roads. Also included is information on Generalized Particle Size Distributions.« less

  9. Regional flux analysis for discovering and quantifying anatomical changes: An application to the brain morphometry in Alzheimer's disease.

    PubMed

    Lorenzi, M; Ayache, N; Pennec, X

    2015-07-15

    In this study we introduce the regional flux analysis, a novel approach to deformation based morphometry based on the Helmholtz decomposition of deformations parameterized by stationary velocity fields. We use the scalar pressure map associated to the irrotational component of the deformation to discover the critical regions of volume change. These regions are used to consistently quantify the associated measure of volume change by the probabilistic integration of the flux of the longitudinal deformations across the boundaries. The presented framework unifies voxel-based and regional approaches, and robustly describes the volume changes at both group-wise and subject-specific level as a spatial process governed by consistently defined regions. Our experiments on the large cohorts of the ADNI dataset show that the regional flux analysis is a powerful and flexible instrument for the study of Alzheimer's disease in a wide range of scenarios: cross-sectional deformation based morphometry, longitudinal discovery and quantification of group-wise volume changes, and statistically powered and robust quantification of hippocampal and ventricular atrophy. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Differences in regional brain volume related to the extraversion-introversion dimension--a voxel based morphometry study.

    PubMed

    Forsman, Lea J; de Manzano, Orjan; Karabanov, Anke; Madison, Guy; Ullén, Fredrik

    2012-01-01

    Extraverted individuals are sociable, behaviorally active, and happy. We report data from a voxel based morphometry study investigating, for the first time, if regional volume in gray and white matter brain regions is related to extraversion. For both gray and white matter, all correlations between extraversion and regional brain volume were negative, i.e. the regions were larger in introverts. Gray matter correlations were found in regions that included the right prefrontal cortex and the cortex around the right temporo-parietal junction--regions that are known to be involved in behavioral inhibition, introspection, and social-emotional processing, e.g. evaluation of social stimuli and reasoning about the mental states of others. White matter correlations extended from the brainstem to widespread cortical regions, and were largely due to global effects, i.e. a larger total white matter volume in introverts. We speculate that these white matter findings may reflect differences in ascending modulatory projections affecting cortical regions involved in behavioral regulation. Copyright © 2011 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.

  11. Assessing the Feasibility of Large-Scale Countercyclical Public Job-Creation. Final Report, Volume III. Selected Implications of Public Job-Creation.

    ERIC Educational Resources Information Center

    Urban Inst., Washington, DC.

    This last of a three-volume report of a study done to assess the feasibility of large-scale, countercyclical public job creation covers the findings regarding the priorities among projects, indirect employment effects, skill imbalances, and administrative issues; and summarizes the overall findings, conclusions, and recommendations. (Volume 1,…

  12. Ray Casting of Large Multi-Resolution Volume Datasets

    NASA Astrophysics Data System (ADS)

    Lux, C.; Fröhlich, B.

    2009-04-01

    High quality volume visualization through ray casting on graphics processing units (GPU) has become an important approach for many application domains. We present a GPU-based, multi-resolution ray casting technique for the interactive visualization of massive volume data sets commonly found in the oil and gas industry. Large volume data sets are represented as a multi-resolution hierarchy based on an octree data structure. The original volume data is decomposed into small bricks of a fixed size acting as the leaf nodes of the octree. These nodes are the highest resolution of the volume. Coarser resolutions are represented through inner nodes of the hierarchy which are generated by down sampling eight neighboring nodes on a finer level. Due to limited memory resources of current desktop workstations and graphics hardware only a limited working set of bricks can be locally maintained for a frame to be displayed. This working set is chosen to represent the whole volume at different local resolution levels depending on the current viewer position, transfer function and distinct areas of interest. During runtime the working set of bricks is maintained in CPU- and GPU memory and is adaptively updated by asynchronously fetching data from external sources like hard drives or a network. The CPU memory hereby acts as a secondary level cache for these sources from which the GPU representation is updated. Our volume ray casting algorithm is based on a 3D texture-atlas in GPU memory. This texture-atlas contains the complete working set of bricks of the current multi-resolution representation of the volume. This enables the volume ray casting algorithm to access the whole working set of bricks through only a single 3D texture. For traversing rays through the volume, information about the locations and resolution levels of visited bricks are required for correct compositing computations. We encode this information into a small 3D index texture which represents the current octree subdivision on its finest level and spatially organizes the bricked data. This approach allows us to render a bricked multi-resolution volume data set utilizing only a single rendering pass with no loss of compositing precision. In contrast most state-of-the art volume rendering systems handle the bricked data as individual 3D textures, which are rendered one at a time while the results are composited into a lower precision frame buffer. Furthermore, our method enables us to integrate advanced volume rendering techniques like empty-space skipping, adaptive sampling and preintegrated transfer functions in a very straightforward manner with virtually no extra costs. Our interactive volume ray tracing implementation allows high quality visualizations of massive volume data sets of tens of Gigabytes in size on standard desktop workstations.

  13. Optimisation des proprietes physiques d'un composite carbone epoxy fabrique par le procede RFI

    NASA Astrophysics Data System (ADS)

    Koanda, Mahamat Mamadou Lamine

    The RFI (Resin Film Infusion) process is a composite materials manufacturing process. Especially known for the small investment it requires, RFI processes are more and more widely used in the aeronautical industry. However a number of aspects of this process are still not well controlled. The quality of the final part depends on which process is used. In the case of RFI, controlling physical characteristics such as thickness, fiber volume fraction or void content remains a major challenge. This dissertation deals with the optimization of the physical properties of a carbon composite manufactured with RFI processes. The ASTMD3171 and ASTMD792 standards were used to measure the void content and fiber volume fraction. First, we introduced different layup sequences in the RFI process and evaluate their impact on the physical properties of the final product. The experiments show the primary mode A, with the resin film at the bottom, resulting in much better quality with controlled fiber volume fraction and void content. Mode B (film in the symmetrical plane) yields results identical to mode A except more irregular thicknesses. Mode C (symmetrical film in the laminate) produces locally unacceptable void contents. Mode D (resin film on the top of the laminate) yields much better results than mode A with the exception of the more irregular thicknesses. Making gaps and overlaps with the resin film has negative effects beyond 2.54cm (one inch) and should be avoided. Several C-scan observations of the manufactured samples showed a large accumulation of porosity in the resin rich areas, as well as surface defects. Ultimately we analyzed the cure cycle in light of the thermodynamic porosity models. It is evident that the diffusion phenomenon is essential in this process. Therefore a better conditioning of the resin film made by Cytec is required. An optimal design with a cycle stop and pressure lag yields the optimal cure cycle for the RFI process.

  14. Foundering and Exhumation of UHP Terranes: Race Car or School Bus?

    NASA Astrophysics Data System (ADS)

    Kylander-Clark, A. R.; Hacker, B. R.

    2008-12-01

    Recent geochronologic data from the giant ultrahigh-pressure (UHP) terrane, in the Western Gneiss Region of Norway, indicate that subduction and exhumation were relatively slow (a few mm/yr), and that the terrane was exhumed to the surface as a relatively thick, coherent body. These conclusions are in stark contrast to those reached in previous studies of some of the best-studied, smaller UHP terranes and suggest that the processes that form and/or exhume small UHP terranes are fundamentally different from the processes that affect large UHP terranes. These differences may be the result of variations in the buoyancy forces of different proportions of subducted felsic crust, mafic crust, and mantle lithosphere. Initial collision occurs via the subduction of smaller portions of continental material, such as microcontinents or ribbon continents. Because the proportion of continental crust is small, the processes involved in early UHP terrane formation are dominated by the oceanic slab; subduction rates are fast because average plate densities are high, and, as a result, subduction angles are steep. Because these smaller, thinner portions of crust are weak, they deform easily and mix readily with the mantle. As the collision matures, thicker and larger portions of continental material-such as a continental margin-are subducted, and the subduction regime changes from one that was ocean dominated to one that is continent dominated. The increased buoyancy of the larger volume of continental crust resists the pull of the leading oceanic lithosphere; subduction shallows and plate rates slow. Because the downgoing continent is thick, it is strong, remains cohesive and has limited interaction with the mantle. Although the subduction regime during early orogenesis is distinct from that during late orogenesis, the degree of mountain building and crustal thickening may be similar in both stages as small volumes and fast flow rates of buoyant material give way to large volumes and slow flow rates.

  15. A very efficient RCS data compression and reconstruction technique, volume 4

    NASA Technical Reports Server (NTRS)

    Tseng, N. Y.; Burnside, W. D.

    1992-01-01

    A very efficient compression and reconstruction scheme for RCS measurement data was developed. The compression is done by isolating the scattering mechanisms on the target and recording their individual responses in the frequency and azimuth scans, respectively. The reconstruction, which is an inverse process of the compression, is granted by the sampling theorem. Two sets of data, the corner reflectors and the F-117 fighter model, were processed and the results were shown to be convincing. The compression ratio can be as large as several hundred, depending on the target's geometry and scattering characteristics.

  16. Idaho Nuclear Technology and Engineering Center Low-Activity Waste Process Technology Program, FY-98 Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herbst, A.K.; Rogers, A.Z.; McCray, J.A.

    The Low-Activity Waste Process Technology Program at the Idaho Nuclear Technology and Engineering Center (INTEC) anticipates that large volumes of low-level/low-activity wastes will need to be grouted prior to near-surface disposal. During fiscal year 1998, three grout formulations were studied for low-activity wastes derived from INTEC liquid sodium-bearing waste. Compressive strength and leach results are presented for phosphate bonding cement, acidic grout, and alkaline grout formulations. In an additional study, grout formulations are recommended for stabilization of the INTEC underground storage tank residual heels.

  17. Advances in roll to roll processing of optics

    NASA Astrophysics Data System (ADS)

    Watts, Michael P. C.

    2008-02-01

    Today, there are a number of successful commercial applications that utilize roll to roll processing and almost all involve optics; unpatterned film, patterned film, and devices on film. The largest applications today are in holograms, and brightness enhancement film (BEF) for LCD. Solar cells are rapidly growing. These are mostly made in large captive facilities with their own proprietary equipment, materials and pattern generation capability. World wide roll to roll volume is > 100M meters2 year -1, and generates sales of > $5B. The vast majority of the sales are in BEF film by 3M.

  18. Increasing the speed of medical image processing in MatLab®

    PubMed Central

    Bister, M; Yap, CS; Ng, KH; Tok, CH

    2007-01-01

    MatLab® has often been considered an excellent environment for fast algorithm development but is generally perceived as slow and hence not fit for routine medical image processing, where large data sets are now available e.g., high-resolution CT image sets with typically hundreds of 512x512 slices. Yet, with proper programming practices – vectorization, pre-allocation and specialization – applications in MatLab® can run as fast as in C language. In this article, this point is illustrated with fast implementations of bilinear interpolation, watershed segmentation and volume rendering. PMID:21614269

  19. Combined low-volume polyethylene glycol solution plus stimulant laxatives versus standard-volume polyethylene glycol solution: A prospective, randomized study of colon cleansing before colonoscopy

    PubMed Central

    Hookey, Lawrence C; Depew, William T; Vanner, Stephen J

    2006-01-01

    INTRODUCTION The effectiveness of polyethylene glycol solutions (PEG) for colon cleansing is often limited by the inability of patients to drink adequate portions of the 4 L solution. The aim of the present study was to determine whether a reduced volume of PEG combined with stimulant laxatives would be better tolerated and as or more effective than the standard dose. METHODS Patients undergoing outpatient colonoscopy were randomly assigned to receive either low-volume PEG plus sennosides (120 mg oral sennosides syrup followed by 2 L PEG) or the standard volume preparation (4 L PEG). The subjects rated the tolerability of the preparations and their symptoms. Colonoscopists were blind to the colonic cleansing preparation and graded the cleansing efficacy using a validated tool (the Ottawa scale). RESULTS The low-volume PEG plus sennosides preparation was significantly better tolerated than the standard large volume PEG (P<0.001) but was less efficacious (P=0.03). Thirty-eight per cent of patients in the large volume PEG group were unable to finish the preparation, compared with only 6% in the reduced volume group. There were no adverse events reported. CONCLUSIONS Although the low-volume PEG plus sennosides preparation was better tolerated, it was not as effective as standard large-volume PEG. However, in view of the significant difference in tolerance, further research investigating possible improvements in the reduced-volume regimen seems warranted. PMID:16482236

  20. User Interactive Software for Analysis of Human Physiological Data

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta

    2006-01-01

    Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration

  1. Linear Approximation SAR Azimuth Processing Study

    NASA Technical Reports Server (NTRS)

    Lindquist, R. B.; Masnaghetti, R. K.; Belland, E.; Hance, H. V.; Weis, W. G.

    1979-01-01

    A segmented linear approximation of the quadratic phase function that is used to focus the synthetic antenna of a SAR was studied. Ideal focusing, using a quadratic varying phase focusing function during the time radar target histories are gathered, requires a large number of complex multiplications. These can be largely eliminated by using linear approximation techniques. The result is a reduced processor size and chip count relative to ideally focussed processing and a correspondingly increased feasibility for spaceworthy implementation. A preliminary design and sizing for a spaceworthy linear approximation SAR azimuth processor meeting requirements similar to those of the SEASAT-A SAR was developed. The study resulted in a design with approximately 1500 IC's, 1.2 cubic feet of volume, and 350 watts of power for a single look, 4000 range cell azimuth processor with 25 meters resolution.

  2. The largest volcanic eruptions on Earth

    NASA Astrophysics Data System (ADS)

    Bryan, Scott E.; Peate, Ingrid Ukstins; Peate, David W.; Self, Stephen; Jerram, Dougal A.; Mawby, Michael R.; Marsh, J. S. (Goonie); Miller, Jodie A.

    2010-10-01

    Large igneous provinces (LIPs) are sites of the most frequently recurring, largest volume basaltic and silicic eruptions in Earth history. These large-volume (> 1000 km 3 dense rock equivalent) and large-magnitude (> M8) eruptions produce areally extensive (10 4-10 5 km 2) basaltic lava flow fields and silicic ignimbrites that are the main building blocks of LIPs. Available information on the largest eruptive units are primarily from the Columbia River and Deccan provinces for the dimensions of flood basalt eruptions, and the Paraná-Etendeka and Afro-Arabian provinces for the silicic ignimbrite eruptions. In addition, three large-volume (675-2000 km 3) silicic lava flows have also been mapped out in the Proterozoic Gawler Range province (Australia), an interpreted LIP remnant. Magma volumes of > 1000 km 3 have also been emplaced as high-level basaltic and rhyolitic sills in LIPs. The data sets indicate comparable eruption magnitudes between the basaltic and silicic eruptions, but due to considerable volumes residing as co-ignimbrite ash deposits, the current volume constraints for the silicic ignimbrite eruptions may be considerably underestimated. Magma composition thus appears to be no barrier to the volume of magma emitted during an individual eruption. Despite this general similarity in magnitude, flood basaltic and silicic eruptions are very different in terms of eruption style, duration, intensity, vent configuration, and emplacement style. Flood basaltic eruptions are dominantly effusive and Hawaiian-Strombolian in style, with magma discharge rates of ~ 10 6-10 8 kg s -1 and eruption durations estimated at years to tens of years that emplace dominantly compound pahoehoe lava flow fields. Effusive and fissural eruptions have also emplaced some large-volume silicic lavas, but discharge rates are unknown, and may be up to an order of magnitude greater than those of flood basalt lava eruptions for emplacement to be on realistic time scales (< 10 years). Most silicic eruptions, however, are moderately to highly explosive, producing co-current pyroclastic fountains (rarely Plinian) with discharge rates of 10 9-10 11 kg s -1 that emplace welded to rheomorphic ignimbrites. At present, durations for the large-magnitude silicic eruptions are unconstrained; at discharge rates of 10 9 kg s -1, equivalent to the peak of the 1991 Mt Pinatubo eruption, the largest silicic eruptions would take many months to evacuate > 5000 km 3 of magma. The generally simple deposit structure is more suggestive of short-duration (hours to days) and high intensity (~ 10 11 kg s -1) eruptions, perhaps with hiatuses in some cases. These extreme discharge rates would be facilitated by multiple point, fissure and/or ring fracture venting of magma. Eruption frequencies are much elevated for large-magnitude eruptions of both magma types during LIP-forming episodes. However, in basalt-dominated provinces (continental and ocean basin flood basalt provinces, oceanic plateaus, volcanic rifted margins), large magnitude (> M8) basaltic eruptions have much shorter recurrence intervals of 10 3-10 4 years, whereas similar magnitude silicic eruptions may have recurrence intervals of up to 10 5 years. The Paraná-Etendeka province was the site of at least nine > M8 silicic eruptions over an ~ 1 Myr period at ~ 132 Ma; a similar eruption frequency, although with a fewer number of silicic eruptions is also observed for the Afro-Arabian Province. The huge volumes of basaltic and silicic magma erupted in quick succession during LIP events raises several unresolved issues in terms of locus of magma generation and storage (if any) in the crust prior to eruption, and paths and rates of ascent from magma reservoirs to the surface. Available data indicate four end-member magma petrogenetic pathways in LIPs: 1) flood basalt magmas with primitive, mantle-dominated geochemical signatures (often high-Ti basalt magma types) that were either transferred directly from melting regions in the upper mantle to fissure vents at surface, or resided temporarily in reservoirs in the upper mantle or in mafic underplate thereby preventing extensive crustal contamination or crystallisation; 2) flood basalt magmas (often low-Ti types) that have undergone storage at lower ± upper crustal depths resulting in crustal assimilation, crystallisation, and degassing; 3) generation of high-temperature anhydrous, crystal-poor silicic magmas (e.g., Paraná-Etendeka quartz latites) by large-scale AFC processes involving lower crustal granulite melting and/or basaltic underplate remelting; and 4) rejuvenation of upper-crustal batholiths (mainly near-solidus crystal mush) by shallow intrusion and underplating by mafic magma providing thermal and volatile input to produce large volumes of crystal-rich (30-50%) dacitic to rhyolitic magma and for ignimbrite-producing eruptions, well-defined calderas up to 80 km diameter (e.g., Fish Canyon Tuff model), and which characterise of some silicic eruptions in silicic LIPs.

  3. Availability and temporal heterogeneity of water supply affect the vertical distribution and mortality of a belowground herbivore and consequently plant growth.

    PubMed

    Tsunoda, Tomonori; Kachi, Naoki; Suzuki, Jun-Ichirou

    2014-01-01

    We examined how the volume and temporal heterogeneity of water supply changed the vertical distribution and mortality of a belowground herbivore, and consequently affected plant biomass. Plantago lanceolata (Plantaginaceae) seedlings were grown at one per pot under different combinations of water volume (large or small volume) and heterogeneity (homogeneous water conditions, watered every day; heterogeneous conditions, watered every 4 days) in the presence or absence of a larva of the belowground herbivorous insect, Anomala cuprea (Coleoptera: Scarabaeidae). The larva was confined in different vertical distributions to top feeding zone (top treatment), middle feeding zone (middle treatment), or bottom feeding zone (bottom treatment); alternatively no larva was introduced (control treatment) or larval movement was not confined (free treatment). Three-way interaction between water volume, heterogeneity, and the herbivore significantly affected plant biomass. With a large water volume, plant biomass was lower in free treatment than in control treatment regardless of heterogeneity. Plant biomass in free treatment was as low as in top treatment. With a small water volume and in free treatment, plant biomass was low (similar to that under top treatment) under homogeneous water conditions but high under heterogeneous ones (similar to that under middle or bottom treatment). Therefore, there was little effect of belowground herbivory on plant growth under heterogeneous water conditions. In other watering regimes, herbivores would be distributed in the shallow soil and reduced root biomass. Herbivore mortality was high with homogeneous application of a large volume or heterogeneous application of a small water volume. Under the large water volume, plant biomass was high in pots in which the herbivore had died. Thus, the combinations of water volume and heterogeneity affected plant growth via the change of a belowground herbivore.

  4. Improving conversion yield of fermentable sugars into fuel ethanol in 1st generation yeast-based production processes.

    PubMed

    Gombert, Andreas K; van Maris, Antonius J A

    2015-06-01

    Current fuel ethanol production using yeasts and starch or sucrose-based feedstocks is referred to as 1st generation (1G) ethanol production. These processes are characterized by the high contribution of sugar prices to the final production costs, by high production volumes, and by low profit margins. In this context, small improvements in the ethanol yield on sugars have a large impact on process economy. Three types of strategies used to achieve this goal are discussed: engineering free-energy conservation, engineering redox-metabolism, and decreasing sugar losses in the process. Whereas the two former strategies lead to decreased biomass and/or glycerol formation, the latter requires increased process and/or yeast robustness. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer.

    PubMed

    Kessel, Kerstin A; Habermehl, Daniel; Jäger, Andreas; Floca, Ralf O; Zhang, Lanlan; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E

    2013-06-07

    In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition.

  6. A Multi-Cohort Study of ApoE ɛ4 and Amyloid-β Effects on the Hippocampus in Alzheimer’s Disease

    PubMed Central

    Khan, Wasim; Giampietro, Vincent; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L.W.; Büchel, Christian; Conrod, Patricia; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Gowland, Penny; Heinz, Anreas; Ittermann, Bernd; Lemaître, Hervé; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Rietschel, Marcella; Smolka, Michael N.; Ströhle, Andreas; Gallinat, Jeurgen; Vellas, Bruno; Soininen, Hilkka; Kloszewska, Iwona; Tsolaki, Magda; Mecocci, Patrizia; Spenger, Christian; Villemagne, Victor L.; Masters, Colin L.; Muehlboeck, J-Sebastian; Bäckman, Lars; Fratiglioni, Laura; Kalpouzos, Grégoria; Wahlund, Lars-Olof; Schumann, Gunther; Lovestone, Simon; Williams, Steven C.R.; Westman, Eric; Simmons, Andrew

    2017-01-01

    The apolipoprotein E (APOE) gene has been consistently shown to modulate the risk of Alzheimer’s disease (AD). Here, using an AD and normal aging dataset primarily consisting of three AD multi-center studies (n = 1,781), we compared the effect of APOE and amyloid-β (Aβ) on baseline hippocampal volumes in AD patients, mild cognitive impairment (MCI) subjects, and healthy controls. A large sample of healthy adolescents (n = 1,387) was also used to compare hippocampal volumes between APOE groups. Subjects had undergone a magnetic resonance imaging (MRI) scan and APOE genotyping. Hippocampal volumes were processed using FreeSurfer. In the AD and normal aging dataset, hippocampal comparisons were performed in each APOE group and in ɛ4 carriers with positron emission tomography (PET) Aβ who were dichotomized (Aβ+/Aβ–) using previous cut-offs. We found a linear reduction in hippocampal volumes with ɛ4 carriers possessing the smallest volumes, ɛ3 carriers possessing intermediate volumes, and ɛ2 carriers possessing the largest volumes. Moreover, AD and MCI ɛ4 carriers possessed the smallest hippocampal volumes and control ɛ2 carriers possessed the largest hippocampal volumes. Subjects with both APOE ɛ4 and Aβ positivity had the lowest hippocampal volumes when compared to Aβ- ɛ4 carriers, suggesting a synergistic relationship between APOE ɛ4 and Aβ. However, we found no hippocampal volume differences between APOE groups in healthy 14-year-old adolescents. Our findings suggest that the strongest neuroanatomic effect of APOE ɛ4 on the hippocampus is observed in AD and groups most at risk of developing the disease, whereas hippocampi of old and young healthy individuals remain unaffected. PMID:28157104

  7. Ultra-High Density Holographic Memory Module with Solid-State Architecture

    NASA Technical Reports Server (NTRS)

    Markov, Vladimir B.

    2000-01-01

    NASA's terrestrial. space, and deep-space missions require technology that allows storing. retrieving, and processing a large volume of information. Holographic memory offers high-density data storage with parallel access and high throughput. Several methods exist for data multiplexing based on the fundamental principles of volume hologram selectivity. We recently demonstrated that a spatial (amplitude-phase) encoding of the reference wave (SERW) looks promising as a way to increase the storage density. The SERW hologram offers a method other than traditional methods of selectivity, such as spatial de-correlation between recorded and reconstruction fields, In this report we present the experimental results of the SERW-hologram memory module with solid-state architecture, which is of particular interest for space operations.

  8. LLE review. Volume 61, Quarterly report, October--December 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-31

    This volume of the LLE review, covering the period of October--December 1994, contains articles on a diagnostic method employing krypton spectroscopy for measurement of temperature and shell-fuel mixing in high-temperature implosions; the first direct assessment of the ion-acoustic decay instability in a large-scale length, hot plasma; measurements of polarization mode dispersion and group-velocity walkaway in birefringent media using a frequency domain interferometer; an evaluation of the magnetic flux dynamics occurring in an optically triggered, thin-film superconducting switch; the effect of slurry fluid chemistry on particle size distribution during aqueous polishing of optical glass; and the influence of thermal and mechanicalmore » processing history in the preparation of well-ordered liquid crystal elastomer systems.« less

  9. Interannual kinetics (2010-2013) of large wood in a river corridor exposed to a 50-year flood event and fluvial ice dynamics

    NASA Astrophysics Data System (ADS)

    Boivin, Maxime; Buffin-Bélanger, Thomas; Piégay, Hervé

    2017-02-01

    Semi-alluvial rivers of the Gaspé Peninsula, Québec, are prone to produce and transport vast quantities of large wood (LW). The high rate of lateral erosion owing to high energy flows and noncohesive banks is the main process leading to the recruitment of large wood, which in turn initiates complex patterns of wood accumulation and reentrainment within the active channel. The delta of the Saint-Jean River (SJR) has accumulated large annual wood fluxes since 1960 that culminated in a wood raft of > 3-km in length in 2014. To document the kinetics of large wood on the main channel of SJR, four annual surveys were carried out from 2010 to 2013 to locate and describe > 1000 large wood jams (LWJ) and 2000 large wood individuals (LWI) along a 60-km river section. Airborne and ground photo/video images were used to estimate the wood volume introduced by lateral erosion and to identify local geomorphic conditions that control wood mobility and deposits. Video camera analysis allowed the examination of transport rates from three hydrometeorological events for specific river sections. Results indicate that the volume of LW recruited between 2010 and 2013 represents 57% of the total LW production over the 2004-2013 period. Volumes of wood deposited along the 60-km section were four times higher in 2013 than in 2010. Increases in wood amount occurred mainly in upper alluvial sections of the river, whereas decreases were observed in the semi-alluvial middle sections. Observations suggest that the 50-year flood event of 2010 produced large amounts of LW that were only partly exported out of the basin so that a significant amount was still available for subsequent floods. Large wood storage continued after this flood until a similar flood or an ice-breakup event could remobilise these LW accumulations into the river corridor. Ice-jam floods transport large amounts of wood during events with fairly low flow but do not contribute significantly to recruitment rates (ca. 10 to 30% early). It is fairly probable that the wood export peak observed in 2012 at the river mouth, where no flood occurred and which is similar to the 1-in 10-year flood of 2010, is mainly linked to such ice-break events that occurred in March 2012.

  10. Record of massive upwellings from the Pacific large low shear velocity province

    PubMed Central

    Madrigal, Pilar; Gazel, Esteban; Flores, Kennet E.; Bizimis, Michael; Jicha, Brian

    2016-01-01

    Large igneous provinces, as the surface expression of deep mantle processes, play a key role in the evolution of the planet. Here we analyse the geochemical record and timing of the Pacific Ocean Large Igneous Provinces and preserved accreted terranes to reconstruct the history of pulses of mantle plume upwellings and their relation with a deep-rooted source like the Pacific large low-shear velocity Province during the Mid-Jurassic to Upper Cretaceous. Petrological modelling and geochemical data suggest the need of interaction between these deep-rooted upwellings and mid-ocean ridges in pulses separated by ∼10–20 Ma, to generate the massive volumes of melt preserved today as oceanic plateaus. These pulses impacted the marine biota resulting in episodes of anoxia and mass extinctions shortly after their eruption. PMID:27824054

  11. Experimental replacement of calcium carbonates by fluorite: high volume changes and porosity generation

    NASA Astrophysics Data System (ADS)

    Trindade Pedrosa, Elisabete; Putnis, Andrew

    2015-04-01

    Pseudomorphic mineral replacement reactions are a common phenomena in nature, and often described as interface-coupled dissolution-reprecipitation processes. The generation of porosity is a key factor for its progression since it creates the pathway for fluid infiltration towards an ongoing reaction front. The generation of porosity depends on two key factors: the molar volume differences between parent and product phase, and the relative solubilities of the parent and product in the fluid at the mineral-fluid interface (Pollok et al., 2011). Jamtveit et al., (2009) demonstrated that the permeability of the parent rock may also be enhanced by the development of fractures as a response to stresses generated by local volume changes at the reaction interface, which in turn increases the reaction rate. The replacement of calcite (CaCO3) by fluorite (CaF2) involves a molar volume decrease of 33.5 %. If indeed high volume changes generate high local stresses, a fragmentation process is expected to be driven by this replacement reaction. To test this hypothesis, a number of hydrothermal experiments were performed. Small cubes of calcite rock (Carrara marble), and single crystals of calcite were used as parent materials. Two fluoride solutions (ammonium fluoride and sodium fluoride) were used as reactants. Samples were reacted at temperatures up to 200°C for various times and quenched to room temperature. After drying, samples were mounted in epoxy holders, cross sections through the centre of the samples were cut and polished, and analysed using scanning electron microscopy (SEM), X-ray diffraction (XRD), and electron microprobe analysis (EMP). The replacement end product of all experiments was confirmed to be fluorite. In every case the external shape of the samples was perfectly maintained. No reaction induced fracturing was visible in any of the samples (rock or single crystals) although the texture of the replaced material was quite complex, often with a 'V' shaped reaction front. The main difference between single crystals and rock was that in the former, grain boundaries were rapid transport pathways for fluid infiltration resulting in the precipitation of fluorite within the sample at locations further from the main reaction front. The porosity formed was very high and complex, its texture depending on the shape and orientation of the replaced material. Very large hollow spaces with diameter >30 μm formed in several samples. In this system the large volume decrease is accommodated by a high porosity rather than fracturing. Jamtveit B., Putnis C.V. & Malthe-Sørenssen A. (2009). Reaction induced fracturing during replacement processes. Contrib. Min. Pet., 157 127-133 Pollok K., Putnis C.V. & Putnis A. (2011) Mineral replacement reactions in solid solution-aqueous solution systems: Volume changes, reaction paths and end points using the example of model salt systems. Am. J. Sci., 311, 211-236

  12. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    PubMed

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.

  13. Sentiment analysis of feature ranking methods for classification accuracy

    NASA Astrophysics Data System (ADS)

    Joseph, Shashank; Mugauri, Calvin; Sumathy, S.

    2017-11-01

    Text pre-processing and feature selection are important and critical steps in text mining. Text pre-processing of large volumes of datasets is a difficult task as unstructured raw data is converted into structured format. Traditional methods of processing and weighing took much time and were less accurate. To overcome this challenge, feature ranking techniques have been devised. A feature set from text preprocessing is fed as input for feature selection. Feature selection helps improve text classification accuracy. Of the three feature selection categories available, the filter category will be the focus. Five feature ranking methods namely: document frequency, standard deviation information gain, CHI-SQUARE, and weighted-log likelihood -ratio is analyzed.

  14. [Automated identification, interpretation and classification of focal changes in the lungs on the images obtained at computed tomography for lung cancer screening].

    PubMed

    Barchuk, A A; Podolsky, M D; Tarakanov, S A; Kotsyuba, I Yu; Gaidukov, V S; Kuznetsov, V I; Merabishvili, V M; Barchuk, A S; Levchenko, E V; Filochkina, A V; Arseniev, A I

    2015-01-01

    This review article analyzes data of literature devoted to the description, interpretation and classification of focal (nodal) changes in the lungs detected by computed tomography of the chest cavity. There are discussed possible criteria for determining the most likely of their character--primary and metastatic tumor processes, inflammation, scarring, and autoimmune changes, tuberculosis and others. Identification of the most characteristic, reliable and statistically significant evidences of a variety of pathological processes in the lungs including the use of modern computer-aided detection and diagnosis of sites will optimize the diagnostic measures and ensure processing of a large volume of medical data in a short time.

  15. Nonlinear, non-stationary image processing technique for eddy current NDE

    NASA Astrophysics Data System (ADS)

    Yang, Guang; Dib, Gerges; Kim, Jaejoon; Zhang, Lu; Xin, Junjun; Udpa, Lalita

    2012-05-01

    Automatic analysis of eddy current (EC) data has facilitated the analysis of large volumes of data generated in the inspection of steam generator tubes in nuclear power plants. The traditional procedure for analysis of EC data includes data calibration, pre-processing, region of interest (ROI) detection, feature extraction and classification. Accurate ROI detection has been enhanced by pre-processing, which involves reducing noise and other undesirable components as well as enhancing defect indications in the raw measurement. This paper presents the Hilbert-Huang Transform (HHT) for feature extraction and support vector machine (SVM) for classification. The performance is shown to significantly better than the existing rule based classification approach used in industry.

  16. Process modeling of a HLA research lab

    NASA Astrophysics Data System (ADS)

    Ribeiro, Bruna G. C.; Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.

    2017-11-01

    Bioinformatics has provided tremendous breakthroughs in the field of molecular biology. All this evolution has generated a large volume of biological data that increasingly require the use of computing for analysis and storage of this information. The identification of the human leukocyte antigen (HLA) genotypes is critical to the success of organ transplants in humans. HLA typing involves not only laboratory tests but also DNA sequencing, with the participation of several professionals responsible for different stages of the process. Thus, the objective of this paper is to map the main steps in HLA typing in a laboratory specialized in performing such procedures, analyzing each process and proposing solutions to speed up the these steps, avoiding mistakes.

  17. Staging memory for massively parallel processor

    NASA Technical Reports Server (NTRS)

    Batcher, Kenneth E. (Inventor)

    1988-01-01

    The invention herein relates to a computer organization capable of rapidly processing extremely large volumes of data. A staging memory is provided having a main stager portion consisting of a large number of memory banks which are accessed in parallel to receive, store, and transfer data words simultaneous with each other. Substager portions interconnect with the main stager portion to match input and output data formats with the data format of the main stager portion. An address generator is coded for accessing the data banks for receiving or transferring the appropriate words. Input and output permutation networks arrange the lineal order of data into and out of the memory banks.

  18. Optical scattering lengths in large liquid-scintillator neutrino detectors.

    PubMed

    Wurm, M; von Feilitzsch, F; Göger-Neff, M; Hofmann, M; Lachenmaier, T; Lewke, T; Marrodán Undagoitia, T; Meindl, Q; Möllenberg, R; Oberauer, L; Potzel, W; Tippmann, M; Todor, S; Traunsteiner, C; Winter, J

    2010-05-01

    For liquid-scintillator neutrino detectors of kiloton scale, the transparency of the organic solvent is of central importance. The present paper reports on laboratory measurements of the optical scattering lengths of the organic solvents phenylxylylethane, linear alkylbenzene (LAB), and dodecane, which are under discussion for next-generation experiments such as SNO+ (Sudbury Neutrino Observatory), HanoHano, or LENA (Low Energy Neutrino Astronomy). Results comprise the wavelength range of 415-440 nm. The contributions from Rayleigh and Mie scattering as well as from absorption/re-emission processes are discussed. Based on the present results, LAB seems to be the preferred solvent for a large-volume detector.

  19. Optical scattering lengths in large liquid-scintillator neutrino detectors

    NASA Astrophysics Data System (ADS)

    Wurm, M.; von Feilitzsch, F.; Göger-Neff, M.; Hofmann, M.; Lachenmaier, T.; Lewke, T.; Undagoitia, T. Marrodán; Meindl, Q.; Möllenberg, R.; Oberauer, L.; Potzel, W.; Tippmann, M.; Todor, S.; Traunsteiner, C.; Winter, J.

    2010-05-01

    For liquid-scintillator neutrino detectors of kiloton scale, the transparency of the organic solvent is of central importance. The present paper reports on laboratory measurements of the optical scattering lengths of the organic solvents phenylxylylethane, linear alkylbenzene (LAB), and dodecane, which are under discussion for next-generation experiments such as SNO+ (Sudbury Neutrino Observatory), HanoHano, or LENA (Low Energy Neutrino Astronomy). Results comprise the wavelength range of 415-440 nm. The contributions from Rayleigh and Mie scattering as well as from absorption/re-emission processes are discussed. Based on the present results, LAB seems to be the preferred solvent for a large-volume detector.

  20. Rapid, economical qualitative method for separation of aflatoxins B-1, B-2 & G-1, G-2 by dry column chromatography.

    PubMed

    Megalla, S E

    1983-12-01

    A good correlation of four components of aflatoxins was accomplished by using the dry column chromatography method. The decolorization process of interfering substances, by 0.01 N KOH and defatting the extract with petroleum ether yields a clean residue for DCC separation. It is clear that the dry column chromatography is a very simple and time-saving procedure for separation of aflatoxins. DCC columns are more economical than precoated 'thick layer' preparative plates and, in DCC, no large developing tanks need to be used. Hazards associated with the use of large volumes of flammable solvents are greatly reduced.

  1. CrossTalk. The Journal of Defense Software Engineering. Volume 13, Number 6, June 2000

    DTIC Science & Technology

    2000-06-01

    Techniques for Efficiently Generating and Testing Software This paper presents a proven process that uses advanced tools to design, develop and test... optimal software. by Keith R. Wegner Large Software Systems—Back to Basics Development methods that work on small problems seem to not scale well to...Ability Requirements for Teamwork: Implications for Human Resource Management, Journal of Management, Vol. 20, No. 2, 1994. 11. Ferguson, Pat, Watts S

  2. Epitaxial thin film growth in outer space

    NASA Technical Reports Server (NTRS)

    Ignatiev, Alex; Chu, C. W.

    1988-01-01

    A new concept for materials processing in space exploits the ultravacuum component of space for thin-film epitaxial growth. The unique LEO space environment is expected to yield 10-ftorr or better pressures, semiinfinite pumping speeds, and large ultravacuum volume (about 100 cu m) without walls. These space ultravacuum properties promise major improvement in the quality, unique nature, and throughput of epitaxially grown materials, including semiconductors, magnetic materials, and thin-film high-temperature superconductors.

  3. Snohomish Estuary Wetlands Study. Volume I. Summary Report

    DTIC Science & Technology

    1979-05-01

    Large marine facilities are structures used for _.. argy development (oil rigs and platforms), raw material pro-cessing, and marine terminals. Such...State) * Wetlands Land Use 20. A9rTlACT (Camthaism revaes ebb N rNeeaa-7 maid Identifr by block number) The study underlines the importance of wetlands...function of a habitat. This study was conducted using information on these and all other subjects. Additional data will provide important refinements

  4. Merging Surface Reconstructions of Terrestrial and Airborne LIDAR Range Data

    DTIC Science & Technology

    2009-05-19

    Mangan and R. Whitaker. Partitioning 3D surface meshes using watershed segmentation . IEEE Trans. on Visualization and Computer Graphics, 5(4), pp...Jain, and A. Zakhor. Data Processing Algorithms for Generating Textured 3D Building Facade Meshes from Laser Scans and Camera Images. International...acquired set of overlapping range images into a single mesh [2,9,10]. However, due to the volume of data involved in large scale urban modeling, data

  5. European Science Notes. Volume 41, Number 5.

    DTIC Science & Technology

    1987-05-01

    Lisbon, from the food and beverage industry. Portugal. The absorption process is fast--it does not require long contact between the fungal mass and the...thousand square meters of buildings were large, the Norwegian authorities estab- consumed by a fire which, at its height, lished the Center for Disaster...riculture and Food Research Council, De- gests that components of both the D-loop partment of Zoology, University of Cam- endoribonuclease and the DNA primase

  6. Northeast Artificial Intelligence Consortium (NAIC). Volume 12. Computer Architecture for Very Large Knowledge Bases

    DTIC Science & Technology

    1990-12-01

    data rate to the electronics would be much lower on the average and the data much "richer" in information. Intelligent use of...system bottleneck, a high data rate should be provided by I/O systems. 2. machines with intelligent storage management specially designed for logic...management information processing, surveillance sensors, intelligence data collection and handling, solid state sciences, electromagnetics, and propagation, and electronic reliability/maintainability and compatibility.

  7. Electronic Structure Methods Based on Density Functional Theory

    DTIC Science & Technology

    2010-01-01

    0188 The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...chapter in the ASM Handbook , Volume 22A: Fundamentals of Modeling for Metals Processing, 2010. PAO Case Number: 88ABW-2009-3258; Clearance Date: 16 Jul...are represented using a linear combination, or basis, of plane waves. Over time several methods were developed to avoid the large number of planewaves

  8. Study for identification of beneficial uses of space, phase 1. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The technological effects of the Space Shuttle Program are considered in terms of the development of improved products, processes, and services aimed at benefitting the public from economic and sociological points of view. As such, an outline is provided for a large number of private organizations to suggest and identify specific areas of research and development which can most effectively be exploited in an extraterrestrial environment.

  9. Novel diamond cells for neutron diffraction using multi-carat CVD anvils

    DOE PAGES

    Boehler, R.; Molaison, J. J.; Haberl, B.

    2017-08-17

    Traditionally, neutron diffraction at high pressure has been severely limited in pressure because low neutron flux required large sample volumes and therefore large volume presses. At the high-flux Spallation Neutron Source at the Oak Ridge National Laboratory, we have developed in this paper new, large-volume diamond anvil cells for neutron diffraction. The main features of these cells are multi-carat, single crystal chemical vapor deposition diamonds, very large diffraction apertures, and gas membranes to accommodate pressure stability, especially upon cooling. A new cell has been tested for diffraction up to 40 GPa with an unprecedented sample volume of ~0.15 mm 3.more » High quality spectra were obtained in 1 h for crystalline Ni and in ~8 h for disordered glassy carbon. Finally, these new techniques will open the way for routine megabar neutron diffraction experiments.« less

  10. Novel analysis of 4DCT imaging quantifies progressive increases in anatomic dead space during mechanical ventilation in mice.

    PubMed

    Kim, Elizabeth H; Preissner, Melissa; Carnibella, Richard P; Samarage, Chaminda R; Bennett, Ellen; Diniz, Marcio A; Fouras, Andreas; Zosky, Graeme R; Jones, Heather D

    2017-09-01

    Increased dead space is an important prognostic marker in early acute respiratory distress syndrome (ARDS) that correlates with mortality. The cause of increased dead space in ARDS has largely been attributed to increased alveolar dead space due to ventilation/perfusion mismatching and shunt. We sought to determine whether anatomic dead space also increases in response to mechanical ventilation. Mice received intratracheal lipopolysaccharide (LPS) or saline and mechanical ventilation (MV). Four-dimensional computed tomography (4DCT) scans were performed at onset of MV and after 5 h of MV. Detailed measurements of airway volumes and lung tidal volumes were performed using image analysis software. The forced oscillation technique was used to obtain measures of airway resistance, tissue damping, and tissue elastance. The ratio of airway volumes to total tidal volume increased significantly in response to 5 h of mechanical ventilation, regardless of LPS exposure, and airways demonstrated significant variation in volumes over the respiratory cycle. These findings were associated with an increase in tissue elastance (decreased lung compliance) but without changes in tidal volumes. Airway volumes increased over time with exposure to mechanical ventilation without a concomitant increase in tidal volumes. These findings suggest that anatomic dead space fraction increases progressively with exposure to positive pressure ventilation and may represent a pathological process. NEW & NOTEWORTHY We demonstrate that anatomic dead space ventilation increases significantly over time in mice in response to mechanical ventilation. The novel functional lung-imaging techniques applied here yield sensitive measures of airway volumes that may have wide applications. Copyright © 2017 the American Physiological Society.

  11. Nd, Sr, and O isotopic variations in metaluminous ash-flow tuffs and related volcanic rocks at the Timber Mountain/Oasis Valley Caldera, Complex, SW Nevada: implications for the origin and evolution of large-volume silicic magma bodies

    USGS Publications Warehouse

    Farmer, G.L.; Broxton, D.E.; Warren, R.G.; Pickthorn, W.

    1991-01-01

    Nd, Sr and O isotopic data were obtained from silicic ash-flow tuffs and lavas at the Tertiary age (16-9 Ma) Timber (Mountain/Oasis Valley volcanic center (TMOV) in southern Nevada, to assess models for the origin and evolution of the large-volume silicic magma bodies generated in this region. The large-volume (>900 km3), chemically-zoned, Topopah Spring (TS) and Tiva Canyon (TC) members of the Paintbrush Tuff, and the Rainier Mesa (RM) and Ammonia Tanks (AT) members of the younger Timber Mountain Tuff all have internal Nd and Sr isotopic zonations. In each tuff, high-silica rhyolites have lower initial e{open}Nd values (???1 e{open}Nd unit), higher87Sr/86Sr, and lower Nd and Sr contents, than cocrupted trachytes. The TS, TC, and RM members have similar e{open}Nd values for high-silica rhyolites (-11.7 to -11.2) and trachytes (-10.5 to -10.7), but the younger AT member has a higher e{open}Nd for both compositional types (-10.3 and -9.4). Oxygen isotope data confirm that the TC and AT members were derived from low e{open}Nd magmas. The internal Sr and Nd isotopic variations in each tuff are interpreted to be the result of the incorporation of 20-40% (by mass) wall-rock into magmas that were injected into the upper crust. The low e{open}Nd magmas most likely formed via the incorporation of low ??18O, hydrothermally-altered, wall-rock. Small-volume rhyolite lavas and ash-flow tuffs have similar isotopic characteristics to the large-volume ash-flow tuffs, but lavas erupted from extracaldera vents may have interacted with higher ??18O crustal rocks peripheral to the main magma chamber(s). Andesitic lavas from the 13-14 Ma Wahmonie/Salyer volcanic center southeast of the TMOV have low e{open}Nd (-13.2 to -13.8) and are considered on the basis of textural evidence to be mixtures of basaltic composition magmas and large proportions (70-80%) of anatectic crustal melts. A similar process may have occurred early in the magmatic history of the TMOV. The large-volume rhyolites may represent a mature stage of magmatism after repeated injection of basaltic magmas, crustal melting, and volcanism cleared sufficient space in the upper crust for large magma bodies to accumulate and differentiate. The TMOV rhyolites and 0-10 Ma old basalts that erupted in southern Nevada all have similar Nd and Sr isotopic compositions, which suggests that silicic and mafic magmatism at the TMOV were genetically related. The distinctive isotopic compositions of the AT member may reflect temporal changes in the isotopic compositions of basaltic magmas entering the upper crust, possibly as a result of increasing "basification" of a lower crustal magma source by repeated injection of mantle-derived mafic magmas. ?? 1991 Springer-Verlag.

  12. Processing, structure, and characterizaton of nickel-alumina composites obtained by the partial reduction of zirconia-doped nickel-aluminum oxide and application to the tempering of ceramics

    NASA Astrophysics Data System (ADS)

    Barbieri, Thomas John

    1999-11-01

    Partial reduction of the spinel compound NiAl2O4 results in a two phase composite mixture of Ni + Al2O3. The reduction reaction has a volume decrease associated with it, which theoretically could generate large residual stresses, which have the potential to "temper" a ceramic, i.e. to place the surface of a ceramic component into a state of residual compression. As the first step towards tempering a ceramic, it is necessary to demonstrate that appreciable stresses can be generated by this volume change, since they may be relieved by either cracking or diffusional relaxation processes at the high temperature of the reduction reaction. It was necessary to determine the best processing methods to use for producing the tempered specimens. Results are presented from a systematic study on the effect of the variation of processing parameters on the reduction behavior of NiAl2O4 doped with ZrO2. Specimen characteristics of interest were time required for reduction, microstructural development, volume contraction achieved and porosity generated during reduction, and the ability to survive the reduction process without fracturing. These results were applied to the tempering process. A simple specimen geometry was used for tempering which involved an Al 2O3 cylinder bonded to an outer NiAl2O4 ring. Finite element calculations were performed to predict the residual stresses generated by the volume contraction of the ring and the coefficient of thermal expansion, (CTE) mismatch between the Al2O3 core and the reduced composite ring. Stress measurements performed on the Al2 O3 core of each specimen using the "d vs. Sin 2Psi" method of X-ray diffraction indicate that only the CTE-induced stresses remain in the specimens after completion of the tempering process. Microstructural analysis of the tempered specimens was performed to determine if residual stresses were developed during reduction, and what processes occurred to relieve these stresses. The results indicate that stresses are generated during the reduction process, but they are dissipated through catastrophic fracture, cation rearrangement in the lattice, and creep. Further evidence of the presence of residual stresses during reduction was found in a decrease in coarsening rate in tempered specimens.

  13. Development of polymer nano composite patterns using fused deposition modeling for rapid investment casting process

    NASA Astrophysics Data System (ADS)

    Vivek, Tiwary; Arunkumar, P.; Deshpande, A. S.; Vinayak, Malik; Kulkarni, R. M.; Asif, Angadi

    2018-04-01

    Conventional investment casting is one of the oldest and most economical manufacturing techniques to produce intricate and complex part geometries. However, investment casting is considered economical only if the volume of production is large. Design iterations and design optimisations in this technique proves to be very costly due to time and tooling cost for making dies for producing wax patterns. However, with the advent of Additive manufacturing technology, plastic patterns promise a very good potential to replace the wax patterns. This approach can be very useful for low volume production & lab requirements, since the cost and time required to incorporate the changes in the design is very low. This research paper discusses the steps involved for developing polymer nanocomposite filaments and checking its suitability for investment castings. The process parameters of the 3D printer machine are also optimized using the DOE technique to obtain mechanically stronger plastic patterns. The study is done to develop a framework for rapid investment casting for lab as well as industrial requirements.

  14. Large deformation of self-oscillating polymer gel

    NASA Astrophysics Data System (ADS)

    Maeda, Shingo; Kato, Terukazu; Otsuka, Yuji; Hosoya, Naoki; Cianchetti, Matteo; Laschi, Cecilia

    2016-01-01

    A self-oscillating gel is a system that generates an autonomous volume oscillation. This oscillation is powered by the chemical energy of the Belousov-Zhabotinsky (BZ) reaction, which demonstrates metal ion redox oscillation. A self-oscillating gel is composed of Poly-N -isopropylacrylamide (PNIPAAm) with a metal ion. In this study, we found that the displacement of the volume oscillation in a self-oscillating gel could be controlled by its being subjected to a prestraining process. We also revealed the driving mechanism of the self-oscillating gel from the point of view of thermodynamics. We observed that the polymer-solvent interaction parameter χ is altered by the redox changes to the metal ion incorporated in the self-oscillating gel. The prestraining process leads to changes in χ and changes in enthalpy and entropy when the self-oscillating gel is in a reduced and oxidized state. We found that nonprestrained gel samples oscillate in a poor solution (χ >0.5 ) and prestrained gel samples oscillate in a good solution (χ <0.5 ).

  15. Silicon algae with carbon topping as thin-film anodes for lithium-ion microbatteries by a two-step facile method

    NASA Astrophysics Data System (ADS)

    Biserni, E.; Xie, M.; Brescia, R.; Scarpellini, A.; Hashempour, M.; Movahed, P.; George, S. M.; Bestetti, M.; Li Bassi, A.; Bruno, P.

    2015-01-01

    Silicon-based electrodes for Li-ion batteries (LIB) attract much attention because of their high theoretical capacity. However, their large volume change during lithiation results in poor cycling due to mechanical cracking. Moreover, silicon can hardly form a stable solid electrolyte interphase (SEI) layer with common electrolytes. We present a safe, innovative strategy to prepare nanostructured silicon-carbon anodes in a two-step process. The nanoporosity of Si films accommodates the volume expansion while a disordered graphitic C layer on top promotes the formation of a stable SEI. This approach shows its promises: carbon-coated porous silicon anodes perform in a very stable way, reaching the areal capacity of ∼175 μAh cm-2, and showing no decay for at least 1000 cycles. With requiring only a two-step deposition process at moderate temperatures, this novel very simple cell concept introduces a promising way to possibly viable up-scaled production of next-generation nanostructured Si anodes for lithium-ion microbatteries.

  16. Demonstration of a robust magnonic spin wave interferometer.

    PubMed

    Kanazawa, Naoki; Goto, Taichi; Sekiguchi, Koji; Granovsky, Alexander B; Ross, Caroline A; Takagi, Hiroyuki; Nakamura, Yuichi; Inoue, Mitsuteru

    2016-07-22

    Magnonics is an emerging field dealing with ultralow power consumption logic circuits, in which the flow of spin waves, rather than electric charges, transmits and processes information. Waves, including spin waves, excel at encoding information via their phase using interference. This enables a number of inputs to be processed in one device, which offers the promise of multi-input multi-output logic gates. To realize such an integrated device, it is essential to demonstrate spin wave interferometers using spatially isotropic spin waves with high operational stability. However, spin wave reflection at the waveguide edge has previously limited the stability of interfering waves, precluding the use of isotropic spin waves, i.e., forward volume waves. Here, a spin wave absorber is demonstrated comprising a yttrium iron garnet waveguide partially covered by gold. This device is shown experimentally to be a robust spin wave interferometer using the forward volume mode, with a large ON/OFF isolation value of 13.7 dB even in magnetic fields over 30 Oe.

  17. Demonstration of a robust magnonic spin wave interferometer

    PubMed Central

    Kanazawa, Naoki; Goto, Taichi; Sekiguchi, Koji; Granovsky, Alexander B.; Ross, Caroline A.; Takagi, Hiroyuki; Nakamura, Yuichi; Inoue, Mitsuteru

    2016-01-01

    Magnonics is an emerging field dealing with ultralow power consumption logic circuits, in which the flow of spin waves, rather than electric charges, transmits and processes information. Waves, including spin waves, excel at encoding information via their phase using interference. This enables a number of inputs to be processed in one device, which offers the promise of multi-input multi-output logic gates. To realize such an integrated device, it is essential to demonstrate spin wave interferometers using spatially isotropic spin waves with high operational stability. However, spin wave reflection at the waveguide edge has previously limited the stability of interfering waves, precluding the use of isotropic spin waves, i.e., forward volume waves. Here, a spin wave absorber is demonstrated comprising a yttrium iron garnet waveguide partially covered by gold. This device is shown experimentally to be a robust spin wave interferometer using the forward volume mode, with a large ON/OFF isolation value of 13.7 dB even in magnetic fields over 30 Oe. PMID:27443989

  18. Development of the Electromagnetic Continuous Casting Technology for of Magnesium Alloys

    NASA Astrophysics Data System (ADS)

    Park, Joon-Pyo; Kim, Myoung-Gyun; Kim, Jong-Ho; Lee, Gyu-Chang

    Currently, magnesium billets produced by ingot casting or direct chill casting process, result in low-quality surfaces and low productivity, Continuous casting technology to solve these problem has not only high-quality surface billets with fine-grained and homogeneous microstructure but also cost down. The latent heat of fusion per weight (J/g) of magnesium is similar to other metals, however, considering the heat emitted to the mold surface during continuous casting in meniscus region and converting it to the latent heat of fusion per volume, magnesium will be rapidly solidified in the mold during continuous casting, which induces subsequent surface defect formation. In this study, electromagnetic casting and stirring (EMC and EMS) techniques are proposed to control solidification process conveniently by compensating the low latent heat of solidification by volume and to fabricate magnesium billet with high-quality surface. This technique was extended to large scale billets up to 300 mm diameter and continuous casting was successfully conducted. Then magnesium billet was used for the fabrication of prototype automobile pulley.

  19. A neuronal morphologic type unique to humans and great apes

    PubMed Central

    Nimchinsky, Esther A.; Gilissen, Emmanuel; Allman, John M.; Perl, Daniel P.; Erwin, Joseph M.; Hof, Patrick R.

    1999-01-01

    We report the existence and distribution of an unusual type of projection neuron, a large, spindle-shaped cell, in layer Vb of the anterior cingulate cortex of pongids and hominids. These spindle cells were not observed in any other primate species or any other mammalian taxa, and their volume was correlated with brain volume residuals, a measure of encephalization in higher primates. These observations are of particular interest when considering primate neocortical evolution, as they reveal possible adaptive changes and functional modifications over the last 15–20 million years in the anterior cingulate cortex, a region that plays a major role in the regulation of many aspects of autonomic function and of certain cognitive processes. That in humans these unique neurons have been shown previously to be severely affected in the degenerative process of Alzheimer’s disease suggests that some of the differential neuronal susceptibility that occurs in the human brain in the course of age-related dementing illnesses may have appeared only recently during primate evolution. PMID:10220455

  20. Animated analysis of geoscientific datasets: An interactive graphical application

    NASA Astrophysics Data System (ADS)

    Morse, Peter; Reading, Anya; Lueg, Christopher

    2017-12-01

    Geoscientists are required to analyze and draw conclusions from increasingly large volumes of data. There is a need to recognise and characterise features and changing patterns of Earth observables within such large datasets. It is also necessary to identify significant subsets of the data for more detailed analysis. We present an innovative, interactive software tool and workflow to visualise, characterise, sample and tag large geoscientific datasets from both local and cloud-based repositories. It uses an animated interface and human-computer interaction to utilise the capacity of human expert observers to identify features via enhanced visual analytics. 'Tagger' enables users to analyze datasets that are too large in volume to be drawn legibly on a reasonable number of single static plots. Users interact with the moving graphical display, tagging data ranges of interest for subsequent attention. The tool provides a rapid pre-pass process using fast GPU-based OpenGL graphics and data-handling and is coded in the Quartz Composer visual programing language (VPL) on Mac OSX. It makes use of interoperable data formats, and cloud-based (or local) data storage and compute. In a case study, Tagger was used to characterise a decade (2000-2009) of data recorded by the Cape Sorell Waverider Buoy, located approximately 10 km off the west coast of Tasmania, Australia. These data serve as a proxy for the understanding of Southern Ocean storminess, which has both local and global implications. This example shows use of the tool to identify and characterise 4 different types of storm and non-storm events during this time. Events characterised in this way are compared with conventional analysis, noting advantages and limitations of data analysis using animation and human interaction. Tagger provides a new ability to make use of humans as feature detectors in computer-based analysis of large-volume geosciences and other data.

Top