Sample records for extraction process requires

  1. Techno-economic analysis of extraction-based separation systems for acetone, butanol, and ethanol recovery and purification.

    PubMed

    Grisales Díaz, Víctor Hugo; Olivar Tost, Gerard

    2017-01-01

    Dual extraction, high-temperature extraction, mixture extraction, and oleyl alcohol extraction have been proposed in the literature for acetone, butanol, and ethanol (ABE) production. However, energy and economic evaluation under similar assumptions of extraction-based separation systems are necessary. Hence, the new process proposed in this work, direct steam distillation (DSD), for regeneration of high-boiling extractants was compared with several extraction-based separation systems. The evaluation was performed under similar assumptions through simulation in Aspen Plus V7.3 ® software. Two end distillation systems (number of non-ideal stages between 70 and 80) were studied. Heat integration and vacuum operation of some units were proposed reducing the energy requirements. Energy requirement of hybrid processes, substrate concentration of 200 g/l, was between 6.4 and 8.3 MJ-fuel/kg-ABE. The minimum energy requirements of extraction-based separation systems, feeding a water concentration in the substrate equivalent to extractant selectivity, and ideal assumptions were between 2.6 and 3.5 MJ-fuel/kg-ABE, respectively. The efficiencies of recovery systems for baseline case and ideal evaluation were 0.53-0.57 and 0.81-0.84, respectively. The main advantages of DSD were the operation of the regeneration column at atmospheric pressure, the utilization of low-pressure steam, and the low energy requirements of preheating. The in situ recovery processes, DSD, and mixture extraction with conventional regeneration were the approaches with the lowest energy requirements and total annualized costs.

  2. Enhancement of Lipid Extraction from Marine Microalga, Scenedesmus Associated with High-Pressure Homogenization Process

    PubMed Central

    Cho, Seok-Cheol; Choi, Woon-Yong; Oh, Sung-Ho; Lee, Choon-Geun; Seo, Yong-Chang; Kim, Ji-Seon; Song, Chi-Ho; Kim, Ga-Vin; Lee, Shin-Young; Kang, Do-Hyung; Lee, Hyeon-Yong

    2012-01-01

    Marine microalga, Scenedesmus sp., which is known to be suitable for biodiesel production because of its high lipid content, was subjected to the conventional Folch method of lipid extraction combined with high-pressure homogenization pretreatment process at 1200 psi and 35°C. Algal lipid yield was about 24.9% through this process, whereas only 19.8% lipid can be obtained by following a conventional lipid extraction procedure using the solvent, chloroform : methanol (2 : 1, v/v). Present approach requires 30 min process time and a moderate working temperature of 35°C as compared to the conventional extraction method which usually requires >5 hrs and 65°C temperature. It was found that this combined extraction process followed second-order reaction kinetics, which means most of the cellular lipids were extracted during initial periods of extraction, mostly within 30 min. In contrast, during the conventional extraction process, the cellular lipids were slowly and continuously extracted for >5 hrs by following first-order kinetics. Confocal and scanning electron microscopy revealed altered texture of algal biomass pretreated with high-pressure homogenization. These results clearly demonstrate that the Folch method coupled with high-pressure homogenization pretreatment can easily destruct the rigid cell walls of microalgae and release the intact lipids, with minimized extraction time and temperature, both of which are essential for maintaining good quality of the lipids for biodiesel production. PMID:22969270

  3. Assessment of critical-fluid extractions in the process industries

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The potential for critical-fluid extraction as a separation process for improving the productive use of energy in the process industries is assessed. Critical-fluid extraction involves the use of fluids, normally gaseous at ambient conditions, as extraction solvents at temperatures and pressures around the critical point. Equilibrium and kinetic properties in this regime are very favorable for solvent applications, and generally allow major reductions in the energy requirements for separating and purifying chemical component of a mixture.

  4. Multichannel Doppler Processing for an Experimental Low-Angle Tracking System

    DTIC Science & Technology

    1990-05-01

    estimation techniques at sea. Because of clutter and noise, it is necessary to use a number of different processing algorithms to extract the required...a number of different processing algorithms to extract the required information. Consequently, the ELAT radar system is composed of multiple...corresponding to RF frequencies, f, and f2. For mode 3, the ambiguities occur at vbi = 15.186 knots and vb2 = 16.96 knots. The sea clutter, with a spectrum

  5. Sequential microfluidic droplet processing for rapid DNA extraction.

    PubMed

    Pan, Xiaoyan; Zeng, Shaojiang; Zhang, Qingquan; Lin, Bingcheng; Qin, Jianhua

    2011-11-01

    This work describes a novel droplet-based microfluidic device, which enables sequential droplet processing for rapid DNA extraction. The microdevice consists of a droplet generation unit, two reagent addition units and three droplet splitting units. The loading/washing/elution steps required for DNA extraction were carried out by sequential microfluidic droplet processing. The movement of superparamagnetic beads, which were used as extraction supports, was controlled with magnetic field. The microdevice could generate about 100 droplets per min, and it took about 1 min for each droplet to perform the whole extraction process. The extraction efficiency was measured to be 46% for λ-DNA, and the extracted DNA could be used in subsequent genetic analysis such as PCR, demonstrating the potential of the device for fast DNA extraction. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Biorefinery process for protein extraction from oriental mustard (Brassica juncea (L.) Czern.) using ethanol stillage.

    PubMed

    Ratanapariyanuch, Kornsulee; Tyler, Robert T; Shim, Youn Young; Reaney, Martin Jt

    2012-01-12

    Large volumes of treated process water are required for protein extraction. Evaporation of this water contributes greatly to the energy consumed in enriching protein products. Thin stillage remaining from ethanol production is available in large volumes and may be suitable for extracting protein rich materials. In this work protein was extracted from ground defatted oriental mustard (Brassica juncea (L.) Czern.) meal using thin stillage. Protein extraction efficiency was studied at pHs between 7.6 and 10.4 and salt concentrations between 3.4 × 10-2 and 1.2 M. The optimum extraction efficiency was pH 10.0 and 1.0 M NaCl. Napin and cruciferin were the most prevalent proteins in the isolate. The isolate exhibited high in vitro digestibility (74.9 ± 0.80%) and lysine content (5.2 ± 0.2 g/100 g of protein). No differences in the efficiency of extraction, SDS-PAGE profile, digestibility, lysine availability, or amino acid composition were observed between protein extracted with thin stillage and that extracted with NaCl solution. The use of thin stillage, in lieu of water, for protein extraction would decrease the energy requirements and waste disposal costs of the protein isolation and biofuel production processes.

  7. Biorefinery process for protein extraction from oriental mustard (Brassica juncea (L.) Czern.) using ethanol stillage

    PubMed Central

    2012-01-01

    Large volumes of treated process water are required for protein extraction. Evaporation of this water contributes greatly to the energy consumed in enriching protein products. Thin stillage remaining from ethanol production is available in large volumes and may be suitable for extracting protein rich materials. In this work protein was extracted from ground defatted oriental mustard (Brassica juncea (L.) Czern.) meal using thin stillage. Protein extraction efficiency was studied at pHs between 7.6 and 10.4 and salt concentrations between 3.4 × 10-2 and 1.2 M. The optimum extraction efficiency was pH 10.0 and 1.0 M NaCl. Napin and cruciferin were the most prevalent proteins in the isolate. The isolate exhibited high in vitro digestibility (74.9 ± 0.80%) and lysine content (5.2 ± 0.2 g/100 g of protein). No differences in the efficiency of extraction, SDS-PAGE profile, digestibility, lysine availability, or amino acid composition were observed between protein extracted with thin stillage and that extracted with NaCl solution. The use of thin stillage, in lieu of water, for protein extraction would decrease the energy requirements and waste disposal costs of the protein isolation and biofuel production processes. PMID:22239856

  8. Optimization of an innovative approach involving mechanical activation and acid digestion for the extraction of lithium from lepidolite

    NASA Astrophysics Data System (ADS)

    Vieceli, Nathália; Nogueira, Carlos A.; Pereira, Manuel F. C.; Durão, Fernando O.; Guimarães, Carlos; Margarido, Fernanda

    2018-01-01

    The recovery of lithium from hard rock minerals has received increased attention given the high demand for this element. Therefore, this study optimized an innovative process, which does not require a high-temperature calcination step, for lithium extraction from lepidolite. Mechanical activation and acid digestion were suggested as crucial process parameters, and experimental design and response-surface methodology were applied to model and optimize the proposed lithium extraction process. The promoting effect of amorphization and the formation of lithium sulfate hydrate on lithium extraction yield were assessed. Several factor combinations led to extraction yields that exceeded 90%, indicating that the proposed process is an effective approach for lithium recovery.

  9. Comparison of lignin extraction processes: Economic and environmental assessment.

    PubMed

    Carvajal, Juan C; Gómez, Álvaro; Cardona, Carlos A

    2016-08-01

    This paper presents the technical-economic and environmental assessment of four lignin extraction processes from two different raw materials (sugarcane bagasse and rice husks). The processes are divided into two categories, the first processes evaluates lignin extraction with prior acid hydrolysis step, while in the second case the extraction processes are evaluated standalone for a total analysis of 16 scenarios. Profitability indicators as the net present value (NPV) and environmental indicators as the potential environmental impact (PEI) are used through a process engineering approach to understand and select the best lignin extraction process. The results show that both economically and environmentally process with sulfites and soda from rice husk presents the best results; however the quality of lignin obtained with sulfites is not suitable for high value-added products. Then, the soda is an interesting option for the extraction of lignin if high quality lignin is required for high value-added products at low costs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Studies for determining thermal ion extraction potential for aluminium plasma generated by electron beam evaporator

    NASA Astrophysics Data System (ADS)

    Dileep Kumar, V.; Barnwal, Tripti A.; Mukherjee, Jaya; Gantayet, L. M.

    2010-02-01

    For effective evaporation of refractory metal, electron beam is found to be most suitable vapour generator source. Using electron beam, high throughput laser based purification processes are carried out. But due to highly concentrated electron beam, the vapour gets ionised and these ions lead to dilution of the pure product of laser based separation process. To estimate the concentration of these ions and extraction potential requirement to remove these ions from vapour stream, experiments have been conducted using aluminium as evaporant. The aluminium ingots were placed in water cooled copper crucible. Inserts were used to hold the evaporant, in order to attain higher number density in the vapour processing zone and also for confining the liquid metal. Parametric studies with beam power, number density and extraction potential were conducted. In this paper we discuss the trend of the generation of thermal ions and electrostatic field requirement for extraction.

  11. Spatial resolution requirements for automated cartographic road extraction

    USGS Publications Warehouse

    Benjamin, S.; Gaydos, L.

    1990-01-01

    Ground resolution requirements for detection and extraction of road locations in a digitized large-scale photographic database were investigated. A color infrared photograph of Sunnyvale, California was scanned, registered to a map grid, and spatially degraded to 1- to 5-metre resolution pixels. Road locations in each data set were extracted using a combination of image processing and CAD programs. These locations were compared to a photointerpretation of road locations to determine a preferred pixel size for the extraction method. Based on road pixel omission error computations, a 3-metre pixel resolution appears to be the best choice for this extraction method. -Authors

  12. Thinking graphically: Connecting vision and cognition during graph comprehension.

    PubMed

    Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A

    2008-03-01

    Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved

  13. Brain Dynamics Sustaining Rapid Rule Extraction from Speech

    ERIC Educational Resources Information Center

    de Diego-Balaguer, Ruth; Fuentemilla, Lluis; Rodriguez-Fornells, Antoni

    2011-01-01

    Language acquisition is a complex process that requires the synergic involvement of different cognitive functions, which include extracting and storing the words of the language and their embedded rules for progressive acquisition of grammatical information. As has been shown in other fields that study learning processes, synchronization…

  14. Low-energy route for alcohol/gasohol recovery from fermentor beer. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mix, T.W.

    1982-03-01

    The production of gasohol directly from fermentor beer and gasoline is feasible and will enable a major reduction in the energy requirements for gasohol production. The fermentor beer is first enriched in a beer still to a 69 mol % ethanol, 31 mol % water product which is then dehydrated by extractive distillation with gasoline as the extractive agent. Gasohol is produced directly. In one version of the process, a heavy cut of gasoline, presumed available at a refinery before blending in of light components, is used as the extractive agent. The enriching column overhead vapors are used to reboilmore » the extractive distillation and steam stripping columns and to contribute to the preheating of the fermentor beer feed. Light components are blended into the heavy cut-ethanol bottom product from the extractive distillation column to form the desired gasohol. Energy requirements, including feed preheat, are 11,000 Btu per gallon of ethanol in the product gasohol. One hundred and fifty pound steam is required. In a second version, full range gasoline is used as the extractive agent. The enriching column overhead vapors are again used to reboil the extractive distillation and steam stripping columns and to contribute to the preheating of the fermentor beer feed. Light gasoline components recovered from the decanter following the overhead condenser of the extractive distillation column are blended in with the gasoline-ethanol product leaving the bottom of the extractive distillation column to form the desired gasohol. Energy requirements in this case are 13,000 Btu/gallon of ethanol in the product gasohol. In both of the above cases it is energy-conservative and desirable from a process standpoint to feed the enriched alcohol to the extractive distillation column as a liquid rather than as a vapor.« less

  15. Accelerating sample preparation through enzyme-assisted microfiltration of Salmonella in chicken extract

    USDA-ARS?s Scientific Manuscript database

    Microfiltration of chicken extracts has the potential to significantly decrease the time required to detect Salmonella, as long as the extract can be efficiently filtered and the pathogenic microorganisms kept in a viable state during this process. We present conditions that enable microfiltration ...

  16. Enhanced extraction of butyric acid under high-pressure CO2 conditions to integrate chemical catalysis for value-added chemicals and biofuels.

    PubMed

    Chun, Jaesung; Choi, Okkyoung; Sang, Byoung-In

    2018-01-01

    Extractive fermentation with the removal of carboxylic acid requires low pH conditions because acids are better partitioned into the solvent phase at low pH values. However, this requirement conflicts with the optimal near-neutral pH conditions for microbial growth. CO 2 pressurization was used, instead of the addition of chemicals, to decrease pH for the extraction of butyric acid, a fermentation product of Clostridium tyrobutyricum , and butyl butyrate was selected as an extractant. CO 2 pressurization (50 bar) improved the extraction efficiency of butyric acid from a solution at pH 6, yielding a distribution coefficient ( D ) 0.42. In situ removal of butyric acid during fermentation increased the production of butyric acid by up to 4.10 g/L h, an almost twofold increase over control without the use of an extraction process. In situ extraction of butyric acid using temporal CO 2 pressurization may be applied to an integrated downstream catalytic process for upgrading butyric acid to value-added chemicals in an organic solvent.

  17. The extraction of bitumen from western oil sands. Annual report, July 1991--July 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oblad, A.G.; Bunger, J.W.; Dahlstrom, D.A.

    1992-08-01

    The University of Utah tar sand research and development program is concerned with research and development on Utah is extensive oil sands deposits. The program has been intended to develop a scientific and technological base required for eventual commercial recovery of the heavy oils from oil sands and processing these oils to produce synthetic crude oil and other products such as asphalt. The overall program is based on mining the oil sand, processing the mined sand to recover the heavy oils and upgrading them to products. Multiple deposits are being investigated since it is believed that a large scale (approximatelymore » 20,000 bbl/day) plant would require the use of resources from more than one deposit. The tasks or projects in the program are organized according to the following classification: Recovery technologies which includes thermal recovery methods, water extraction methods, and solvent extraction methods; upgrading and processing technologies which covers hydrotreating, hydrocracking, and hydropyrolysis; solvent extraction; production of specialty products; and environmental aspects of the production and processing technologies. These tasks are covered in this report.« less

  18. Reactive extraction at liquid-liquid systems

    NASA Astrophysics Data System (ADS)

    Wieszczycka, Karolina

    2018-01-01

    The chapter summarizes the state of knowledge about a metal transport in two-phase system. The first part of this review focuses on the distribution law and main factors determination in classical solvent extraction (solubility and polarity of the solute, as well as inter- and intramolecules interaction. Next part of the chapter is devoted to the reactive solvent extraction and the molecular modeling requiring knowledge on type of extractants, complexation mechanisms, metals ions speciation and oxidation during complexes forming, and other parameters that enable to understand the extraction process. Also the kinetic data that is needed for proper modeling, simulation and design of processes needed for critical separations are discussed. Extraction at liquid-solid system using solvent impregnated resins is partially identical as in the case of the corresponding solvent extraction, therefore this subject was also presented in all aspects of separation process (equilibrium, mechanism, kinetics).

  19. Biosafety evaluation of the DNA extraction protocol for Mycobacterium tuberculosis complex species, as implemented at the Instituto Nacional de Salud, Colombia.

    PubMed

    Castro, Claudia; González, Liliana; Rozo, Juan Carlos; Puerto, Gloria; Ribón, Wellman

    2009-12-01

    Manipulating Mycobacterium tuberculosis clinical specimens and cultures represents a risk factor for laboratory personnel. One of the processes that requires high concentrations of microorganisms is DNA extraction for molecular procedures. Pulmonary tuberculosis cases have occurred among professionals in charge of molecular procedures that require manipulation of massive quantities of microorganisms. This has prompted research studies on biosafety aspects of extraction protocols; however, as yet, no consensus has been reached regarding risks associated with the process. The biosafety was evaluated for the DNA extraction protocol of van Soolingen, et al. 2002 by determining M. tuberculosis viability at each process stage. Eight hundred eighty cultures were grown from 220 M. tuberculosis clinical isolates that had been processed through the first three DNA extraction stages. Molecular identifications of positive cultures used a PCR isolation of a fragment of the heat shock protein PRA-hsp65 and examination of its restriction enzyme profile (spoligotyping). Growth was seen in one culture with one of the procedures used. The molecular characterization did not correspond to the initially analyzed isolate, and therefore was deduced to be the product of a cross-contamination. The DNA extraction protocol, as described by van Soolingen, et al. 2002 and as implemented at the Instituto Nacional de Salud, was established to be safe for laboratory personnel as well as for the environment.

  20. Progress on lipid extraction from wet algal biomass for biodiesel production.

    PubMed

    Ghasemi Naghdi, Forough; González González, Lina M; Chan, William; Schenk, Peer M

    2016-11-01

    Lipid recovery and purification from microalgal cells continues to be a significant bottleneck in biodiesel production due to high costs involved and a high energy demand. Therefore, there is a considerable necessity to develop an extraction method which meets the essential requirements of being safe, cost-effective, robust, efficient, selective, environmentally friendly, feasible for large-scale production and free of product contamination. The use of wet concentrated algal biomass as a feedstock for oil extraction is especially desirable as it would avoid the requirement for further concentration and/or drying. This would save considerable costs and circumvent at least two lengthy processes during algae-based oil production. This article provides an overview on recent progress that has been made on the extraction of lipids from wet algal biomass. The biggest contributing factors appear to be the composition of algal cell walls, pre-treatments of biomass and the use of solvents (e.g. a solvent mixture or solvent-free lipid extraction). We compare recently developed wet extraction processes for oleaginous microalgae and make recommendations towards future research to improve lipid extraction from wet algal biomass. © 2016 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  1. Diatom Milking: A Review and New Approaches

    PubMed Central

    Vinayak, Vandana; Manoylov, Kalina M.; Gateau, Hélène; Blanckaert, Vincent; Hérault, Josiane; Pencréac’h, Gaëlle; Marchand, Justine; Gordon, Richard; Schoefs, Benoît

    2015-01-01

    The rise of human populations and the growth of cities contribute to the depletion of natural resources, increase their cost, and create potential climatic changes. To overcome difficulties in supplying populations and reducing the resource cost, a search for alternative pharmaceutical, nanotechnology, and energy sources has begun. Among the alternative sources, microalgae are the most promising because they use carbon dioxide (CO2) to produce biomass and/or valuable compounds. Once produced, the biomass is ordinarily harvested and processed (downstream program). Drying, grinding, and extraction steps are destructive to the microalgal biomass that then needs to be renewed. The extraction and purification processes generate organic wastes and require substantial energy inputs. Altogether, it is urgent to develop alternative downstream processes. Among the possibilities, milking invokes the concept that the extraction should not kill the algal cells. Therefore, it does not require growing the algae anew. In this review, we discuss research on milking of diatoms. The main themes are (a) development of alternative methods to extract and harvest high added value compounds; (b) design of photobioreactors; (c) biodiversity and (d) stress physiology, illustrated with original results dealing with oleaginous diatoms. PMID:25939034

  2. System for high throughput water extraction from soil material for stable isotope analysis of water

    USDA-ARS?s Scientific Manuscript database

    A major limitation in the use of stable isotope of water in ecological studies is the time that is required to extract water from soil and plant samples. Using vacuum distillation the extraction time can be less than one hour per sample. Therefore, assembling a distillation system that can process m...

  3. Processes for metal extraction

    NASA Technical Reports Server (NTRS)

    Bowersox, David F.

    1992-01-01

    This report describes the processing of plutonium at Los Alamos National Laboratory (LANL), and operation illustrating concepts that may be applicable to the processing of lunar materials. The toxic nature of plutonium requires a highly closed system for processing lunar surface materials.

  4. Analysis of the production process of optically pure D-lactic acid from raw glycerol using engineered Escherichia coli strains.

    PubMed

    Posada, John A; Cardona, Carlos A; Gonzalez, Ramon

    2012-02-01

    Glycerol has become an ideal feedstock for producing fuels and chemicals. Here, five technological schemes for optically pure D: -lactic acid production from raw glycerol were designed, simulated, and economically assessed based on five fermentative scenarios using engineered Escherichia coli strains. Fermentative scenarios considered different qualities of glycerol (pure, 98 wt.%, and crude, 85 wt.%) with concentrations ranging from 20 to 60 g/l in the fermentation media, and two fermentation stages were also analyzed. Raw glycerol (60 wt.%) was considered as the feedstock feeding the production process in all cases; then a purification process of raw glycerol up to the required quality was required. Simulation processes were carried out using Aspen Plus, while economic assessments were performed using Aspen Icarus Process Evaluator. D: -Lactic acid recovery and purification processes were based on reactive extraction with tri-n-octylamine using dichloromethane as active extractant agent. The use of raw glycerol represents only between 2.4% and 7.8% of the total production costs. Also, the total production costs obtained of D: -lactic acid in all cases were lower than its sale price indicating that these processes are potentially profitable. Thus, the best configuration process requires the use of crude glycerol diluted at 40 g/l with total glycerol consumption and with D: -lactic acid recovering by reactive extraction. The lowest obtained total production cost was 1.015 US$/kg with a sale price/production cost ratio of 1.53.

  5. Autism, Context/Noncontext Information Processing, and Atypical Development

    PubMed Central

    Skoyles, John R.

    2011-01-01

    Autism has been attributed to a deficit in contextual information processing. Attempts to understand autism in terms of such a defect, however, do not include more recent computational work upon context. This work has identified that context information processing depends upon the extraction and use of the information hidden in higher-order (or indirect) associations. Higher-order associations underlie the cognition of context rather than that of situations. This paper starts by examining the differences between higher-order and first-order (or direct) associations. Higher-order associations link entities not directly (as with first-order ones) but indirectly through all the connections they have via other entities. Extracting this information requires the processing of past episodes as a totality. As a result, this extraction depends upon specialised extraction processes separate from cognition. This information is then consolidated. Due to this difference, the extraction/consolidation of higher-order information can be impaired whilst cognition remains intact. Although not directly impaired, cognition will be indirectly impaired by knock on effects such as cognition compensating for absent higher-order information with information extracted from first-order associations. This paper discusses the implications of this for the inflexible, literal/immediate, and inappropriate information processing of autistic individuals. PMID:22937255

  6. Extracting DNA from FFPE Tissue Biospecimens Using User-Friendly Automated Technology: Is There an Impact on Yield or Quality?

    PubMed

    Mathieson, William; Guljar, Nafia; Sanchez, Ignacio; Sroya, Manveer; Thomas, Gerry A

    2018-05-03

    DNA extracted from formalin-fixed, paraffin-embedded (FFPE) tissue blocks is amenable to analytical techniques, including sequencing. DNA extraction protocols are typically long and complex, often involving an overnight proteinase K digest. Automated platforms that shorten and simplify the process are therefore an attractive proposition for users wanting a faster turn-around or to process large numbers of biospecimens. It is, however, unclear whether automated extraction systems return poorer DNA yields or quality than manual extractions performed by experienced technicians. We extracted DNA from 42 FFPE clinical tissue biospecimens using the QiaCube (Qiagen) and ExScale (ExScale Biospecimen Solutions) automated platforms, comparing DNA yields and integrities with those from manual extractions. The QIAamp DNA FFPE Spin Column Kit was used for manual and QiaCube DNA extractions and the ExScale extractions were performed using two of the manufacturer's magnetic bead kits: one extracting DNA only and the other simultaneously extracting DNA and RNA. In all automated extraction methods, DNA yields and integrities (assayed using DNA Integrity Numbers from a 4200 TapeStation and the qPCR-based Illumina FFPE QC Assay) were poorer than in the manual method, with the QiaCube system performing better than the ExScale system. However, ExScale was fastest, offered the highest reproducibility when extracting DNA only, and required the least intervention or technician experience. Thus, the extraction methods have different strengths and weaknesses, would appeal to different users with different requirements, and therefore, we cannot recommend one method over another.

  7. Coupling alkaline pre-extraction with alkaline-oxidative post-treatment of corn stover to enhance enzymatic hydrolysis and fermentability.

    PubMed

    Liu, Tongjun; Williams, Daniel L; Pattathil, Sivakumar; Li, Muyang; Hahn, Michael G; Hodge, David B

    2014-04-03

    A two-stage chemical pretreatment of corn stover is investigated comprising an NaOH pre-extraction followed by an alkaline hydrogen peroxide (AHP) post-treatment. We propose that conventional one-stage AHP pretreatment can be improved using alkaline pre-extraction, which requires significantly less H2O2 and NaOH. To better understand the potential of this approach, this study investigates several components of this process including alkaline pre-extraction, alkaline and alkaline-oxidative post-treatment, fermentation, and the composition of alkali extracts. Mild NaOH pre-extraction of corn stover uses less than 0.1 g NaOH per g corn stover at 80°C. The resulting substrates were highly digestible by cellulolytic enzymes at relatively low enzyme loadings and had a strong susceptibility to drying-induced hydrolysis yield losses. Alkaline pre-extraction was highly selective for lignin removal over xylan removal; xylan removal was relatively minimal (~20%). During alkaline pre-extraction, up to 0.10 g of alkali was consumed per g of corn stover. AHP post-treatment at low oxidant loading (25 mg H2O2 per g pre-extracted biomass) increased glucose hydrolysis yields by 5%, which approached near-theoretical yields. ELISA screening of alkali pre-extraction liquors and the AHP post-treatment liquors demonstrated that xyloglucan and β-glucans likely remained tightly bound in the biomass whereas the majority of the soluble polymeric xylans were glucurono (arabino) xylans and potentially homoxylans. Pectic polysaccharides were depleted in the AHP post-treatment liquor relative to the alkaline pre-extraction liquor. Because the already-low inhibitor content was further decreased in the alkaline pre-extraction, the hydrolysates generated by this two-stage pretreatment were highly fermentable by Saccharomyces cerevisiae strains that were metabolically engineered and evolved for xylose fermentation. This work demonstrates that this two-stage pretreatment process is well suited for converting lignocellulose to fermentable sugars and biofuels, such as ethanol. This approach achieved high enzymatic sugars yields from pretreated corn stover using substantially lower oxidant loadings than have been reported previously in the literature. This pretreatment approach allows for many possible process configurations involving novel alkali recovery approaches and novel uses of alkaline pre-extraction liquors. Further work is required to identify the most economical configuration, including process designs using techno-economic analysis and investigating processing strategies that economize water use.

  8. The Solid Phase Curing Time Effect of Asbuton with Texapon Emulsifier at the Optimum Bitumen Content

    NASA Astrophysics Data System (ADS)

    Sarwono, D.; Surya D, R.; Setyawan, A.; Djumari

    2017-07-01

    Buton asphalt (asbuton) could not be utilized optimally in Indonesia. Asbuton utilization rate was still low because the processed product of asbuton still have impracticable form in the term of use and also requiring high processing costs. This research aimed to obtain asphalt products from asbuton practical for be used through the extraction process and not requiring expensive processing cost. This research was done with experimental method in laboratory. The composition of emulsify asbuton were 5/20 grain, premium, texapon, HCl, and aquades. Solid phase was the mixture asbuton 5/20 grain and premium with 3 minutes mixing time. Liquid phase consisted texapon, HCl and aquades. The aging process was done after solid phase mixing process in order to reaction and tie of solid phase mixed become more optimal for high solubility level of asphalt production. Aging variable time were 30, 60, 90, 120, and 150 minutes. Solid and liquid phase was mixed for emulsify asbuton production, then extracted for 25 minutes. Solubility level of asphalt, water level, and asphalt characteristic was tested at extraction result of emulsify asbuton with most optimum ashphal level. The result of analysis tested data asphalt solubility level at extract asbuton resulted 94.77% on 120 minutes aging variable time. Water level test resulted water content reduction on emulsify asbuton more long time on occurring of aging solid phase. Examination of asphalt characteristic at extraction result of emulsify asbuton with optimum asphalt solubility level, obtain specimen that have rigid and strong texture in order that examination result have not sufficient ductility and penetration value.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damtie, Fikeraddis A., E-mail: Fikeraddis.Damtie@teorfys.lu.se; Wacker, Andreas, E-mail: Andreas.Wacker@fysik.lu.se; Karki, Khadga J., E-mail: Khadga.Karki@chemphys.lu.se

    Multiple exciton generation (MEG) is a process in which more than one electron hole pair is generated per absorbed photon. It allows us to increase the efficiency of solar energy harvesting. Experimental studies have shown the multiple exciton generation yield of 1.2 in isolated colloidal quantum dots. However real photoelectric devices require the extraction of electron hole pairs to electric contacts. We provide a systematic study of the corresponding quantum coherent processes including extraction and injection and show that a proper design of extraction and injection rates enhances the yield significantly up to values around 1.6.

  10. Solvent Extraction for Vegetable Oil Production: National Emission Standards for Hazardous Air Pollutants (NESHAP)

    EPA Pesticide Factsheets

    The EPA has identified solvent extraction for vegetable oil production processes as major sources of a single hazardous air pollutant (HAP), n-hexane. Learn more about the rule requirements and regulations, as well as find compliance help

  11. Downstream processing of hyperforin from Hypericum perforatum root cultures.

    PubMed

    Haas, Paul; Gaid, Mariam; Zarinwall, Ajmal; Beerhues, Ludger; Scholl, Stephan

    2018-05-01

    Hyperforin is a major metabolite of the medicinal plant Hypericum perforatum (St. John's Wort) and has recently been found in hormone induced root cultures. The objective of this study is to identify a downstream process for the production of a hyperforin-rich extract with maximum extraction efficiency and minimal decomposition. The maximum extraction time was found to be 60min. The comparison of two equipment concepts for the extraction and solvent evaporation was performed employing two different solvents. While the rotary mixer showed better results for the extraction efficiency than a stirred vessel, the latter set-up was able to handle larger volumes but did not meet all process requirements. For the evaporation the prompt evaporation of the extraction agent using nitrogen stripping led to minor decomposition. In a 5L stirred vessel, the highest specific extraction of hyperforin was 4.3mg hyperforin/g dry weight bio material. Parameters for the equipment design for extraction and solvent evaporation were determined based on the experimental data. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Large scale extraction of poly(3-hydroxybutyrate) from Ralstonia eutropha H16 using sodium hypochlorite

    PubMed Central

    2012-01-01

    Isolation of polyhydroxyalkanoates (PHAs) from bacterial cell matter is a critical step in order to achieve a profitable production of the polymer. Therefore, an extraction method must lead to a high recovery of a pure product at low costs. This study presents a simplified method for large scale poly(3-hydroxybutyrate), poly(3HB), extraction using sodium hypochlorite. Poly(3HB) was extracted from cells of Ralstonia eutropha H16 at almost 96% purity. At different extraction volumes, a maximum recovery rate of 91.32% was obtained. At the largest extraction volume of 50 L, poly(3HB) with an average purity of 93.32% ± 4.62% was extracted with a maximum recovery of 87.03% of the initial poly(3HB) content. This process is easy to handle and requires less efforts than previously described processes. PMID:23164136

  13. Reductive stripping process for uranium recovery from organic extracts

    DOEpatents

    Hurst, F.J. Jr.

    1983-06-16

    In the reductive stripping of uranium from an organic extractant in a uranium recovery process, the use of phosphoric acid having a molarity in the range of 8 to 10 increases the efficiency of the reductive stripping and allows the strip step to operate with lower aqueous to organic recycle ratios and shorter retention time in the mixer stages. Under these operating conditions, less solvent is required in the process, and smaller, less expensive process equipment can be utilized. The high strength H/sub 3/PO/sub 4/ is available from the evaporator stage of the process.

  14. Reductive stripping process for uranium recovery from organic extracts

    DOEpatents

    Hurst, Jr., Fred J.

    1985-01-01

    In the reductive stripping of uranium from an organic extractant in a uranium recovery process, the use of phosphoric acid having a molarity in the range of 8 to 10 increases the efficiency of the reductive stripping and allows the strip step to operate with lower aqueous to organic recycle ratios and shorter retention time in the mixer stages. Under these operating conditions, less solvent is required in the process, and smaller, less expensive process equipment can be utilized. The high strength H.sub.3 PO.sub.4 is available from the evaporator stage of the process.

  15. The atmosphere of Mars - Resources for the exploration and settlement of Mars

    NASA Technical Reports Server (NTRS)

    Meyer, T. R.; Mckay, C. P.

    1984-01-01

    This paper describes methods of processing the Mars atmosphere to supply water, oxygen and buffer gas for a Mars base. Existing life support system technology is combined with innovative methods of water extraction, and buffer gas processing. The design may also be extended to incorporate an integrated greenhouse to supply food, oxygen and water recycling. It is found that the work required to supply one kilogram of an argon/nitrogen buffer gas is 9.4 kW-hr. To extract water from the dry Martian atmosphere can require up to 102.8 kW-hr per kilogram of water depending on the relative humidity of the air.

  16. Extraction, scrub, and strip test results for the salt waste processing facility caustic side solvent extraction solvent example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T. B.

    An Extraction, Scrub, and Strip (ESS) test was performed on a sample of Salt Waste Processing Facility (SWPF) Caustic-Side Solvent Extraction (CSSX) solvent and salt simulant to determine cesium distribution ratios (D(Cs)), and cesium concentration in the strip effluent (SE) and decontaminated salt solution (DSS) streams; this data will be used by Parsons to help determine if the solvent is qualified for use at the SWPF. The ESS test showed acceptable performance of the solvent for extraction, scrub, and strip operations. The extraction D(Cs) measured 12.9, exceeding the required value of 8. This value is consistent with results from previousmore » ESS tests using similar solvent formulations. Similarly, scrub and strip cesium distribution ratios fell within acceptable ranges.« less

  17. An Environmental Impact Analysis of Semi-Mechanical Extraction Process of Sago Starch: Life Cycle Assessment (LCA) Perspective

    NASA Astrophysics Data System (ADS)

    Yusuf, M. A.; Romli, M.; Suprihatin; Wiloso, E. I.

    2018-05-01

    Industrial activities use material, energy and water resources and generate greenhouse gas (GHG). Currently, various regulations require industry to measure and quantify the emissions generated from its process activity. LCA is a method that can be used to analyze and report the environmental impact of an activity that uses resources and generates waste by an industrial activity. In this work, LCA is used to determine the environmental impact of a semi-mechanical extraction process of sago industry. The data was collected through the sago industry in Cimahpar, Bogor. The extraction of sago starch consists of stem cutting, rasping, mixing, filtration, starch sedimentation, washing, and drying. The scope of LCA study covers the harvesting of sago stem, transportation to extraction site, and the starch extraction process. With the assumption that the average transportation distance of sago stem to extraction site is 200 km, the GHG emission is estimated to be 325 kg CO2 eq / ton of sundried sago starch. This figure is lower than that reported for maize starch (1120 kg CO2 eq), potato starch (2232 kg CO2 eq) and cassava starch (4310 kg CO2 eq). This is most likely due to the uncounted impact from the use of electrical energy on the extraction process, which is currently being conducted. A follow-up study is also underway to formulate several process improvement scenarios to derive the design of sago starch processing that generates the minimum emissions.

  18. Coupling alkaline pre-extraction with alkaline-oxidative post-treatment of corn stover to enhance enzymatic hydrolysis and fermentability

    PubMed Central

    2014-01-01

    Background A two-stage chemical pretreatment of corn stover is investigated comprising an NaOH pre-extraction followed by an alkaline hydrogen peroxide (AHP) post-treatment. We propose that conventional one-stage AHP pretreatment can be improved using alkaline pre-extraction, which requires significantly less H2O2 and NaOH. To better understand the potential of this approach, this study investigates several components of this process including alkaline pre-extraction, alkaline and alkaline-oxidative post-treatment, fermentation, and the composition of alkali extracts. Results Mild NaOH pre-extraction of corn stover uses less than 0.1 g NaOH per g corn stover at 80°C. The resulting substrates were highly digestible by cellulolytic enzymes at relatively low enzyme loadings and had a strong susceptibility to drying-induced hydrolysis yield losses. Alkaline pre-extraction was highly selective for lignin removal over xylan removal; xylan removal was relatively minimal (~20%). During alkaline pre-extraction, up to 0.10 g of alkali was consumed per g of corn stover. AHP post-treatment at low oxidant loading (25 mg H2O2 per g pre-extracted biomass) increased glucose hydrolysis yields by 5%, which approached near-theoretical yields. ELISA screening of alkali pre-extraction liquors and the AHP post-treatment liquors demonstrated that xyloglucan and β-glucans likely remained tightly bound in the biomass whereas the majority of the soluble polymeric xylans were glucurono (arabino) xylans and potentially homoxylans. Pectic polysaccharides were depleted in the AHP post-treatment liquor relative to the alkaline pre-extraction liquor. Because the already-low inhibitor content was further decreased in the alkaline pre-extraction, the hydrolysates generated by this two-stage pretreatment were highly fermentable by Saccharomyces cerevisiae strains that were metabolically engineered and evolved for xylose fermentation. Conclusions This work demonstrates that this two-stage pretreatment process is well suited for converting lignocellulose to fermentable sugars and biofuels, such as ethanol. This approach achieved high enzymatic sugars yields from pretreated corn stover using substantially lower oxidant loadings than have been reported previously in the literature. This pretreatment approach allows for many possible process configurations involving novel alkali recovery approaches and novel uses of alkaline pre-extraction liquors. Further work is required to identify the most economical configuration, including process designs using techno-economic analysis and investigating processing strategies that economize water use. PMID:24693882

  19. The Effect of Salts on Electrospray Ionization of Amino Acids in the Negative Mode

    NASA Technical Reports Server (NTRS)

    Kim, H. I.; Johnson, P. V.; Beegle, L. W.; Kanik, I.

    2004-01-01

    The continued search for organics on Mars will require the development of simplified procedures for handling and processing of soil or rock core samples prior to analysis by onboard instrumentation. Extraction of certain organic molecules such as amino acids from rock and soil samples using a liquid solvent (H2O) has been shown to be more efficient (by approximately an order of magnitude) than heat extraction methods. As such, liquid extraction (using H2O) of amino acid molecules from rock cores or regolith material is a prime candidate for the required processing. In this scenario, electrospray ionization (ESI) of the liquid extract would be a natural choice for ionization of the analyte prior to interrogation by one of a variety of potential analytical separation techniques (mass spectroscopy, ion mobility spectroscopy, etc.). Aside from the obvious compatibility of ESI and liquid samples, ESI offers simplicity and a soft ionization capability. In order to demonstrate that liquid extraction and ESI can work as part of an in situ instrument on Mars, we must better understand and quantify the effect salts have on the ESI process. In the current work, we have endeavored to investigate the feasibility and limitations of negative mode ESI of Martian surface samples in the context of sample salt content using ion mobility spectroscopy (IMS).

  20. Low-power coprocessor for Haar-like feature extraction with pixel-based pipelined architecture

    NASA Astrophysics Data System (ADS)

    Luo, Aiwen; An, Fengwei; Fujita, Yuki; Zhang, Xiangyu; Chen, Lei; Jürgen Mattausch, Hans

    2017-04-01

    Intelligent analysis of image and video data requires image-feature extraction as an important processing capability for machine-vision realization. A coprocessor with pixel-based pipeline (CFEPP) architecture is developed for real-time Haar-like cell-based feature extraction. Synchronization with the image sensor’s pixel frequency and immediate usage of each input pixel for the feature-construction process avoids the dependence on memory-intensive conventional strategies like integral-image construction or frame buffers. One 180 nm CMOS prototype can extract the 1680-dimensional Haar-like feature vectors, applied in the speeded up robust features (SURF) scheme, using an on-chip memory of only 96 kb (kilobit). Additionally, a low power dissipation of only 43.45 mW at 1.8 V supply voltage is achieved during VGA video procession at 120 MHz frequency with more than 325 fps. The Haar-like feature-extraction coprocessor is further evaluated by the practical application of vehicle recognition, achieving the expected high accuracy which is comparable to previous work.

  1. Food allergen extracts to diagnose food-induced allergic diseases: How they are made.

    PubMed

    David, Natalie A; Penumarti, Anusha; Burks, A Wesley; Slater, Jay E

    2017-08-01

    To review the manufacturing procedures of food allergen extracts and applicable regulatory requirements from government agencies, potential approaches to standardization, and clinical application of these products. The effects of thermal processing on allergenicity of common food allergens are also considered. A broad literature review was conducted on the natural history of food allergy, the manufacture of allergen extracts, and the allergenicity of heated food. Regulations, guidance documents, and pharmacopoeias related to food allergen extracts from the United States and Europe were also reviewed. Authoritative and peer-reviewed research articles relevant to the topic were chosen for review. Selected regulations and guidance documents are current and relevant to food allergen extracts. Preparation of a food allergen extract may require careful selection and identification of source materials, grinding, defatting, extraction, clarification, sterilization, and product testing. Although extractions for all products licensed in the United States are performed using raw source materials, many foods are not consumed in their raw form. Heating foods may change their allergenicity, and doing so before extraction may change their allergenicity and the composition of the final product. The manufacture of food allergen extracts requires many considerations to achieve the maximal quality of the final product. Allergen extracts for a select number of foods may be inconsistent between manufacturers or unreliable in a clinical setting, indicating a potential area for future improvement. Copyright © 2016 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  2. Design optimization of highly asymmetrical layouts by 2D contour metrology

    NASA Astrophysics Data System (ADS)

    Hu, C. M.; Lo, Fred; Yang, Elvis; Yang, T. H.; Chen, K. C.

    2018-03-01

    As design pitch shrinks to the resolution limit of up-to-date optical lithography technology, the Critical Dimension (CD) variation tolerance has been dramatically decreased for ensuring the functionality of device. One of critical challenges associates with the narrower CD tolerance for whole chip area is the proximity effect control on asymmetrical layout environments. To fulfill the tight CD control of complex features, the Critical Dimension Scanning Electron Microscope (CD-SEM) based measurement results for qualifying process window and establishing the Optical Proximity Correction (OPC) model become insufficient, thus 2D contour extraction technique [1-5] has been an increasingly important approach for complementing the insufficiencies of traditional CD measurement algorithm. To alleviate the long cycle time and high cost penalties for product verification, manufacturing requirements are better to be well handled at design stage to improve the quality and yield of ICs. In this work, in-house 2D contour extraction platform was established for layout design optimization of 39nm half-pitch Self-Aligned Double Patterning (SADP) process layer. Combining with the adoption of Process Variation Band Index (PVBI), the contour extraction platform enables layout optimization speedup as comparing to traditional methods. The capabilities of identifying and handling lithography hotspots in complex layout environments of 2D contour extraction platform allow process window aware layout optimization to meet the manufacturing requirements.

  3. Sensitivity analysis of coupled processes and parameters on the performance of enhanced geothermal systems.

    PubMed

    Pandey, S N; Vishal, Vikram

    2017-12-06

    3-D modeling of coupled thermo-hydro-mechanical (THM) processes in enhanced geothermal systems using the control volume finite element code was done. In a first, a comparative analysis on the effects of coupled processes, operational parameters and reservoir parameters on heat extraction was conducted. We found that significant temperature drop and fluid overpressure occurred inside the reservoirs/fracture that affected the transport behavior of the fracture. The spatio-temporal variations of fracture aperture greatly impacted the thermal drawdown and consequently the net energy output. The results showed that maximum aperture evolution occurred near the injection zone instead of the production zone. Opening of the fracture reduced the injection pressure required to circulate a fixed mass of water. The thermal breakthrough and heat extraction strongly depend on the injection mass flow rate, well distances, reservoir permeability and geothermal gradients. High permeability caused higher water loss, leading to reduced heat extraction. From the results of TH vs THM process simulations, we conclude that appropriate coupling is vital and can impact the estimates of net heat extraction. This study can help in identifying the critical operational parameters, and process optimization for enhanced energy extraction from a geothermal system.

  4. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  5. Parallel RNA extraction using magnetic beads and a droplet array.

    PubMed

    Shi, Xu; Chen, Chun-Hong; Gao, Weimin; Chao, Shih-Hui; Meldrum, Deirdre R

    2015-02-21

    Nucleic acid extraction is a necessary step for most genomic/transcriptomic analyses, but it often requires complicated mechanisms to be integrated into a lab-on-a-chip device. Here, we present a simple, effective configuration for rapidly obtaining purified RNA from low concentration cell medium. This Total RNA Extraction Droplet Array (TREDA) utilizes an array of surface-adhering droplets to facilitate the transportation of magnetic purification beads seamlessly through individual buffer solutions without solid structures. The fabrication of TREDA chips is rapid and does not require a microfabrication facility or expertise. The process takes less than 5 minutes. When purifying mRNA from bulk marine diatom samples, its repeatability and extraction efficiency are comparable to conventional tube-based operations. We demonstrate that TREDA can extract the total mRNA of about 10 marine diatom cells, indicating that the sensitivity of TREDA approaches single-digit cell numbers.

  6. Parallel RNA extraction using magnetic beads and a droplet array

    PubMed Central

    Shi, Xu; Chen, Chun-Hong; Gao, Weimin; Meldrum, Deirdre R.

    2015-01-01

    Nucleic acid extraction is a necessary step for most genomic/transcriptomic analyses, but it often requires complicated mechanisms to be integrated into a lab-on-a-chip device. Here, we present a simple, effective configuration for rapidly obtaining purified RNA from low concentration cell medium. This Total RNA Extraction Droplet Array (TREDA) utilizes an array of surface-adhering droplets to facilitate the transportation of magnetic purification beads seamlessly through individual buffer solutions without solid structures. The fabrication of TREDA chips is rapid and does not require a microfabrication facility or expertise. The process takes less than 5 minutes. When purifying mRNA from bulk marine diatom samples, its repeatability and extraction efficiency are comparable to conventional tube-based operations. We demonstrate that TREDA can extract the total mRNA of about 10 marine diatom cells, indicating that the sensitivity of TREDA approaches single-digit cell numbers. PMID:25519439

  7. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  8. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2004-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  9. Structuring and extracting knowledge for the support of hypothesis generation in molecular biology

    PubMed Central

    Roos, Marco; Marshall, M Scott; Gibson, Andrew P; Schuemie, Martijn; Meij, Edgar; Katrenko, Sophia; van Hage, Willem Robert; Krommydas, Konstantinos; Adriaans, Pieter W

    2009-01-01

    Background Hypothesis generation in molecular and cellular biology is an empirical process in which knowledge derived from prior experiments is distilled into a comprehensible model. The requirement of automated support is exemplified by the difficulty of considering all relevant facts that are contained in the millions of documents available from PubMed. Semantic Web provides tools for sharing prior knowledge, while information retrieval and information extraction techniques enable its extraction from literature. Their combination makes prior knowledge available for computational analysis and inference. While some tools provide complete solutions that limit the control over the modeling and extraction processes, we seek a methodology that supports control by the experimenter over these critical processes. Results We describe progress towards automated support for the generation of biomolecular hypotheses. Semantic Web technologies are used to structure and store knowledge, while a workflow extracts knowledge from text. We designed minimal proto-ontologies in OWL for capturing different aspects of a text mining experiment: the biological hypothesis, text and documents, text mining, and workflow provenance. The models fit a methodology that allows focus on the requirements of a single experiment while supporting reuse and posterior analysis of extracted knowledge from multiple experiments. Our workflow is composed of services from the 'Adaptive Information Disclosure Application' (AIDA) toolkit as well as a few others. The output is a semantic model with putative biological relations, with each relation linked to the corresponding evidence. Conclusion We demonstrated a 'do-it-yourself' approach for structuring and extracting knowledge in the context of experimental research on biomolecular mechanisms. The methodology can be used to bootstrap the construction of semantically rich biological models using the results of knowledge extraction processes. Models specific to particular experiments can be constructed that, in turn, link with other semantic models, creating a web of knowledge that spans experiments. Mapping mechanisms can link to other knowledge resources such as OBO ontologies or SKOS vocabularies. AIDA Web Services can be used to design personalized knowledge extraction procedures. In our example experiment, we found three proteins (NF-Kappa B, p21, and Bax) potentially playing a role in the interplay between nutrients and epigenetic gene regulation. PMID:19796406

  10. Extractive Fermentation of Sugarcane Juice to Produce High Yield and Productivity of Bioethanol

    NASA Astrophysics Data System (ADS)

    Rofiqah, U.; Widjaja, T.; Altway, A.; Bramantyo, A.

    2017-04-01

    Ethanol production by batch fermentation requires a simple process and it is widely used. Batch fermentation produces ethanol with low yield and productivity due to the accumulation of ethanol in which poisons microorganisms in the fermenter. Extractive fermentation technique is applied to solve the microorganism inhibition problem by ethanol. Extractive fermentation technique can produce ethanol with high yield and productivity. In this process raffinate still, contains much sugar because conversion in the fermentation process is not perfect. Thus, to enhance ethanol yield and productivity, recycle system is applied by returning the raffinate from the extraction process to the fermentation process. This raffinate also contains ethanol which would inhibit the performance of microorganisms in producing ethanol during the fermentation process. Therefore, this study aims to find the optimum condition for the amount of solvent to broth ratio (S: B) and recycle to fresh feed ratio (R: F) which enter the fermenter to produce high yield and productivity. This research was carried out by experiment. In the experiment, sugarcane juice was fermented using Zymomonasmobilis mutant. The fermentation broth was extracted using amyl alcohol. The process was integrated with the recycle system by varying the recycle ratio. The highest yield and productivity is 22.3901% and 103.115 g / L.h respectively, obtained in a process that uses recycle to fresh feed ratio (R: F) of 50:50 and solvents to both ratio of 1.

  11. A simple rapid process for semi-automated brain extraction from magnetic resonance images of the whole mouse head.

    PubMed

    Delora, Adam; Gonzales, Aaron; Medina, Christopher S; Mitchell, Adam; Mohed, Abdul Faheem; Jacobs, Russell E; Bearer, Elaine L

    2016-01-15

    Magnetic resonance imaging (MRI) is a well-developed technique in neuroscience. Limitations in applying MRI to rodent models of neuropsychiatric disorders include the large number of animals required to achieve statistical significance, and the paucity of automation tools for the critical early step in processing, brain extraction, which prepares brain images for alignment and voxel-wise statistics. This novel timesaving automation of template-based brain extraction ("skull-stripping") is capable of quickly and reliably extracting the brain from large numbers of whole head images in a single step. The method is simple to install and requires minimal user interaction. This method is equally applicable to different types of MR images. Results were evaluated with Dice and Jacquard similarity indices and compared in 3D surface projections with other stripping approaches. Statistical comparisons demonstrate that individual variation of brain volumes are preserved. A downloadable software package not otherwise available for extraction of brains from whole head images is included here. This software tool increases speed, can be used with an atlas or a template from within the dataset, and produces masks that need little further refinement. Our new automation can be applied to any MR dataset, since the starting point is a template mask generated specifically for that dataset. The method reliably and rapidly extracts brain images from whole head images, rendering them useable for subsequent analytical processing. This software tool will accelerate the exploitation of mouse models for the investigation of human brain disorders by MRI. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Extracting biomedical events from pairs of text entities

    PubMed Central

    2015-01-01

    Background Huge amounts of electronic biomedical documents, such as molecular biology reports or genomic papers are generated daily. Nowadays, these documents are mainly available in the form of unstructured free texts, which require heavy processing for their registration into organized databases. This organization is instrumental for information retrieval, enabling to answer the advanced queries of researchers and practitioners in biology, medicine, and related fields. Hence, the massive data flow calls for efficient automatic methods of text-mining that extract high-level information, such as biomedical events, from biomedical text. The usual computational tools of Natural Language Processing cannot be readily applied to extract these biomedical events, due to the peculiarities of the domain. Indeed, biomedical documents contain highly domain-specific jargon and syntax. These documents also describe distinctive dependencies, making text-mining in molecular biology a specific discipline. Results We address biomedical event extraction as the classification of pairs of text entities into the classes corresponding to event types. The candidate pairs of text entities are recursively provided to a multiclass classifier relying on Support Vector Machines. This recursive process extracts events involving other events as arguments. Compared to joint models based on Markov Random Fields, our model simplifies inference and hence requires shorter training and prediction times along with lower memory capacity. Compared to usual pipeline approaches, our model passes over a complex intermediate problem, while making a more extensive usage of sophisticated joint features between text entities. Our method focuses on the core event extraction of the Genia task of BioNLP challenges yielding the best result reported so far on the 2013 edition. PMID:26201478

  13. Supercritical CO2 Extraction of Rice Bran Oil -the Technology, Manufacture, and Applications.

    PubMed

    Sookwong, Phumon; Mahatheeranont, Sugunya

    2017-06-01

    Rice bran is a good source of nutrients that have large amounts of phytochemicals and antioxidants. Conventional rice bran oil production requires many processes that may deteriorate and degrade these valuable substances. Supercritical CO 2 extraction is a green alternative method for producing rice bran oil. This work reviews production of rice bran oil by supercritical carbon dioxide (SC-CO 2 ) extraction. In addition, the usefulness and advantages of SC-CO 2 extracted rice bran oil for edible oil and health purpose is also described.

  14. Requirements management: A CSR's perspective

    NASA Technical Reports Server (NTRS)

    Thompson, Joanie

    1991-01-01

    The following subject areas are covered: customer service overview of network service request processing; Customer Service Representative (CSR) responsibility matrix; extract from a sample Memorandum of Understanding; Network Service Request Form and its instructions sample notification of receipt; and requirements management in the NASA Science Internet.

  15. A highly efficient bead extraction technique with low bead number for digital microfluidic immunoassay

    PubMed Central

    Tsai, Po-Yen; Lee, I-Chin; Hsu, Hsin-Yun; Huang, Hong-Yuan; Fan, Shih-Kang; Liu, Cheng-Hsien

    2016-01-01

    Here, we describe a technique to manipulate a low number of beads to achieve high washing efficiency with zero bead loss in the washing process of a digital microfluidic (DMF) immunoassay. Previously, two magnetic bead extraction methods were reported in the DMF platform: (1) single-side electrowetting method and (2) double-side electrowetting method. The first approach could provide high washing efficiency, but it required a large number of beads. The second approach could reduce the required number of beads, but it was inefficient where multiple washes were required. More importantly, bead loss during the washing process was unavoidable in both methods. Here, an improved double-side electrowetting method is proposed for bead extraction by utilizing a series of unequal electrodes. It is shown that, with proper electrode size ratio, only one wash step is required to achieve 98% washing rate without any bead loss at bead number less than 100 in a droplet. It allows using only about 25 magnetic beads in DMF immunoassay to increase the number of captured analytes on each bead effectively. In our human soluble tumor necrosis factor receptor I (sTNF-RI) model immunoassay, the experimental results show that, comparing to our previous results without using the proposed bead extraction technique, the immunoassay with low bead number significantly enhances the fluorescence signal to provide a better limit of detection (3.14 pg/ml) with smaller reagent volumes (200 nl) and shorter analysis time (<1 h). This improved bead extraction technique not only can be used in the DMF immunoassay but also has great potential to be used in any other bead-based DMF systems for different applications. PMID:26858807

  16. Information Fusion for Feature Extraction and the Development of Geospatial Information

    DTIC Science & Technology

    2004-07-01

    of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of

  17. Downstream processing of stevioside and its potential applications.

    PubMed

    Puri, Munish; Sharma, Deepika; Tiwari, Ashok K

    2011-01-01

    Stevioside is a natural sweetener extracted from leaves of Stevia rebaudiana Bertoni, which is commercially produced by conventional (chemical/physical) processes. This article gives an overview of the stevioside structure, various analysis technique, new technologies required and the advances achieved in recent years. An enzymatic process is established, by which the maximum efficacy and benefit of the process can be achieved. The efficiency of the enzymatic process is quite comparable to that of other physical and chemical methods. Finally, we believe that in the future, the enzyme-based extraction will ensure more cost-effective availability of stevioside, thus assisting in the development of more food-based applications. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Extraction, scrub, and strip test results for the solvent transfer to salt waste processing facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T.

    The Savannah River National Laboratory (SRNL) prepared approximately 240 gallons of Caustic-Side Solvent Extraction (CSSX) solvent for use at the Salt Waste Processing Facility (SWPF). An Extraction, Scrub, and Strip (ESS) test was performed on a sample of the prepared solvent using a salt solution prepared by Parsons to determine cesium distribution ratios (D(Cs)), and cesium concentration in the strip effluent (SE) and decontaminated salt solution (DSS) streams. This data will be used by Parsons to help qualify the solvent for use at the SWPF. The ESS test showed acceptable performance of the solvent for extraction, scrub, and strip operations.more » The extraction D(Cs) measured 15.5, exceeding the required value of 8. This value is consistent with results from previous ESS tests using similar solvent formulations. Similarly, scrub and strip cesium distribution ratios fell within acceptable ranges.« less

  19. Determination of Extraction Process Conditions of Gambier Catechin (Uncaria Gambier Roxb) from Solok Bio Bio Lima Puluh Kota District – West Sumatera

    NASA Astrophysics Data System (ADS)

    Desni Rahman, Elly; Sari, Ellyta; Burmawi; Frizka; Endah

    2018-03-01

    Catechin content is the determinant key of quality in gambier trade. The required Catechin content of gambier extracts as a herbal medicinal ingridient is greater than 90%. Mostly, Local gambier that produced by community is not uniform and low quality, thus lowering the price in the export markets. The quality improvement of gambier can be done by extraction and purification processes. This study aims to determine the best extraction process of catechin from Gambier (Uncaria Roxb) which derived from Solok Bio Bio Lima Puluh Kota, West Sumatra. The research methodology includes pre purification: raw materials preparation, washing, filtration, extraction, drying and testing. Washing was done on 100 gr gambier with a variation of water at 500, 600, 700, and 800 ml, heating for an hour at a temperature of 70°C, screened, filtered, and allow to stand until a precipitate is formed, wash repeatedly, filtered, and dried. Further, extract with a solvent variation of : water, etyl acetate, heated at 70°C temperature for 1 hour, then filtered. Filtrate then thickened by using a Rotary evaporator, dried at 50°C temperature for 48 hours and analyzed. The results showed that the best conditions of the extraction process is by using a solvent etyl acetate, at a temperature of 70°C, grading 97.40% catechins.

  20. Influence of process parameters on the extraction of soluble substances from OFMSW and methane production.

    PubMed

    Campuzano, Rosalinda; González-Martínez, Simón

    2017-04-01

    Microorganisms involved in anaerobic digestion require dissolved substrates to transport them through the cell wall to different processing units and finally to be disposed as waste, such as methane and carbon dioxide. In order to increase methane production, this work proposes to separate the soluble substances from OFMSW and analyse methane production from extracts and OFMSW. Using water as solvent, four extraction parameters were proposed: (1) Number of consecutive extractions, (2) Duration of mixing for every consecutive extraction, (3) OFMSW to water mass ratios 1:1, 1:2, and 1:3 and, (4) The influence of temperature on the extraction process. Results indicated that is possible to separate 40% of VS from OFMSW with only three consecutive extraction with mixing of 30min in every extraction using ambient temperature water. For every OFMSW to water combination, the first three consecutive extracts were analysed for biochemical methane potential test during 21days at 35°C; OFMSW was also tested as reference. Methane production from all substrates is highest during the first day and then it slowly decreases to increase again during a second stage. This was identified as diauxic behaviour. Specific methane production at day 21 increased with increasing water content of the extracts where OFMSW methane production was the lowest of all with 535NL/kgVS. These results indicate that it is feasible to rapidly produce methane from extracted substances. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Integrated microwave processing system for the extraction of organophosphorus pesticides in fresh vegetables.

    PubMed

    Wu, Lijie; Song, Ying; Hu, Mingzhu; Xu, Xu; Zhang, Hanqi; Yu, Aimin; Ma, Qiang; Wang, Ziming

    2015-03-01

    A simple and efficient integrated microwave processing system (IMPS) was firstly assembled and validated for the extraction of organophosphorus pesticides in fresh vegetables. Two processes under microwave irradiation, dynamic microwave-assisted extraction (DMAE) and microwave-accelerated solvent elution (MASE), were integrated for simplifying the pretreatment of the sample. Extraction, separation, enrichment and elution were finished in a simple step. The organophosphorus pesticides were extracted from the fresh vegetables into hexane with DMAE, and then the extract was directly introduced into the enrichment column packed with active carbon fiber (ACF). Subsequently, the organophosphorus pesticides trapped on the ACF were eluted with ethyl acetate under microwave irradiation. No further filtration or cleanup was required before analysis of the eluate by gas chromatography-mass spectrometry. Some experimental parameters affecting extraction efficiency were investigated and optimized, such as microwave output power, kind and volume of extraction solvent, extraction time, amount of sorbent, elution microwave power, kind and volume of elution solvent, elution solvent flow rate. Under the optimized conditions, the recoveries were in the range of 71.5-105.2%, and the relative standard deviations were lower than 11.6%. The experiment results prove that the present method is a simple and effective sample preparation method for the determination of pesticides in solid samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Local and global aspects of biological motion perception in children born at very low birth weight

    PubMed Central

    Williamson, K. E.; Jakobson, L. S.; Saunders, D. R.; Troje, N. F.

    2015-01-01

    Biological motion perception can be assessed using a variety of tasks. In the present study, 8- to 11-year-old children born prematurely at very low birth weight (<1500 g) and matched, full-term controls completed tasks that required the extraction of local motion cues, the ability to perceptually group these cues to extract information about body structure, and the ability to carry out higher order processes required for action recognition and person identification. Preterm children exhibited difficulties in all 4 aspects of biological motion perception. However, intercorrelations between test scores were weak in both full-term and preterm children—a finding that supports the view that these processes are relatively independent. Preterm children also displayed more autistic-like traits than full-term peers. In preterm (but not full-term) children, these traits were negatively correlated with performance in the task requiring structure-from-motion processing, r(30) = −.36, p < .05), but positively correlated with the ability to extract identity, r(30) = .45, p < .05). These findings extend previous reports of vulnerability in systems involved in processing dynamic cues in preterm children and suggest that a core deficit in social perception/cognition may contribute to the development of the social and behavioral difficulties even in members of this population who are functioning within the normal range intellectually. The results could inform the development of screening, diagnostic, and intervention tools. PMID:25103588

  3. Extraction, Scrub, and Strip Test Results for the Salt Waste Processing Facility Caustic Side Solvent Extraction Solvent Sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T. B.

    An Extraction, Scrub, and Strip (ESS) test was performed on a sample of Salt Waste Processing Facility (SWPF) Caustic-Side Solvent Extraction (CSSX) solvent and salt simulant to determine cesium distribution ratios (D( Cs)), and cesium concentration in the strip effluent (SE) and decontaminated salt solution (DSS) streams; this data will be used by Parsons to help determine if the solvent is qualified for use at the SWPF. The ESS test showed acceptable performance of the solvent for extraction, scrub, and strip operations. The extraction D( Cs) measured 12.5, exceeding the required value of 8. This value is consistent with resultsmore » from previous ESS tests using similar solvent formulations. Similarly, scrub and strip cesium distribution ratios fell within acceptable ranges. This revision was created to correct an error. The previous revision used an incorrect set of temperature correction coefficients which resulted in slight deviations from the correct D( Cs) results.« less

  4. ARRAY OPTIMIZATION FOR TIDAL ENERGY EXTRACTION IN A TIDAL CHANNEL – A NUMERICAL MODELING ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhaoqing; Wang, Taiping; Copping, Andrea

    This paper presents an application of a hydrodynamic model to simulate tidal energy extraction in a tidal dominated estuary in the Pacific Northwest coast. A series of numerical experiments were carried out to simulate tidal energy extraction with different turbine array configurations, including location, spacing and array size. Preliminary model results suggest that array optimization for tidal energy extraction in a real-world site is a very complex process that requires consideration of multiple factors. Numerical models can be used effectively to assist turbine siting and array arrangement in a tidal turbine farm for tidal energy extraction.

  5. Refining of metallurgical-grade silicon

    NASA Technical Reports Server (NTRS)

    Dietl, J.

    1986-01-01

    A basic requirement of large scale solar cell fabrication is to provide low cost base material. Unconventional refining of metallurical grade silicon represents one of the most promising ways of silicon meltstock processing. The refining concept is based on an optimized combination of metallurgical treatments. Commercially available crude silicon, in this sequence, requires a first pyrometallurgical step by slagging, or, alternatively, solvent extraction by aluminum. After grinding and leaching, high purity qualtiy is gained as an advanced stage of refinement. To reach solar grade quality a final pyrometallurgical step is needed: liquid-gas extraction.

  6. Robust watermark technique using masking and Hermite transform.

    PubMed

    Coronel, Sandra L Gomez; Ramírez, Boris Escalante; Mosqueda, Marco A Acevedo

    2016-01-01

    The following paper evaluates a watermark algorithm designed for digital images by using a perceptive mask and a normalization process, thus preventing human eye detection, as well as ensuring its robustness against common processing and geometric attacks. The Hermite transform is employed because it allows a perfect reconstruction of the image, while incorporating human visual system properties; moreover, it is based on the Gaussian functions derivates. The applied watermark represents information of the digital image proprietor. The extraction process is blind, because it does not require the original image. The following techniques were utilized in the evaluation of the algorithm: peak signal-to-noise ratio, the structural similarity index average, the normalized crossed correlation, and bit error rate. Several watermark extraction tests were performed, with against geometric and common processing attacks. It allowed us to identify how many bits in the watermark can be modified for its adequate extraction.

  7. Integrated experimental and technoeconomic evaluation of two-stage Cu-catalyzed alkaline–oxidative pretreatment of hybrid poplar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhalla, Aditya; Fasahati, Peyman; Particka, Chrislyn A.

    2018-05-17

    When applied to recalcitrant lignocellulosic feedstocks, multi-stage pretreatments can provide more processing flexibility to optimize or balance process outcomes such as increasing delignification, preserving hemicellulose, and maximizing enzymatic hydrolysis yields. We previously reported that adding an alkaline pre-extraction step to a copper-catalyzed alkaline hydrogen peroxide (Cu-AHP) pretreatment process resulted in improved sugar yields, but the process still utilized relatively high chemical inputs (catalyst and H2O2) and enzyme loadings. We hypothesized that by increasing the temperature of the alkaline pre-extraction step in water or ethanol, we could reduce the inputs required during Cu-AHP pretreatment and enzymatic hydrolysis without significant loss inmore » sugar yield. We also performed technoeconomic analysis to determine if ethanol or water was the more cost-effective solvent during alkaline pre-extraction and if the expense associated with increasing the temperature was economically justified.« less

  8. Extraction of valuable compounds from mangosteen pericarps by hydrothermal assisted sonication

    NASA Astrophysics Data System (ADS)

    Machmudah, Siti; Lestari, Sarah Duta; Shiddiqi, Qifni Yasa'Ash; Widiyastuti, Winardi, Sugeng; Wahyudiono, Kanda, Hideki; Goto, Motonobu

    2015-12-01

    Valuable compounds, such as xanthone and phenolic compounds, from mangosteen pericarps was extracted by hydrothermal treatment at temperatures of 120-160 °C and pressures of 5 MPa using batch and semi-batch extractor. This method is a simple and environmentally friendly extraction method requiring no chemicals other than water. Under these conditions, there is possibility for the formation of phenolic compounds from mangosteen pericarps from decomposition of bounds between lignin, cellulose, and hemicellulose via autohydrolysis. In order to increase the amount of extracted valuable compounds, sonication pre-treament was performed prior to the hydrothermal extraction process. 30 min of sonication pre-treatment could increase significantly the amount of xanthone and phenolic compounds mangosteen pericarps extraction. In batch-system, the xanthone recovery approach to 100 % at 160 °C with 30 min sonication pre-treatment for 150 min extraction time. Under semi-batch process, the total phenolic compounds in the extract was 217 mg/g sample at 160 °C with 30 min sonication pre-treatment for 150 min total extraction time. The results revealed that hydrothermal extraction assisted sonication pre-treatment is applicable method for the isolation of polyphenolic compounds from other types of biomass and may lead to an advanced plant biomass components extraction technology.

  9. Thinking Graphically: Connecting Vision and Cognition during Graph Comprehension

    ERIC Educational Resources Information Center

    Ratwani, Raj M.; Trafton, J. Gregory; Boehm-Davis, Deborah A.

    2008-01-01

    Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive…

  10. The Agent of extracting Internet Information with Lead Order

    NASA Astrophysics Data System (ADS)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  11. Direct LiT Electrolysis in a Metallic Fusion Blanket

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, Luke

    2016-09-30

    A process that simplifies the extraction of tritium from molten lithium-based breeding blankets was developed. The process is based on the direct electrolysis of lithium tritide using a ceramic Li ion conductor that replaces the molten salt extraction step. Extraction of tritium in the form of lithium tritide in the blankets/targets of fusion/fission reactors is critical in order to maintain low concentrations. This is needed to decrease the potential tritium permeation to the surroundings and large releases from unforeseen accident scenarios. Extraction is complicated due to required low tritium concentration limits and because of the high affinity of tritium formore » the blanket. This work identified, developed and tested the use of ceramic lithium ion conductors capable of recovering hydrogen and deuterium through an electrolysis step at high temperatures.« less

  12. Direct Lit Electrolysis In A Metallic Lithium Fusion Blanket

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colon-Mercado, H.; Babineau, D.; Elvington, M.

    2015-10-13

    A process that simplifies the extraction of tritium from molten lithium based breeding blankets was developed.  The process is based on the direct electrolysis of lithium tritide using a ceramic Li ion conductor that replaces the molten salt extraction step. Extraction of tritium in the form of lithium tritide in the blankets/targets of fission/fusion reactors is critical in order to maintained low concentrations.  This is needed to decrease the potential tritium permeation to the surroundings and large releases from unforeseen accident scenarios. Because of the high affinity of tritium for the blanket, extraction is complicated at the required low levels. This workmore » identified, developed and tested the use of ceramic lithium ion conductors capable of recovering the hydrogen and deuterium thru an electrolysis step at high temperatures. « less

  13. Automatic differential analysis of NMR experiments in complex samples.

    PubMed

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André

    2018-06-01

    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Recent developments of downstream processing for microbial lipids and conversion to biodiesel.

    PubMed

    Yellapu, Sravan Kumar; Bharti; Kaur, Rajwinder; Kumar, Lalit R; Tiwari, Bhagyashree; Zhang, Xiaolei; Tyagi, Rajeshwar D

    2018-05-01

    With increasing global population and depleting resources, there is an apparent demand for radical unprecedented innovation to satisfy the basal needs of lives. Hence, non-conventional renewable energy resources like biodiesel have been worked out in past few decades. Biofuel (e.g. Biodiesel) serves to be the most sustainable answer to solve "food vs. fuel crisis". In biorefinery process, lipid extraction from oleaginous microbial lipids is an integral part as it facilitates the release of fatty acids. Direct lipid extraction from wet cell-biomass is favorable in comparison to dry-cell biomass because it eliminates the application of expensive dehydration. However, this process is not commercialized yet, instead, it requires intensive research and development in order to establish robust approaches for lipid extraction that can be practically applied on an industrial scale. This review aims for the critical presentation on cell disruption, lipid recovery and purification to support extraction from wet cell-biomass for an efficient transesterification. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. On the hydrophilicity of electrodes for capacitive energy extraction

    NASA Astrophysics Data System (ADS)

    Lian, Cheng; Kong, Xian; Liu, Honglai; Wu, Jianzhong

    2016-11-01

    The so-called Capmix technique for energy extraction is based on the cyclic expansion of electrical double layers to harvest dissipative energy arising from the salinity difference between freshwater and seawater. Its optimal performance requires a careful selection of the electrical potentials for the charging and discharging processes, which must be matched with the pore characteristics of the electrode materials. While a number of recent studies have examined the effects of the electrode pore size and geometry on the capacitive energy extraction processes, there is little knowledge on how the surface properties of the electrodes affect the thermodynamic efficiency. In this work, we investigate the Capmix processes using the classical density functional theory for a realistic model of electrolyte solutions. The theoretical predictions allow us to identify optimal operation parameters for capacitive energy extraction with porous electrodes of different surface hydrophobicity. In agreement with recent experiments, we find that the thermodynamic efficiency can be much improved by using most hydrophilic electrodes.

  16. On the hydrophilicity of electrodes for capacitive energy extraction

    DOE PAGES

    Lian, Cheng; East China Univ. of Science and Technology, Shanghai; Kong, Xian; ...

    2016-09-14

    The so-called Capmix technique for energy extraction is based on the cyclic expansion of electrical double layers to harvest dissipative energy arising from the salinity difference between freshwater and seawater. Its optimal performance requires a careful selection of the electrical potentials for the charging and discharging processes, which must be matched with the pore characteristics of the electrode materials. While a number of recent studies have examined the effects of the electrode pore size and geometry on the capacitive energy extraction processes, there is little knowledge on how the surface properties of the electrodes affect the thermodynamic efficiency. In thismore » paper, we investigate the Capmix processes using the classical density functional theory for a realistic model of electrolyte solutions. The theoretical predictions allow us to identify optimal operation parameters for capacitive energy extraction with porous electrodes of different surface hydrophobicity. Finally, in agreement with recent experiments, we find that the thermodynamic efficiency can be much improved by using most hydrophilic electrodes.« less

  17. Automation for System Safety Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  18. Atypical Balance between Occipital and Fronto-Parietal Activation for Visual Shape Extraction in Dyslexia

    PubMed Central

    Zhang, Ying; Whitfield-Gabrieli, Susan; Christodoulou, Joanna A.; Gabrieli, John D. E.

    2013-01-01

    Reading requires the extraction of letter shapes from a complex background of text, and an impairment in visual shape extraction would cause difficulty in reading. To investigate the neural mechanisms of visual shape extraction in dyslexia, we used functional magnetic resonance imaging (fMRI) to examine brain activation while adults with or without dyslexia responded to the change of an arrow’s direction in a complex, relative to a simple, visual background. In comparison to adults with typical reading ability, adults with dyslexia exhibited opposite patterns of atypical activation: decreased activation in occipital visual areas associated with visual perception, and increased activation in frontal and parietal regions associated with visual attention. These findings indicate that dyslexia involves atypical brain organization for fundamental processes of visual shape extraction even when reading is not involved. Overengagement in higher-order association cortices, required to compensate for underengagment in lower-order visual cortices, may result in competition for top-down attentional resources helpful for fluent reading. PMID:23825653

  19. Methods of downstream processing for the production of biodiesel from microalgae.

    PubMed

    Kim, Jungmin; Yoo, Gursong; Lee, Hansol; Lim, Juntaek; Kim, Kyochan; Kim, Chul Woong; Park, Min S; Yang, Ji-Won

    2013-11-01

    Despite receiving increasing attention during the last few decades, the production of microalgal biofuels is not yet sufficiently cost-effective to compete with that of petroleum-based conventional fuels. Among the steps required for the production of microalgal biofuels, the harvest of the microalgal biomass and the extraction of lipids from microalgae are two of the most expensive. In this review article, we surveyed a substantial amount of previous work in microalgal harvesting and lipid extraction to highlight recent progress in these areas. We also discuss new developments in the biodiesel conversion technology due to the importance of the connectivity of this step with the lipid extraction process. Furthermore, we propose possible future directions for technological or process improvements that will directly affect the final production costs of microalgal biomass-based biofuels. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. A Sustainable and Selective Roasting and Water-Leaching Process to Simultaneously Extract Valuable Metals from Low-Grade Ni-Cu Matte

    NASA Astrophysics Data System (ADS)

    Cui, Fuhui; Mu, Wenning; Wang, Shuai; Xin, Haixia; Xu, Qian; Zhai, Yuchun

    2018-03-01

    Due to stringent environmental requirements and the complex occurrence of valuable metals, traditional pyrometallurgical methods are unsuitable for treating low-grade nickel-copper matte. A clean and sustainable two-stage sulfating roasting and water-leaching process was used to simultaneously extract valuable metals from low-grade nickel-copper matte. Ammonium and sodium sulfate were used as sulfating agents. The first roasting temperature, mass ratio of ammonium sulfate to matte, roasting time, dosage of sodium sulfate, second roasting temperature and leaching temperature were studied. Under optimal conditions, 98.89% of Ni, 97.48% of Cu and 95.82% of Co, but only 1.34% of Fe, were extracted. X-ray diffraction (XRD) and scanning electron microscopy (SEM) were used to reveal the sulfating mechanism during the roasting process.

  1. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  2. Digital image processing for information extraction.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  3. Techno-economic assessment of hybrid extraction and distillation processes for furfural production from lignocellulosic biomass.

    PubMed

    Nhien, Le Cao; Long, Nguyen Van Duc; Kim, Sangyong; Lee, Moonyong

    2017-01-01

    Lignocellulosic biomass is one of the most promising alternatives for replacing mineral resources to overcome global warming, which has become the most important environmental issue in recent years. Furfural was listed by the National Renewable Energy Laboratory as one of the top 30 potential chemicals arising from biomass. However, the current production of furfural is energy intensive and uses inefficient technology. Thus, a hybrid purification process that combines extraction and distillation to produce furfural from lignocellulosic biomass was considered and investigated in detail to improve the process efficiency. This effective hybrid process depends on the extracting solvent, which was selected based on a comprehensive procedure that ranged from solvent screening to complete process design. Various solvents were first evaluated in terms of their extraction ability. Then, the most promising solvents were selected to study the separation feasibility. Eventually, processes that used the three best solvents (toluene, benzene, and butyl chloride) were designed and optimized in detail using Aspen Plus. Sustainability analysis was performed to evaluate these processes in terms of their energy requirements, total annual costs (TAC), and carbon dioxide (CO 2 ) emissions. The results showed that butyl chloride was the most suitable solvent for the hybrid furfural process because it could save 44.7% of the TAC while reducing the CO 2 emissions by 45.5% compared to the toluene process. In comparison with the traditional purification process using distillation, this suggested hybrid extraction/distillation process can save up to 19.2% of the TAC and reduce 58.3% total annual CO 2 emissions. Furthermore, a sensitivity analysis of the feed composition and its effect on the performance of the proposed hybrid system was conducted. Butyl chloride was found to be the most suitable solvent for the hybrid extraction/distillation process of furfural production. The proposed hybrid sequence was more favorable than the traditional distillation process when the methanol fraction of the feed stream was <3% and more benefit could be obtained when that fraction decreased.

  4. Data Processing and Text Mining Technologies on Electronic Medical Records: A Review

    PubMed Central

    Sun, Wencheng; Li, Yangyang; Liu, Fang; Fang, Shengqun; Wang, Guoyan

    2018-01-01

    Currently, medical institutes generally use EMR to record patient's condition, including diagnostic information, procedures performed, and treatment results. EMR has been recognized as a valuable resource for large-scale analysis. However, EMR has the characteristics of diversity, incompleteness, redundancy, and privacy, which make it difficult to carry out data mining and analysis directly. Therefore, it is necessary to preprocess the source data in order to improve data quality and improve the data mining results. Different types of data require different processing technologies. Most structured data commonly needs classic preprocessing technologies, including data cleansing, data integration, data transformation, and data reduction. For semistructured or unstructured data, such as medical text, containing more health information, it requires more complex and challenging processing methods. The task of information extraction for medical texts mainly includes NER (named-entity recognition) and RE (relation extraction). This paper focuses on the process of EMR processing and emphatically analyzes the key techniques. In addition, we make an in-depth study on the applications developed based on text mining together with the open challenges and research issues for future work. PMID:29849998

  5. Ontology-Based Information Extraction for Business Intelligence

    NASA Astrophysics Data System (ADS)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  6. Accelerating sample preparation through enzyme-assisted microfiltration of Salmonella in chicken extract.

    PubMed

    Vibbert, Hunter B; Ku, Seockmo; Li, Xuan; Liu, Xingya; Ximenes, Eduardo; Kreke, Thomas; Ladisch, Michael R; Deering, Amanda J; Gehring, Andrew G

    2015-01-01

    Microfiltration of chicken extracts has the potential to significantly decrease the time required to detect Salmonella, as long as the extract can be efficiently filtered and the pathogenic microorganisms kept in a viable state during this process. We present conditions that enable microfiltration by adding endopeptidase from Bacillus amyloliquefaciens to chicken extracts or chicken rinse, prior to microfiltration with fluid flow on both retentate and permeate sides of 0.2 μm cutoff polysulfone and polyethersulfone hollow fiber membranes. After treatment with this protease, the distribution of micron, submicron, and nanometer particles in chicken extracts changes so that the size of the remaining particles corresponds to 0.4-1 μm. Together with alteration of dissolved proteins, this change helps to explain how membrane fouling might be minimized because the potential foulants are significantly smaller or larger than the membrane pore size. At the same time, we found that the presence of protein protects Salmonella from protease action, thus maintaining cell viability. Concentration and recovery of 1-10 CFU Salmonella/mL from 400 mL chicken rinse is possible in less than 4 h, with the microfiltration step requiring less than 25 min at fluxes of 0.028-0.32 mL/cm(2) min. The entire procedure-from sample processing to detection by polymerase chain reaction-is completed in 8 h. © 2015 American Institute of Chemical Engineers.

  7. Energy Implications of Materials Processing

    ERIC Educational Resources Information Center

    Hayes, Earl T.

    1976-01-01

    Processing of materials could become energy-limited rather than resource-limited. Methods to extract metals, industrial minerals, and energy materials and convert them to useful states requires more than one-fifth of the United States energy budget. Energy accounting by industries must include a total systems analysis of costs to insure net energy…

  8. Memory for Context becomes Less Specific with Time

    ERIC Educational Resources Information Center

    Wiltgen, Brian J.; Silva, Alcino J.

    2007-01-01

    Context memories initially require the hippocampus, but over time become independent of this structure. This shift reflects a consolidation process whereby memories are gradually stored in distributed regions of the cortex. The function of this process is thought to be the extraction of statistical regularities and general knowledge from specific…

  9. 40 CFR 63.2840 - What emission requirements must I meet?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... National Emission Standards for Hazardous Air Pollutants: Solvent Extraction for Vegetable Oil Production... month. (e) Low-HAP solvent option. For all vegetable oil production processes subject to this subpart... paragraphs (e)(1) through (5) of this section. Your vegetable oil production process is not subject to the...

  10. 40 CFR 63.2840 - What emission requirements must I meet?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... National Emission Standards for Hazardous Air Pollutants: Solvent Extraction for Vegetable Oil Production... month. (e) Low-HAP solvent option. For all vegetable oil production processes subject to this subpart... paragraphs (e)(1) through (5) of this section. Your vegetable oil production process is not subject to the...

  11. High-power ultrasonic system for the enhancement of mass transfer in supercritical CO2 extraction processes

    NASA Astrophysics Data System (ADS)

    Riera, Enrique; Blanco, Alfonso; García, José; Benedito, José; Mulet, Antonio; Gallego-Juárez, Juan A.; Blasco, Miguel

    2010-01-01

    Oil is an important component of almonds and other vegetable substrates that can show an influence on human health. In this work the development and validation of an innovative, robust, stable, reliable and efficient ultrasonic system at pilot scale to assist supercritical CO2 extraction of oils from different substrates is presented. In the extraction procedure ultrasonic energy represents an efficient way of producing deep agitation enhancing mass transfer processes because of some mechanisms (radiation pressure, streaming, agitation, high amplitude vibrations, etc.). A previous work to this research pointed out the feasibility of integrating an ultrasonic field inside a supercritical extractor without losing a significant volume fraction. This pioneer method enabled to accelerate mass transfer and then, improving supercritical extraction times. To commercially develop the new procedure fulfilling industrial requirements, a new configuration device has been designed, implemented, tested and successfully validated for supercritical fluid extraction of oil from different vegetable substrates.

  12. Synthesis of microspheres of triuranium octaoxide by simultaneous water and nitrate extraction from ascorbate-uranyl sols.

    PubMed

    Brykala, M; Deptula, A; Rogowski, M; Lada, W; Olczak, T; Wawszczak, D; Smolinski, T; Wojtowicz, P; Modolo, G

    A new method for synthesis of uranium oxide microspheres (diameter <100 μm) has been developed. It is a variant of our patented Complex Sol-Gel Process, which has been used to synthesize high-quality powders of a wide variety of complex oxides. Starting uranyl-nitrate-ascorbate sols were prepared by addition of ascorbic acid to uranyl nitrate hexahydrate solution and alkalizing by aqueous ammonium hydroxide and then emulsified in 2-ethylhexanol-1 containing 1v/o SPAN-80. Drops of emulsion were firstly gelled by extraction of water by the solvent. Destruction of the microspheres during thermal treatment, owing to highly reactive components in the gels, requires modification of the gelation step by Double Extraction Process-simultaneously extraction of water and nitrates using Primene JMT, which completely eliminates these problem. Final step was calcination in air of obtained microspheres of gels to triuranium octaoxide.

  13. A novel microalgal lipid extraction method using biodiesel (fatty acid methyl esters) as an extractant.

    PubMed

    Huang, Wen-Can; Park, Chan Woo; Kim, Jong-Duk

    2017-02-01

    Although microalgae are considered promising renewable sources of biodiesel, the high cost of the downstream process is a significant obstacle in large-scale biodiesel production. In this study, a novel approach for microalgal biodiesel production was developed by using the biodiesel as an extractant. First, wet microalgae with 70% water content were incubated with a mixture of biodiesel/methanol and penetration of the mixture through the cell membrane and swelling of the lipids contained in microalgae was confirmed. Significant increases of lipid droplets were observed by confocal microscopy. Second, the swelled lipid droplets in microalgae were squeezed out using mechanical stress across the cell membrane and washed with methanol. The lipid extraction efficiency reached 68%. This process does not require drying of microalgae or solvent recovery, which the most energy-intensive step in solvent-based biodiesel production. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A Simplified Method for Sampling and Analysis of High Volume Surface Water for Organic Contaminants Using XAD-2

    USGS Publications Warehouse

    Datta, S.; Do, L.V.; Young, T.M.

    2004-01-01

    A simple compressed-gas driven system for field processing and extracting water for subsequent analyses of hydrophobic organic compounds is presented. The pumping device is a pneumatically driven pump and filtration system that can easily clarify at 4L/min. The extraction device uses compressed gas to drive filtered water through two parallel XAD-2 resin columns, at about 200 mL/min. No batteries or inverters are required for water collection or processing. Solvent extractions were performed directly in the XAD-2 glass columns. Final extracts are cleaned-up on Florisil cartridges without fractionation and contaminants analyzed by GC-MS. Method detection limits (MDLs) and recoveries for dissolved organic contaminants, polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs) and pesticides are reported along with results of surface water analysis for the San Francisco Bay, CA.

  15. Accurate Modeling Method for Cu Interconnect

    NASA Astrophysics Data System (ADS)

    Yamada, Kenta; Kitahara, Hiroshi; Asai, Yoshihiko; Sakamoto, Hideo; Okada, Norio; Yasuda, Makoto; Oda, Noriaki; Sakurai, Michio; Hiroi, Masayuki; Takewaki, Toshiyuki; Ohnishi, Sadayuki; Iguchi, Manabu; Minda, Hiroyasu; Suzuki, Mieko

    This paper proposes an accurate modeling method of the copper interconnect cross-section in which the width and thickness dependence on layout patterns and density caused by processes (CMP, etching, sputtering, lithography, and so on) are fully, incorporated and universally expressed. In addition, we have developed specific test patterns for the model parameters extraction, and an efficient extraction flow. We have extracted the model parameters for 0.15μm CMOS using this method and confirmed that 10%τpd error normally observed with conventional LPE (Layout Parameters Extraction) was completely dissolved. Moreover, it is verified that the model can be applied to more advanced technologies (90nm, 65nm and 55nm CMOS). Since the interconnect delay variations due to the processes constitute a significant part of what have conventionally been treated as random variations, use of the proposed model could enable one to greatly narrow the guardbands required to guarantee a desired yield, thereby facilitating design closure.

  16. New procedure for extraction of algal lipids from wet biomass: a green clean and scalable process.

    PubMed

    Dejoye Tanzi, Celine; Abert Vian, Maryline; Chemat, Farid

    2013-04-01

    A new procedure, called Simultaneous Distillation and Extraction Process (SDEP), for lipid extraction from wet microalgae (Nannochloropsis oculata and Dunaliella salina) was reported. This method does not require a pre-drying of the biomass and employs alternative solvents such as d-limonene, α-pinene and p-cymene. This procedure has been compared with Soxhlet extraction (Sox) and Bligh & Dyer method (B&D). For N. oculata, results showed that SDEP-cymene provided similar lipid yields to B&D (21.45% and 23.78%), while SDEP-limonene and pinene provided lower yields (18.73% and 18.75% respectively). For D. salina, SDEP-pinene provided the maximum lipid yield (3.29%) compared to the other solvents, which is quite close to B&D result (4.03%). No significant differences in terms of distribution of lipid classes and fatty acid composition have been obtained for different techniques. Evaluation of energy consumption indicates a substantial saving in the extraction cost by SDEP compared to the conventional extraction technique, Soxhlet. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Deep liquid-chromatographic purification of uranium extract from technetium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Volk, V.; Dvoeglazov, K; Podrezova, L.

    The recycling of uranium in the nuclear fuel cycle requires the removal of a number of radioactive and stable impurities like {sup 99}Tc from spent fuels. In order to improve the grade of uranium extract purification from technetium the method of liquid chromatography and the apparatus for its performance have been developed. Process of technetium extraction and concentrating in aqueous solution containing reducing agent has been studied on simulated solutions (U-Tc-HNO{sub 3}-30% TBP-isoparM). The dynamic tests of the method have been carried out on the laboratory unit. Solution of diformyl-hydrazine in nitric acid was used as a stationary phase. Silicamore » gel with specific surface of 186 m{sup 2}/g was used as a carrier of the stationary phase. It is shown that the volume of purified extract increases as the solution temperature increases, concentration of reducing agent increases and extract flow rate decreases. It is established that the technetium content in uranium by this method could achieve a value below 0.3 ppm. Some variants of overload and composition of the stationary phase containing the extracted technetium have been offered and tested. It is defined that the method provides reduction of processing medium-active wastes by more than 10 times during finish refining process. (authors)« less

  18. Sequential electrokinetic treatment and oxalic acid extraction for the removal of Cu, Cr and As from wood.

    PubMed

    Isosaari, Pirjo; Marjavaara, Pieti; Lehmus, Eila

    2010-10-15

    Removal of Cu, Cr and As from utility poles treated with chromated copper arsenate (CCA) was investigated using different one- to three-step combinations of oxalic acid extraction and electrokinetic treatment. The experiments were carried out at room temperature, using 0.8% oxalic acid and 30 V (200 V/m) of direct current (DC) or alternating current in combination (DC/AC). Six-hour extraction removed only 15%, 11% and 28% and 7-day electrokinetic treatment 57%, 0% and 17% of Cu, Cr and As from wood chips, respectively. The best combination for all the metals was a three-step process consisting of pre-extraction, electrokinetics and post-extraction steps, yielding removals of 67% for Cu, 64% for Cr and 81% for As. Oxalic acid extraction prior to electrokinetic treatment was deleterious to further removal of Cu, but it was necessary for Cr and As removal. Chemical equilibrium modelling was used to explain the differences in the behaviour of Cu, Cr and As. Due to the dissimilar nature of these metals, it appeared that even more process sequences and/or stricter control of the process conditions would be needed to obtain the >99% removals required for safe recycling of the purified wood material. 2010 Elsevier B.V. All rights reserved.

  19. Microwave-Assisted γ-Valerolactone Production for Biomass Lignin Extraction: A Cascade Protocol.

    PubMed

    Tabasso, Silvia; Grillo, Giorgio; Carnaroglio, Diego; Calcio Gaudino, Emanuela; Cravotto, Giancarlo

    2016-03-26

    The general need to slow the depletion of fossil resources and reduce carbon footprints has led to tremendous effort being invested in creating "greener" industrial processes and developing alternative means to produce fuels and synthesize platform chemicals. This work aims to design a microwave-assisted cascade process for a full biomass valorisation cycle. GVL (γ-valerolactone), a renewable green solvent, has been used in aqueous acidic solution to achieve complete biomass lignin extraction. After lignin precipitation, the levulinic acid (LA)-rich organic fraction was hydrogenated, which regenerated the starting solvent for further biomass delignification. This process does not requires a purification step because GVL plays the dual role of solvent and product, while the reagent (LA) is a product of biomass delignification. In summary, this bio-refinery approach to lignin extraction is a cascade protocol in which the solvent loss is integrated into the conversion cycle, leading to simplified methods for biomass valorisation.

  20. Analysis of edible oil processing options for the BIO-Plex advanced life support system

    NASA Technical Reports Server (NTRS)

    Greenwalt, C. J.; Hunter, J.

    2000-01-01

    Edible oil is a critical component of the proposed plant-based Advanced Life Support (ALS) diet. Soybean, peanut, and single-cell oil are the oil source options to date. In terrestrial manufacture, oil is ordinarily extracted with hexane, an organic solvent. However, exposed solvents are not permitted in the spacecraft environment or in enclosed human tests by National Aeronautics and Space Administration due to their potential danger and handling difficulty. As a result, alternative oil-processing methods will need to be utilized. Preparation and recovery options include traditional dehulling, crushing, conditioning, and flaking, extrusion, pressing, water extraction, and supercritical extraction. These processing options were evaluated on criteria appropriate to the Advanced Life Support System and BIO-Plex application including: product quality, product stability, waste production, risk, energy needs, labor requirements, utilization of nonrenewable resources, usefulness of by-products, and versatility and mass of equipment to determine the most appropriate ALS edible oil-processing operation.

  1. Technical difficulties and solutions of direct transesterification process of microbial oil for biodiesel synthesis.

    PubMed

    Yousuf, Abu; Khan, Maksudur Rahman; Islam, M Amirul; Wahid, Zularisam Ab; Pirozzi, Domenico

    2017-01-01

    Microbial oils are considered as alternative to vegetable oils or animal fats as biodiesel feedstock. Microalgae and oleaginous yeast are the main candidates of microbial oil producers' community. However, biodiesel synthesis from these sources is associated with high cost and process complexity. The traditional transesterification method includes several steps such as biomass drying, cell disruption, oil extraction and solvent recovery. Therefore, direct transesterification or in situ transesterification, which combines all the steps in a single reactor, has been suggested to make the process cost effective. Nevertheless, the process is not applicable for large-scale biodiesel production having some difficulties such as high water content of biomass that makes the reaction rate slower and hurdles of cell disruption makes the efficiency of oil extraction lower. Additionally, it requires high heating energy in the solvent extraction and recovery stage. To resolve these difficulties, this review suggests the application of antimicrobial peptides and high electric fields to foster the microbial cell wall disruption.

  2. Surrogate safety measures from traffic simulation models

    DOT National Transportation Integrated Search

    2003-01-01

    This project investigates the potential for deriving surrogate measures of safety from existing microscopic traffic simulation models for intersections. The process of computing the measures in the simulation, extracting the required data, and summar...

  3. Isothermal separation processes

    NASA Technical Reports Server (NTRS)

    England, C.

    1982-01-01

    The isothermal processes of membrane separation, supercritical extraction and chromatography were examined using availability analysis. The general approach was to derive equations that identified where energy is consumed in these processes and how they compare with conventional separation methods. These separation methods are characterized by pure work inputs, chiefly in the form of a pressure drop which supplies the required energy. Equations were derived for the energy requirement in terms of regular solution theory. This approach is believed to accurately predict the work of separation in terms of the heat of solution and the entropy of mixing. It can form the basis of a convenient calculation method for optimizing membrane and solvent properties for particular applications. Calculations were made on the energy requirements for a membrane process separating air into its components.

  4. 40 CFR Appendix F to Subpart B of... - Standard for Recover-Only Equipment That Extracts a Single, Specific Refrigerant Other Than CFC...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... refrigerant, which are either (1) to be returned to a refrigerant reclamation facility that will process the... capability is required which shall process contaminated refrigerant samples at specific temperatures. 6.2The... the recovery process to ±2% of the original manufacturer's formulation submitted to, and accepted by...

  5. 40 CFR Appendix F to Subpart B of... - Standard for Recover-Only Equipment That Extracts a Single, Specific Refrigerant Other Than CFC...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... refrigerant, which are either (1) to be returned to a refrigerant reclamation facility that will process the... capability is required which shall process contaminated refrigerant samples at specific temperatures. 6.2The... the recovery process to ±2% of the original manufacturer's formulation submitted to, and accepted by...

  6. 40 CFR Appendix F to Subpart B of... - Standard for Recover-Only Equipment That Extracts a Single, Specific Refrigerant Other Than CFC...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... refrigerant, which are either (1) to be returned to a refrigerant reclamation facility that will process the... capability is required which shall process contaminated refrigerant samples at specific temperatures. 6.2The... the recovery process to ±2% of the original manufacturer's formulation submitted to, and accepted by...

  7. 40 CFR Appendix F to Subpart B of... - Standard for Recover-Only Equipment That Extracts a Single, Specific Refrigerant Other Than CFC...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... refrigerant, which are either (1) to be returned to a refrigerant reclamation facility that will process the... capability is required which shall process contaminated refrigerant samples at specific temperatures. 6.2The... the recovery process to ±2% of the original manufacturer's formulation submitted to, and accepted by...

  8. 40 CFR Appendix F to Subpart B of... - Standard for Recover-Only Equipment That Extracts a Single, Specific Refrigerant Other Than CFC...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... refrigerant, which are either (1) to be returned to a refrigerant reclamation facility that will process the... capability is required which shall process contaminated refrigerant samples at specific temperatures. 6.2The... the recovery process to ±2% of the original manufacturer's formulation submitted to, and accepted by...

  9. Using Process Redesign and Information Technology to Improve Procurement

    DTIC Science & Technology

    1994-04-01

    contrac- tor. Many large-volume contractors have automated order processing tied to ac- counting, manufacturing, and shipping subsystems. Currently...the contractor must receive the mailed order, analyze it, extract pertinent information, and en- ter that information into the automated order ... processing system. Almost all orders for small purchases are unilateral documents that do not require acceptance or acknowledgment by the contractor. For

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaplan, Ruth; Mamrosh, Darryl; Salih, Hafiz H.

    Brine extraction is a promising strategy for the management of increased reservoir pressure, resulting from carbon dioxide (CO 2) injection in deep saline reservoirs. The extracted brines usually have high concentrations of total dissolved solids (TDS) and various contaminants, and require proper disposal or treatment. In this article, first by conducting a critical review, we evaluate the applicability, limits, and advantages or challenges of various commercially available and emerging desalination technologies that can potentially be employed to treat the highly saline brine (with TDS values >70.000 ppm) and those that are applicable to a ~200,000 ppm TDS brine extracted frommore » the Mt. Simon Sandstone, a potential CO 2 storage site in Illinois, USA. Based on the side-by-side comparison of technologies, evaporators are selected as the most suitable existing technology for treating Mt. Simon brine. Process simulations are then conducted for a conceptual design for desalination of 454 m 3/h (2000 gpm) pretreated brine for near-zero liquid discharge by multi-effect evaporators. In conclusion, the thermal energy demand is estimated at 246kWh perm 3 of recoveredwater, ofwhich 212kWh/m 3 is required for multiple-effect evaporation and the remainder for salt drying. The process also requires additional electrical power of ~2 kWh/m 3.« less

  11. Thermodynamic and energy efficiency analysis of power generation from natural salinity gradients by pressure retarded osmosis.

    PubMed

    Yip, Ngai Yin; Elimelech, Menachem

    2012-05-01

    The Gibbs free energy of mixing dissipated when fresh river water flows into the sea can be harnessed for sustainable power generation. Pressure retarded osmosis (PRO) is one of the methods proposed to generate power from natural salinity gradients. In this study, we carry out a thermodynamic and energy efficiency analysis of PRO work extraction. First, we present a reversible thermodynamic model for PRO and verify that the theoretical maximum extractable work in a reversible PRO process is identical to the Gibbs free energy of mixing. Work extraction in an irreversible constant-pressure PRO process is then examined. We derive an expression for the maximum extractable work in a constant-pressure PRO process and show that it is less than the ideal work (i.e., Gibbs free energy of mixing) due to inefficiencies intrinsic to the process. These inherent inefficiencies are attributed to (i) frictional losses required to overcome hydraulic resistance and drive water permeation and (ii) unutilized energy due to the discontinuation of water permeation when the osmotic pressure difference becomes equal to the applied hydraulic pressure. The highest extractable work in constant-pressure PRO with a seawater draw solution and river water feed solution is 0.75 kWh/m(3) while the free energy of mixing is 0.81 kWh/m(3)-a thermodynamic extraction efficiency of 91.1%. Our analysis further reveals that the operational objective to achieve high power density in a practical PRO process is inconsistent with the goal of maximum energy extraction. This study demonstrates thermodynamic and energetic approaches for PRO and offers insights on actual energy accessible for utilization in PRO power generation through salinity gradients. © 2012 American Chemical Society

  12. Space Research Data Management in the National Aeronautics and Space Administration

    NASA Technical Reports Server (NTRS)

    Ludwig, G. H.

    1986-01-01

    Space related scientific research has passed through a natural evolutionary process. The task of extracting the meaningful information from the raw data is highly involved and will require data processing capabilities that do not exist today. The results are presented of a three year examination of this subject, using an earlier report as a starting point. The general conclusion is that there are areas in which NASA's data management practices can be improved and recommends specific actions. These actions will enhance NASA's ability to extract more of the potential data and to capitalize on future opportunities.

  13. Individual Learning Route as a Way of Highly Qualified Specialists Training for Extraction of Solid Commercial Minerals Enterprises

    NASA Astrophysics Data System (ADS)

    Oschepkova, Elena; Vasinskaya, Irina; Sockoluck, Irina

    2017-11-01

    In view of changing educational paradigm (adopting of two-tier system of higher education concept - undergraduate and graduate programs) a need of using of modern learning and information and communications technologies arises putting into practice learner-centered approaches in training of highly qualified specialists for extraction and processing of solid commercial minerals enterprises. In the unstable market demand situation and changeable institutional environment, from one side, and necessity of work balancing, supplying conditions and product quality when mining-and-geological parameters change, from the other side, mining enterprises have to introduce and develop the integrated management process of product and informative and logistic flows under united management system. One of the main limitations, which keeps down the developing process on Russian mining enterprises, is staff incompetence at all levels of logistic management. Under present-day conditions extraction and processing of solid commercial minerals enterprises need highly qualified specialists who can do self-directed researches, develop new and improve present arranging, planning and managing technologies of technical operation and commercial exploitation of transport and transportation and processing facilities based on logistics. Learner-centered approach and individualization of the learning process necessitate the designing of individual learning route (ILR), which can help the students to realize their professional facilities according to requirements for specialists for extraction and processing of solid commercial minerals enterprises.

  14. Map Design for Computer Processing: Literature Review and DMA Product Critique.

    DTIC Science & Technology

    1985-01-01

    requirements can be separated contour lines (vegetation shown by iconic symbols) from user preference. versus extracting relief information using only con...tour lines (vegetation shown by tints); 0 extracting vegetation information using iconic sym- PERFORMANCE TESTING bols (relief shown by elevation...show another: trapolating the symbols on a white background) in tim- * in the case of point symbols, iconic forms where ing the performance of tasks

  15. Volume and Value of Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  16. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  17. The Metaproteome of "Park Grass" soil - a reference for EU soil science

    NASA Astrophysics Data System (ADS)

    Quinn, Gerry; Dudley, Ed; Doerr, Stefan; Matthews, Peter; Halen, Ingrid; Walley, Richard; Ashton, Rhys; Delmont, Tom; Francis, Lewis; Gazze, Salvatore Andrea; Van Keulen, Geertje

    2016-04-01

    Soil metaproteomics, the systemic extraction and identification of proteins from a soil, is key to understanding the biological and physical processes that occur within the soil at a molecular level. Until recently, direct extraction of proteins from complex soils have yielded only dozens of protein identifications due to interfering substances, such as humic acids and clay, which co-extract and/or strongly adsorb protein, often causing problems in downstream processing, e.g. mass spectrometry. Furthermore, the current most successful, direct, proteomic extraction protocol favours larger molecular weight and/or heat-stable proteins due to its extraction protocol. We have now developed a novel, faster, direct soil protein extraction protocol which also addressed the problem of interfering substances, while only requiring less than 1 gram of material per extraction. We extracted protein from the 'Genomic Observatory' Park Grass at Rothamsted Research (UK), an ideally suited geographic site as it is the longest (>150 years) continually studied experiment on ungrazed permanent grassland in the world, for which a rich history of environmental/ecological data has been collected, including high quality publically available metagenome DNA sequences. Using this improved methodology, in conjunction with the creation of high quality, curated metagenomic sequence databases, we have been able to significantly improve protein identifications from one soil due to extracting a similar number of proteins that were >90% different when compared to the best current direct protocol. This optimised metaproteomics protocol has now enabled identification of thousands of proteins from one soil, leading therefore to a deeper insight of soil system processes at the molecular scale.

  18. Application of enzymes in the production of RTD black tea beverages: a review.

    PubMed

    Kumar, Chandini S; Subramanian, R; Rao, L Jaganmohan

    2013-01-01

    Ready-to-drink (RTD) tea is a popular beverage in many countries. Instability due to development of haze and formation of tea cream is the common problem faced in the production of RTD black tea beverages. Thus decreaming is an important step in the process to meet the cold stability requirements of the product. Enzymatic decreaming approaches overcome some of the disadvantages associated with other conventional decreaming methods such as cold water extraction, chill decreaming, chemical stabilization, and chemical solubilization. Enzyme treatments have been attempted at three stages of black tea processing, namely, enzymatic treatment to green tea and conversion to black tea, enzymatic treatment to black tea followed by extraction, and enzymatic clarification of extract. Tannase is the most commonly employed enzyme (tannin acyl hydrolase EC 3.1.1.20) aiming at improving cold water extractability/solubility and decreasing tea cream formation as well as improving the clarity. The major enzymatic methods proposed for processing black tea having a direct or indirect bearing on RTD tea production, have been discussed along with their relative advantages and limitations.

  19. Interactive access to LP DAAC satellite data archives through a combination of open-source and custom middleware web services

    USGS Publications Warehouse

    Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.

    2015-01-01

    Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.

  20. Low solvent, low temperature method for extracting biodiesel lipids from concentrated microalgal biomass.

    PubMed

    Olmstead, Ian L D; Kentish, Sandra E; Scales, Peter J; Martin, Gregory J O

    2013-11-01

    An industrially relevant method for disrupting microalgal cells and preferentially extracting neutral lipids for large-scale biodiesel production was demonstrated on pastes (20-25% solids) of Nannochloropsis sp. The highly resistant Nannochloropsis sp. cells. were disrupted by incubation for 15 h at 37°C followed by high pressure homogenization at 1200 ± 100 bar. Lipid extraction was performed by twice contacting concentrated algal paste with minimal hexane (solvent:biomass ratios (w/w) of <2:1 and <1.3:1) in a stirred vessel at 35°C. Cell disruption prior to extraction increased lipid recovery 100-fold, with yields of 30-50% w/w obtained in the first hexane contact, and a further 6.5-20% in the second contact. The hexane preferentially extracted neutral lipids over glyco- and phospholipids, with up to 86% w/w of the neutral lipids recovered. The process was effective on wet concentrated paste, required minimal solvent and moderate temperature, and did not require difficult to recover polar solvents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Method of Moments Applied to the Analysis of Precision Spectra from the Neutron Time-of- flight Diagnostics at the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Hatarik, Robert; Caggiano, J. A.; Callahan, D.; Casey, D.; Clark, D.; Doeppner, T.; Eckart, M.; Field, J.; Frenje, J.; Gatu Johnson, M.; Grim, G.; Hartouni, E.; Hurricane, O.; Kilkenny, J.; Knauer, J.; Ma, T.; Mannion, O.; Munro, D.; Sayre, D.; Spears, B.

    2015-11-01

    The method of moments was introduced by Pearson as a process for estimating the population distributions from which a set of ``random variables'' are measured. These moments are compared with a parameterization of the distributions, or of the same quantities generated by simulations of the process. Most diagnostics processes extract scalar parameters depending on the moments of spectra derived from analytic solutions to the fusion rate, necessarily based on simplifying assumptions of the confined plasma. The precision of the TOF spectra, and the nature of the implosions at the NIF require the inclusion of factors beyond the traditional analysis and require the addition of higher order moments to describe the data. This talk will present a diagnostic process for extracting the moments of the neutron energy spectrum for a comparison with theoretical considerations as well as simulations of the implosions. Work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.

  2. Resist Parameter Extraction from Line-and-Space Patterns of Chemically Amplified Resist for Extreme Ultraviolet Lithography

    NASA Astrophysics Data System (ADS)

    Kozawa, Takahiro; Oizumi, Hiroaki; Itani, Toshiro; Tagawa, Seiichi

    2010-11-01

    The development of extreme ultraviolet (EUV) lithography has progressed owing to worldwide effort. As the development status of EUV lithography approaches the requirements for the high-volume production of semiconductor devices with a minimum line width of 22 nm, the extraction of resist parameters becomes increasingly important from the viewpoints of the accurate evaluation of resist materials for resist screening and the accurate process simulation for process and mask designs. In this study, we demonstrated that resist parameters (namely, quencher concentration, acid diffusion constant, proportionality constant of line edge roughness, and dissolution point) can be extracted from the scanning electron microscopy (SEM) images of patterned resists without the knowledge on the details of resist contents using two types of latest EUV resist.

  3. Vision-based weld pool boundary extraction and width measurement during keyhole fiber laser welding

    NASA Astrophysics Data System (ADS)

    Luo, Masiyang; Shin, Yung C.

    2015-01-01

    In keyhole fiber laser welding processes, the weld pool behavior is essential to determining welding quality. To better observe and control the welding process, the accurate extraction of the weld pool boundary as well as the width is required. This work presents a weld pool edge detection technique based on an off axial green illumination laser and a coaxial image capturing system that consists of a CMOS camera and optic filters. According to the difference of image quality, a complete developed edge detection algorithm is proposed based on the local maximum gradient of greyness searching approach and linear interpolation. The extracted weld pool geometry and the width are validated by the actual welding width measurement and predictions by a numerical multi-phase model.

  4. Unlocking echocardiogram measurements for heart disease research through natural language processing.

    PubMed

    Patterson, Olga V; Freiberg, Matthew S; Skanderson, Melissa; J Fodeh, Samah; Brandt, Cynthia A; DuVall, Scott L

    2017-06-12

    In order to investigate the mechanisms of cardiovascular disease in HIV infected and uninfected patients, an analysis of echocardiogram reports is required for a large longitudinal multi-center study. A natural language processing system using a dictionary lookup, rules, and patterns was developed to extract heart function measurements that are typically recorded in echocardiogram reports as measurement-value pairs. Curated semantic bootstrapping was used to create a custom dictionary that extends existing terminologies based on terms that actually appear in the medical record. A novel disambiguation method based on semantic constraints was created to identify and discard erroneous alternative definitions of the measurement terms. The system was built utilizing a scalable framework, making it available for processing large datasets. The system was developed for and validated on notes from three sources: general clinic notes, echocardiogram reports, and radiology reports. The system achieved F-scores of 0.872, 0.844, and 0.877 with precision of 0.936, 0.982, and 0.969 for each dataset respectively averaged across all extracted values. Left ventricular ejection fraction (LVEF) is the most frequently extracted measurement. The precision of extraction of the LVEF measure ranged from 0.968 to 1.0 across different document types. This system illustrates the feasibility and effectiveness of a large-scale information extraction on clinical data. New clinical questions can be addressed in the domain of heart failure using retrospective clinical data analysis because key heart function measurements can be successfully extracted using natural language processing.

  5. Water-based gas purge microsyringe extraction coupled with liquid chromatography for determination of alkylphenols from sea food Laminaria japonica Aresh.

    PubMed

    Yang, Cui; Zhao, Jinhua; Wang, Juan; Yu, Hongling; Piao, Xiangfan; Li, Donghao

    2013-07-26

    A novel organic solvent-free mode of gas purge microsyringe extraction, termed water-based gas purge microsyringe extraction, was developed. This technique can directly extract target compounds in wet samples without any drying process. Parameters affecting the extraction efficiency were investigated. Under optimal extraction conditions, the recoveries of alkylphenols were between 87.6 and 105.8%, and reproducibility was between 5.2 and 12.1%. The technique was also used to determine six kinds of alkylphenols (APs) from samples of Laminaria japonica Aresh. The OP and NP were detected in all the samples, and concentrations ranged from 26.0 to 54.5ngg(-1) and 45.0-180.4ngg(-1), respectively. The 4-n-butylphenol was detected in only one sample and its concentration was very low. Other APs were not detected in L. japonica Aresh samples. The experimental results demonstrated that the technique is fast, simple, non-polluting, allows for quantitative extraction, and a drying process was not required for wet samples. Since only aqueous solution and a conventional microsyringe were used, this technique proved affordable, efficient, and convenient for the extraction of volatile and semivolatile ionizable compounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. MOEX: Solvent extraction approach for recycling enriched 98Mo/ 100Mo material

    DOE PAGES

    Tkac, Peter; Brown, M. Alex; Momen, Abdul; ...

    2017-03-20

    Several promising pathways exist for the production of 99Mo/ 99mTc using enriched 98Mo or 100Mo. Use of Mo targets require a major change in current generator technology, and the necessity for an efficient recycle pathway to recover valuable enriched Mo material. High recovery yields, purity, suitable chemical form and particle size are required. Results on the development of the MOEX– molybdenum solvent extraction – approach to recycle enriched Mo material are presented. Furthermore, the advantages of the MOEX process are very high decontamination factors from potassium and other elements, high throughput, easy scalability, automation, and minimal waste generation.

  7. MOEX: Solvent extraction approach for recycling enriched 98Mo/ 100Mo material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tkac, Peter; Brown, M. Alex; Momen, Abdul

    Several promising pathways exist for the production of 99Mo/ 99mTc using enriched 98Mo or 100Mo. Use of Mo targets require a major change in current generator technology, and the necessity for an efficient recycle pathway to recover valuable enriched Mo material. High recovery yields, purity, suitable chemical form and particle size are required. Results on the development of the MOEX– molybdenum solvent extraction – approach to recycle enriched Mo material are presented. Furthermore, the advantages of the MOEX process are very high decontamination factors from potassium and other elements, high throughput, easy scalability, automation, and minimal waste generation.

  8. Measures of the environmental footprint of the front end of the nuclear fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Schneider; B. Carlsen; E. Tavrides

    2013-11-01

    Previous estimates of environmental impacts associated with the front end of the nuclear fuel cycle (FEFC) have focused primarily on energy consumption and CO2 emissions. Results have varied widely. This work builds upon reports from operating facilities and other primary data sources to build a database of front end environmental impacts. This work also addresses land transformation and water withdrawals associated with the processes of the FEFC. These processes include uranium extraction, conversion, enrichment, fuel fabrication, depleted uranium disposition, and transportation. To allow summing the impacts across processes, all impacts were normalized per tonne of natural uranium mined as wellmore » as per MWh(e) of electricity produced, a more conventional unit for measuring environmental impacts that facilitates comparison with other studies. This conversion was based on mass balances and process efficiencies associated with the current once-through LWR fuel cycle. Total energy input is calculated at 8.7 x 10- 3 GJ(e)/MWh(e) of electricity and 5.9 x 10- 3 GJ(t)/MWh(e) of thermal energy. It is dominated by the energy required for uranium extraction, conversion to fluoride compound for subsequent enrichment, and enrichment. An estimate of the carbon footprint is made from the direct energy consumption at 1.7 kg CO2/MWh(e). Water use is likewise dominated by requirements of uranium extraction, totaling 154 L/MWh(e). Land use is calculated at 8 x 10- 3 m2/MWh(e), over 90% of which is due to uranium extraction. Quantified impacts are limited to those resulting from activities performed within the FEFC process facilities (i.e. within the plant gates). Energy embodied in material inputs such as process chemicals and fuel cladding is identified but not explicitly quantified in this study. Inclusion of indirect energy associated with embodied energy as well as construction and decommissioning of facilities could increase the FEFC energy intensity estimate by a factor of up to 2.« less

  9. Challenges for automatically extracting molecular interactions from full-text articles.

    PubMed

    McIntosh, Tara; Curran, James R

    2009-09-24

    The increasing availability of full-text biomedical articles will allow more biomedical knowledge to be extracted automatically with greater reliability. However, most Information Retrieval (IR) and Extraction (IE) tools currently process only abstracts. The lack of corpora has limited the development of tools that are capable of exploiting the knowledge in full-text articles. As a result, there has been little investigation into the advantages of full-text document structure, and the challenges developers will face in processing full-text articles. We manually annotated passages from full-text articles that describe interactions summarised in a Molecular Interaction Map (MIM). Our corpus tracks the process of identifying facts to form the MIM summaries and captures any factual dependencies that must be resolved to extract the fact completely. For example, a fact in the results section may require a synonym defined in the introduction. The passages are also annotated with negated and coreference expressions that must be resolved.We describe the guidelines for identifying relevant passages and possible dependencies. The corpus includes 2162 sentences from 78 full-text articles. Our corpus analysis demonstrates the necessity of full-text processing; identifies the article sections where interactions are most commonly stated; and quantifies the proportion of interaction statements requiring coherent dependencies. Further, it allows us to report on the relative importance of identifying synonyms and resolving negated expressions. We also experiment with an oracle sentence retrieval system using the corpus as a gold-standard evaluation set. We introduce the MIM corpus, a unique resource that maps interaction facts in a MIM to annotated passages within full-text articles. It is an invaluable case study providing guidance to developers of biomedical IR and IE systems, and can be used as a gold-standard evaluation set for full-text IR tasks.

  10. Modern Techniques in Acoustical Signal and Image Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J V

    2002-04-04

    Acoustical signal processing problems can lead to some complex and intricate techniques to extract the desired information from noisy, sometimes inadequate, measurements. The challenge is to formulate a meaningful strategy that is aimed at performing the processing required even in the face of uncertainties. This strategy can be as simple as a transformation of the measured data to another domain for analysis or as complex as embedding a full-scale propagation model into the processor. The aims of both approaches are the same--to extract the desired information and reject the extraneous, that is, develop a signal processing scheme to achieve thismore » goal. In this paper, we briefly discuss this underlying philosophy from a ''bottom-up'' approach enabling the problem to dictate the solution rather than visa-versa.« less

  11. Extraction of kiwi seed oil: Soxhlet versus four different non-conventional techniques.

    PubMed

    Cravotto, Giancarlo; Bicchi, Carlo; Mantegna, Stefano; Binello, Arianna; Tomao, Valerie; Chemat, Farid

    2011-06-01

    Kiwi seed oil has a nutritionally interesting fatty acid profile, but a rather low oxidative stability, which requires careful extraction procedures and adequate packaging and storage. For these reasons and with the aim to achieve process intensification with shorter extraction time, lower energy consumption and higher yields, four different non-conventional techniques were experimented. Kiwi seeds were extracted in hexane using classic Soxhlet as well as under power ultrasound (US), microwaves (MWs; closed vessel) and MW-integrated Soxhlet. Supercritical CO₂ was also employed and compared to the other techniques in term of yield, extraction time, fatty acid profiles and organoleptic properties. All these non-conventional techniques are fast, effective and safe. A sensory evaluation test showed the presence of off-flavours in oil samples extracted by Soxhlet and US, an indicator of partial degradation.

  12. An algorithm to extract more accurate stream longitudinal profiles from unfilled DEMs

    NASA Astrophysics Data System (ADS)

    Byun, Jongmin; Seong, Yeong Bae

    2015-08-01

    Morphometric features observed from a stream longitudinal profile (SLP) reflect channel responses to lithological variation and changes in uplift or climate; therefore, they constitute essential indicators in the studies for the dynamics between tectonics, climate, and surface processes. The widespread availability of digital elevation models (DEMs) and their processing enable semi-automatic extraction of SLPs as well as additional stream profile parameters, thus reducing the time spent for extracting them and simultaneously allowing regional-scale studies of SLPs. However, careful consideration is required to extract SLPs directly from a DEM, because the DEM must be altered by depression filling process to ensure the continuity of flows across it. Such alteration inevitably introduces distortions to the SLP, such as stair steps, bias of elevation values, and inaccurate stream paths. This paper proposes a new algorithm, called maximum depth tracing algorithm (MDTA), to extract more accurate SLPs using depression-unfilled DEMs. The MDTA supposes that depressions in DEMs are not necessarily artifacts to be removed, and that elevation values within them are useful to represent more accurately the real landscape. To ensure the continuity of flows even across the unfilled DEM, the MDTA first determines the outlet of each depression and then reverses flow directions of the cells on the line of maximum depth within each depression, beginning from the outlet and toward the sink. It also calculates flow accumulation without disruption across the unfilled DEM. Comparative analysis with the profiles extracted by the hydrologic functions implemented in the ArcGIS™ was performed to illustrate the benefits from the MDTA. It shows that the MDTA provides more accurate stream paths on depression areas, and consequently reduces distortions of the SLPs derived from the paths, such as exaggerated elevation values and negatively biased slopes that are commonly observed in the SLPs built using the ArcGIS™. The algorithm proposed here, therefore, could aid all the studies requiring more reliable stream paths and SLPs from DEMs.

  13. Compound Separation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Jet Propulsion Laboratory developed a new one-step liquid-liquid extraction technique which cuts processing time, reduces costs and eliminates much of the equipment required. Technique employs disposable extraction columns, originally developed as an aid to the Los Angeles Police Department, which allow more rapid detection of drugs as part of the department's drug abuse program. Applications include medical treatment, pharmaceutical preparation and forensic chemistry. NASA waived title to Caltech, and Analytichem International is producing Extubes under Caltech license.

  14. A microfluidic study of liquid-liquid extraction mediated by carbon dioxide.

    PubMed

    Lestari, Gabriella; Salari, Alinaghi; Abolhasani, Milad; Kumacheva, Eugenia

    2016-07-05

    Liquid-liquid extraction is an important separation and purification method; however, it faces a challenge in reducing the energy consumption and the environmental impact of solvent (extractant) recovery. The reversible chemical reactions of switchable solvents (nitrogenous bases) with carbon dioxide (CO2) can be implemented in reactive liquid-liquid extraction to significantly reduce the cost and energy requirements of solvent recovery. The development of new effective switchable solvents reacting with CO2 and the optimization of extraction conditions rely on the ability to evaluate and screen the performance of switchable solvents in extraction processes. We report a microfluidic strategy for time- and labour-efficient studies of CO2-mediated solvent extraction. The platform utilizes a liquid segment containing an aqueous extractant droplet and a droplet of a solution of a switchable solvent in a non-polar liquid, with gaseous CO2 supplied to the segment from both sides. Following the reaction of the switchable solvent with CO2, the solvent becomes hydrophilic and transfers from the non-polar solvent to the aqueous droplet. By monitoring the time-dependent variation in droplet volumes, we determined the efficiency and extraction time for the CO2-mediated extraction of different nitrogenous bases in a broad experimental parameter space. The platform enables a significant reduction in the amount of switchable solvents used in these studies, provides accurate temporal characterization of the liquid-liquid extraction process, and offers the capability of high-throughput screening of switchable solvents.

  15. A Technological Comparison of Six Processes for the Production of Reduction-Grade Alumina from Non-Bauxitic Raw Materials

    NASA Astrophysics Data System (ADS)

    Bengtson, K. B.

    The U. S. Bureau of Mines, by means of a contract with Kaiser Engineers and with Kaiser Aluminum & Chemical Corporation as a subcontractor, has sponsored a technological and an economic evaluation of six candidate processes for the manufacture of alumina from certain U. S. raw materials other than bauxite. This paper describes each process. Flow diagrams and the total energy requirement for each process are included. Important characteristics affecting the economics of producing alumina by each process are discussed, and some presently unsolved technical problems are identified. The extraction of alumina from clay via hydrochloric acid with iron separation by solvent extraction, and the crystallization of intermediate AlCl3·6H2O through the introduction of HCl gas into the pregnant mother liquor, appears to be technically feasible and the most attractive of the six raw material/process combinations.

  16. Service Contract Compliance Management in Business Process Management

    NASA Astrophysics Data System (ADS)

    El Kharbili, Marwane; Pulvermueller, Elke

    Compliance management is a critical concern for corporations, required to respect contracts. This concern is particularly relevant in the context of business process management (BPM) as this paradigm is getting adopted more widely for-designing and building IT systems. Enforcing contractual compliance needs to be modeled at different levels of a BPM framework, which also includes the service layer. In this paper, we discuss requirements and methods for modeling contractual compliance for an SOA-supported BPM. We also show how business rule management integrated into an industry BPM tool allows modeling and processing functional and non-functional-property constraints which may be extracted from business process contracts. This work proposes a framework that responds to the requirements identified and proposes an architecture implementing it. Our approach is also illustrated by an example.

  17. Ultrasonication aided in-situ transesterification of microbial lipids to biodiesel.

    PubMed

    Zhang, Xiaolei; Yan, Song; Tyagi, Rajeshwar Dayal; Surampalli, Rao Y; Valéro, Jose R

    2014-10-01

    In-situ transesterification of microbial lipid to biodiesel has been paid substantial attention due to the fact that the lipid extraction and transesterification can be conducted in one-stage process. To improve the feasibility of in-situ transesterification, ultrasonication was employed to reduce methanol requirement and reaction time. The results showed that the use of ultrasonication could achieve high conversion of lipid to FAMEs (92.1% w lipid conversion/w total lipids) with methanol to lipid molar ratio 60:1 and NaOH addition 1% w/w lipid in 20 min, while methanol to lipid molar ratio 360:1, NaOH addition 1% w/w lipid, and reaction time 12h was required to obtain similar yield in in-situ transesterification without ultrasonication. The compositions of FAMEs obtained in case of ultrasonication aided in-situ transesterification were similar as that of two-stage extraction followed by transesterification processes. Copyright © 2014. Published by Elsevier Ltd.

  18. Applying Intelligent Algorithms to Automate the Identification of Error Factors.

    PubMed

    Jin, Haizhe; Qu, Qingxing; Munechika, Masahiko; Sano, Masataka; Kajihara, Chisato; Duffy, Vincent G; Chen, Han

    2018-05-03

    Medical errors are the manifestation of the defects occurring in medical processes. Extracting and identifying defects as medical error factors from these processes are an effective approach to prevent medical errors. However, it is a difficult and time-consuming task and requires an analyst with a professional medical background. The issues of identifying a method to extract medical error factors and reduce the extraction difficulty need to be resolved. In this research, a systematic methodology to extract and identify error factors in the medical administration process was proposed. The design of the error report, extraction of the error factors, and identification of the error factors were analyzed. Based on 624 medical error cases across four medical institutes in both Japan and China, 19 error-related items and their levels were extracted. After which, they were closely related to 12 error factors. The relational model between the error-related items and error factors was established based on a genetic algorithm (GA)-back-propagation neural network (BPNN) model. Additionally, compared to GA-BPNN, BPNN, partial least squares regression and support vector regression, GA-BPNN exhibited a higher overall prediction accuracy, being able to promptly identify the error factors from the error-related items. The combination of "error-related items, their different levels, and the GA-BPNN model" was proposed as an error-factor identification technology, which could automatically identify medical error factors.

  19. Xenopus egg extract: A powerful tool to study genome maintenance mechanisms.

    PubMed

    Hoogenboom, Wouter S; Klein Douwel, Daisy; Knipscheer, Puck

    2017-08-15

    DNA repair pathways are crucial to maintain the integrity of our genome and prevent genetic diseases such as cancer. There are many different types of DNA damage and specific DNA repair mechanisms have evolved to deal with these lesions. In addition to these repair pathways there is an extensive signaling network that regulates processes important for repair, such as cell cycle control and transcription. Despite extensive research, DNA damage repair and signaling are not fully understood. In vitro systems such as the Xenopus egg extract system, have played, and still play, an important role in deciphering the molecular details of these processes. Xenopus laevis egg extracts contain all factors required to efficiently perform DNA repair outside a cell, using mechanisms conserved in humans. These extracts have been used to study several genome maintenance pathways, including mismatch repair, non-homologous end joining, ICL repair, DNA damage checkpoint activation, and replication fork stability. Here we describe how the Xenopus egg extract system, in combination with specifically designed DNA templates, contributed to our detailed understanding of these pathways. Copyright © 2017. Published by Elsevier Inc.

  20. Detergent assisted lipid extraction from wet yeast biomass for biodiesel: A response surface methodology approach.

    PubMed

    Yellapu, Sravan Kumar; Bezawada, Jyothi; Kaur, Rajwinder; Kuttiraja, Mathiazhakan; Tyagi, Rajeshwar D

    2016-10-01

    The lipid extraction from the microbial biomass is a tedious and high cost dependent process. In the present study, detergent assisted lipids extraction from the culture of the yeast Yarrowia lipolytica SKY-7 was carried out. Response surface methodology (RSM) was used to investigate the effect of three principle parameters (N-LS concentration, time and temperature) on microbial lipid extraction efficiency % (w/w). The results obtained by statistical analysis showed that the quadratic model fits in all cases. Maximum lipid recovery of 95.3±0.3% w/w was obtained at the optimum level of process variables [N-LS concentration 24.42mg (equal to 48mgN-LS/g dry biomass), treatment time 8.8min and reaction temperature 30.2°C]. Whereas the conventional chloroform and methanol extraction to achieve total lipid recovery required 12h at 60°C. The study confirmed that oleaginous yeast biomass treatment with N-lauroyl sarcosine would be a promising approach for industrial scale microbial lipid recovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Using decision-tree classifier systems to extract knowledge from databases

    NASA Technical Reports Server (NTRS)

    St.clair, D. C.; Sabharwal, C. L.; Hacke, Keith; Bond, W. E.

    1990-01-01

    One difficulty in applying artificial intelligence techniques to the solution of real world problems is that the development and maintenance of many AI systems, such as those used in diagnostics, require large amounts of human resources. At the same time, databases frequently exist which contain information about the process(es) of interest. Recently, efforts to reduce development and maintenance costs of AI systems have focused on using machine learning techniques to extract knowledge from existing databases. Research is described in the area of knowledge extraction using a class of machine learning techniques called decision-tree classifier systems. Results of this research suggest ways of performing knowledge extraction which may be applied in numerous situations. In addition, a measurement called the concept strength metric (CSM) is described which can be used to determine how well the resulting decision tree can differentiate between the concepts it has learned. The CSM can be used to determine whether or not additional knowledge needs to be extracted from the database. An experiment involving real world data is presented to illustrate the concepts described.

  2. Grinding and classification of pine bark for use as plywood adhesive filler

    Treesearch

    Thomas L. Eberhardt; Karen G. Reed

    2005-01-01

    Prior efforts to incorporate bark or bark extracts into composites have met with only limited success because of poor performance relative to existing products and/or economic barriers stemming from high levels of processing. We are currently investigating applications for southern yellow pine (SYP) bark that require intermediate levels of processing, one being the use...

  3. Lunar oxygen and metal for use in near-earth space - Magma electrolysis

    NASA Technical Reports Server (NTRS)

    Colson, Russell O.; Haskin, Larry A.

    1990-01-01

    The unique conditions on the moon, such as vacuum, absence of many reagents common on the earth, and presence of very nontraditional 'ores', suggest that a unique and nontraditional process for extracting materials from the ores may prove the most practical. An investigation has begun into unfluxed silicate electrolysis as a method for extracting oxygen, Fe, and Si from lunar regolith. The advantages of the process include simplicity of concept, absence of need to supply reagents from the earth, and low power and mass requirements for the processing plant. Disadvantages include the need for uninterrupted high temperature and the highly corrosive nature of the high-temperature silicate melts, which has made identifying suitable electrode and container materials difficult.

  4. Research activities on supercritical fluid science in food biotechnology.

    PubMed

    Khosravi-Darani, Kianoush

    2010-06-01

    This article serves as an overview, introducing the currently popular area of supercritical fluids and their uses in food biotechnology. Within each application, and wherever possible, the basic principles of the technique, as well as a description of the history, instrumentation, methodology, uses, problems encountered, and advantages over the traditional, non-supercritical methods are given. Most current commercial application of the supercritical extraction involve biologically-produced materials; the technique may be particularly relevant to the extraction of biological compounds in cases where there is a requirement for low-temperature processing, high mass-transfer rates, and negligible carrying over of the solvent into the final product. Special applications to food processing include the decaffeination of green coffee beans, the production of hops extracts, the recovery of aromas and flavors from herbs and spices, the extraction and fractionation of edible oils, and the removal of contaminants, among others. New advances, in which the extraction is combined with reaction or crystallization steps, may further increase the attractiveness of supercritical fluids in the bioprocess industries. To develop and establish a novel and effective alternative to heating treatment, the lethal action of high hydrostatic pressure CO(2) on microorganisms, with none or only a minimal heating process, has recently received a great deal of attention.

  5. Integrated experimental and technoeconomic evaluation of two-stage Cu-catalyzed alkaline-oxidative pretreatment of hybrid poplar.

    PubMed

    Bhalla, Aditya; Fasahati, Peyman; Particka, Chrislyn A; Assad, Aline E; Stoklosa, Ryan J; Bansal, Namita; Semaan, Rachel; Saffron, Christopher M; Hodge, David B; Hegg, Eric L

    2018-01-01

    When applied to recalcitrant lignocellulosic feedstocks, multi-stage pretreatments can provide more processing flexibility to optimize or balance process outcomes such as increasing delignification, preserving hemicellulose, and maximizing enzymatic hydrolysis yields. We previously reported that adding an alkaline pre-extraction step to a copper-catalyzed alkaline hydrogen peroxide (Cu-AHP) pretreatment process resulted in improved sugar yields, but the process still utilized relatively high chemical inputs (catalyst and H 2 O 2 ) and enzyme loadings. We hypothesized that by increasing the temperature of the alkaline pre-extraction step in water or ethanol, we could reduce the inputs required during Cu-AHP pretreatment and enzymatic hydrolysis without significant loss in sugar yield. We also performed technoeconomic analysis to determine if ethanol or water was the more cost-effective solvent during alkaline pre-extraction and if the expense associated with increasing the temperature was economically justified. After Cu-AHP pretreatment of 120 °C NaOH-H 2 O pre-extracted and 120 °C NaOH-EtOH pre-extracted biomass, approximately 1.4-fold more total lignin was solubilized (78% and 74%, respectively) compared to the 30 °C NaOH-H 2 O pre-extraction (55%) carried out in a previous study. Consequently, increasing the temperature of the alkaline pre-extraction step to 120 °C in both ethanol and water allowed us to decrease bipyridine and H 2 O 2 during Cu-AHP and enzymes during hydrolysis with only a small reduction in sugar yields compared to 30 °C alkaline pre-extraction. Technoeconomic analysis indicated that 120 °C NaOH-H 2 O pre-extraction has the lowest installed ($246 million) and raw material ($175 million) costs compared to the other process configurations. We found that by increasing the temperature of the alkaline pre-extraction step, we could successfully lower the inputs for pretreatment and enzymatic hydrolysis. Based on sugar yields as well as capital, feedstock, and operating costs, 120 °C NaOH-H 2 O pre-extraction was superior to both 120 °C NaOH-EtOH and 30 °C NaOH-H 2 O pre-extraction.

  6. Integrated experimental and technoeconomic evaluation of two-stage Cu-catalyzed alkaline–oxidative pretreatment of hybrid poplar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhalla, Aditya; Fasahati, Peyman; Particka, Chrislyn A.

    When applied to recalcitrant lignocellulosic feedstocks, multi-stage pretreatments can provide more processing flexibility to optimize or balance process outcomes such as increasing delignification, preserving hemicellulose, and maximizing enzymatic hydrolysis yields. We previously reported that adding an alkaline pre-extraction step to a copper-catalyzed alkaline hydrogen peroxide (Cu-AHP) pretreatment process resulted in improved sugar yields, but the process still utilized relatively high chemical inputs (catalyst and H 2O 2) and enzyme loadings. We hypothesized that by increasing the temperature of the alkaline pre-extraction step in water or ethanol, we could reduce the inputs required during Cu-AHP pretreatment and enzymatic hydrolysis without significantmore » loss in sugar yield. We also performed technoeconomic analysis to determine if ethanol or water was the more cost-effective solvent during alkaline pre-extraction and if the expense associated with increasing the temperature was economically justified. After Cu-AHP pretreatment of 120 °C NaOH-H 2O pre-extracted and 120 °C NaOH-EtOH pre-extracted biomass, approximately 1.4-fold more total lignin was solubilized (78% and 74%, respectively) compared to the 30 °C NaOH-H 2O pre-extraction (55%) carried out in a previous study. Consequently, increasing the temperature of the alkaline pre-extraction step to 120 °C in both ethanol and water allowed us to decrease bipyridine and H 2O 2 during Cu-AHP and enzymes during hydrolysis with only a small reduction in sugar yields compared to 30 °C alkaline pre-extraction. Technoeconomic analysis indicated that 120 °C NaOH-H 2O pre-extraction has the lowest installed ($246 million) and raw material (175 million) costs compared to the other process configurations. We found that by increasing the temperature of the alkaline pre-extraction step, we could successfully lower the inputs for pretreatment and enzymatic hydrolysis. Based on sugar yields as well as capital, feedstock, and operating costs, 120 °C NaOH-H 2O pre-extraction was superior to both 120 °C NaOH-EtOH and 30 °C NaOH-H 2O pre-extraction.« less

  7. Integrated experimental and technoeconomic evaluation of two-stage Cu-catalyzed alkaline–oxidative pretreatment of hybrid poplar

    DOE PAGES

    Bhalla, Aditya; Fasahati, Peyman; Particka, Chrislyn A.; ...

    2018-05-17

    When applied to recalcitrant lignocellulosic feedstocks, multi-stage pretreatments can provide more processing flexibility to optimize or balance process outcomes such as increasing delignification, preserving hemicellulose, and maximizing enzymatic hydrolysis yields. We previously reported that adding an alkaline pre-extraction step to a copper-catalyzed alkaline hydrogen peroxide (Cu-AHP) pretreatment process resulted in improved sugar yields, but the process still utilized relatively high chemical inputs (catalyst and H 2O 2) and enzyme loadings. We hypothesized that by increasing the temperature of the alkaline pre-extraction step in water or ethanol, we could reduce the inputs required during Cu-AHP pretreatment and enzymatic hydrolysis without significantmore » loss in sugar yield. We also performed technoeconomic analysis to determine if ethanol or water was the more cost-effective solvent during alkaline pre-extraction and if the expense associated with increasing the temperature was economically justified. After Cu-AHP pretreatment of 120 °C NaOH-H 2O pre-extracted and 120 °C NaOH-EtOH pre-extracted biomass, approximately 1.4-fold more total lignin was solubilized (78% and 74%, respectively) compared to the 30 °C NaOH-H 2O pre-extraction (55%) carried out in a previous study. Consequently, increasing the temperature of the alkaline pre-extraction step to 120 °C in both ethanol and water allowed us to decrease bipyridine and H 2O 2 during Cu-AHP and enzymes during hydrolysis with only a small reduction in sugar yields compared to 30 °C alkaline pre-extraction. Technoeconomic analysis indicated that 120 °C NaOH-H 2O pre-extraction has the lowest installed ($246 million) and raw material (175 million) costs compared to the other process configurations. We found that by increasing the temperature of the alkaline pre-extraction step, we could successfully lower the inputs for pretreatment and enzymatic hydrolysis. Based on sugar yields as well as capital, feedstock, and operating costs, 120 °C NaOH-H 2O pre-extraction was superior to both 120 °C NaOH-EtOH and 30 °C NaOH-H 2O pre-extraction.« less

  8. Kernel-Based Learning for Domain-Specific Relation Extraction

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Giannone, Cristina; Del Vescovo, Chiara; Moschitti, Alessandro; Naggar, Paolo

    In a specific process of business intelligence, i.e. investigation on organized crime, empirical language processing technologies can play a crucial role. The analysis of transcriptions on investigative activities, such as police interrogatories, for the recognition and storage of complex relations among people and locations is a very difficult and time consuming task, ultimately based on pools of experts. We discuss here an inductive relation extraction platform that opens the way to much cheaper and consistent workflows. The presented empirical investigation shows that accurate results, comparable to the expert teams, can be achieved, and parametrization allows to fine tune the system behavior for fitting domain-specific requirements.

  9. Process development for elemental recovery from PGM tailings by thermochemical treatment: Preliminary major element extraction studies using ammonium sulphate as extracting agent.

    PubMed

    Mohamed, Sameera; van der Merwe, Elizabet M; Altermann, Wladyslaw; Doucet, Frédéric J

    2016-04-01

    Mine tailings can represent untapped secondary resources of non-ferrous, ferrous, precious, rare and trace metals. Continuous research is conducted to identify opportunities for the utilisation of these materials. This preliminary study investigated the possibility of extracting major elements from South African tailings associated with the mining of Platinum Group Metals (PGM) at the Two Rivers mine operations. These PGM tailings typically contain four major elements (11% Al2O3; 12% MgO; 22% Fe2O3; 34% Cr2O3), with lesser amounts of SiO2 (18%) and CaO (2%). Extraction was achieved via thermochemical treatment followed by aqueous dissolution, as an alternative to conventional hydrometallurgical processes. The thermochemical treatment step used ammonium sulphate, a widely available, low-cost, recyclable chemical agent. Quantification of the efficiency of the thermochemical process required the development and optimisation of the dissolution technique. Dissolution in water promoted the formation of secondary iron precipitates, which could be prevented by leaching thermochemically-treated tailings in 0.6M HNO3 solution. The best extraction efficiencies were achieved for aluminium (ca. 60%) and calcium (ca. 80%). 35% iron and 32% silicon were also extracted, alongside chromium (27%) and magnesium (25%). Thermochemical treatment using ammonium sulphate may therefore represent a promising technology for extracting valuable elements from PGM tailings, which could be subsequently converted to value-added products. However, it is not element-selective, and major elements were found to compete with the reagent to form water-soluble sulphate-metal species. Further development of this integrated process, which aims at achieving the full potential of utilisation of PGM tailings, is currently underway. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Effects of herbal ointment containing the leaf extracts of Madeira vine (Anredera cordifolia (Ten.) Steenis) for burn wound healing process on albino rats.

    PubMed

    Yuniarti, Wiwik Misaco; Lukiswanto, Bambang Sektiari

    2017-07-01

    Skin burn is a health problem that requires fast and accurate treatment. If not well-treated, the burn will cause various damaging conditions for the patient. The leaf extract of Madeira vine ( Anredera cordifolia (Ten.) Steenis), or popularly known as Binahong in Indonesia, has been used to treat various diseases. The purpose of this research is to determine the effects of leaf extracts of Madeira vine ( A. cordifolia (Ten.) Steenis) on skin burn healing process in rats as an animal model. In this research, there were four treatment groups: G0, G1, G2, and G3, each consisting of five rats. All these rats were given skin burns, using hot metal plates. Then, sulfadiazine was given to G0, 2.5% leaf extract of Madeira vine was given to G1, 5% extract was given to G2, and 10% extract was given to G3, for straight 14 days topically, 3 times a day. At the end of the treatment period, skin excisions were conducted, and histopathological examination was carried out. Microscopic observation on the wound healing process on the collagen deposition, polymorphonuclear infiltration, angiogenesis, and fibrosis showed that G2 had a significant difference with G0, G1, and G3 (p<0.05), while group G0 was significantly different from G1 and G3 (p<0.05). The better burn healing process on G2 allegedly because of the activity of flavonoid, saponin, and tannin, contained in the Madeira vine, which have the antioxidant, anti-inflammatory, and antibacterial effects. The ointment from the 5% leaf extract of Madeira vine ( A. cordifolia (Ten.) Steenis) has been proven to be effective to be used for topical burn therapy.

  11. The Combination Process for Preparative Separation and Purification of Paclitaxel and 10-Deacetylbaccatin III Using Diaion® Hp-20 Followed by Hydrophilic Interaction Based Solid Phase Extraction.

    PubMed

    Shirshekanb, Mahsa; Rezadoost, Hassan; Javanbakht, Mehran; Ghassempour, Ali Reza

    2017-01-01

    There is no other naturally occurring defense agent against cancer that has a stronger effect than paclitaxel, commonly known under the brand name of Taxol ® . The major drawback for the more widespread use of paclitaxel and its precious precursor, 10-deacetylbaccatin III (10-DAB III), is that they require large-scale extraction from different parts of yew trees ( Taxus species), cell cultures, taxane-producing endophytic fungi, and Corylus species. In our previous work, a novel online two-dimensional heart-cut liquid chromatography process using hydrophilic interaction/ reversed-phase chromatography was used to introduce a semi-preparative treatment for the separation of polar (10-deacetylbaccatin III) and non-polar (paclitaxel) taxanes from Taxus baccata L. In this work, a combination of the absorbent (Diaion ®  HP-20) and a silica based solid phase extraction is utilized as a new, efficient, and cost effective method for large-scale production of taxanes. This process avoids the technical problem of two-dimensional preparative liquid chromatography. The first stage of the process involves discarding co-extractive polar compounds including chlorophylls and pigments using a non-polar synthetic hydrophobic absorbent, Diaion ®  HP-20. Extract was then loaded on to a silica based hydrophilic interaction solid phase extraction (silica 40-60 micron). Taxanes was eluted using a mixture of water and methanol at the optimized ratio of 70:30. Finally, the fraction containing taxanes was applied to semi-preparative reversed phase HPLC. The results revealed that using this procedure, paclitaxel and 10-DAB III could be obtained at 8 and 3 times more, respectively than by the traditional method of extraction.

  12. Extraction of SelectSecure leads compared to conventional pacing leads in patients with congenital heart disease and congenital atrioventricular block.

    PubMed

    Shepherd, Emma; Stuart, Graham; Martin, Rob; Walsh, Mark A

    2015-06-01

    SelectSecure™ pacing leads (Medtronic Inc) are increasingly being used in pediatric patients and adults with structural congenital heart disease. The 4Fr lead is ideal for patients who may require lifelong pacing and can be advantageous for patients with complex anatomy. The purpose of this study was to compare the extraction of SelectSecure leads with conventional (stylette-driven) pacing leads in patients with structural congenital heart disease and congenital atrioventricular block. The data on lead extractions from pediatric and adult congenital heart disease (ACHD) patients from August 2004 to July 2014 at Bristol Royal Hospital for Children and the Bristol Heart Institute were reviewed. Multivariable regression analysis was used to determine whether conventional pacing leads were associated with a more difficult extraction process. A total of 57 patients underwent pacemaker lead extractions (22 SelectSecure, 35 conventional). No deaths occurred. Mean age at the time of extraction was 17.6 ± 10.5 years, mean weight was 47 ± 18 kg, and mean lead age was 5.6 ± 2.6 years (range 1-11 years). Complex extraction (partial extraction/femoral extraction) was more common in patients with conventional pacing leads at univariate (P < .01) and multivariate (P = .04) levels. Lead age was also a significant predictor of complex extraction (P < .01). SelectSecure leads can be successfully extracted using techniques that are used for conventional pacing leads. They are less likely to be partially extracted and are less likely to require extraction using a femoral approach compared with conventional pacing leads. Copyright © 2015 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  13. Transparent DNA/RNA Co-extraction Workflow Protocol Suitable for Inhibitor-Rich Environmental Samples That Focuses on Complete DNA Removal for Transcriptomic Analyses

    PubMed Central

    Lim, Natalie Y. N.; Roco, Constance A.; Frostegård, Åsa

    2016-01-01

    Adequate comparisons of DNA and cDNA libraries from complex environments require methods for co-extraction of DNA and RNA due to the inherent heterogeneity of such samples, or risk bias caused by variations in lysis and extraction efficiencies. Still, there are few methods and kits allowing simultaneous extraction of DNA and RNA from the same sample, and the existing ones generally require optimization. The proprietary nature of kit components, however, makes modifications of individual steps in the manufacturer’s recommended procedure difficult. Surprisingly, enzymatic treatments are often performed before purification procedures are complete, which we have identified here as a major problem when seeking efficient genomic DNA removal from RNA extracts. Here, we tested several DNA/RNA co-extraction commercial kits on inhibitor-rich soils, and compared them to a commonly used phenol-chloroform co-extraction method. Since none of the kits/methods co-extracted high-quality nucleic acid material, we optimized the extraction workflow by introducing small but important improvements. In particular, we illustrate the need for extensive purification prior to all enzymatic procedures, with special focus on the DNase digestion step in RNA extraction. These adjustments led to the removal of enzymatic inhibition in RNA extracts and made it possible to reduce genomic DNA to below detectable levels as determined by quantitative PCR. Notably, we confirmed that DNase digestion may not be uniform in replicate extraction reactions, thus the analysis of “representative samples” is insufficient. The modular nature of our workflow protocol allows optimization of individual steps. It also increases focus on additional purification procedures prior to enzymatic processes, in particular DNases, yielding genomic DNA-free RNA extracts suitable for metatranscriptomic analysis. PMID:27803690

  14. Modeling In-stream Tidal Energy Extraction and Its Potential Environmental Impacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhaoqing; Wang, Taiping; Copping, Andrea

    In recent years, there has been growing interest in harnessing in-stream tidal energy in response to concerns of increasing energy demand and to mitigate climate change impacts. While many studies have been conducted to assess and map tidal energy resources, efforts for quantifying the associated potential environmental impacts have been limited. This paper presents the development of a tidal turbine module within a three-dimensional unstructured-grid coastal ocean model and its application for assessing the potential environmental impacts associated with tidal energy extraction. The model is used to investigate in-stream tidal energy extraction and associated impacts on estuarine hydrodynamic and biologicalmore » processes in a tidally dominant estuary. A series of numerical experiments with varying numbers and configurations of turbines installed in an idealized estuary were carried out to assess the changes in the hydrodynamics and biological processes due to tidal energy extraction. Model results indicated that a large number of turbines are required to extract the maximum tidal energy and cause significant reduction of the volume flux. Preliminary model results also indicate that extraction of tidal energy increases vertical mixing and decreases flushing rate in a stratified estuary. The tidal turbine model was applied to simulate tidal energy extraction in Puget Sound, a large fjord-like estuary in the Pacific Northwest coast.« less

  15. Optimization of a wet microalgal lipid extraction procedure for improved lipid recovery for biofuel and bioproduct production.

    PubMed

    Sathish, Ashik; Marlar, Tyler; Sims, Ronald C

    2015-10-01

    Methods to convert microalgal biomass to bio based fuels and chemicals are limited by several processing and economic hurdles. Research conducted in this study modified/optimized a previously published procedure capable of extracting transesterifiable lipids from wet algal biomass. This optimization resulted in the extraction of 77% of the total transesterifiable lipids, while reducing the amount of materials and temperature required in the procedure. In addition, characterization of side streams generated demonstrated that: (1) the C/N ratio of the residual biomass or lipid extracted (LE) biomass increased to 54.6 versus 10.1 for the original biomass, (2) the aqueous phase generated contains nitrogen, phosphorous, and carbon, and (3) the solid precipitate phase was composed of up to 11.2 wt% nitrogen (70% protein). The ability to isolate algal lipids and the possibility of utilizing generated side streams as products and/or feedstock material for downstream processes helps promote the algal biorefinery concept. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Optimization of microwave-assisted extraction of polyphenols from Myrtus communis L. leaves.

    PubMed

    Dahmoune, Farid; Nayak, Balunkeswar; Moussi, Kamal; Remini, Hocine; Madani, Khodir

    2015-01-01

    Phytochemicals, such as phenolic compounds, are of great interest due to their health-benefitting antioxidant properties and possible protection against inflammation, cardiovascular diseases and certain types of cancer. Maximum retention of these phytochemicals during extraction requires optimised process parameter conditions. A microwave-assisted extraction (MAE) method was investigated for extraction of total phenolics from Myrtus communis leaves. The total phenolic capacity (TPC) of leaf extracts at optimised MAE conditions was compared with ultrasound-assisted extraction (UAE) and conventional solvent extraction (CSE). The influence of extraction parameters including ethanol concentration, microwave power, irradiation time and solvent-to-solid ratio on the extraction of TPC was modeled by using a second-order regression equation. The optimal MAE conditions were 42% ethanol concentration, 500 W microwave power, 62 s irradiation time and 32 mL/g solvent to material ratio. Ethanol concentration and liquid-to-solid ratio were the significant parameters for the extraction process (p<0.01). Under the MAE optimised conditions, the recovery of TPC was 162.49 ± 16.95 mg gallic acidequivalent/gdry weight(DW), approximating the predicted content (166.13 mg GAE/g DW). When bioactive phytochemicals extracted from Myrtus leaves using MAE compared with UAE and CSE, it was also observed that tannins (32.65 ± 0.01 mg/g), total flavonoids (5.02 ± 0.05 mg QE/g) and antioxidant activities (38.20 ± 1.08 μg GAE/mL) in MAE extracts were higher than the other two extracts. These findings further illustrate that extraction of bioactive phytochemicals from plant materials using MAE method consumes less extraction solvent and saves time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. 10 CFR 51.22 - Criterion for categorical exclusion; identification of licensing and regulatory actions eligible...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... requirements. (11) Issuance of amendments to licenses for fuel cycle plants and radioactive waste disposal... licensees, except processing of source material for extraction of rare earth and other metals. (xiv) Nuclear...

  18. Use of a commercial wind SODAR for measuring wake vortices

    DOT National Transportation Integrated Search

    2006-05-01

    This paper describes the application of a commercial wind SODAR (SOnic Detection : And Ranging) to the measurement of aircraft wake vortices. Changes in data collection and : processing were required to extract vortex location and circulation from th...

  19. Assessment of desalination technologies for treatment of a highly saline brine from a potential CO 2 storage site

    DOE PAGES

    Kaplan, Ruth; Mamrosh, Darryl; Salih, Hafiz H.; ...

    2016-11-12

    Brine extraction is a promising strategy for the management of increased reservoir pressure, resulting from carbon dioxide (CO 2) injection in deep saline reservoirs. The extracted brines usually have high concentrations of total dissolved solids (TDS) and various contaminants, and require proper disposal or treatment. In this article, first by conducting a critical review, we evaluate the applicability, limits, and advantages or challenges of various commercially available and emerging desalination technologies that can potentially be employed to treat the highly saline brine (with TDS values >70.000 ppm) and those that are applicable to a ~200,000 ppm TDS brine extracted frommore » the Mt. Simon Sandstone, a potential CO 2 storage site in Illinois, USA. Based on the side-by-side comparison of technologies, evaporators are selected as the most suitable existing technology for treating Mt. Simon brine. Process simulations are then conducted for a conceptual design for desalination of 454 m 3/h (2000 gpm) pretreated brine for near-zero liquid discharge by multi-effect evaporators. In conclusion, the thermal energy demand is estimated at 246kWh perm 3 of recoveredwater, ofwhich 212kWh/m 3 is required for multiple-effect evaporation and the remainder for salt drying. The process also requires additional electrical power of ~2 kWh/m 3.« less

  20. Improved tandem mass spectrometry (MS/MS) derivatized method for the detection of tyrosinemia type I, amino acids and acylcarnitine disorders using a single extraction process.

    PubMed

    Dhillon, Kuldeep S; Bhandal, Ajit S; Aznar, Constantino P; Lorey, Fred W; Neogi, Partha

    2011-05-12

    Succinylacetone (SUAC), a specific marker for tyrosinemia type I (Tyr I) cannot be detected by the routine LC-MS/MS screening of amino acids (AA) and acylcarnitines (AC) in newborns. The current derivatized methods require double extraction of newborn dried blood spots (DBS); one for AA and AC and the second for SUAC from the blood spot left after the first extraction. We have developed a method in which AA, AC and SUAC are extracted in a single extraction resulting in significant reduction in labor and assay time. The 3.2 mm DBS were extracted by incubating at 45 °C for 45 min with 100 μl of acetonitrile (ACN)-water-formic acid mixture containing hydrazine and stable-isotope labeled internal standards of AA, AC and SUAC. The extract was derivatized with n-butanolic-HCl and analyzed by LC-MS/MS. The average inter-assay CVs for, AA, AC and SUAC were 10.1, 10.8 and 7.1% respectively. The extraction of analytes with ACN-water mixture showed no significant difference in their recovery compared to commonly used solvent MeOH. The concentration of hydrazine had considerable impact on SUAC extraction. We developed a new MS/MS derivatized method to detect AA/AC/SUAC in a single extraction process for screening Tyr I along with disorders of AA and AC. Published by Elsevier B.V.

  1. The LabTube - a novel microfluidic platform for assay automation in laboratory centrifuges.

    PubMed

    Kloke, A; Fiebach, A R; Zhang, S; Drechsel, L; Niekrawietz, S; Hoehl, M M; Kneusel, R; Panthel, K; Steigert, J; von Stetten, F; Zengerle, R; Paust, N

    2014-05-07

    Assay automation is the key for successful transformation of modern biotechnology into routine workflows. Yet, it requires considerable investment in processing devices and auxiliary infrastructure, which is not cost-efficient for laboratories with low or medium sample throughput or point-of-care testing. To close this gap, we present the LabTube platform, which is based on assay specific disposable cartridges for processing in laboratory centrifuges. LabTube cartridges comprise interfaces for sample loading and downstream applications and fluidic unit operations for release of prestored reagents, mixing, and solid phase extraction. Process control is achieved by a centrifugally-actuated ballpen mechanism. To demonstrate the workflow and functionality of the LabTube platform, we show two LabTube automated sample preparation assays from laboratory routines: DNA extractions from whole blood and purification of His-tagged proteins. Equal DNA and protein yields were observed compared to manual reference runs, while LabTube automation could significantly reduce the hands-on-time to one minute per extraction.

  2. Combined Enzymatic and Mechanical Cell Disruption and Lipid Extraction of Green Alga Neochloris oleoabundans

    PubMed Central

    Wang, Dongqin; Li, Yanqun; Hu, Xueqiong; Su, Weimin; Zhong, Min

    2015-01-01

    Microalgal biodiesel is one of the most promising renewable fuels. The wet technique for lipids extraction has advantages over the dry method, such as energy-saving and shorter procedure. The cell disruption is a key factor in wet oil extraction to facilitate the intracellular oil release. Ultrasonication, high-pressure homogenization, enzymatic hydrolysis and the combination of enzymatic hydrolysis with high-pressure homogenization and ultrasonication were employed in this study to disrupt the cells of the microalga Neochloris oleoabundans. The cell disruption degree was investigated. The cell morphology before and after disruption was assessed with scanning and transmission electron microscopy. The energy requirements and the operation cost for wet cell disruption were also estimated. The highest disruption degree, up to 95.41%, assessed by accounting method was achieved by the combination of enzymatic hydrolysis and high-pressure homogenization. A lipid recovery of 92.6% was also obtained by the combined process. The combined process was found to be more efficient and economical compared with the individual process. PMID:25853267

  3. Mobil lube dewaxing technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, C.L.; McGuiness, M.P.

    1995-09-01

    Currently, the lube refining industry is in a period of transition, with both hydroprocessing and catalytic dewaxing gathering momentum as replacements for solvent extraction and solvent dewaxing. In addition, lube product quality requirements have been increasing, both in the US and abroad. Mobil has developed a broad array of dewaxing catalytic technologies which can serve refiners throughout the stages of this transition. In the future, lube feedstocks which vary in source and wax content will become increasingly important, requiring an optimized system for highest performance. The Mobil Lube Dewaxing (MLDW) process is the work-horse of the catalytic dewaxing technologies, beingmore » a robust, low cost technology suitable for both solvent extracted and hydrocracked feeds. The Mobil Selective Dewaxing (MSDW) process has been recently introduced in response to the growth of hydroprocessing. MSDW requires either severely hydrotreated or hydrocracked feeds and provides improved lube yields and VI. For refiners with hydrocrackers and solvent dewaxing units, Mobil Wax Isomerization (MWI) technology can make higher VI base stocks to meet the growing demand for very high quality lube products. A review of these three technologies is presented in this paper.« less

  4. [Selected adjuvants as carriers of a dry extract of common ivy (Hedera helix L.)].

    PubMed

    Marczyński, Zbigniew; Zgoda, Marian Mikołaj; Bodek, Kazimiera Henryka

    2011-01-01

    The usefulness was tested of selected adjuvants: Vivapur 112, Carmellose calcium, Calcium carbonate CA 740, Calcium carbonate CA 800, Hypromellose as carriers of a dry extract of common ivy (Hedera helix L.) leaves in the process of direct tableting. The quality of the produced tablets was determined by examining their appearance, diameter, thickness, mass resistance to abrasion, crushing and disintegration time. Furthermore, the rate of release of biologically active components from the produced drug form to acceptor fluid was tested in accordance with the requirements of Polish Pharmacopoeia VII (PPVII). An attempt was made to estimate the effect of the used adjuvants on the course of this process. The applied adjuvants and acceptor fluid osmolarity decide significantly about the pharmaceutical availability of the therapeutic agents contained in the extract. The obtained model tablets are characterized by controlled release of biologically active substances, in majority of batches they fulfil the requirements as regards physicochemical properties. The formulation composition of the first batch (Extr. Hederae helices e fol.spir. sicc., Vivapur 112, Carmellose calcium, Sodium Stearyl Fumarate) appeared to be the most effective. The worked out method is optimal and provides technological reproducibility and high durability of the drug form.

  5. Natural Language Processing in Radiology: A Systematic Review.

    PubMed

    Pons, Ewoud; Braun, Loes M M; Hunink, M G Myriam; Kors, Jan A

    2016-05-01

    Radiological reporting has generated large quantities of digital content within the electronic health record, which is potentially a valuable source of information for improving clinical care and supporting research. Although radiology reports are stored for communication and documentation of diagnostic imaging, harnessing their potential requires efficient and automated information extraction: they exist mainly as free-text clinical narrative, from which it is a major challenge to obtain structured data. Natural language processing (NLP) provides techniques that aid the conversion of text into a structured representation, and thus enables computers to derive meaning from human (ie, natural language) input. Used on radiology reports, NLP techniques enable automatic identification and extraction of information. By exploring the various purposes for their use, this review examines how radiology benefits from NLP. A systematic literature search identified 67 relevant publications describing NLP methods that support practical applications in radiology. This review takes a close look at the individual studies in terms of tasks (ie, the extracted information), the NLP methodology and tools used, and their application purpose and performance results. Additionally, limitations, future challenges, and requirements for advancing NLP in radiology will be discussed. (©) RSNA, 2016 Online supplemental material is available for this article.

  6. Optimization of acidic extraction of astaxanthin from Phaffia rhodozyma *

    PubMed Central

    Ni, Hui; Chen, Qi-he; He, Guo-qing; Wu, Guang-bin; Yang, Yuan-fan

    2008-01-01

    Optimization of a process for extracting astaxanthin from Phaffia rhodozyma by acidic method was investigated, regarding several extraction factors such as acids, organic solvents, temperature and time. Fractional factorial design, central composite design and response surface methodology were used to derive a statistically optimal model, which corresponded to the following optimal condition: concentration of lactic acid at 5.55 mol/L, ratio of ethanol to yeast dry weight at 20.25 ml/g, temperature for cell-disruption at 30 °C, and extraction time for 3 min. Under this condition, astaxanthin and the total carotenoids could be extracted in amounts of 1294.7 μg/g and 1516.0 μg/g, respectively. This acidic method has advantages such as high extraction efficiency, low chemical toxicity and no special requirement of instruments. Therefore, it might be a more feasible and practical method for industrial practice. PMID:18196613

  7. Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples

    NASA Technical Reports Server (NTRS)

    Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi

    2014-01-01

    RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads in a shape-optimized chamber. A secondary proprietary feature is in the particular layout integrating these components to perform the desired operation of RNA isolation. Apart from a novel functional capability, advantages of the innovation include reduced or eliminated use of toxic reagents, and operator-independent extraction of RNA.

  8. Lipid recovery from wet oleaginous microbial biomass for biofuel production: A critical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Tao; Knoshaug, Eric P.; Pienkos, Philip T.

    Biological lipids derived from oleaginous microorganisms are promising precursors for renewable biofuel productions. Direct lipid extraction from wet cell-biomass is favored because it eliminates the need for costly dehydration. However, the development of a practical and scalable process for extracting lipids from wet cell-biomass is far from ready to be commercialized, instead, requiring intensive research and development to understand the lipid accessibility, mechanisms in mass transfer and establish robust lipid extraction approaches that are practical for industrial applications. Furthermore, this paper aims to present a critical review on lipid recovery in the context of biofuel productions with special attention tomore » cell disruption and lipid mass transfer to support extraction from wet biomass.« less

  9. Lipid recovery from wet oleaginous microbial biomass for biofuel production: A critical review

    DOE PAGES

    Dong, Tao; Knoshaug, Eric P.; Pienkos, Philip T.; ...

    2016-06-15

    Biological lipids derived from oleaginous microorganisms are promising precursors for renewable biofuel productions. Direct lipid extraction from wet cell-biomass is favored because it eliminates the need for costly dehydration. However, the development of a practical and scalable process for extracting lipids from wet cell-biomass is far from ready to be commercialized, instead, requiring intensive research and development to understand the lipid accessibility, mechanisms in mass transfer and establish robust lipid extraction approaches that are practical for industrial applications. Furthermore, this paper aims to present a critical review on lipid recovery in the context of biofuel productions with special attention tomore » cell disruption and lipid mass transfer to support extraction from wet biomass.« less

  10. Microencapsulation by solvent extraction/evaporation: reviewing the state of the art of microsphere preparation process technology.

    PubMed

    Freitas, Sergio; Merkle, Hans P; Gander, Bruno

    2005-02-02

    The therapeutic benefit of microencapsulated drugs and vaccines brought forth the need to prepare such particles in larger quantities and in sufficient quality suitable for clinical trials and commercialisation. Very commonly, microencapsulation processes are based on the principle of so-called "solvent extraction/evaporation". While initial lab-scale experiments are frequently performed in simple beaker/stirrer setups, clinical trials and market introduction require more sophisticated technologies, allowing for economic, robust, well-controllable and aseptic production of microspheres. To this aim, various technologies have been examined for microsphere preparation, among them are static mixing, extrusion through needles, membranes and microfabricated microchannel devices, dripping using electrostatic forces and ultrasonic jet excitation. This article reviews the current state of the art in solvent extraction/evaporation-based microencapsulation technologies. Its focus is on process-related aspects, as described in the scientific and patent literature. Our findings will be outlined according to the four major substeps of microsphere preparation by solvent extraction/evaporation, namely, (i) incorporation of the bioactive compound, (ii) formation of the microdroplets, (iii) solvent removal and (iv) harvesting and drying the particles. Both, well-established and more advanced technologies will be reviewed.

  11. The chemistry of TALSPEAK: A review of the science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nash, Kenneth L.

    Here, the TALSPEAK Process (Trivalent Actinide Lanthanide Separation with Phosphorus-reagent Extraction from Aqueous Komplexes) was originally developed at Oak Ridge National Laboratory by B. Weaver and F.A. Kappelmann in the 1960s. It was envisioned initially as an alternative to the TRAMEX process (selective extraction of trivalent actinides by tertiary or quaternary amines over fission product lanthanides from concentrated LiCl solutions). TALSPEAK proposed the selective extraction of trivalent lanthanides away from the actinides, which are retained in the aqueous phase as aminopolycarboxylate complexes. After several decades of research and development, the conventional TALSPEAK process (based on di-(2-ethylhexyl) phosphoric acid (extractant) inmore » 1,4-di-isopropylbenzene (diluent) and a concentrated lactate buffer containing diethylenetriamine-N,N,N',N",N"-pentaacetic acid (actinide-selective holdback reagent)) has become a widely recognized benchmark for advanced aqueous partitioning of the trivalent 4f/5f elements. TALSPEAK chemistry has also been utilized as an actinide-selective stripping agent (Reverse TALSPEAK) with some notable success. Under ideal conditions, conventional TALSPEAK separates Am 3+ from Nd 3+ (the usual limiting pair) with a single-stage separation factor of about 100; both lighter and heavier lanthanides are more completely separated from Am 3+. Despite this apparent efficiency, TALSPEAK has not seen enthusiastic adoption for advanced partitioning of nuclear fuels at process scale for two principle reasons: 1) all adaptations of TALSPEAK chemistry to process scale applications require rigid pH control within a narrow range of pH, and 2) phase transfer kinetics are often slower than ideal. To compensate for these effects, high concentrations of the buffer (0.5-2 M H/Na lactate) are required. Acknowledgement of these complications in TALSPEAK process development has inspired significant research activities dedicated to improving understanding of the basic chemistry that controls TALSPEAK (and related processes based on the application of actinide-selective holdback reagents). In the following report, advances in understanding of the fundamental chemistry of TALSPEAK that have occurred during the past decade will be reviewed and discussed.« less

  12. The chemistry of TALSPEAK: A review of the science

    DOE PAGES

    Nash, Kenneth L.

    2014-11-13

    Here, the TALSPEAK Process (Trivalent Actinide Lanthanide Separation with Phosphorus-reagent Extraction from Aqueous Komplexes) was originally developed at Oak Ridge National Laboratory by B. Weaver and F.A. Kappelmann in the 1960s. It was envisioned initially as an alternative to the TRAMEX process (selective extraction of trivalent actinides by tertiary or quaternary amines over fission product lanthanides from concentrated LiCl solutions). TALSPEAK proposed the selective extraction of trivalent lanthanides away from the actinides, which are retained in the aqueous phase as aminopolycarboxylate complexes. After several decades of research and development, the conventional TALSPEAK process (based on di-(2-ethylhexyl) phosphoric acid (extractant) inmore » 1,4-di-isopropylbenzene (diluent) and a concentrated lactate buffer containing diethylenetriamine-N,N,N',N",N"-pentaacetic acid (actinide-selective holdback reagent)) has become a widely recognized benchmark for advanced aqueous partitioning of the trivalent 4f/5f elements. TALSPEAK chemistry has also been utilized as an actinide-selective stripping agent (Reverse TALSPEAK) with some notable success. Under ideal conditions, conventional TALSPEAK separates Am 3+ from Nd 3+ (the usual limiting pair) with a single-stage separation factor of about 100; both lighter and heavier lanthanides are more completely separated from Am 3+. Despite this apparent efficiency, TALSPEAK has not seen enthusiastic adoption for advanced partitioning of nuclear fuels at process scale for two principle reasons: 1) all adaptations of TALSPEAK chemistry to process scale applications require rigid pH control within a narrow range of pH, and 2) phase transfer kinetics are often slower than ideal. To compensate for these effects, high concentrations of the buffer (0.5-2 M H/Na lactate) are required. Acknowledgement of these complications in TALSPEAK process development has inspired significant research activities dedicated to improving understanding of the basic chemistry that controls TALSPEAK (and related processes based on the application of actinide-selective holdback reagents). In the following report, advances in understanding of the fundamental chemistry of TALSPEAK that have occurred during the past decade will be reviewed and discussed.« less

  13. Event-driven processing for hardware-efficient neural spike sorting

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Pereira, João L.; Constandinou, Timothy G.

    2018-02-01

    Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.

  14. FPGA-based real time processing of the Plenoptic Wavefront Sensor

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ramos, L. F.; Marín, Y.; Díaz, J. J.; Piqueras, J.; García-Jiménez, J.; Rodríguez-Ramos, J. M.

    The plenoptic wavefront sensor combines measurements at pupil and image planes in order to obtain simultaneously wavefront information from different points of view, being capable to sample the volume above the telescope to extract the tomographic information of the atmospheric turbulence. The advantages of this sensor are presented elsewhere at this conference (José M. Rodríguez-Ramos et al). This paper will concentrate in the processing required for pupil plane phase recovery, and its computation in real time using FPGAs (Field Programmable Gate Arrays). This technology eases the implementation of massive parallel processing and allows tailoring the system to the requirements, maintaining flexibility, speed and cost figures.

  15. Maximizing carotenoid extraction from microalgae used as food additives and determined by liquid chromatography (HPLC).

    PubMed

    Cerón-García, M C; González-López, C V; Camacho-Rodríguez, J; López-Rosales, L; García-Camacho, F; Molina-Grima, E

    2018-08-15

    Microalgae are an interesting source of natural pigments that have valuable applications. However, further research is necessary to develop processes that allow us to achieve high levels of carotenoid recovery while avoiding degradation. This work presents a comprehensive study on the recovery of carotenoids from several microalgae genera, optimizing carotenoid extraction using alkaline saponification at various temperatures and KOH concentrations. Results show that I. galbana requires a temperature of 60 °C and <10% KOH, N. gaditana and K. veneficum require 60 °C and no saponification, P. reticulatum requires 40 °C and 10% KOH, T. suecica and H. pluvialis require 25 °C and 40% KOH while C. sp. and S. almeriensis require 80 °C and 40% KOH. The influence of the solvent on carotenoid recovery was also studied. In general terms, an ethanol:hexane:water (77:17:6 v/v/v) mixture results in good yields. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Representation control increases task efficiency in complex graphical representations.

    PubMed

    Moritz, Julia; Meyerhoff, Hauke S; Meyer-Dernbecher, Claudia; Schwan, Stephan

    2018-01-01

    In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients.

  17. A Viable Scheme for Elemental Extraction and Purification Using In-Situ Planetary Resources

    NASA Technical Reports Server (NTRS)

    Sen, S.; Schofield, E.; ODell, S.; Ray, C. S.

    2005-01-01

    NASA's new strategic direction includes establishing a self-sufficient, affordable and safe human and robotic presence outside the low earth orbit. Some of the items required for a self-sufficient extra-terrestrial habitat will include materials for power generation (e.g. Si for solar cells) and habitat construction (e.g. Al, Fe, and Ti). In this paper we will present a viable elemental extraction and refining process from in-situ regolith which would be optimally continuous, robotically automated, and require a minimum amount of astronaut supervision and containment facilities, The approach is based on using a concentrated heat source and translating sample geometry to enable simultaneous oxide reduction and elemental refining. Preliminary results will be presented to demonstrate that the proposed zone refining process is capable of segregating or refining important elements such as Si (for solar cell fabrication) and Fe (for habitat construction). A conceptual scheme will be presented whereby such a process could be supported by use of solar energy and a precursor robotic mission on the surface of the moon.

  18. Representation control increases task efficiency in complex graphical representations

    PubMed Central

    Meyerhoff, Hauke S.; Meyer-Dernbecher, Claudia; Schwan, Stephan

    2018-01-01

    In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients. PMID:29698443

  19. Mass transfer coefficient in ginger oil extraction by microwave hydrotropic solution

    NASA Astrophysics Data System (ADS)

    Handayani, Dwi; Ikhsan, Diyono; Yulianto, Mohamad Endy; Dwisukma, Mandy Ayulia

    2015-12-01

    This research aims to obtain mass transfer coefficient data on the extraction of ginger oil using microwave hydrotropic solvent as an alternative to increase zingiberene. The innovation of this study is extraction with microwave heater and hydrotropic solvent,which able to shift the phase equilibrium, and the increasing rate of the extraction process and to improve the content of ginger oil zingiberene. The experiment was conducted at the Laboratory of Separation Techniques at Chemical Engineering Department of Diponegoro University. The research activities carried out in two stages, namely experimental and modeling work. Preparation of the model postulated, then lowered to obtain equations that were tested and validated using data obtained from experimental. Measurement of experimental data was performed using microwave power (300 W), extraction temperature of 90 ° C and the independent variable, i.e.: type of hydrotropic, the volume of solvent and concentration in order, to obtain zingiberen levels as a function of time. Measured data was used as a tool to validate the postulation, in order to obtain validation of models and empirical equations. The results showed that the mass transfer coefficient (Kla) on zingiberene mass transfer models ginger oil extraction at various hydrotropic solution attained more 14 ± 2 Kla value than its reported on the extraction with electric heating. The larger value of Kla, the faster rate of mass transfer on the extraction process. To obtain the same yields, the microwave-assisted extraction required one twelfth time shorter.

  20. Rigorous assessment of patterning solution of metal layer in 7 nm technology node

    NASA Astrophysics Data System (ADS)

    Gao, Weimin; Ciofi, Ivan; Saad, Yves; Matagne, Philippe; Bachmann, Michael; Gillijns, Werner; Lucas, Kevin; Demmerle, Wolfgang; Schmoeller, Thomas

    2016-01-01

    In a 7 nm node (N7), the logic design requires a critical poly pitch of 42 to 45 nm and a metal 1 (M1) pitch of 28 to 32 nm. Such high-pattern density pushes the 193 immersion lithography solution toward its limit and also brings extremely complex patterning scenarios. The N7 M1 layer may require a self-aligned quadruple patterning (SAQP) with a triple litho-etch (LE3) block process. Therefore, the whole patterning process flow requires multiple exposure+etch+deposition processes and each step introduces a particular impact on the pattern profiles and the topography. In this study, we have successfully integrated a simulation tool that enables emulation of the whole patterning flow with realistic process-dependent three-dimensional (3-D) profile and topology. We use this tool to study the patterning process variations of the N7 M1 layer including the overlay control, the critical dimension uniformity budget, and the lithographic process window (PW). The resulting 3-D pattern structure can be used to optimize the process flow, verify design rules, extract parasitics, and most importantly, simulate the electric field, and identify hot spots for dielectric reliability. As an example application, the maximum electric field at M1 tip-to-tip, which is one of the most critical patterning locations, has been simulated and extracted. The approach helps to investigate the impact of process variations on dielectric reliability. We have also assessed the alternative M1 patterning flow with a single exposure block using extreme ultraviolet lithography (EUVL) and analyzed its advantages compared to the LE3 block approach.

  1. Symbolic dynamic filtering and language measure for behavior identification of mobile robots.

    PubMed

    Mallapragada, Goutham; Ray, Asok; Jin, Xin

    2012-06-01

    This paper presents a procedure for behavior identification of mobile robots, which requires limited or no domain knowledge of the underlying process. While the features of robot behavior are extracted by symbolic dynamic filtering of the observed time series, the behavior patterns are classified based on language measure theory. The behavior identification procedure has been experimentally validated on a networked robotic test bed by comparison with commonly used tools, namely, principal component analysis for feature extraction and Bayesian risk analysis for pattern classification.

  2. EXTRACT: interactive extraction of environment metadata and term suggestion for metagenomic sample annotation.

    PubMed

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra; Pereira, Emiliano; Schnetzer, Julia; Arvanitidis, Christos; Jensen, Lars Juhl

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, well documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed. Database URL: https://extract.hcmr.gr/. © The Author(s) 2016. Published by Oxford University Press.

  3. Pumpkin (Cucurbita maxima) seed proteins: sequential extraction processing and fraction characterization.

    PubMed

    Rezig, Leila; Chibani, Farhat; Chouaibi, Moncef; Dalgalarrondo, Michèle; Hessini, Kamel; Guéguen, Jacques; Hamdi, Salem

    2013-08-14

    Seed proteins extracted from Tunisian pumpkin seeds ( Cucurbita maxima ) were investigated for their solubility properties and sequentially extracted according to the Osborne procedure. The solubility of pumpkin proteins from seed flour was greatly influenced by pH changes and ionic strength, with higher values in the alkaline pH regions. It also depends on the seed defatting solvent. Protein solubility was decreased by using chloroform/methanol (CM) for lipid extraction instead of pentane (P). On the basis of differential solubility fractionation and depending on the defatting method, the alkali extract (AE) was the major fraction (42.1 (P), 22.3% (CM)) compared to the salt extract (8.6 (P), 7.5% (CM)). In salt, alkali, and isopropanol extracts, all essential amino acids with the exceptions of threonine and lysine met the minimum requirements for preschool children (FAO/WHO/UNU). The denaturation temperatures were 96.6 and 93.4 °C for salt and alkali extracts, respectively. Pumpkin protein extracts with unique protein profiles and higher denaturation temperatures could impart novel characteristics when used as food ingredients.

  4. SPECTROSCOPIC ONLINE MONITORING FOR PROCESS CONTROL AND SAFEGUARDING OF RADIOCHEMICAL STREAMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Samuel A.; Levitskaia, Tatiana G.

    2013-09-29

    There is a renewed interest worldwide to promote the use of nuclear power and close the nuclear fuel cycle. The long term successful use of nuclear power is critically dependent upon adequate and safe processing and disposition of the used nuclear fuel. Liquid-liquid extraction is a separation technique commonly employed for the processing of the dissolved used nuclear fuel. The instrumentation used to monitor these processes must be robust, require little or no maintenance, and be able to withstand harsh environments such as high radiation fields and aggressive chemical matrices. This paper summarizes application of the absorption and vibrational spectroscopicmore » techniques supplemented by physicochemical measurements for radiochemical process monitoring. In this context, our team experimentally assessed the potential of Raman and spectrophotometric techniques for online real-time monitoring of the U(VI)/nitrate ion/nitric acid and Pu(IV)/Np(V)/Nd(III), respectively, in solutions relevant to spent fuel reprocessing. These techniques demonstrate robust performance in the repetitive batch measurements of each analyte in a wide concentration range using simulant and commercial dissolved spent fuel solutions. Spectroscopic measurements served as training sets for the multivariate data analysis to obtain partial least squares predictive models, which were validated using on-line centrifugal contactor extraction tests. Satisfactory prediction of the analytes concentrations in these preliminary experiments warrants further development of the spectroscopy-based methods for radiochemical process control and safeguarding. Additionally, the ability to identify material intentionally diverted from a liquid-liquid extraction contactor system was successfully tested using on-line process monitoring as a means to detect the amount of material diverted. A chemical diversion and detection from a liquid-liquid extraction scheme was demonstrated using a centrifugal contactor system operating with the simulant PUREX extraction system of Nd(NO3)3/nitric acid aqueous phase and TBP/n-dodecane organic phase. During a continuous extraction experiment, a portion of the feed from a counter-current extraction system was diverted while the spectroscopic on-line process monitoring system was simultaneously measuring the feed, raffinate and organic products streams. The amount observed to be diverted by on-line spectroscopic process monitoring was in excellent agreement with values based from the known mass of sample directly taken (diverted) from system feed solution.« less

  5. Gas Chromatic Mass Spectrometer

    NASA Technical Reports Server (NTRS)

    Wey, Chowen

    1995-01-01

    Gas chromatograph/mass spectrometer (GC/MS) used to measure and identify combustion species present in trace concentration. Advanced extractive diagnostic method measures to parts per billion (PPB), as well as differentiates between different types of hydrocarbons. Applicable for petrochemical, waste incinerator, diesel transporation, and electric utility companies in accurately monitoring types of hydrocarbon emissions generated by fuel combustion, in order to meet stricter environmental requirements. Other potential applications include manufacturing processes requiring precise detection of toxic gaseous chemicals, biomedical applications requiring precise identification of accumulative gaseous species, and gas utility operations requiring high-sensitivity leak detection.

  6. NOVEL BINDERS AND METHODS FOR AGGLOMERATION OF ORE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S.K. Kawatra; T.C. Eisele; J.A. Gurtler

    2005-04-01

    Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not breakdown during processing. However, for many important metal extraction processes there are no binders known that will workmore » satisfactorily. Primary examples of this are copper heap leaching, where there are no binders that will work in the acidic environment encountered in this process. As a result, operators of many facilities see large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of a broad range of mineral agglomeration applications, particularly heap leaching.« less

  7. Novel Binders and Methods for Agglomeration of Ore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. K. Kawatra; T. C. Eisele; J. A. Gurtler

    2004-03-31

    Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not break down during processing. However, for many important metal extraction processes there are no binders known that willmore » work satisfactorily. A primary example of this is copper heap leaching, where there are no binders that will work in the acidic environment encountered in this process. As a result, operators of acidic heap-leach facilities see a large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of other agglomeration applications, particularly advanced primary ironmaking.« less

  8. Extracting microRNA-gene relations from biomedical literature using distant supervision

    PubMed Central

    Clarke, Luka A.; Couto, Francisco M.

    2017-01-01

    Many biomedical relation extraction approaches are based on supervised machine learning, requiring an annotated corpus. Distant supervision aims at training a classifier by combining a knowledge base with a corpus, reducing the amount of manual effort necessary. This is particularly useful for biomedicine because many databases and ontologies have been made available for many biological processes, while the availability of annotated corpora is still limited. We studied the extraction of microRNA-gene relations from text. MicroRNA regulation is an important biological process due to its close association with human diseases. The proposed method, IBRel, is based on distantly supervised multi-instance learning. We evaluated IBRel on three datasets, and the results were compared with a co-occurrence approach as well as a supervised machine learning algorithm. While supervised learning outperformed on two of those datasets, IBRel obtained an F-score 28.3 percentage points higher on the dataset for which there was no training set developed specifically. To demonstrate the applicability of IBRel, we used it to extract 27 miRNA-gene relations from recently published papers about cystic fibrosis. Our results demonstrate that our method can be successfully used to extract relations from literature about a biological process without an annotated corpus. The source code and data used in this study are available at https://github.com/AndreLamurias/IBRel. PMID:28263989

  9. Extracting microRNA-gene relations from biomedical literature using distant supervision.

    PubMed

    Lamurias, Andre; Clarke, Luka A; Couto, Francisco M

    2017-01-01

    Many biomedical relation extraction approaches are based on supervised machine learning, requiring an annotated corpus. Distant supervision aims at training a classifier by combining a knowledge base with a corpus, reducing the amount of manual effort necessary. This is particularly useful for biomedicine because many databases and ontologies have been made available for many biological processes, while the availability of annotated corpora is still limited. We studied the extraction of microRNA-gene relations from text. MicroRNA regulation is an important biological process due to its close association with human diseases. The proposed method, IBRel, is based on distantly supervised multi-instance learning. We evaluated IBRel on three datasets, and the results were compared with a co-occurrence approach as well as a supervised machine learning algorithm. While supervised learning outperformed on two of those datasets, IBRel obtained an F-score 28.3 percentage points higher on the dataset for which there was no training set developed specifically. To demonstrate the applicability of IBRel, we used it to extract 27 miRNA-gene relations from recently published papers about cystic fibrosis. Our results demonstrate that our method can be successfully used to extract relations from literature about a biological process without an annotated corpus. The source code and data used in this study are available at https://github.com/AndreLamurias/IBRel.

  10. Rapid extraction and assay of uranium from environmental surface samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Christopher A.; Chouyyok, Wilaiwan; Speakman, Robert J.

    Extraction methods enabling faster removal and concentration of uranium compounds for improved trace and low-level assay are demonstrated for standard surface sampling material in support of nuclear safeguards efforts, health monitoring, and other nuclear analysis applications. A key problem with the existing surface sampling swipes is the requirement for complete digestion of sample and sampling matrix. This is a time-consuming and labour-intensive process that limits laboratory throughput, elevates costs, and increases background levels. Various extraction methods are explored for their potential to quickly and efficiently remove different chemical forms of uranium from standard surface sampling material. A combination of carbonatemore » and peroxide solutions is shown to give the most rapid and complete form of uranyl compound extraction and dissolution. This rapid extraction process is demonstrated to be compatible with standard inductive coupled plasma mass spectrometry methods for uranium isotopic assay as well as screening techniques such as x-ray fluorescence. The general approach described has application beyond uranium to other analytes of nuclear forensic interest (e.g., rare earth elements and plutonium) as well as heavy metals for environmental and industrial hygiene monitoring.« less

  11. Solar production of intermediate temperature process heat. Phase I design. Final report. [For sugarcane processing plant in Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1980-08-01

    This report is the final effort in the Phase I design of a solar industrial process heat system for the Hilo Coast Processing Company (HCPC) in Pepeekeo, Hawaii. The facility is used to wash, grind and extract sugar from the locally grown sugarcane and it operates 24 hours a day, 305 days per year. The major steam requirements in the industrial process are for the prime movers (mill turbines) in the milling process and heat for evaporating water from the extracted juices. Bagasse (the fibrous residue of milled sugarcane) supplied 84% of the fuel requirement for steam generation in 1979,more » while 65,000 barrels of No. 6 industrial fuel oil made up the remaining 16%. These fuels are burned in the power plant complex which produces 825/sup 0/F, 1,250 psi superheated steam to power a turbogenerator set which, in addition to serving the factory, generates from 7 to 16 megawatts of electricity that is exported to the local utility company. Extracted steam from the turbo-generator set supplies the plant's process steam needs. The system consists of 42,420 ft./sup 2/ of parabolic trough, single axis tracking, concentrating solar collectors. The collectors will be oriented in a North-South configuration and will track East-West. A heat transfer fluid (Gulf Synfluid 4cs) will be circulated in a closed loop fashion through the solar collectors and a series of heat exchangers. The inlet and outlet fluid temperatures for the collectors are 370/sup 0/F and 450/sup 0/F respectively. It is estimated that the net useable energy delivered to the industrial process will be 7.2 x 10/sup 9/ Btu's per year. With an HCPC boiler efficiency of 78% and 6.2 x 10/sup 6/ Btu's per barrel of oil, the solar energy system will displace 1489 barrels of oil per year. (WHK)« less

  12. Image feature detection and extraction techniques performance evaluation for development of panorama under different light conditions

    NASA Astrophysics Data System (ADS)

    Patil, Venkat P.; Gohatre, Umakant B.

    2018-04-01

    The technique of obtaining a wider field-of-view of an image to get high resolution integrated image is normally required for development of panorama of a photographic images or scene from a sequence of part of multiple views. There are various image stitching methods developed recently. For image stitching five basic steps are adopted stitching which are Feature detection and extraction, Image registration, computing homography, image warping and Blending. This paper provides review of some of the existing available image feature detection and extraction techniques and image stitching algorithms by categorizing them into several methods. For each category, the basic concepts are first described and later on the necessary modifications made to the fundamental concepts by different researchers are elaborated. This paper also highlights about the some of the fundamental techniques for the process of photographic image feature detection and extraction methods under various illumination conditions. The Importance of Image stitching is applicable in the various fields such as medical imaging, astrophotography and computer vision. For comparing performance evaluation of the techniques used for image features detection three methods are considered i.e. ORB, SURF, HESSIAN and time required for input images feature detection is measured. Results obtained finally concludes that for daylight condition, ORB algorithm found better due to the fact that less tome is required for more features extracted where as for images under night light condition it shows that SURF detector performs better than ORB/HESSIAN detectors.

  13. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert; Lovely, David

    1999-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snap-shot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: (1) Shocks, (2) Vortex cores, (3) Regions of recirculation, (4) Boundary layers, (5) Wakes. Three papers and an initial specification for the (The Fluid eXtraction tool kit) FX Programmer's guide were included. The papers, submitted to the AIAA Computational Fluid Dynamics Conference, are entitled : (1) Using Residence Time for the Extraction of Recirculation Regions, (2) Shock Detection from Computational Fluid Dynamics results and (3) On the Velocity Gradient Tensor and Fluid Feature Extraction.

  14. Extraction and Capture of Water from Martian Regolith Experimental Proof-of-Concept

    NASA Technical Reports Server (NTRS)

    Linne, Diane; Kleinhenz, Julie; Bauman, Steve; Johnson, Kyle

    2016-01-01

    Mars Design Reference Architecture 5.0:Lists in-situ resource utilization (ISRU) as enabling for robust human Mars missionsLO2LCH4 ascent propulsion 25,000 kg oxygen from atmosphere for ascent and life support Atmospheric based ISRU processes less operationally complex than surface based limited concept evaluation to date and Mars surface water property and distribution uncertainty would not allow [Mars soil water processing] to be base lined at this time Limited Concept Evaluation to Date Lunar regolith O2 extraction processing experience Lunar regolith is fluidized and heated to high temperatures with H2 to produce H2O from iron-bearing minerals Mars similarity concept: Soil placed in fluidized bed reactor Heated to moderate temperatures Inert gas flow used to fluidize the bed and help with water desorption Challenges: High-temperature dusty seals Working gas requires downstream separation and recycling to reduce consumables loss Batch process heating thermally inefficient.

  15. Determination of free sulfites (SO3-2) in dried fruits processed with sulfur dioxide by ion chromatography through anion exchange column and conductivity detection.

    PubMed

    Liao, Benjamin S; Sram, Jacqueline C; Files, Darin J

    2013-01-01

    A simple and effective anion ion chromatography (IC) method with anion exchange column and conductivity detector has been developed to determine free sulfites (SO3-2) in dried fruits processed with sulfur dioxide. No oxidation agent, such as hydrogen peroxide, is used to convert sulfites to sulfates for IC analysis. In addition, no stabilizing agent, such as formaldehyde, fructose or EDTA, is required during the sample extraction. This method uses aqueous 0.2 N NaOH as the solvent for standard preparation and sample extraction. The sulfites, either prepared from standard sodium sulfite powder or extracted from food samples, are presumed to be unbound SO3-2 in aqueous 0.2 N NaOH (pH > 13), because the bound sulfites in the sample matrix are released at pH > 10. In this study, sulfites in the standard solutions were stable at room temperature (i.e., 15-25 degrees C) for up to 12 days. The lowest standard of the linear calibration curve is set at 1.59 microg/mL SO3-2 (equivalent to 6.36 microg/g sample with no dilution) for analysis of processed dried fruits that would contain high levels (>1000 microg/g) of sulfites. As a consequence, this method typically requires significant dilution of the sample extract. Samples are prepared with a simple procedure of sample compositing, extraction with aqueous 0.2 N NaOH, centrifugation, dilution as needed, and filtration prior to IC. The sulfites in these sample extracts are stable at room temperature for up to 20 h. Using anion IC, the sulfites are eluted under isocratic conditions with 10 mM aqueous sodium carbonate solution as the mobile phase passing through an anion exchange column. The sulfites are easily separated, with an analysis run time of 18 min, regardless of the dried fruit matrix. Recoveries from samples spiked with sodium sulfites were demonstrated to be between 81 and 105% for five different fruit matrixes (apricot, golden grape, white peach, fig, and mango). Overall, this method is simple to perform and effective for the determination of high levels of sulfites in dried fruits.

  16. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  17. Life-Cycle Analysis of Energy Use, Greenhouse Gas Emissions, and Water Consumption in the 2016 MYPP Algal Biofuel Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frank, Edward; Pegallapati, Ambica; Davis, Ryan

    2016-06-16

    The Department of Energy (DOE) Bioenergy Technologies Office (BETO) Multi-year Program Plan (MYPP) describes the bioenergy objectives pursued by BETO, the strategies for achieving those objectives, the current state of technology (SOT), and a number of design cases that explore cost and operational performance required to advance the SOT towards middle and long term goals (MYPP, 2016). Two options for converting algae to biofuel intermediates were considered in the MYPP, namely algal biofuel production via lipid extraction and algal biofuel production by thermal processing. The first option, lipid extraction, is represented by the Combined Algae Processing (CAP) pathway in whichmore » algae are hydrolyzed in a weak acid pretreatment step. The treated slurry is fermented for ethanol production from sugars. The fermentation stillage contains most of the lipids from the original biomass, which are recovered through wet solvent extraction. The process residuals after lipid extraction, which contain much of the original mass of amino acids and proteins, are directed to anaerobic digestion (AD) for biogas production and recycle of N and P nutrients. The second option, thermal processing, comprises direct hydrothermal liquefaction (HTL) of the wet biomass, separation of aqueous, gas, and oil phases, and treatment of the aqueous phase with catalytic hydrothermal gasification (CHG) to produce biogas and to recover N and P nutrients.« less

  18. Improvement of lipid yield from microalgae Spirulina platensis using ultrasound assisted osmotic shock extraction method

    NASA Astrophysics Data System (ADS)

    Adetya, NP; Hadiyanto, H.

    2018-01-01

    Microalgae Spirulina sp. has been identified as potential source of natural food supplement and food colorant. The high water content of microalgae (70-90%) causes an obstacle in biomass dehydration which requires large amounts of energy, eventually damaging the lipid in the microalgae. Therefore, the lipid must be extracted by using a suitable method which complies to wet biomass conditions. One of the methods is applying osmotic shock. This study was aimed to investigate the influence of osmotic agent (NaCl) concentration (10-30%) and extraction time (20-50 min) on yield of lipid and also to determine the optimal conditions in the extraction process through response surface methodology. The extraction was conducted at a temperature of 40°C under ultrasound frequency of 40 kHz. The result showed that the optimum yield lipid obtained was 6.39% in 16.98% NaCl concentration for 36 minutes 10 seconds.

  19. Biological network extraction from scientific literature: state of the art and challenges.

    PubMed

    Li, Chen; Liakata, Maria; Rebholz-Schuhmann, Dietrich

    2014-09-01

    Networks of molecular interactions explain complex biological processes, and all known information on molecular events is contained in a number of public repositories including the scientific literature. Metabolic and signalling pathways are often viewed separately, even though both types are composed of interactions involving proteins and other chemical entities. It is necessary to be able to combine data from all available resources to judge the functionality, complexity and completeness of any given network overall, but especially the full integration of relevant information from the scientific literature is still an ongoing and complex task. Currently, the text-mining research community is steadily moving towards processing the full body of the scientific literature by making use of rich linguistic features such as full text parsing, to extract biological interactions. The next step will be to combine these with information from scientific databases to support hypothesis generation for the discovery of new knowledge and the extension of biological networks. The generation of comprehensive networks requires technologies such as entity grounding, coordination resolution and co-reference resolution, which are not fully solved and are required to further improve the quality of results. Here, we analyse the state of the art for the extraction of network information from the scientific literature and the evaluation of extraction methods against reference corpora, discuss challenges involved and identify directions for future research. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  20. Enabling Low-Power, Multi-Modal Neural Interfaces Through a Common, Low-Bandwidth Feature Space.

    PubMed

    Irwin, Zachary T; Thompson, David E; Schroeder, Karen E; Tat, Derek M; Hassani, Ali; Bullard, Autumn J; Woo, Shoshana L; Urbanchek, Melanie G; Sachs, Adam J; Cederna, Paul S; Stacey, William C; Patil, Parag G; Chestek, Cynthia A

    2016-05-01

    Brain-Machine Interfaces (BMIs) have shown great potential for generating prosthetic control signals. Translating BMIs into the clinic requires fully implantable, wireless systems; however, current solutions have high power requirements which limit their usability. Lowering this power consumption typically limits the system to a single neural modality, or signal type, and thus to a relatively small clinical market. Here, we address both of these issues by investigating the use of signal power in a single narrow frequency band as a decoding feature for extracting information from electrocorticographic (ECoG), electromyographic (EMG), and intracortical neural data. We have designed and tested the Multi-modal Implantable Neural Interface (MINI), a wireless recording system which extracts and transmits signal power in a single, configurable frequency band. In prerecorded datasets, we used the MINI to explore low frequency signal features and any resulting tradeoff between power savings and decoding performance losses. When processing intracortical data, the MINI achieved a power consumption 89.7% less than a more typical system designed to extract action potential waveforms. When processing ECoG and EMG data, the MINI achieved similar power reductions of 62.7% and 78.8%. At the same time, using the single signal feature extracted by the MINI, we were able to decode all three modalities with less than a 9% drop in accuracy relative to using high-bandwidth, modality-specific signal features. We believe this system architecture can be used to produce a viable, cost-effective, clinical BMI.

  1. Characterization of an in vitro system for the synthesis of mRNA from human parainfluenza virus type 3.

    PubMed

    De, B P; Galinski, M S; Banerjee, A K

    1990-03-01

    A cell extract derived from human parainfluenza virus type 3-infected human lung carcinoma (HLC) cells synthesized mRNA in vitro. Under optimal conditions, the extract was able to support transcription of all virus-encoded genes as determined by hybridization analyses. The RNA products contained full-length poly(A)-containing mRNA species similar to those observed in acutely infected cells. Further purification of the viral nucleocapsids from the infected HLC cell extract resulted in total loss of the capacity of the extract to synthesize mRNA in vitro. However, the addition of cytoplasmic extracts from uninfected HLC cells to the nucleocapsid preparations restored transcription to levels observed in the infected cell lysates, indicating requirement of a host factor(s) in the human parainfluenza virus type 3 transcription process. In distinction to the abundant transcription observed in the cell extract from HLC cells, cell extract prepared from CV-1 cells failed to support transcription in vitro. High levels of RNase activity in the cell extract from CV-1 cells appears to be the principal reason for this difference.

  2. Hemimorphite Ores: A Review of Processing Technologies for Zinc Extraction

    NASA Astrophysics Data System (ADS)

    Chen, Ailiang; Li, Mengchun; Qian, Zhen; Ma, Yutian; Che, Jianyong; Ma, Yalin

    2016-10-01

    With the gradual depletion of zinc sulfide ores, exploration of zinc oxide ores is becoming more and more important. Hemimorphite is a major zinc oxide ore, attracting much attention in the field of zinc metallurgy although it is not the major zinc mineral. This paper presents a critical review of the treatment for extraction of zinc with emphasis on flotation, pyrometallurgical and hydrometallurgical methods based on the properties of hemimorphite. The three-dimensional framework structure of hemimorphite with complex linkage of its structural units lead to difficult desilicification before extracting zinc in the many metallurgical technologies. It is found that the flotation method is generally effective in enriching zinc minerals from hemimorphite ores into a high-grade concentrate for recovery of zinc. Pure zinc can be produced from hemimorphite or/and willemite with a reducing reagent, like methane or carbon. Leaching reagents, such as acid and alkali, can break the complex structure of hemimorphite to release zinc in the leached solution without generation of silica gel in the hydrometallurgical process. For optimal zinc extraction, combing flotation with pyrometallurgical or hydrometallurgical methods may be required.

  3. Extraction of Oxygen from the Martian Atmosphere

    NASA Technical Reports Server (NTRS)

    England, C.

    2004-01-01

    A mechanical process was designed for direct extraction of molecular oxygen from the martian atmosphere based on liquefaction of the majority component, CO2, followed by separation of the lower-boiling components. The atmospheric gases are compressed from about 0.007 bar to 13 bar and then cooled to liquefy most of the CO2. The uncondensed gases are further compressed to 30 bar or more, and then cooled again to recover water as ice and to remove much of the remaining CO2. The final gaseous products consisting mostly of nitrogen, oxygen, and carbon monoxide are liquefied and purified by cryogenic distillation. The liquefied CO2 is expanded back to the low-pressure atmosphere with the addition of heat to recover a majority of the compression energy and to produce the needed mechanical work. Energy for the process is needed primarily as heat to drive the CO2-based expansion power system. When properly configured, the extraction process can be a net producer of electricity. The conceptual design, termed 'MARRS' for Mars Atmosphere Resource Recovery System, was based on the NASA/JSC Mars Reference Mission (MRM) requirement for oxygen. This mission requires both liquid oxygen for propellant, and gaseous oxygen as a component of air for the mission crew. With single redundancy both for propellant and crew air, the oxygen requirement for the MRM is estimated at 5.8 kg/hr. The process thermal power needed is about 120 kW, which can be provided at 300-500 C. A lower-cost nuclear reactor made largely of stainless steel could serve as the heat source. The chief development needed for MARRS is an efficient atmospheric compression technology, all other steps being derived from conventional chemical engineering separations. The conceptual design describes an exceptionally low-mass compression system that can be made from ultra-lightweight and deployable structures. This system adapts to the rapidly changing martian environment to supply the atmospheric resource to MARRS at constant conditions.

  4. Centrifugal contactor operations for UREX process flowsheet. An update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pereira, Candido; Vandegrift, George F.

    2014-08-01

    The uranium extraction (UREX) process separates uranium, technetium, and a fraction of the iodine from the other components of the irradiated fuel in nitric acid solution. In May 2012, the time, material, and footprint requirements for treatment of 260 L batches of a solution containing 130 g-U/L were evaluated for two commercial annular centrifugal contactors from CINC Industries. These calculated values were based on the expected volume and concentration of fuel arising from treatment of a single target solution vessel (TSV). The general conclusions of that report were that a CINC V-2 contactor would occupy a footprint of 3.2 mmore » 2 (0.25 m x 15 m) if each stage required twice the nominal footprint of an individual stage, and approximately 1,131 minutes or nearly 19 hours is required to process all of the feed solution. A CINC V-5 would require approximately 9.9 m 2 (0.4 m x 25 m) of floor space but would require only 182 minutes or ~ 3 hours to process the spent target solution. Subsequent comparison with the Modular Caustic Side Solvent Extraction Unit (MCU) at Savannah River Site (SRS) in October 2013 suggested that a more compact arrangement is feasible, and the linear dimension for the CINC V-5 may be reduced to about 8 m; a comparable reduction for the CINC V-2 yields a length of 5 m. That report also described an intermediate-scale (10 cm) contactor design developed by Argonne in the early 1980s that would better align with the SHINE operations as they stood in May 2012. In this report, we revisit the previous evaluation of contactor operations after discussions with CINC Industries and analysis of the SHINE process flow diagrams for the cleanup of the TSV, which were not available at the time of the first assessment.« less

  5. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2000-01-01

    In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense.

  6. Critical comparison of the on-line and off-line molecularly imprinted solid-phase extraction of patulin coupled with liquid chromatography.

    PubMed

    Lhotská, Ivona; Holznerová, Anežka; Solich, Petr; Šatínský, Dalibor

    2017-12-01

    Reaching trace amounts of mycotoxin contamination requires sensitive and selective analytical tools for their determination. Improving the selectivity of sample pretreatment steps covering new and modern extraction techniques is one way to achieve it. Molecularly imprinted polymers as selective sorbent for extraction undoubtedly meet these criteria. The presented work is focused on the hyphenation of on-line molecularly imprinted solid-phase extraction with a chromatography system using a column-switching approach. Making a critical comparison with a simultaneously developed off-line extraction procedure, evaluation of pros and cons of each method, and determining the reliability of both methods on a real sample analysis were carried out. Both high-performance liquid chromatography methods, using off-line extraction on molecularly imprinted polymer and an on-line column-switching approach, were validated, and the validation results were compared against each other. Although automation leads to significant time savings, fewer human errors, and required no handling of toxic solvents, it reached worse detection limits (15 versus 6 μg/L), worse recovery values (68.3-123.5 versus 81.2-109.9%), and worse efficiency throughout the entire clean-up process in comparison with the off-line extraction method. The difficulties encountered, the compromises made during the optimization of on-line coupling and their critical evaluation are presented in detail. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Numerical framework for the modeling of electrokinetic flows

    NASA Astrophysics Data System (ADS)

    Deshpande, Manish; Ghaddar, Chahid; Gilbert, John R.; St. John, Pamela M.; Woudenberg, Timothy M.; Connell, Charles R.; Molho, Joshua; Herr, Amy; Mungal, Godfrey; Kenny, Thomas W.

    1998-09-01

    This paper presents a numerical framework for design-based analyses of electrokinetic flow in interconnects. Electrokinetic effects, which can be broadly divided into electrophoresis and electroosmosis, are of importance in providing a transport mechanism in microfluidic devices for both pumping and separation. Models for the electrokinetic effects can be derived and coupled to the fluid dynamic equations through appropriate source terms. In the design of practical microdevices, however, accurate coupling of the electrokinetic effects requires the knowledge of several material and physical parameters, such as the diffusivity and the mobility of the solute in the solvent. Additionally wall-based effects such as chemical binding sites might exist that affect the flow patterns. In this paper, we address some of these issues by describing a synergistic numerical/experimental process to extract the parameters required. Experiments were conducted to provide the numerical simulations with a mechanism to extract these parameters based on quantitative comparisons with each other. These parameters were then applied in predicting further experiments to validate the process. As part of this research, we have created NetFlow, a tool for micro-fluid analyses. The tool can be validated and applied in existing technologies by first creating test structures to extract representations of the physical phenomena in the device, and then applying them in the design analyses to predict correct behavior.

  8. A UWB Radar Signal Processing Platform for Real-Time Human Respiratory Feature Extraction Based on Four-Segment Linear Waveform Model.

    PubMed

    Hsieh, Chi-Hsuan; Chiu, Yu-Fang; Shen, Yi-Hsiang; Chu, Ta-Shun; Huang, Yuan-Hao

    2016-02-01

    This paper presents an ultra-wideband (UWB) impulse-radio radar signal processing platform used to analyze human respiratory features. Conventional radar systems used in human detection only analyze human respiration rates or the response of a target. However, additional respiratory signal information is available that has not been explored using radar detection. The authors previously proposed a modified raised cosine waveform (MRCW) respiration model and an iterative correlation search algorithm that could acquire additional respiratory features such as the inspiration and expiration speeds, respiration intensity, and respiration holding ratio. To realize real-time respiratory feature extraction by using the proposed UWB signal processing platform, this paper proposes a new four-segment linear waveform (FSLW) respiration model. This model offers a superior fit to the measured respiration signal compared with the MRCW model and decreases the computational complexity of feature extraction. In addition, an early-terminated iterative correlation search algorithm is presented, substantially decreasing the computational complexity and yielding negligible performance degradation. These extracted features can be considered the compressed signals used to decrease the amount of data storage required for use in long-term medical monitoring systems and can also be used in clinical diagnosis. The proposed respiratory feature extraction algorithm was designed and implemented using the proposed UWB radar signal processing platform including a radar front-end chip and an FPGA chip. The proposed radar system can detect human respiration rates at 0.1 to 1 Hz and facilitates the real-time analysis of the respiratory features of each respiration period.

  9. Liquefaction process wherein solvents derived from the material liquefied and containing increased concentrations of donor species are employed

    DOEpatents

    Fant, B. T.; Miller, John D.; Ryan, D. F.

    1982-01-01

    An improved process for the liquefaction of solid carbonaceous materials wherein a solvent or diluent derived from the solid carbonaceous material being liquefied is used to form a slurry of the solid carbonaceous material and wherein the solvent or diluent comprises from about 65 to about 85 wt. % hydroaromatic components. The solvent is prepared by first separating a solvent or diluent distillate fraction from the liquefaction product, subjecting this distillate fraction to hydrogenation and then extracting the naphthenic components from the hydrogenated product. The extracted naphthenic components are then dehydrogenated and hydrotreated to produce additional hydroaromatic components. These components are combined with the solvent or diluent distillate fraction. The solvent may also contain hydroaromatic constituents prepared by extracting naphthenic components from a heavy naphtha, dehydrogenating the same and then hydrotreating the dehydrogenated product. When the amount of solvent produced in this manner exceeds that required for steady state operation of the liquefaction process a portion of the solvent or diluent distillated fraction will be withdrawn as product.

  10. Effective Information Extraction Framework for Heterogeneous Clinical Reports Using Online Machine Learning and Controlled Vocabularies

    PubMed Central

    Zheng, Shuai; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A

    2017-01-01

    Background Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Objective Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. Methods A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Results Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports—each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. Conclusions IDEAL-X adopts a unique online machine learning–based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. PMID:28487265

  11. Investigations regarding the wet decontamination of fluorescent lamp waste using iodine in potassium iodide solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tunsu, Cristian, E-mail: tunsu@chalmers.se; Ekberg, Christian; Foreman, Mark

    Highlights: • A wet-based decontamination process for fluorescent lamp waste is proposed. • Mercury can be leached using iodine in potassium iodide solution. • The efficiency of the process increases with an increase in leachant concentration. • Selective leaching of mercury from rare earth elements is achieved. • Mercury is furthered recovered using ion exchange, reduction or solvent extraction. - Abstract: With the rising popularity of fluorescent lighting, simple and efficient methods for the decontamination of discarded lamps are needed. Due to their mercury content end-of-life fluorescent lamps are classified as hazardous waste, requiring special treatment for disposal. A simplemore » wet-based decontamination process is required, especially for streams where thermal desorption, a commonly used but energy demanding method, cannot be applied. In this study the potential of a wet-based process using iodine in potassium iodide solution was studied for the recovery of mercury from fluorescent lamp waste. The influence of the leaching agent’s concentration and solid/liquid ratio on the decontamination efficiency was investigated. The leaching behaviour of mercury was studied over time, as well as its recovery from the obtained leachates by means of anion exchange, reduction, and solvent extraction. Dissolution of more than 90% of the contained mercury was achieved using 0.025/0.05 M I{sub 2}/KI solution at 21 °C for two hours. The efficiency of the process increased with an increase in leachant concentration. 97.3 ± 0.6% of the mercury contained was dissolved at 21 °C, in two hours, using a 0.25/0.5 M I{sub 2}/KI solution and a solid to liquid ratio of 10% w/v. Iodine and mercury can be efficiently removed from the leachates using Dowex 1X8 anion exchange resin or reducing agents such as sodium hydrosulphite, allowing the disposal of the obtained solution as non-hazardous industrial wastewater. The extractant CyMe{sub 4}BTBP showed good removal of mercury, with an extraction efficiency of 97.5 ± 0.7% being achieved in a single stage. Better removal of mercury was achieved in a single stage using the extractants Cyanex 302 and Cyanex 923 in kerosene, respectively.« less

  12. Monitoring of the secondary drying in freeze-drying of pharmaceuticals.

    PubMed

    Fissore, Davide; Pisano, Roberto; Barresi, Antonello A

    2011-02-01

    This paper is focused on the in-line monitoring of the secondary drying phase of a lyophilization process. An innovative software sensor is presented to estimate reliably the residual moisture in the product and the time required to complete secondary drying, that is, to reach the target value of the residual moisture or of the desorption rate. Such results are obtained by coupling a mathematical model of the process and the in-line measurement of the solvent desorption rate and by means of the pressure rise test or another sensors (e.g., windmills, laser sensors) that can measure the vapor flux in the drying chamber. The proposed method does not require extracting any vial during the operation or using expensive sensors to measure off-line the residual moisture. Moreover, it does not require any preliminary experiment to determine the relationship between the desorption rate and residual moisture in the product. The effectiveness of the proposed approach is demonstrated by means of experiments carried out in a pilot-scale apparatus: in this case, some vials were extracted from the drying chamber and the moisture content was measured to validate the estimations provided by the soft-sensor. Copyright © 2010 Wiley-Liss, Inc.

  13. Text feature extraction based on deep learning: a review.

    PubMed

    Liang, Hong; Sun, Xiao; Sun, Yunlei; Gao, Yuan

    2017-01-01

    Selection of text feature item is a basic and important matter for text mining and information retrieval. Traditional methods of feature extraction require handcrafted features. To hand-design, an effective feature is a lengthy process, but aiming at new applications, deep learning enables to acquire new effective feature representation from training data. As a new feature extraction method, deep learning has made achievements in text mining. The major difference between deep learning and conventional methods is that deep learning automatically learns features from big data, instead of adopting handcrafted features, which mainly depends on priori knowledge of designers and is highly impossible to take the advantage of big data. Deep learning can automatically learn feature representation from big data, including millions of parameters. This thesis outlines the common methods used in text feature extraction first, and then expands frequently used deep learning methods in text feature extraction and its applications, and forecasts the application of deep learning in feature extraction.

  14. EXTRACT: Interactive extraction of environment metadata and term suggestion for metagenomic sample annotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, wellmore » documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Here the comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15–25% and helps curators to detect terms that would otherwise have been missed.« less

  15. Oil extraction from sheanut (Vitellaria paradoxa Gaertn C.F.) kernels assisted by microwaves.

    PubMed

    Nde, Divine B; Boldor, Dorin; Astete, Carlos; Muley, Pranjali; Xu, Zhimin

    2016-03-01

    Shea butter, is highly solicited in cosmetics, pharmaceuticals, chocolates and biodiesel formulations. Microwave assisted extraction (MAE) of butter from sheanut kernels was carried using the Doehlert's experimental design. Factors studied were microwave heating time, temperature and solvent/solute ratio while the responses were the quantity of oil extracted and the acid number. Second order models were established to describe the influence of experimental parameters on the responses studied. Under optimum MAE conditions of heating time 23 min, temperature 75 °C and solvent/solute ratio 4:1 more than 88 % of the oil with a free fatty acid (FFA) value less than 2, was extracted compared to the 10 h and solvent/solute ratio of 10:1 required for soxhlet extraction. Scanning electron microscopy was used to elucidate the effect of microwave heating on the kernels' microstructure. Substantial reduction in extraction time and volumes of solvent used and oil of suitable quality are the main benefits derived from the MAE process.

  16. EXTRACT: Interactive extraction of environment metadata and term suggestion for metagenomic sample annotation

    DOE PAGES

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra; ...

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, wellmore » documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Here the comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15–25% and helps curators to detect terms that would otherwise have been missed.« less

  17. Silk Fibroin Degradation Related to Rheological and Mechanical Properties.

    PubMed

    Partlow, Benjamin P; Tabatabai, A Pasha; Leisk, Gary G; Cebe, Peggy; Blair, Daniel L; Kaplan, David L

    2016-05-01

    Regenerated silk fibroin has been proposed as a material substrate for biomedical, optical, and electronic applications. Preparation of the silk fibroin solution requires extraction (degumming) to remove contaminants, but results in the degradation of the fibroin protein. Here, a mechanism of fibroin degradation is proposed and the molecular weight and polydispersity is characterized as a function of extraction time. Rheological analysis reveals significant changes in the viscosity of samples while mechanical characterization of cast and drawn films shows increased moduli, extensibility, and strength upon drawing. Fifteen minutes extraction time results in degraded fibroin that generates the strongest films. Structural analysis by wide angle X-ray scattering (WAXS) and Fourier transform infrared spectroscopy (FTIR) indicates molecular alignment in the drawn films and shows that the drawing process converts amorphous films into the crystalline, β-sheet, secondary structure. Most interesting, by using selected extraction times, films with near-native crystallinity, alignment, and molecular weight can be achieved; yet maximal mechanical properties for the films from regenerated silk fibroin solutions are found with solutions subjected to some degree of degradation. These results suggest that the regenerated solutions and the film casting and drawing processes introduce more complexity than native spinning processes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Automobile shredded residue valorisation by hydrometallurgical metal recovery.

    PubMed

    Granata, Giuseppe; Moscardini, Emanuela; Furlani, Giuliana; Pagnanelli, Francesca; Toro, Luigi

    2011-01-15

    The aim of this work was developing a hydrometallurgical process to recover metals from automobile shredded residue (or car fluff). Automobile shredded residue (ASR) was characterised by particle size distribution, total metal content and metal speciation in order to guide the choice of target metals and the operating conditions of leaching. Characterisation results showed that Fe is the most abundant metal in the waste, while Zn was the second abundant metal in the fraction with diameter lower than 500 μm. Sequential extractions denoted that Zn was easily extractable by weak acid attack, while Fe and Al required a strong acid attack to be removed. In order to recover zinc from <500 μm fraction leaching tests were operated using acetic acid, sulphuric acid and sodium hydroxide at different concentrations. Sulphuric acid determined the highest zinc extraction yield, while acetic acid determined the highest zinc extractive selectivity. Sodium hydroxide promoted an intermediate situation between sulphuric and acetic acid. Zn recovery by electro winning using acetic leach liquor determined 95% of Zn electro deposition yield in 1h, while using sulphuric leach liquor 40% yield in 1h and 50% yield in 2h were obtained. Simulation results showed that the sulphuric leaching process was more attractive than acetic leaching process. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Establishing Priorities for Postsecondary Energy-Related Technology Programs

    ERIC Educational Resources Information Center

    Brooking, Walter J.

    1977-01-01

    Data from a Shell Oil Company forecast of national energy requirements through 1990 and from a national invitational conference on energy-related postsecondary programs are presented under the following headings: Coal mining beneficiation and processing, petroleum extraction and refining, nuclear power production, solar energy, and energy…

  20. Effects of herbal ointment containing the leaf extracts of Madeira vine (Anredera cordifolia (Ten.) Steenis) for burn wound healing process on albino rats

    PubMed Central

    Yuniarti, Wiwik Misaco; Lukiswanto, Bambang Sektiari

    2017-01-01

    Aim: Skin burn is a health problem that requires fast and accurate treatment. If not well-treated, the burn will cause various damaging conditions for the patient. The leaf extract of Madeira vine (Anredera cordifolia (Ten.) Steenis), or popularly known as Binahong in Indonesia, has been used to treat various diseases. The purpose of this research is to determine the effects of leaf extracts of Madeira vine (A. cordifolia (Ten.) Steenis) on skin burn healing process in rats as an animal model. Materials and Methods: In this research, there were four treatment groups: G0, G1, G2, and G3, each consisting of five rats. All these rats were given skin burns, using hot metal plates. Then, sulfadiazine was given to G0, 2.5% leaf extract of Madeira vine was given to G1, 5% extract was given to G2, and 10% extract was given to G3, for straight 14 days topically, 3 times a day. At the end of the treatment period, skin excisions were conducted, and histopathological examination was carried out. Result: Microscopic observation on the wound healing process on the collagen deposition, polymorphonuclear infiltration, angiogenesis, and fibrosis showed that G2 had a significant difference with G0, G1, and G3 (p<0.05), while group G0 was significantly different from G1 and G3 (p<0.05). The better burn healing process on G2 allegedly because of the activity of flavonoid, saponin, and tannin, contained in the Madeira vine, which have the antioxidant, anti-inflammatory, and antibacterial effects. Conclusion: The ointment from the 5% leaf extract of Madeira vine (A. cordifolia (Ten.) Steenis) has been proven to be effective to be used for topical burn therapy. PMID:28831227

  1. EliXR-TIME: A Temporal Knowledge Representation for Clinical Research Eligibility Criteria.

    PubMed

    Boland, Mary Regina; Tu, Samson W; Carini, Simona; Sim, Ida; Weng, Chunhua

    2012-01-01

    Effective clinical text processing requires accurate extraction and representation of temporal expressions. Multiple temporal information extraction models were developed but a similar need for extracting temporal expressions in eligibility criteria (e.g., for eligibility determination) remains. We identified the temporal knowledge representation requirements of eligibility criteria by reviewing 100 temporal criteria. We developed EliXR-TIME, a frame-based representation designed to support semantic annotation for temporal expressions in eligibility criteria by reusing applicable classes from well-known clinical temporal knowledge representations. We used EliXR-TIME to analyze a training set of 50 new temporal eligibility criteria. We evaluated EliXR-TIME using an additional random sample of 20 eligibility criteria with temporal expressions that have no overlap with the training data, yielding 92.7% (76 / 82) inter-coder agreement on sentence chunking and 72% (72 / 100) agreement on semantic annotation. We conclude that this knowledge representation can facilitate semantic annotation of the temporal expressions in eligibility criteria.

  2. Efficient microplastics extraction from sand. A cost effective methodology based on sodium iodide recycling.

    PubMed

    Kedzierski, Mikaël; Le Tilly, Véronique; César, Guy; Sire, Olivier; Bruzaud, Stéphane

    2017-02-15

    Evaluating the microplastics pollution on the shores requires overcoming the technological and economical challenge of efficient plastic extraction from sand. The recovery of dense microplastics requires the use of NaI solutions, a costly process. The aim of this study is to decrease this cost by recycling the NaI solutions and to determine the impact of NaI storage. For studying the NaI recyclability, the solution density and the salt mass have been monitored during ten life cycles. Density, pH and salt mass have been measured for 40days to assess the storage effect. The results show that NaI solutions are recyclable without any density alterations with a total loss of 35.9% after the 10cycles of use. During storage, chemical reactions may appear but are reversible. Consequently, the use of recycling methods allows for a significant cost reduction. How far the plastic extraction by dense solutions is representative is discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A review on methods of regeneration of spent pickling solutions from steel processing.

    PubMed

    Regel-Rosocka, Magdalena

    2010-05-15

    The review presents various techniques of regeneration of spent pickling solutions, including the methods with acid recovery, such as diffusion dialysis, electrodialysis, membrane electrolysis and membrane distillation, evaporation, precipitation and spray roasting as well as those with acid and metal recovery: ion exchange, retardation, crystallization solvent and membrane extraction. Advantages and disadvantages of the techniques are presented, discussed and confronted with the best available techniques requirements. Most of the methods presented meet the BAT requirements. The best available techniques are electrodialysis, diffusion dialysis and crystallization; however, in practice spray roasting and retardation/ion-exchange are applied most frequently for spent pickling solution regeneration. As "waiting for their chance" solvent extraction, non-dispersive solvent extraction and membrane distillation should be indicated because they are well investigated and developed. Environmental and economic benefits of the methods presented in the review depend on the cost of chemicals and wastewater treatment, legislative regulations and cost of modernization of existing technologies or implementation of new ones. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  4. Extraction of Water from Polar Lunar Permafrost with Microwaves - Dielectric Property Measurements

    NASA Technical Reports Server (NTRS)

    Ethridge, Edwin C.; Kaukler, William

    2009-01-01

    Remote sensing indicates the presence of hydrogen rich regions associated with the lunar poles. The logical hypothesis is that there is cryogenically trapped water ice located in craters at the lunar poles. Some of the craters have been in permanent darkness for a billion years. The presence of water at the poles as well as other scientific advantages of a polar base, have influenced NASA plans for the lunar outpost. The lunar outpost has water and oxygen requirements on the order of 1 ton per year scaling up to as much as 10 tons per year. Microwave heating of the frozen permafrost has unique advantages for water extraction. Proof of principle experiments have successfully demonstrated that microwaves will couple to the cryogenic soil in a vacuum and the sublimed water vapor can be successfully captured on a cold trap. The dielectric properties of lunar soil will determine the hardware requirements for extraction processes. Microwave frequency dielectric property measurements of lunar soil simulant have been measured.

  5. Extraction methods and food uses of a natural red colorant from dye sorghum.

    PubMed

    Akogou, Folachodé Ug; Kayodé, Ap Polycarpe; den Besten, Heidy Mw; Linnemann, Anita R

    2018-01-01

    The interest in stable natural colorants for food applications continues to grow. A red pigment extracted from the leaf sheaths of a sorghum variety (Sorghum bicolor) with a high content of apigeninidin is widely used as a biocolorant in processed foods in West Africa. This study compared the colour and anthocyanin composition from traditional extraction methods to determine options for improvement and use of the red biocolorant from dye sorghum in the food sector. Sorghum biocolorant was commonly applied in fermented and heated foods. Traditional extraction methods predominantly differed in two aspects, namely the use of an alkaline rock salt (locally known as kanwu) and the temperature of the extraction water. Cool extraction using the alkaline ingredient was more efficient than hot alkaline and hot aqueous extractions in extracting anthocyanins. The apigeninidin content was three times higher in the cool and hot alkaline extracts than in the aqueous extract. Cool and hot alkaline extractions at pH 8-9 were the most efficient methods for extracting apigeninidin from dye sorghum leaf sheaths. Broader use of the sorghum biocolorant in foods requires further research on its effects on nutrient bioavailability and antioxidant activity. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  6. Ambiguity in the processing of Mandarin Chinese relative clauses: One factor cannot explain it all

    PubMed Central

    Mansbridge, Michael P.; Tamaoka, Katsuo; Xiong, Kexin; Verdonschot, Rinus G.

    2017-01-01

    This study addresses the question of whether native Mandarin Chinese speakers process and comprehend subject-extracted relative clauses (SRC) more readily than object-extracted relative clauses (ORC) in Mandarin Chinese. Presently, this has been a hotly debated issue, with various studies producing contrasting results. Using two eye-tracking experiments with ambiguous and unambiguous RCs, this study shows that both ORCs and SRCs have different processing requirements depending on the locus and time course during reading. The results reveal that ORC reading was possibly facilitated by linear/temporal integration and canonicity. On the other hand, similarity-based interference made ORCs more difficult, and expectation-based processing was more prominent for unambiguous ORCs. Overall, RC processing in Mandarin should not be broken down to a single ORC (dis)advantage, but understood as multiple interdependent factors influencing whether ORCs are either more difficult or easier to parse depending on the task and context at hand. PMID:28594939

  7. The Physics and Chemistry of Marine Aerosols

    NASA Astrophysics Data System (ADS)

    Russell, Lynn M.

    Understanding the physics and chemistry of the marine atmosphere requires both predicting the evolution of its gas and aerosol phases and making observations that reflect the processes in that evolution. This work presents a model of the most fundamental physical and chemical processes important in the marine atmosphere, and discusses the current uncertainties in our theoretical understanding of those processes. Backing up these predictions with observations requires improved instrumentation for field measurements of aerosol. One important advance in this instrumentation is described for accelerating the speed of size distribution measurements. Observations of aerosols in the marine boundary layer during the Atlantic Stratocumulus Transition Experiment (ASTEX) provide an illustration of the impact of cloud processing in marine stratus. More advanced measurements aboard aircraft were enabled by redesigning the design of the system for separating particles by differential mobility and counting them by condensational growth. With this instrumentation, observations made during the Monterey Area Ship Tracks (MAST) Experiment have illustrated the role of aerosol emissions of ships in forming tracks in clouds. High-resolution gas chromatography and mass spectrometry was used with samples extracted by supercritical fluid extraction in order to identify the role of combustion organics in forming ship tracks. The results illustrate the need both for more sophisticated models incorporating organic species in cloud activation and for more extensive boundary layer observations.

  8. An improved filter elution and cell culture assay procedure for evaluating public groundwater systems for culturable enteroviruses.

    PubMed

    Dahling, Daniel R

    2002-01-01

    Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.

  9. Wide coverage biomedical event extraction using multiple partially overlapping corpora

    PubMed Central

    2013-01-01

    Background Biomedical events are key to understanding physiological processes and disease, and wide coverage extraction is required for comprehensive automatic analysis of statements describing biomedical systems in the literature. In turn, the training and evaluation of extraction methods requires manually annotated corpora. However, as manual annotation is time-consuming and expensive, any single event-annotated corpus can only cover a limited number of semantic types. Although combined use of several such corpora could potentially allow an extraction system to achieve broad semantic coverage, there has been little research into learning from multiple corpora with partially overlapping semantic annotation scopes. Results We propose a method for learning from multiple corpora with partial semantic annotation overlap, and implement this method to improve our existing event extraction system, EventMine. An evaluation using seven event annotated corpora, including 65 event types in total, shows that learning from overlapping corpora can produce a single, corpus-independent, wide coverage extraction system that outperforms systems trained on single corpora and exceeds previously reported results on two established event extraction tasks from the BioNLP Shared Task 2011. Conclusions The proposed method allows the training of a wide-coverage, state-of-the-art event extraction system from multiple corpora with partial semantic annotation overlap. The resulting single model makes broad-coverage extraction straightforward in practice by removing the need to either select a subset of compatible corpora or semantic types, or to merge results from several models trained on different individual corpora. Multi-corpus learning also allows annotation efforts to focus on covering additional semantic types, rather than aiming for exhaustive coverage in any single annotation effort, or extending the coverage of semantic types annotated in existing corpora. PMID:23731785

  10. Retrieval of radiology reports citing critical findings with disease-specific customization.

    PubMed

    Lacson, Ronilda; Sugarbaker, Nathanael; Prevedello, Luciano M; Ivan, Ip; Mar, Wendy; Andriole, Katherine P; Khorasani, Ramin

    2012-01-01

    Communication of critical results from diagnostic procedures between caregivers is a Joint Commission national patient safety goal. Evaluating critical result communication often requires manual analysis of voluminous data, especially when reviewing unstructured textual results of radiologic findings. Information retrieval (IR) tools can facilitate this process by enabling automated retrieval of radiology reports that cite critical imaging findings. However, IR tools that have been developed for one disease or imaging modality often need substantial reconfiguration before they can be utilized for another disease entity. THIS PAPER: 1) describes the process of customizing two Natural Language Processing (NLP) and Information Retrieval/Extraction applications - an open-source toolkit, A Nearly New Information Extraction system (ANNIE); and an application developed in-house, Information for Searching Content with an Ontology-Utilizing Toolkit (iSCOUT) - to illustrate the varying levels of customization required for different disease entities and; 2) evaluates each application's performance in identifying and retrieving radiology reports citing critical imaging findings for three distinct diseases, pulmonary nodule, pneumothorax, and pulmonary embolus. Both applications can be utilized for retrieval. iSCOUT and ANNIE had precision values between 0.90-0.98 and recall values between 0.79 and 0.94. ANNIE had consistently higher precision but required more customization. Understanding the customizations involved in utilizing NLP applications for various diseases will enable users to select the most suitable tool for specific tasks.

  11. Retrieval of Radiology Reports Citing Critical Findings with Disease-Specific Customization

    PubMed Central

    Lacson, Ronilda; Sugarbaker, Nathanael; Prevedello, Luciano M; Ivan, IP; Mar, Wendy; Andriole, Katherine P; Khorasani, Ramin

    2012-01-01

    Background: Communication of critical results from diagnostic procedures between caregivers is a Joint Commission national patient safety goal. Evaluating critical result communication often requires manual analysis of voluminous data, especially when reviewing unstructured textual results of radiologic findings. Information retrieval (IR) tools can facilitate this process by enabling automated retrieval of radiology reports that cite critical imaging findings. However, IR tools that have been developed for one disease or imaging modality often need substantial reconfiguration before they can be utilized for another disease entity. Purpose: This paper: 1) describes the process of customizing two Natural Language Processing (NLP) and Information Retrieval/Extraction applications – an open-source toolkit, A Nearly New Information Extraction system (ANNIE); and an application developed in-house, Information for Searching Content with an Ontology-Utilizing Toolkit (iSCOUT) – to illustrate the varying levels of customization required for different disease entities and; 2) evaluates each application’s performance in identifying and retrieving radiology reports citing critical imaging findings for three distinct diseases, pulmonary nodule, pneumothorax, and pulmonary embolus. Results: Both applications can be utilized for retrieval. iSCOUT and ANNIE had precision values between 0.90-0.98 and recall values between 0.79 and 0.94. ANNIE had consistently higher precision but required more customization. Conclusion: Understanding the customizations involved in utilizing NLP applications for various diseases will enable users to select the most suitable tool for specific tasks. PMID:22934127

  12. An introduction to the Marshall information retrieval and display system

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An on-line terminal oriented data storage and retrieval system is presented which allows a user to extract and process information from stored data bases. The use of on-line terminals for extracting and displaying data from the data bases provides a fast and responsive method for obtaining needed information. The system consists of general purpose computer programs that provide the overall capabilities of the total system. The system can process any number of data files via a Dictionary (one for each file) which describes the data format to the system. New files may be added to the system at any time, and reprogramming is not required. Illustrations of the system are shown, and sample inquiries and responses are given.

  13. Crosslinked Remote-Doped Hole-Extracting Contacts Enhance Stability under Accelerated Lifetime Testing in Perovskite Solar Cells.

    PubMed

    Xu, Jixian; Voznyy, Oleksandr; Comin, Riccardo; Gong, Xiwen; Walters, Grant; Liu, Min; Kanjanaboos, Pongsakorn; Lan, Xinzheng; Sargent, Edward H

    2016-04-13

    A crosslinked hole-extracting electrical contact is reported, which simultaneously improves the stability and lowers the hysteresis of perovskite solar cells. Polymerizable monomers and crosslinking processes are developed to obviate in situ degradation of the under lying perovskite. The crosslinked material is band-aligned with perovskite. The required free carrier density is induced by a high-work-function metal oxide layer atop the device, following a remote-doping strategy. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsoupas, Nicholaos

    The acceleration process of charged particle beam often required the use of few acceleration stages to provide the beam with the desired energy. The extraction of the beam from one acceleration stage and the injection to the next, both require a special type of magnet which comes under the name septum magnet. Such a magnet generates a strong field in one region of space an a very low field in another region with two regions separated by a very thin material (septum).

  15. Reductive stripping process for the recovery of uranium from wet-process phosphoric acid

    DOEpatents

    Hurst, Fred J.; Crouse, David J.

    1984-01-01

    A reductive stripping flow sheet for recovery of uranium from wet-process phosphoric acid is described. Uranium is stripped from a uranium-loaded organic phase by a redox reaction converting the uranyl to uranous ion. The uranous ion is reoxidized to the uranyl oxidation state to form an aqueous feed solution highly concentrated in uranium. Processing of this feed through a second solvent extraction cycle requires far less stripping reagent as compared to a flow sheet which does not include the reductive stripping reaction.

  16. Parallel processing of real-time dynamic systems simulation on OSCAR (Optimally SCheduled Advanced multiprocessoR)

    NASA Technical Reports Server (NTRS)

    Kasahara, Hironori; Honda, Hiroki; Narita, Seinosuke

    1989-01-01

    Parallel processing of real-time dynamic systems simulation on a multiprocessor system named OSCAR is presented. In the simulation of dynamic systems, generally, the same calculation are repeated every time step. However, we cannot apply to Do-all or the Do-across techniques for parallel processing of the simulation since there exist data dependencies from the end of an iteration to the beginning of the next iteration and furthermore data-input and data-output are required every sampling time period. Therefore, parallelism inside the calculation required for a single time step, or a large basic block which consists of arithmetic assignment statements, must be used. In the proposed method, near fine grain tasks, each of which consists of one or more floating point operations, are generated to extract the parallelism from the calculation and assigned to processors by using optimal static scheduling at compile time in order to reduce large run time overhead caused by the use of near fine grain tasks. The practicality of the scheme is demonstrated on OSCAR (Optimally SCheduled Advanced multiprocessoR) which has been developed to extract advantageous features of static scheduling algorithms to the maximum extent.

  17. Process for producing fluid fuel from coal

    DOEpatents

    Hyde, Richard W.; Reber, Stephen A.; Schutte, August H.; Nadkarni, Ravindra M.

    1977-01-01

    Process for producing fluid fuel from coal. Moisture-free coal in particulate form is slurried with a hydrogen-donor solvent and the heated slurry is charged into a drum wherein the pressure is so regulated as to maintain a portion of the solvent in liquid form. During extraction of the hydrocarbons from the coal, additional solvent is added to agitate the drum mass and keep it up to temperature. Subsequently, the pressure is released to vaporize the solvent and at least a portion of the hydrocarbons extracted. The temperature of the mass in the drum is then raised under conditions required to crack the hydrocarbons in the drum and to produce, after subsequent stripping, a solid coke residue. The hydrocarbon products are removed and fractionated into several cuts, one of which is hydrotreated to form the required hydrogen-donor solvent while other fractions can be hydrotreated or hydrocracked to produce a synthetic crude product. The heaviest fraction can be used to produce ash-free coke especially adapted for hydrogen manufacture. The process can be made self-sufficient in hydrogen and furnishes as a by-product a solid carbonaceous material with a useful heating value.

  18. Development of Novel Method for Rapid Extract of Radionuclides from Solution Using Polymer Ligand Film

    NASA Astrophysics Data System (ADS)

    Rim, Jung H.

    Accurate and fast determination of the activity of radionuclides in a sample is critical for nuclear forensics and emergency response. Radioanalytical techniques are well established for radionuclides measurement, however, they are slow and labor intensive, requiring extensive radiochemical separations and purification prior to analysis. With these limitations of current methods, there is great interest for a new technique to rapidly process samples. This dissertation describes a new analyte extraction medium called Polymer Ligand Film (PLF) developed to rapidly extract radionuclides. Polymer Ligand Film is a polymer medium with ligands incorporated in its matrix that selectively and rapidly extract analytes from a solution. The main focus of the new technique is to shorten and simplify the procedure necessary to chemically isolate radionuclides for determination by alpha spectrometry or beta counting. Five different ligands were tested for plutonium extraction: bis(2-ethylhexyl) methanediphosphonic acid (H2DEH[MDP]), di(2-ethyl hexyl) phosphoric acid (HDEHP), trialkyl methylammonium chloride (Aliquat-336), 4,4'(5')-di-t-butylcyclohexano 18-crown-6 (DtBuCH18C6), and 2-ethylhexyl 2-ethylhexylphosphonic acid (HEH[EHP]). The ligands that were effective for plutonium extraction further studied for uranium extraction. The plutonium recovery by PLFs has shown dependency on nitric acid concentration and ligand to total mass ratio. H2DEH[MDP] PLFs performed best with 1:10 and 1:20 ratio PLFs. 50.44% and 47.61% of plutonium were extracted on the surface of PLFs with 1M nitric acid for 1:10 and 1:20 PLF, respectively. HDEHP PLF provided the best combination of alpha spectroscopy resolution and plutonium recovery with 1:5 PLF when used with 0.1M nitric acid. The overall analyte recovery was lower than electrodeposited samples, which typically has recovery above 80%. However, PLF is designed to be a rapid field deployable screening technique and consistency is more important than recovery. PLFs were also tested using blind quality control samples and the activities were accurately measured. It is important to point out that PLFs were consistently susceptible to analytes penetrating and depositing below the surface. The internal radiation within the body of PLF is mostly contained and did not cause excessive self-attenuation and peak broadening in alpha spectroscopy. The analyte penetration issue was beneficial in the destructive analysis. H2DEH[MDP] PLF was tested with environmental samples to fully understand the capabilities and limitations of the PLF in relevant environments. The extraction system was very effective in extracting plutonium from environmental water collected from Mortandad Canyon at Los Alamos National Laboratory with minimal sample processing. Soil samples were tougher to process than the water samples. Analytes were first leached from the soil matrixes using nitric acid before processing with PLF. This approach had a limitation in extracting plutonium using PLF. The soil samples from Mortandad Canyon, which are about 1% iron by weight, were effectively processed with the PLF system. Even with certain limitations of the PLF extraction system, this technique was able to considerably decrease the sample analysis time. The entire environmental sample was analyzed within one to two days. The decrease in time can be attributed to the fact that PLF is replacing column chromatography and electrodeposition with a single step for preparing alpha spectrometry samples. The two-step process of column chromatography and electrodeposition takes a couple days to a week to complete depending on the sample. The decrease in time and the simplified procedure make this technique a unique solution for application to nuclear forensics and emergency response. A large number of samples can be quickly analyzed and selective samples can be further analyzed with more sensitive techniques based on the initial data. The deployment of a PLF system as a screening method will greatly reduce a total analysis time required to gain meaningful isotopic data for the nuclear forensics application. (Abstract shortened by UMI.)

  19. Application of fluorescence spectroscopy for on-line bioprocess monitoring and control

    NASA Astrophysics Data System (ADS)

    Boehl, Daniela; Solle, D.; Toussaint, Hans J.; Menge, M.; Renemann, G.; Lindemann, Carsten; Hitzmann, Bernd; Scheper, Thomas-Helmut

    2001-02-01

    12 Modern bioprocess control requires fast data acquisition and in-time evaluation of bioprocess variables. On-line fluorescence spectroscopy for data acquisition and the use of chemometric methods accomplish these requirements. The presented investigations were performed with fluorescence spectrometers with wide ranges of excitation and emission wavelength. By detection of several biogenic fluorophors (amino acids, coenzymes and vitamins) a large amount of information about the state of the bioprocess are obtained. For the evaluation of the process variables partial least squares regression is used. This technique was applied to several bioprocesses: the production of ergotamine by Claviceps purpurea, the production of t-PA (tissue plasminogen activator) by animal cells and brewing processes. The main point of monitoring the brewing processes was to determine the process variables cell count and extract concentration.

  20. Coupled Reactions "versus" Connected Reactions: Coupling Concepts with Terms

    ERIC Educational Resources Information Center

    Aledo, Juan Carlos

    2007-01-01

    A hallmark of living matter is its ability to extract and transform energy from the environment. Not surprisingly, biology students are required to take thermodynamics. The necessity of coupling exergonic reactions to endergonic processes is easily grasped by most undergraduate students. However, when addressing the thermodynamic concept of…

  1. Prosodic Encoding in Silent Reading.

    ERIC Educational Resources Information Center

    Wilkenfeld, Deborah

    In silent reading, short-memory tasks, such as semantic and syntactic processing, require a stage of phonetic encoding between visual representation and the actual extraction of meaning, and this encoding includes prosodic as well as segmental features. To test for this suprasegmental coding, an experiment was conducted in which subjects were…

  2. Antimicrobial and Antibiofilm Activities of Citrus Water-Extracts Obtained by Microwave-Assisted and Conventional Methods.

    PubMed

    Caputo, Leonardo; Quintieri, Laura; Cavalluzzi, Maria Maddalena; Lentini, Giovanni; Habtemariam, Solomon

    2018-06-17

    Citrus pomace is a huge agro-food industrial waste mostly composed of peels and traditionally used as compost or animal feed. Owing to its high content of compounds beneficial to humans (e.g., flavonoids, phenol-like acids, and terpenoids), citrus waste is increasingly used to produce valuable supplements, fragrance, or antimicrobials. However, such processes require sustainable and efficient extraction strategies by solvent-free techniques for environmentally-friendly good practices. In this work, we evaluated the antimicrobial and antibiofilm activity of water extracts of three citrus peels (orange, lemon, and citron) against ten different sanitary relevant bacteria. Both conventional extraction methods using hot water (HWE) and microwave-assisted extraction (MAE) were used. Even though no extract fully inhibited the growth of the target bacteria, these latter (mostly pseudomonads) showed a significant reduction in biofilm biomass. The most active extracts were obtained from orange and lemon peel by using MAE at 100 °C for 8 min. These results showed that citrus peel water infusions by MAE may reduce biofilm formation possibly enhancing the susceptibility of sanitary-related bacteria to disinfection procedures.

  3. Microwave-Assisted Extraction for Microalgae: From Biofuels to Biorefinery

    PubMed Central

    Pandhal, Jagroop

    2018-01-01

    The commercial reality of bioactive compounds and oil production from microalgal species is constrained by the high cost of production. Downstream processing, which includes harvesting and extraction, can account for 70–80% of the total cost of production. Consequently, from an economic perspective extraction technologies need to be improved. Microalgal cells are difficult to disrupt due to polymers within their cell wall such as algaenan and sporopollenin. Consequently, solvents and disruption devices are required to obtain products of interest from within the cells. Conventional techniques used for cell disruption and extraction are expensive and are often hindered by low efficiencies. Microwave-assisted extraction offers a possibility for extraction of biochemical components including lipids, pigments, carbohydrates, vitamins and proteins, individually and as part of a biorefinery. Microwave technology has advanced since its use in the 1970s. It can cut down working times and result in higher yields and purity of products. In this review, the ability and challenges in using microwave technology are discussed for the extraction of bioactive products individually and as part of a biorefinery approach. PMID:29462888

  4. Characterizing DebriSat Fragments: So Many Fragments, So Much Data, and So Little Time

    NASA Technical Reports Server (NTRS)

    Shiotani, B.; Rivero, M.; Carrasquilla, M.; Allen, S.; Fitz-Coy, N.; Liou, J.-C.; Huynh, T.; Sorge, M.; Cowardin, H.; Opiela, J.; hide

    2017-01-01

    To improve prediction accuracy, the DebriSat project was conceived by NASA and DoD to update existing standard break-up models. Updating standard break-up models require detailed fragment characteristics such as physical size, material properties, bulk density, and ballistic coefficient. For the DebriSat project, a representative modern LEO spacecraft was developed and subjected to a laboratory hypervelocity impact test and all generated fragments with at least one dimension greater than 2 mm are collected, characterized and archived. Since the beginning of the characterization phase of the DebriSat project, over 130,000 fragments have been collected and approximately 250,000 fragments are expected to be collected in total, a three-fold increase over the 85,000 fragments predicted by the current break-up model. The challenge throughout the project has been to ensure the integrity and accuracy of the characteristics of each fragment. To this end, the post hypervelocity-impact test activities, which include fragment collection, extraction, and characterization, have been designed to minimize handling of the fragments. The procedures for fragment collection, extraction, and characterization were painstakingly designed and implemented to maintain the post-impact state of the fragments, thus ensuring the integrity and accuracy of the characterization data. Each process is designed to expedite the accumulation of data, however, the need for speed is restrained by the need to protect the fragments. Methods to expedite the process such as parallel processing have been explored and implemented while continuing to maintain the highest integrity and value of the data. To minimize fragment handling, automated systems have been developed and implemented. Errors due to human inputs are also minimized by the use of these automated systems. This paper discusses the processes and challenges involved in the collection, extraction, and characterization of the fragments as well as the time required to complete the processes. The objective is to provide the orbital debris community an understanding of the scale of the effort required to generate and archive high quality data and metadata for each debris fragment 2 mm or larger generated by the DebriSat project.

  5. Cryogenics free production of hyperpolarized 129Xe and 83Kr for biomedical MRI applications

    NASA Astrophysics Data System (ADS)

    Hughes-Riley, Theodore; Six, Joseph S.; Lilburn, David M. L.; Stupic, Karl F.; Dorkes, Alan C.; Shaw, Dominick E.; Pavlovskaya, Galina E.; Meersmann, Thomas

    2013-12-01

    As an alternative to cryogenic gas handling, hyperpolarized (hp) gas mixtures were extracted directly from the spin exchange optical pumping (SEOP) process through expansion followed by compression to ambient pressure for biomedical MRI applications. The omission of cryogenic gas separation generally requires the usage of high xenon or krypton concentrations at low SEOP gas pressures to generate hp 129Xe or hp 83Kr with sufficient MR signal intensity for imaging applications. Two different extraction schemes for the hp gasses were explored with focus on the preservation of the nuclear spin polarization. It was found that an extraction scheme based on an inflatable, pressure controlled balloon is sufficient for hp 129Xe handling, while 83Kr can efficiently be extracted through a single cycle piston pump. The extraction methods were tested for ex vivo MRI applications with excised rat lungs. Precise mixing of the hp gases with oxygen, which may be of interest for potential in vivo applications, was accomplished during the extraction process using a piston pump. The 83Kr bulk gas phase T1 relaxation in the mixtures containing more than approximately 1% O2 was found to be slower than that of 129Xe in corresponding mixtures. The experimental setup also facilitated 129Xe T1 relaxation measurements as a function of O2 concentration within excised lungs.

  6. Cryogenics free production of hyperpolarized 129Xe and 83Kr for biomedical MRI applications☆

    PubMed Central

    Hughes-Riley, Theodore; Six, Joseph S.; Lilburn, David M.L.; Stupic, Karl F.; Dorkes, Alan C.; Shaw, Dominick E.; Pavlovskaya, Galina E.; Meersmann, Thomas

    2013-01-01

    As an alternative to cryogenic gas handling, hyperpolarized (hp) gas mixtures were extracted directly from the spin exchange optical pumping (SEOP) process through expansion followed by compression to ambient pressure for biomedical MRI applications. The omission of cryogenic gas separation generally requires the usage of high xenon or krypton concentrations at low SEOP gas pressures to generate hp 129Xe or hp 83Kr with sufficient MR signal intensity for imaging applications. Two different extraction schemes for the hp gasses were explored with focus on the preservation of the nuclear spin polarization. It was found that an extraction scheme based on an inflatable, pressure controlled balloon is sufficient for hp 129Xe handling, while 83Kr can efficiently be extracted through a single cycle piston pump. The extraction methods were tested for ex vivo MRI applications with excised rat lungs. Precise mixing of the hp gases with oxygen, which may be of interest for potential in vivo applications, was accomplished during the extraction process using a piston pump. The 83Kr bulk gas phase T1 relaxation in the mixtures containing more than approximately 1% O2 was found to be slower than that of 129Xe in corresponding mixtures. The experimental setup also facilitated 129Xe T1 relaxation measurements as a function of O2 concentration within excised lungs. PMID:24135800

  7. Cryogenics free production of hyperpolarized 129Xe and 83Kr for biomedical MRI applications.

    PubMed

    Hughes-Riley, Theodore; Six, Joseph S; Lilburn, David M L; Stupic, Karl F; Dorkes, Alan C; Shaw, Dominick E; Pavlovskaya, Galina E; Meersmann, Thomas

    2013-12-01

    As an alternative to cryogenic gas handling, hyperpolarized (hp) gas mixtures were extracted directly from the spin exchange optical pumping (SEOP) process through expansion followed by compression to ambient pressure for biomedical MRI applications. The omission of cryogenic gas separation generally requires the usage of high xenon or krypton concentrations at low SEOP gas pressures to generate hp (129)Xe or hp (83)Kr with sufficient MR signal intensity for imaging applications. Two different extraction schemes for the hp gasses were explored with focus on the preservation of the nuclear spin polarization. It was found that an extraction scheme based on an inflatable, pressure controlled balloon is sufficient for hp (129)Xe handling, while (83)Kr can efficiently be extracted through a single cycle piston pump. The extraction methods were tested for ex vivo MRI applications with excised rat lungs. Precise mixing of the hp gases with oxygen, which may be of interest for potential in vivo applications, was accomplished during the extraction process using a piston pump. The (83)Kr bulk gas phase T1 relaxation in the mixtures containing more than approximately 1% O2 was found to be slower than that of (129)Xe in corresponding mixtures. The experimental setup also facilitated (129)Xe T1 relaxation measurements as a function of O2 concentration within excised lungs. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Extraction of brewer's yeasts using different methods of cell disruption for practical biodiesel production.

    PubMed

    Řezanka, Tomáš; Matoulková, Dagmar; Kolouchová, Irena; Masák, Jan; Viden, Ivan; Sigler, Karel

    2015-05-01

    The methods of preparation of fatty acids from brewer's yeast and its use in production of biofuels and in different branches of industry are described. Isolation of fatty acids from cell lipids includes cell disintegration (e.g., with liquid nitrogen, KOH, NaOH, petroleum ether, nitrogenous basic compounds, etc.) and subsequent processing of extracted lipids, including analysis of fatty acid and computing of biodiesel properties such as viscosity, density, cloud point, and cetane number. Methyl esters obtained from brewer's waste yeast are well suited for the production of biodiesel. All 49 samples (7 breweries and 7 methods) meet the requirements for biodiesel quality in both the composition of fatty acids and the properties of the biofuel required by the US and EU standards.

  9. Modelling elderly cardiac patients decision making using Cognitive Work Analysis: identifying requirements for patient decision aids.

    PubMed

    Dhukaram, Anandhi Vivekanandan; Baber, Chris

    2015-06-01

    Patients make various healthcare decisions on a daily basis. Such day-to-day decision making can have significant consequences on their own health, treatment, care, and costs. While decision aids (DAs) provide effective support in enhancing patient's decision making, to date there have been few studies examining patient's decision making process or exploring how the understanding of such decision processes can aid in extracting requirements for the design of DAs. This paper applies Cognitive Work Analysis (CWA) to analyse patient's decision making in order to inform requirements for supporting self-care decision making. This study uses focus groups to elicit information from elderly cardiovascular disease (CVD) patients concerning a range of decision situations they face on a daily basis. Specifically, the focus groups addressed issues related to the decision making of CVD in terms of medication compliance, pain, diet and exercise. The results of these focus groups are used to develop high level views using CWA. CWA framework decomposes the complex decision making problem to inform three approaches to DA design: one design based on high level requirements; one based on a normative model of decision-making for patients; and the third based on a range of heuristics that patients seem to use. CWA helps in extracting and synthesising decision making from different perspectives: decision processes, work organisation, patient competencies and strategies used in decision making. As decision making can be influenced by human behaviour like skills, rules and knowledge, it is argued that patients require support to different types of decision making. This paper also provides insights for designers in using CWA framework for the design of effective DAs to support patients in self-management. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Breckinridge Project, initial effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1982-01-01

    The project cogeneration plant supplies electric power, process steam and treated boiler feedwater for use by the project plants. The plant consists of multiple turbine generators and steam generators connected to a common main steam header. The major plant systems which are required to produce steam, electrical power and treated feedwater are discussed individually. The systems are: steam, steam generator, steam generator fuel, condensate and feedwater deaeration, condensate and blowdown collection, cooling water, boiler feedwater treatment, coal handling, ash handling (fly ash and bottom ash), electrical, and control system. The plant description is based on the Phase Zero design basismore » established for Plant 31 in July of 1980 and the steam/condensate balance as presented on Drawing 31-E-B-1. Updating of steam requirements as more refined process information becomes available has generated some changes in the steam balance. Boiler operation with these updated requirements is reflected on Drawing 31-D-B-1A. The major impact of updating has been that less 600 psig steam generated within the process units requires more extraction steam from the turbine generators to close the 600 psig steam balance. Since the 900 psig steam generation from the boilers was fixed at 1,200,000 lb/hr, the additional extraction steam required to close the 600 psig steam balance decreased the quantity of electrical power available from the turbine generators. In the next phase of engineering work, the production of 600 psig steam will be augmented by increasing convection bank steam generation in the Plant 3 fired heaters by 140,000 to 150,000 lb/hr. This modification will allow full rated power generation from the turbine generators.« less

  11. Advanced processing for high-bandwidth sensor systems

    NASA Astrophysics Data System (ADS)

    Szymanski, John J.; Blain, Phil C.; Bloch, Jeffrey J.; Brislawn, Christopher M.; Brumby, Steven P.; Cafferty, Maureen M.; Dunham, Mark E.; Frigo, Janette R.; Gokhale, Maya; Harvey, Neal R.; Kenyon, Garrett; Kim, Won-Ha; Layne, J.; Lavenier, Dominique D.; McCabe, Kevin P.; Mitchell, Melanie; Moore, Kurt R.; Perkins, Simon J.; Porter, Reid B.; Robinson, S.; Salazar, Alfonso; Theiler, James P.; Young, Aaron C.

    2000-11-01

    Compute performance and algorithm design are key problems of image processing and scientific computing in general. For example, imaging spectrometers are capable of producing data in hundreds of spectral bands with millions of pixels. These data sets show great promise for remote sensing applications, but require new and computationally intensive processing. The goal of the Deployable Adaptive Processing Systems (DAPS) project at Los Alamos National Laboratory is to develop advanced processing hardware and algorithms for high-bandwidth sensor applications. The project has produced electronics for processing multi- and hyper-spectral sensor data, as well as LIDAR data, while employing processing elements using a variety of technologies. The project team is currently working on reconfigurable computing technology and advanced feature extraction techniques, with an emphasis on their application to image and RF signal processing. This paper presents reconfigurable computing technology and advanced feature extraction algorithm work and their application to multi- and hyperspectral image processing. Related projects on genetic algorithms as applied to image processing will be introduced, as will the collaboration between the DAPS project and the DARPA Adaptive Computing Systems program. Further details are presented in other talks during this conference and in other conferences taking place during this symposium.

  12. Production of Oxygen from Lunar Regolith by Molten Oxide Electrolysis

    NASA Technical Reports Server (NTRS)

    Curreri, Peter A.

    2009-01-01

    This paper describes the use of the molten oxide electrolysis (MOE) process for the extraction of oxygen for life support and propellant, and silicon and metallic elements for use in fabrication on the Moon. The Moon is rich in mineral resources, but it is almost devoid of chemical reducing agents, therefore, molten oxide electrolysis is ideal for extraction, since the electron is the only practical reducing agent. MOE has several advantages over other extraction methods. First, electrolytic processing offers uncommon versatility in its insensitivity to feedstock composition. Secondly, oxide melts boast the twin key attributes of highest solubilizing capacity for regolith and lowest volatility of any candidate electrolytes. The former is critical in ensuring high productivity since cell current is limited by reactant solubility, while the latter simplifies cell design by obviating the need for a gas-tight reactor to contain evaporation losses as would be the case with a gas or liquid phase fluoride reagent operating at such high temperatures. Alternatively, MOE requires no import of consumable reagents (e.g. fluorine and carbon) as other processes do, and does not rely on interfacing multiple processes to obtain refined products. Electrolytic processing has the advantage of selectivity of reaction in the presence of a multi-component feed. Products from lunar regolith can be extracted in sequence according to the stabilities of their oxides as expressed by the values of the free energy of oxide formation (e.g. chromium, manganese, Fe, Si, Ti, Al, magnesium, and calcium). Previous work has demonstrated the viability of producing Fe and oxygen from oxide mixtures similar in composition to lunar regolith by molten oxide electrolysis (electrowinning), also called magma electrolysis having shown electrolytic extraction of Si from regolith simulant. This paper describes recent advances in demonstrating the MOE process by a joint project with participation by NASA KSC and MSFC, and Ohio State University and MIT. Progress in measuring cell efficiency for oxygen production, development of non reacting electrodes, and cell feeding and withdrawal will be discussed.

  13. A rapid and efficient assay for extracting DNA from fungi

    USGS Publications Warehouse

    Griffin, Dale W.; Kellogg, C.A.; Peak, K.K.; Shinn, E.A.

    2002-01-01

    Aims: A method for the rapid extraction of fungal DNA from small quantities of tissue in a batch-processing format was investigated. Methods and Results: Tissue (< 3.0 mg) was scraped from freshly-grown fungal isolates. The tissue was suspended in buffer AP1 and subjected to seven rounds of freeze/thaw using a crushed dry ice/ethanol bath and a boiling water bath. After a 30 min boiling step, the tissue was quickly ground against the wall of the microfuge tube using a sterile pipette tip. The Qiagen DNeasy Plant Tissue Kit protocol was then used to purify the DNA for PCR/ sequencing applications. Conclusions: The method allowed batch DNA extraction from multiple fungal isolates using a simple yet rapid and reliable assay. Significance and Impact of the Study: Use of this assay will allow researchers to obtain DNA from fungi quickly for use in molecular assays that previously required specialized instrumentation, was time-consuming or was not conducive to batch processing.

  14. Scenes from the past: initial investigation of early jurassic vertebrate fossils with multidetector CT.

    PubMed

    Bolliger, Stephan A; Ross, Steffen; Thali, Michael J; Hostettler, Bernhard; Menkveld-Gfeller, Ursula

    2012-01-01

    The study of fossils permits the reconstruction of past life on our planet and enhances our understanding of evolutionary processes. However, many fossils are difficult to recognize, being encased in a lithified matrix whose tedious removal is required before examination is possible. The authors describe the use of multidetector computed tomography (CT) in locating, identifying, and examining fossil remains of crocodilians (Mesosuchia) embedded in hard shale, all without removing the matrix. In addition, they describe how three-dimensional (3D) reformatted CT images provided details that were helpful for extraction and preparation. Multidetector CT can help experienced paleontologists localize and characterize fossils in the matrix of a promising rock specimen in a nondestructive manner. Moreover, with its capacity to generate highly accurate 3D images, multidetector CT can help determine whether the fossils warrant extraction and can assist in planning the extraction process. Thus, multidetector CT may well become an invaluable tool in the field of paleoradiology.

  15. Concurrent evolution of feature extractors and modular artificial neural networks

    NASA Astrophysics Data System (ADS)

    Hannak, Victor; Savakis, Andreas; Yang, Shanchieh Jay; Anderson, Peter

    2009-05-01

    This paper presents a new approach for the design of feature-extracting recognition networks that do not require expert knowledge in the application domain. Feature-Extracting Recognition Networks (FERNs) are composed of interconnected functional nodes (feurons), which serve as feature extractors, and are followed by a subnetwork of traditional neural nodes (neurons) that act as classifiers. A concurrent evolutionary process (CEP) is used to search the space of feature extractors and neural networks in order to obtain an optimal recognition network that simultaneously performs feature extraction and recognition. By constraining the hill-climbing search functionality of the CEP on specific parts of the solution space, i.e., individually limiting the evolution of feature extractors and neural networks, it was demonstrated that concurrent evolution is a necessary component of the system. Application of this approach to a handwritten digit recognition task illustrates that the proposed methodology is capable of producing recognition networks that perform in-line with other methods without the need for expert knowledge in image processing.

  16. Remote measurement methods for 3-D modeling purposes using BAE Systems' Software

    NASA Astrophysics Data System (ADS)

    Walker, Stewart; Pietrzak, Arleta

    2015-06-01

    Efficient, accurate data collection from imagery is the key to an economical generation of useful geospatial products. Incremental developments of traditional geospatial data collection and the arrival of new image data sources cause new software packages to be created and existing ones to be adjusted to enable such data to be processed. In the past, BAE Systems' digital photogrammetric workstation, SOCET SET®, met fin de siècle expectations in data processing and feature extraction. Its successor, SOCET GXP®, addresses today's photogrammetric requirements and new data sources. SOCET GXP is an advanced workstation for mapping and photogrammetric tasks, with automated functionality for triangulation, Digital Elevation Model (DEM) extraction, orthorectification and mosaicking, feature extraction and creation of 3-D models with texturing. BAE Systems continues to add sensor models to accommodate new image sources, in response to customer demand. New capabilities added in the latest version of SOCET GXP facilitate modeling, visualization and analysis of 3-D features.

  17. a Geographic Data Gathering System for Image Geolocalization Refining

    NASA Astrophysics Data System (ADS)

    Semaan, B.; Servières, M.; Moreau, G.; Chebaro, B.

    2017-09-01

    Image geolocalization has become an important research field during the last decade. This field is divided into two main sections. The first is image geolocalization that is used to find out which country, region or city the image belongs to. The second one is refining image localization for uses that require more accuracy such as augmented reality and three dimensional environment reconstruction using images. In this paper we present a processing chain that gathers geographic data from several sources in order to deliver a better geolocalization than the GPS one of an image and precise camera pose parameters. In order to do so, we use multiple types of data. Among this information some are visible in the image and are extracted using image processing, other types of data can be extracted from image file headers or online image sharing platforms related information. Extracted information elements will not be expressive enough if they remain disconnected. We show that grouping these information elements helps finding the best geolocalization of the image.

  18. A multi-approach feature extractions for iris recognition

    NASA Astrophysics Data System (ADS)

    Sanpachai, H.; Settapong, M.

    2014-04-01

    Biometrics is a promising technique that is used to identify individual traits and characteristics. Iris recognition is one of the most reliable biometric methods. As iris texture and color is fully developed within a year of birth, it remains unchanged throughout a person's life. Contrary to fingerprint, which can be altered due to several aspects including accidental damage, dry or oily skin and dust. Although iris recognition has been studied for more than a decade, there are limited commercial products available due to its arduous requirement such as camera resolution, hardware size, expensive equipment and computational complexity. However, at the present time, technology has overcome these obstacles. Iris recognition can be done through several sequential steps which include pre-processing, features extractions, post-processing, and matching stage. In this paper, we adopted the directional high-low pass filter for feature extraction. A box-counting fractal dimension and Iris code have been proposed as feature representations. Our approach has been tested on CASIA Iris Image database and the results are considered successful.

  19. Atomic vapor laser isotope separation of lead-210 isotope

    DOEpatents

    Scheibner, K.F.; Haynam, C.A.; Johnson, M.A.; Worden, E.F.

    1999-08-31

    An isotopically selective laser process and apparatus for removal of Pb-210 from natural lead that involves a one-photon near-resonant, two-photon resonant excitation of one or more Rydberg levels, followed by field ionization and then electrostatic extraction. The wavelength to the near-resonant intermediate state is counter propagated with respect to the second wavelength required to populate the final Rydberg state. This scheme takes advantage of the large first excited state cross section, and only modest laser fluences are required. The non-resonant process helps to avoid two problems: first, stimulated Raman Gain due to the nearby F=3/2 hyperfine component of Pb-207 and, second, direct absorption of the first transition process light by Pb-207. 5 figs.

  20. Atomic vapor laser isotope separation of lead-210 isotope

    DOEpatents

    Scheibner, Karl F.; Haynam, Christopher A.; Johnson, Michael A.; Worden, Earl F.

    1999-01-01

    An isotopically selective laser process and apparatus for removal of Pb-210 from natural lead that involves a one-photon near-resonant, two-photon resonant excitation of one or more Rydberg levels, followed by field ionization and then electrostatic extraction. The wavelength to the near-resonant intermediate state is counter propagated with respect to the second wavelength required to populate the final Rydberg state. This scheme takes advantage of the large first excited state cross section, and only modest laser fluences are required. The non-resonant process helps to avoid two problems: first, stimulated Raman Gain due to the nearby F=3/2 hyperfine component of Pb-207 and, second, direct absorption of the first transition process light by Pb-207.

  1. Striatal degeneration impairs language learning: evidence from Huntington's disease.

    PubMed

    De Diego-Balaguer, R; Couette, M; Dolbeau, G; Dürr, A; Youssov, K; Bachoud-Lévi, A-C

    2008-11-01

    Although the role of the striatum in language processing is still largely unclear, a number of recent proposals have outlined its specific contribution. Different studies report evidence converging to a picture where the striatum may be involved in those aspects of rule-application requiring non-automatized behaviour. This is the main characteristic of the earliest phases of language acquisition that require the online detection of distant dependencies and the creation of syntactic categories by means of rule learning. Learning of sequences and categorization processes in non-language domains has been known to require striatal recruitment. Thus, we hypothesized that the striatum should play a prominent role in the extraction of rules in learning a language. We studied 13 pre-symptomatic gene-carriers and 22 early stage patients of Huntington's disease (pre-HD), both characterized by a progressive degeneration of the striatum and 21 late stage patients Huntington's disease (18 stage II, two stage III and one stage IV) where cortical degeneration accompanies striatal degeneration. When presented with a simplified artificial language where words and rules could be extracted, early stage Huntington's disease patients (stage I) were impaired in the learning test, demonstrating a greater impairment in rule than word learning compared to the 20 age- and education-matched controls. Huntington's disease patients at later stages were impaired both on word and rule learning. While spared in their overall performance, gene-carriers having learned a set of abstract artificial language rules were then impaired in the transfer of those rules to similar artificial language structures. The correlation analyses among several neuropsychological tests assessing executive function showed that rule learning correlated with tests requiring working memory and attentional control, while word learning correlated with a test involving episodic memory. These learning impairments significantly correlated with the bicaudate ratio. The overall results support striatal involvement in rule extraction from speech and suggest that language acquisition requires several aspects of memory and executive functions for word and rule learning.

  2. An energy-saving glutathione production method from low-temperature cooked rice using amylase-expressing Saccharomyces cerevisiae.

    PubMed

    Hara, Kiyotaka Y; Kim, Songhee; Kiriyama, Kentaro; Yoshida, Hideyo; Arai, Shogo; Ishii, Jun; Ogino, Chiaki; Fukuda, Hideki; Kondo, Akihiko

    2012-05-01

    Glutathione is a valuable tripeptide that is widely used in the pharmaceutical, food, and cosmetic industries. Glutathione is industrially produced by fermentation using Saccharomyces cerevisiae. Before the glutathione fermentation process with S. cerevisiae, a glucose extraction process from starchy materials is required. This glucose extraction is usually carried out by converting starchy materials to starch using high-temperature cooking and subsequent hydrolysis by amylases to convert starch to glucose. In this study, to develop an energy-saving glutathione production process by reducing energy consumption during the cooking step, we efficiently produced glutathione from low-temperature cooked rice using amylase-expressing S. cerevisiae. The combination of the amylase-expressing yeast with low-temperature cooking is potentially applicable to a variety of energy-saving bio-production methods of chemicals from starchy bio-resources. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Perceiving age and gender in unfamiliar faces: brain potential evidence for implicit and explicit person categorization.

    PubMed

    Wiese, Holger; Schweinberger, Stefan R; Neumann, Markus F

    2008-11-01

    We used repetition priming to investigate implicit and explicit processes of unfamiliar face categorization. During prime and test phases, participants categorized unfamiliar faces according to either age or gender. Faces presented at test were either new or primed in a task-congruent (same task during priming and test) or incongruent (different tasks) condition. During age categorization, reaction times revealed significant priming for both priming conditions, and event-related potentials yielded an increased N170 over the left hemisphere as a result of priming. During gender categorization, congruent faces elicited priming and a latency decrease in the right N170. Accordingly, information about age is extracted irrespective of processing demands, and priming facilitates the extraction of feature information reflected in the left N170 effect. By contrast, priming of gender categorization may depend on whether the task at initial presentation requires configural processing.

  4. Pneumatic Regolith Transfer Systems for In-Situ Resource Utilization

    NASA Technical Reports Server (NTRS)

    Mueller, Robert P.; Townsend, Ivan I., III; Mantovani, James G.

    2010-01-01

    One aspect of In-Situ Resource Utilization (lSRU) in a lunar environment is to extract oxygen and other elements from the minerals that make up the lunar regolith. Typical ISRU oxygen production processes include but are not limited to hydrogen reduction, carbothermal and molten oxide electrolysis. All of these processes require the transfer of regolith from a supply hopper into a reactor for chemical reaction processing, and the subsequent extraction of the reacted regolith from the reactor. This paper will discuss recent activities in the NASA ISRU project involved with developing pneumatic conveying methods to achieve lunar regolith simulant transfer under I-g and 1/6-g gravitational environments. Examples will be given of hardware that has been developed and tested by NASA on reduced gravity flights. Lessons learned and details of pneumatic regolith transfer systems will be examined as well as the relative performance in a 1/6th G environment

  5. [Extraction of artemisinin and synthesis of its derivates artesunate and artemether].

    PubMed

    Chekem, L; Wierucki, S

    2006-12-01

    Artemisinin is extracted from Artemisia annua, a shrub also known as sweet wormwood that was used in traditional medicine in Asia for more than 1500 years. Recent studies in numerous malarious zones have demonstrated the effectiveness of artemisinin and have reported no evidence of the resistance now associated with almost all other antimalarials on the market. Despite its remarkable activity, artermisinin is not accessible to many patients due to high cost. This situation confronts all players in the fight against malaria with the urgent need to develop a simple process to produce massive supplies of artemisinin and its derivative at an affordable price. The purpose of the study described here was to develop a simple, cost-effective method that could be used by all professionals to extract artemisinin and transform it into artesunate or artemether. Artemisinin was extracted with dichloromethane and purified on the basis of variations in polarity and in the hydrophile/lipophile balance of solvents. Transformation into artesunate was a two-step process involving reduction to dihydroartemisinin using diisobutylaluminium hydride (DIBAL) followed by esterification using succinic anhydride. Artemether was obtained from dihydroartemisinin using boron trifluoride. Extraction using dichloromethane presents several advantages. Since dichloromethane is not explosive it can be safely transported and used for extraction on farms where Artemisia annua is grown. Evaporation and recovery of dichloromethane is relatively easy so that it can be re-used. These advantages result in a significant decrease in purchasing and shipping costs. Extraction on the farm eliminates the expense and facilities that would otherwise be required to transport and store leaves at the laboratory (250 kg of leaves yield 4 to 5 kg of raw artemisinin extract that yields approximately 1 kg of pure artemisinin). The low-cost process described here is feasible for any pharmaceutical laboratory including those in developing countries.

  6. Top Value Added Chemicals From Biomass. Volume 1 - Results of Screening for Potential Candidates From Sugars and Synthesis Gas

    DTIC Science & Technology

    2004-08-01

    Hydrogenation of sugars or extraction from biomass pretreatment processes. Very few if any. Commercial processes Non-nutritive sweeteners ...and no commercial production of arabinitol. Xylitol is used as a non-nutritive sweetener . The technology required to convert the five carbon sugars ...Top Value Added Chemicals from Biomass Volume I—Results of Screening for Potential Candidates from Sugars and Synthesis Gas Produced by

  7. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  8. Statistical Design Model (SDM) of satellite thermal control subsystem

    NASA Astrophysics Data System (ADS)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  9. An integrated paper-based sample-to-answer biosensor for nucleic acid testing at the point of care.

    PubMed

    Choi, Jane Ru; Hu, Jie; Tang, Ruihua; Gong, Yan; Feng, Shangsheng; Ren, Hui; Wen, Ting; Li, XiuJun; Wan Abas, Wan Abu Bakar; Pingguan-Murphy, Belinda; Xu, Feng

    2016-02-07

    With advances in point-of-care testing (POCT), lateral flow assays (LFAs) have been explored for nucleic acid detection. However, biological samples generally contain complex compositions and low amounts of target nucleic acids, and currently require laborious off-chip nucleic acid extraction and amplification processes (e.g., tube-based extraction and polymerase chain reaction (PCR)) prior to detection. To the best of our knowledge, even though the integration of DNA extraction and amplification into a paper-based biosensor has been reported, a combination of LFA with the aforementioned steps for simple colorimetric readout has not yet been demonstrated. Here, we demonstrate for the first time an integrated paper-based biosensor incorporating nucleic acid extraction, amplification and visual detection or quantification using a smartphone. A handheld battery-powered heating device was specially developed for nucleic acid amplification in POC settings, which is coupled with this simple assay for rapid target detection. The biosensor can successfully detect Escherichia coli (as a model analyte) in spiked drinking water, milk, blood, and spinach with a detection limit of as low as 10-1000 CFU mL(-1), and Streptococcus pneumonia in clinical blood samples, highlighting its potential use in medical diagnostics, food safety analysis and environmental monitoring. As compared to the lengthy conventional assay, which requires more than 5 hours for the entire sample-to-answer process, it takes about 1 hour for our integrated biosensor. The integrated biosensor holds great potential for detection of various target analytes for wide applications in the near future.

  10. Effective Information Extraction Framework for Heterogeneous Clinical Reports Using Online Machine Learning and Controlled Vocabularies.

    PubMed

    Zheng, Shuai; Lu, James J; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A; Wang, Fusheng

    2017-05-09

    Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports-each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. IDEAL-X adopts a unique online machine learning-based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. ©Shuai Zheng, James J Lu, Nima Ghasemzadeh, Salim S Hayek, Arshed A Quyyumi, Fusheng Wang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 09.05.2017.

  11. Analyte stability during the total testing process: studies of vitamins A, D and E by LC-MS/MS.

    PubMed

    Albahrani, Ali A; Rotarou, Victor; Roche, Peter J; Greaves, Ronda F

    2016-10-01

    There are limited evidence based studies demonstrating the stability of fat-soluble vitamins (FSV) measured in blood. This study aimed to examine the effects of light, temperature and time on vitamins A, D and E throughout the total testing process. Four experiments were conducted. Three investigated the sample matrix, of whole blood, serum and the extracted sample, against the variables of temperature and light; and the fourth experiment investigated the sample during the extraction process against the variable of light. All samples were analysed via our simultaneous FSV method using liquid chromatography-tandem mass spectrometry technology. The allowable clinical percentage change was calculated based on biological variation and desirable method imprecision for each analyte. The total change limit was ±7.3% for 25-OH-vitamin D3, ±11.8% for retinol and ±10.8% for α-tocopherol. Vitamins D and E were stable in the investigated conditions (concentration changes <4%) in the pre-analytical and analytical stages. Vitamin A showed photosensitivity in times >48 h with concentration changes of -6.8% (blood) and -6.5% (serum), both are within the allowable clinical percentage change. By contrast, the extracted retinol sample demonstrated a concentration change of -18.4% after 48 h of light exposure. However, vitamin A in the serum and extracted solution was stable for one month when stored at -20°C. Blood samples for vitamins D and E analyses can be processed in normal laboratory conditions of lighting and temperature. The required conditions for vitamin A analysis are similar when performed within 48 h. For longer-term storage, serum and vitamin A extracts should be stored at -20°C.

  12. Optimized ultrasound-assisted extraction procedure for the analysis of opium alkaloids in papaver plants by cyclodextrin-modified capillary electrophoresis.

    PubMed

    Fakhari, Ali Reza; Nojavan, Saeed; Ebrahimi, Samad Nejad; Evenhuis, Christopher John

    2010-07-01

    This study investigated the use of ultrasound-assisted extraction to improve the extraction efficiency of morphine, codeine and thebaine from the papaver plants. Extraction conditions such as type of solvent, temperature, duration, frequency and power level of ultrasonic were optimized and the influences of different parameters on resolution of alkaloids in CE were studied. The optimized condition for CE separation includes a sodium phosphate buffer (100 mM, pH 3.0) containing 5 mM alpha-CD. The optimized extraction conditions for ultrasound-assisted extraction was an extraction time of 1 h, an ultrasonic frequency of 60 kHz with water-methanol (80:20) at 40 degrees C as the extraction solvent. The LOD for alkaloids was found to be 0.1 microg/mL at a signal-to-noise ratio of 3:1. The RSDs for peak areas were in the range of 1.4-4.4%. The amounts of opium alkaloids (mg/100 g dried sample) in four Iranian papaver plants were found to be in the range of 7.8-8.7 (morphine), 5.5-9.5 (codeine) and 1.4-10.4 (thebaine). It should be emphasized that no cleanup of the filtered extract was required; hence, direct determination after extraction drastically simplifies the analytical process.

  13. Pilot Plant Testing of Hot Gas Building Decontamination Process

    DTIC Science & Technology

    1987-10-30

    last hours of the cooldown (after water traps in the line were installed) showed no detectable contamination from this station. 1 60 CwC -So 0) 0 o j...Since we will not require refrigeration, additional generators probably 0 qlill not be required. Water is trucked to the site. Agent contaminated water ...surface. The gauze was handled by forceps during all of the sampling steps to prevent contamination after the solvent extraction clean-up of the gauze pads

  14. Remediating pesticide contaminated soils using solvent extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sahle-Demessie, E.; Meckes, M.C.; Richardson, T.L.

    Bench-scale solvent extraction studies were performed on soil samples obtained from a Superfund site contaminated with high levels of p,p{prime}-DDT, p,p{prime}-DDE and toxaphene. The effectiveness of the solvent extraction process was assessed using methanol and 2-propanol as solvents over a wide range of operating conditions. It was demonstrated that a six-stage methanol extraction using a solvent-to-soil ratio of 1.6 can decrease pesticide levels in the soil by more than 99% and reduce the volume of material requiring further treatment by 25 times or more. The high solubility of the pesticides in methanol resulted in rapid extraction rates, with the systemmore » reaching quasi-equilibrium state in 30 minutes. The extraction efficiency was influenced by the number of extraction stages, the solvent-to-soil ratio, and the soil moisture content. Various methods were investigated to regenerate and recycle the solvent. Evaporation and solvent stripping are low cost and reliable methods for removing high pesticide concentrations from the solvent. For low concentrations, GAC adsorption may be used. Precipitating and filtering pesticides by adding water to the methanol/pesticide solution was not successful when tested with soil extracts. 26 refs., 10 figs., 6 tabs.« less

  15. NOVEL BINDERS AND METHODS FOR AGGLOMERATION OF ORE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S.K. Kawatra; T.C. Eisele; J.A. Gurtler

    2004-04-01

    Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not break down during processing. However, for many important metal extraction processes there are no binders known that willmore » work satisfactorily. Primary examples of this are copper heap leaching, where there are no binders that will work in the acidic environment encountered in this process, and advanced ironmaking processes, where binders must function satisfactorily over an extraordinarily large range of temperatures (from room temperature up to over 1200 C). As a result, operators of many facilities see a large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of a broad range of mineral agglomeration applications, particularly heap leaching and advanced primary ironmaking.« less

  16. Soluble organic substances extracted from compost as amendments for Fenton-like oxidation of contaminated sites.

    PubMed

    Zingaretti, Daniela; Lombardi, Francesco; Baciocchi, Renato

    2018-04-01

    The Fenton process is a well known treatment that proved to be effective for the remediation of sites contaminated by a wide range of organic pollutants. Its application to soil-water systems typically requires the addition of a stabilizer, in order to increase the H 2 O 2 lifetime and thus the radius of influence of the treatment, and a chelating agent, aimed to extract and maintain in solution the iron present in the soil. However, as the use of these compounds has been debated for their environmental impact, efforts have been placed to test new "greener" amendments. Namely, in line with the concept of circular economy introduced by the European Council, in this study we have tested the use of humic acids extracted from compost as amendment in a Fenton-like process. These substances are of potential interest as can form complexes with metal ions and act as sorbents for hydrophobic organic compounds. Fenton-like lab-scale tests with the extracted humic acids were performed on a soil-water system artificially contaminated by chlorophenol. The obtained results were compared with those achieved applying commercial humic acids or traditional amendments (i.e. KH 2 PO 4 or EDTA) used as reference. The humic acids extracted from compost allowed to achieve a H 2 O 2 lifetime close to the one obtained with traditional stabilizing agent; besides, humic acids proved also effective in removing chlorophenol, with performance close to the one achieved using a traditional chelating agent. These findings hence suggest that the use of the humic acids extracted from wastes in a Fenton-like process could allow to replace at the same time the H 2 O 2 stabilizer and the chelating agent. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Immunochemical-based method for detection of hazelnut proteins in processed foods.

    PubMed

    Ben Rejeb, Samuel; Abbott, Michael; Davies, David; Querry, Jessica; Cléroux, Chantal; Streng, Christine; Delahaut, Philippe; Yeung, Jupiter M

    2003-01-01

    A competitive enzyme-linked immunosorbent assay (ELISA) was developed to detect hazelnut by using polyclonal antibodies generated against a protein extract of roasted hazelnut. No cross-reactivity was observed in tests against 39 commodities, including many common allergens, tree nuts, and legumes. Hazelnut protein standard solutions at 0.45 ng/mL [inhibition concentration (IC80) of the competitive test] were clearly identified by the ELISA. An extraction and quantification method was developed and optimized for chocolate, cookies, breakfast cereals, and ice cream, major food commodities likely to be cross-contaminated with undeclared hazelnut during food processing. No sample cleanup was required when extracts were diluted 10-fold. Recovery results were generated with blank matrixes spiked at 4 levels from 1 to 10 microg/g hazelnut protein. With the developed extraction and sample handling procedure, hazelnut proteins were recovered at 64-83% from chocolate and at 78-97% from other matrixes. A confirmatory technique was developed with sodium dodecyl sulfate-polyacrylamide gel electrophoresis and Western transfer. The developed methods were applied to a small market survey of chocolate products and allowed the identification of undeclared hazelnut in these products.

  18. Metabolic engineering of Cyanobacteria and microalgae for enhanced production of biofuels and high-value products.

    PubMed

    Gomaa, M A; Al-Haj, L; Abed, R M M

    2016-10-01

    A lot of research has been performed on Cyanobacteria and microalgae with the aim to produce numerous biotechnological products. However, native strains have a few shortcomings, like limitations in cultivation, harvesting and product extraction, which prevents reaching optimal production value at lowest costs. Such limitations require the intervention of genetic engineering to produce strains with superior properties. Promising advancements in the cultivation of Cyanobacteria and microalgae have been achieved by improving photosynthetic efficiency through increasing RuBisCO activity and truncation of light-harvesting antennae. Genetic engineering has also contributed to final product extraction by inducing autolysis and product secretory systems, to enable direct product recovery without going through costly extraction steps. In this review, we summarize the different enzymes and pathways that have been targeted thus far for improving cultivation aspects, harvesting and product extraction in Cyanobacteria and microalgae. With synthetic biology advancements, genetically engineered strains can be generated to resolve demanding process issues and achieve economic practicality. This comprehensive overview of gene modifications will be useful to researchers in the field to employ on their strains to increase their yields and improve the economic feasibility of the production process. © 2016 The Society for Applied Microbiology.

  19. Use of Dimethyl Pimelimidate with Microfluidic System for Nucleic Acids Extraction without Electricity.

    PubMed

    Jin, Choong Eun; Lee, Tae Yoon; Koo, Bonhan; Choi, Kyung-Chul; Chang, Suhwan; Park, Se Yoon; Kim, Ji Yeun; Kim, Sung-Han; Shin, Yong

    2017-07-18

    The isolation of nucleic acids in the lab on a chip is crucial to achieve the maximal effectiveness of point-of-care testing for detection in clinical applications. Here, we report on the use of a simple and versatile single-channel microfluidic platform that combines dimethyl pimelimidate (DMP) for nucleic acids (both RNA and DNA) extraction without electricity using a thin-film system. The system is based on the adaption of DMP into nonchaotropic-based nucleic acids and the capture of reagents into a low-cost thin-film platform for use as a microfluidic total analysis system, which can be utilized for sample processing in clinical diagnostics. Moreover, we assessed the use of the DMP system for the extraction of nucleic acids from various samples, including mammalian cells, bacterial cells, and viruses from human disease, and we also confirmed that the quality and quantity of the nucleic acids extracted were sufficient to allow for the robust detection of biomarkers and/or pathogens in downstream analysis. Furthermore, this DMP system does not require any instruments and electricity, and has improved time efficiency, portability, and affordability. Thus, we believe that the DMP system may change the paradigm of sample processing in clinical diagnostics.

  20. Naphthenic acids speciation and removal during petroleum-coke adsorption and ozonation of oil sands process-affected water.

    PubMed

    Gamal El-Din, Mohamed; Fu, Hongjing; Wang, Nan; Chelme-Ayala, Pamela; Pérez-Estrada, Leonidas; Drzewicz, Przemysław; Martin, Jonathan W; Zubot, Warren; Smith, Daniel W

    2011-11-01

    The Athabasca Oil Sands industry produces large volumes of oil sands process-affected water (OSPW) as a result of bitumen extraction and upgrading processes. Constituents of OSPW include chloride, naphthenic acids (NAs), aromatic hydrocarbons, and trace heavy metals, among other inorganic and organic compounds. To address the environmental issues associated with the recycling and/or safe return of OSPW into the environment, water treatment technologies are required. This study examined, for the first time, the impacts of pretreatment steps, including filtration and petroleum-coke adsorption, on ozonation requirements and performance. The effect of the initial OSPW pH on treatment performance, and the evolution of ozonation and its impact on OSPW toxicity and biodegradability were also examined. The degradation of more than 76% of total acid-extractable organics was achieved using a semi-batch ozonation system at a utilized ozone dose of 150 mg/L. With a utilized ozone dose of 100 mg/L, the treated OSPW became more biodegradable and showed no toxicity towards Vibrio fischeri. Changes in the NA profiles in terms of carbon number and number of rings were observed after ozonation. The filtration of the OSPW did not improve the ozonation performance. Petroleum-coke adsorption was found to be effective in reducing total acid-extractable organics by a 91%, NA content by an 84%, and OSPW toxicity from 4.3 to 1.1 toxicity units. The results of this study indicate that the combination of petroleum-coke adsorption and ozonation is a promising treatment approach to treat OSPW. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Exploration and practice of methods and processes of evidence-based rapid review on peer review of WHO EML application.

    PubMed

    Li, Youping; Yu, Jiajie; Du, Liang; Sun, Xin; Kwong, Joey S W; Wu, Bin; Hu, Zhiqiang; Lu, Jing; Xu, Ting; Zhang, Lingli

    2015-11-01

    After 38 years of development, the procedure of selection and evaluation of the World Health Organization Essential Medicine List (WHO EML) is increasingly scientific and formal. However, peer review for the applications of World Health Organization Essential Medicine List is always required in a short period. It is necessary to build up a set of methods and processes for rapid review. We identified the process of evidenced-based rapid review on WHO EML application for peer reviews according to 11 items which were required during reporting of the peer review results of the proposals. The most important items for the rapid review of World Health Organization Essential Medicine List peer reviewers are (1) to confirm the requirements and identify the purposes; (2) to establish the research questions and translate the questions into the 'Participants, Interventions, Comparators, Outcomes, Study design' (PICOS) format; (3) to search and screen available evidence, for which high-level evidence is preferred, such as systematic reviews or meta-analyses, health technology assessment, clinical guidelines; (4) to extract data, where we extract primary information based on the purposes; (5) to synthesize data by qualitative methods, assess the quality of evidence, and compare the results; (6) to provide the answers to the applications, quality of evidences and strength of recommendations. Our study established a set of methods and processes for the rapid review of World Health Organization Essential Medicine List peer review, and our findings were used to guide the reviewers to fulfill the 19(th) World Health Organization Essential Medicine List peer review. The methods and processes were feasible and met the necessary requirements in terms of time and quality. Continuous improvement and evaluation in practice are warranted. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.

  2. Application of the remote-sensing communication model to a time-sensitive wildfire remote-sensing system

    Treesearch

    Christopher D. Lippitt; Douglas A. Stow; Philip J. Riggan

    2016-01-01

    Remote sensing for hazard response requires a priori identification of sensor, transmission, processing, and distribution methods to permit the extraction of relevant information in timescales sufficient to allow managers to make a given time-sensitive decision. This study applies and demonstrates the utility of the Remote Sensing Communication...

  3. Fast fringe pattern phase demodulation using FIR Hilbert transformers

    NASA Astrophysics Data System (ADS)

    Gdeisat, Munther; Burton, David; Lilley, Francis; Arevalillo-Herráez, Miguel

    2016-01-01

    This paper suggests the use of FIR Hilbert transformers to extract the phase of fringe patterns. This method is computationally faster than any known spatial method that produces wrapped phase maps. Also, the algorithm does not require any parameters to be adjusted which are dependent upon the specific fringe pattern that is being processed, or upon the particular setup of the optical fringe projection system that is being used. It is therefore particularly suitable for full algorithmic automation. The accuracy and validity of the suggested method has been tested using both computer-generated and real fringe patterns. This novel algorithm has been proposed for its advantages in terms of computational processing speed as it is the fastest available method to extract the wrapped phase information from a fringe pattern.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Liping; Zhu, Fulong, E-mail: zhufulong@hust.edu.cn; Duan, Ke

    Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of opticalmore » devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.« less

  5. Establishing lunar resource viability

    NASA Astrophysics Data System (ADS)

    Carpenter, J.; Fisackerly, R.; Houdou, B.

    2016-11-01

    Recent research has highlighted the potential of lunar resources as an important element of space exploration but their viability has not been demonstrated. Establishing whether or not they can be considered in future plans is a multidisciplinary effort, requiring scientific expertise and delivering scientific results. To this end various space agencies and private entities are looking to lunar resources, extracted and processed in situ, as a potentially game changing element in future space architectures, with the potential to increase scale and reduce cost. However, before any decisions can be made on the inclusion of resources in exploration roadmaps or future scenarios some big questions need to be answered about the viability of different resource deposits and the processes for extraction and utilisation. The missions and measurements that will be required to answer these questions, and which are being prepared by agencies and others, can only be performed through the engagement and support of the science community. In answering questions about resources, data and knowledge will be generated that is of fundamental scientific importance. In supporting resource prospecting missions the science community will de facto generate new scientific knowledge. Science enables exploration and exploration enables science.

  6. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    NASA Astrophysics Data System (ADS)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  7. Ultrasonic power measurement system based on acousto-optic interaction.

    PubMed

    He, Liping; Zhu, Fulong; Chen, Yanming; Duan, Ke; Lin, Xinxin; Pan, Yongjun; Tao, Jiaquan

    2016-05-01

    Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of optical devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.

  8. Ultrasonic power measurement system based on acousto-optic interaction

    NASA Astrophysics Data System (ADS)

    He, Liping; Zhu, Fulong; Chen, Yanming; Duan, Ke; Lin, Xinxin; Pan, Yongjun; Tao, Jiaquan

    2016-05-01

    Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of optical devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.

  9. Automated Text Markup for Information Retrieval from an Electronic Textbook of Infectious Disease

    PubMed Central

    Berrios, Daniel C.; Kehler, Andrew; Kim, David K.; Yu, Victor L.; Fagan, Lawrence M.

    1998-01-01

    The information needs of practicing clinicians frequently require textbook or journal searches. Making these sources available in electronic form improves the speed of these searches, but precision (i.e., the fraction of relevant to total documents retrieved) remains low. Improving the traditional keyword search by transforming search terms into canonical concepts does not improve search precision greatly. Kim et al. have designed and built a prototype system (MYCIN II) for computer-based information retrieval from a forthcoming electronic textbook of infectious disease. The system requires manual indexing by experts in the form of complex text markup. However, this mark-up process is time consuming (about 3 person-hours to generate, review, and transcribe the index for each of 218 chapters). We have designed and implemented a system to semiautomate the markup process. The system, information extraction for semiautomated indexing of documents (ISAID), uses query models and existing information-extraction tools to provide support for any user, including the author of the source material, to mark up tertiary information sources quickly and accurately.

  10. Parallel optimization of signal detection in active magnetospheric signal injection experiments

    NASA Astrophysics Data System (ADS)

    Gowanlock, Michael; Li, Justin D.; Rude, Cody M.; Pankratius, Victor

    2018-05-01

    Signal detection and extraction requires substantial manual parameter tuning at different stages in the processing pipeline. Time-series data depends on domain-specific signal properties, necessitating unique parameter selection for a given problem. The large potential search space makes this parameter selection process time-consuming and subject to variability. We introduce a technique to search and prune such parameter search spaces in parallel and select parameters for time series filters using breadth- and depth-first search strategies to increase the likelihood of detecting signals of interest in the field of magnetospheric physics. We focus on studying geomagnetic activity in the extremely and very low frequency ranges (ELF/VLF) using ELF/VLF transmissions from Siple Station, Antarctica, received at Québec, Canada. Our technique successfully detects amplified transmissions and achieves substantial speedup performance gains as compared to an exhaustive parameter search. We present examples where our algorithmic approach reduces the search from hundreds of seconds down to less than 1 s, with a ranked signal detection in the top 99th percentile, thus making it valuable for real-time monitoring. We also present empirical performance models quantifying the trade-off between the quality of signal recovered and the algorithm response time required for signal extraction. In the future, improved signal extraction in scenarios like the Siple experiment will enable better real-time diagnostics of conditions of the Earth's magnetosphere for monitoring space weather activity.

  11. A modified temporal criterion to meta-optimize the extended Kalman filter for land cover classification of remotely sensed time series

    NASA Astrophysics Data System (ADS)

    Salmon, B. P.; Kleynhans, W.; Olivier, J. C.; van den Bergh, F.; Wessels, K. J.

    2018-05-01

    Humans are transforming land cover at an ever-increasing rate. Accurate geographical maps on land cover, especially rural and urban settlements are essential to planning sustainable development. Time series extracted from MODerate resolution Imaging Spectroradiometer (MODIS) land surface reflectance products have been used to differentiate land cover classes by analyzing the seasonal patterns in reflectance values. The proper fitting of a parametric model to these time series usually requires several adjustments to the regression method. To reduce the workload, a global setting of parameters is done to the regression method for a geographical area. In this work we have modified a meta-optimization approach to setting a regression method to extract the parameters on a per time series basis. The standard deviation of the model parameters and magnitude of residuals are used as scoring function. We successfully fitted a triply modulated model to the seasonal patterns of our study area using a non-linear extended Kalman filter (EKF). The approach uses temporal information which significantly reduces the processing time and storage requirements to process each time series. It also derives reliability metrics for each time series individually. The features extracted using the proposed method are classified with a support vector machine and the performance of the method is compared to the original approach on our ground truth data.

  12. Process-induced compositional changes of flaxseed.

    PubMed

    Wanasundara, P K; Shahidi, F

    1998-01-01

    Flaxseed has been used as an edible grain in different parts of the world since ancient times. However, use of flaxseed oil has been limited due to its high content of polyunsaturated fatty acids. Nonetheless, alpha-linolenic acid, dietary fiber and lignans of flaxseed have regained attention. New varieties of flaxseed containing low levels of alpha-linolenic acid are available for edible oil extraction. Use of whole flaxseed in foods provides a means to utilise all of its nutrients and require minimum processing steps. However, the presence of cyanogenic glucosides and diglucosides in the seeds is a concern as they may release cyanide upon hydrolysis. In addition, the polyunsaturated fatty acids may undergo thermal or autooxidation when exposed to air or high temperatures that are used in food preparation. Studies todate on oxidation products of intact flaxseed lipids have not shown any harmful effects when flaxseed is included, up to 28%, in the baked products. Furthermore, cyanide levels produced as a result of autolysis are below the harmful limits to humans. However, the meals left after oil extraction require detoxification but, by solvent extraction, to reduce the harmful effects of cyanide when used in animal rations. Flaxseed meal is a good source of proteins; these could be isolated by complexation with sodium hexametaphosphate without changing their nutritional value or composition. In addition, the effect of germination on proteins, lipids, cyanogenic glycosides, and other minor constituents of flaxseed is discussed.

  13. Uraniferous Phosphates: Resource, Security Risk, or Contaminant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LeMone, D.V.; Goodell, Ph.C.; Gibbs, S.G.

    2008-07-01

    The escalation of the price of uranium (U) yellow cake (summer high = $130/0.454 kg (lb) has called into question the continuing availability of sufficient stockpiles and ores to process. As was developed during the years following World War II, the establishment and maintenance of a strategic inventory is a reasonable consideration for today. Therefore, it becomes critical to look at potential secondary resources beyond the classical ore suites now being utilized. The most economically viable future secondary source seems to be the byproducts of the beneficiation of phosphoric acids derived from phosphate ores. Phosphorous (P) is an essential nutrientmore » for plants; its deficiency can result in highly restrictive limitations in crop productivity. Acidic soils in tropical and subtropical regions of the world are often P deficient with high P-sorption (fixation) capacities. To correct this deficiency, efficient water-soluble P fertilizers are required. The use of raw phosphate rocks not only adds phosphate but also its contained contaminants, including uranium to the treated land. Another immediate difficulty is phosphogypsum, the standard byproduct of simple extraction. It, for practical purposes, has been selectively classified as TENORM by regulators. The imposition of these standards presents major current and future disposal and re-utilization problems. Therefore, establishing an economically viable system that allows for uranium byproduct extraction from phosphoric acids is desirable. Such a system would be dependent on yellow cake base price stability, reserve estimates, political conditions, nation-state commitment, and dependence on nuclear energy. The accumulation of yellow cake from the additional extraction process provides a valuable commodity and allows the end acid to be a more environmentally acceptable product. The phosphogypsum already accumulated, as well as that which is in process, will not make a viable component for a radiation disposal devise (RDD). Concern for weapon proliferation by rogue nation states from the byproduct production of yellowcake is an unlikely scenario. To extract the fissile U-235 (0.07%) isotope from the yellowcake (99.3%) requires the erection of a costly major gaseous diffusion or a cascading centrifuge facility. Such a facility would be extremely difficult to mask. Therefore, from a diminished security risk and positive economic and environmental viewpoints, the utilization of a phosphoric acid beneficiation process extracting uranium is desirable. (authors)« less

  14. Isolation of Mitochondrial DNA from Single, Short Hairs without Roots Using Pressure Cycling Technology.

    PubMed

    Harper, Kathryn A; Meiklejohn, Kelly A; Merritt, Richard T; Walker, Jessica; Fisher, Constance L; Robertson, James M

    2018-02-01

    Hairs are commonly submitted as evidence to forensic laboratories, but standard nuclear DNA analysis is not always possible. Mitochondria (mt) provide another source of genetic material; however, manual isolation is laborious. In a proof-of-concept study, we assessed pressure cycling technology (PCT; an automated approach that subjects samples to varying cycles of high and low pressure) for extracting mtDNA from single, short hairs without roots. Using three microscopically similar donors, we determined the ideal PCT conditions and compared those yields to those obtained using the traditional manual micro-tissue grinder method. Higher yields were recovered from grinder extracts, but yields from PCT extracts exceeded the requirements for forensic analysis, with the DNA quality confirmed through sequencing. Automated extraction of mtDNA from hairs without roots using PCT could be useful for forensic laboratories processing numerous samples.

  15. Structural health monitoring feature design by genetic programming

    NASA Astrophysics Data System (ADS)

    Harvey, Dustin Y.; Todd, Michael D.

    2014-09-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems.

  16. CLAMP - a toolkit for efficiently building customized clinical natural language processing pipelines.

    PubMed

    Soysal, Ergin; Wang, Jingqi; Jiang, Min; Wu, Yonghui; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua

    2017-11-24

    Existing general clinical natural language processing (NLP) systems such as MetaMap and Clinical Text Analysis and Knowledge Extraction System have been successfully applied to information extraction from clinical text. However, end users often have to customize existing systems for their individual tasks, which can require substantial NLP skills. Here we present CLAMP (Clinical Language Annotation, Modeling, and Processing), a newly developed clinical NLP toolkit that provides not only state-of-the-art NLP components, but also a user-friendly graphic user interface that can help users quickly build customized NLP pipelines for their individual applications. Our evaluation shows that the CLAMP default pipeline achieved good performance on named entity recognition and concept encoding. We also demonstrate the efficiency of the CLAMP graphic user interface in building customized, high-performance NLP pipelines with 2 use cases, extracting smoking status and lab test values. CLAMP is publicly available for research use, and we believe it is a unique asset for the clinical NLP community. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Removal of chromium(III) from aqueous waste solution by liquid-liquid extraction in a circular microchannel.

    PubMed

    Luo, Jian Hong; Li, Jun; Guo, Lei; Zhu, Xin Hua; Dai, Shuang; Li, Xing

    2017-11-01

    A new circular microchannel device has been proposed for the removal of chromium(III) from aqueous waste solution by using kerosene as a diluent and (2-ethylhexyl) 2-ethylhexyl phosphonate as an extractant. The proposed device has several advantages such as a flexible and easily adaptable design, easy maintenance, and cheap setup without the requirement of microfabrication. To study the extraction efficiency and advantages of the circular microchannel device in the removal of chromium(III), the effects of various operating conditions such as the inner diameter of the channel, the total flow velocity, the phase ratio, the initial pH of aqueous waste solution, the reaction temperature and the initial concentration of extractant on the extraction efficiency are investigated and the optimal process conditions are obtained. The results show that chromium(III) in aqueous waste solution can be effectively removed with (2-ethylhexyl) 2-ethylhexyl phosphonate in the circular microchannel. Under optimized conditions, an extraction efficiency of chromium(III) of more than 99% can be attained and the aqueous waste solution can be discharged directly, which can meet the Chinese national emission standards.

  18. Conception of Self-Construction Production Scheduling System

    NASA Astrophysics Data System (ADS)

    Xue, Hai; Zhang, Xuerui; Shimizu, Yasuhiro; Fujimura, Shigeru

    With the high speed innovation of information technology, many production scheduling systems have been developed. However, a lot of customization according to individual production environment is required, and then a large investment for development and maintenance is indispensable. Therefore now the direction to construct scheduling systems should be changed. The final objective of this research aims at developing a system which is built by it extracting the scheduling technique automatically through the daily production scheduling work, so that an investment will be reduced. This extraction mechanism should be applied for various production processes for the interoperability. Using the master information extracted by the system, production scheduling operators can be supported to accelerate the production scheduling work easily and accurately without any restriction of scheduling operations. By installing this extraction mechanism, it is easy to introduce scheduling system without a lot of expense for customization. In this paper, at first a model for expressing a scheduling problem is proposed. Then the guideline to extract the scheduling information and use the extracted information is shown and some applied functions are also proposed based on it.

  19. Extraction of incident irradiance from LWIR hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Lahaie, Pierre

    2014-10-01

    The atmospheric correction of thermal hyperspectral imagery can be separated in two distinct processes: Atmospheric Compensation (AC) and Temperature and Emissivity separation (TES). TES requires for input at each pixel, the ground leaving radiance and the atmospheric downwelling irradiance, which are the outputs of the AC process. The extraction from imagery of the downwelling irradiance requires assumptions about some of the pixels' nature, the sensor and the atmosphere. Another difficulty is that, often the sensor's spectral response is not well characterized. To deal with this unknown, we defined a spectral mean operator that is used to filter the ground leaving radiance and a computation of the downwelling irradiance from MODTRAN. A user will select a number of pixels in the image for which the emissivity is assumed to be known. The emissivity of these pixels is assumed to be smooth and that the only spectrally fast varying variable in the downwelling irradiance. Using these assumptions we built an algorithm to estimate the downwelling irradiance. The algorithm is used on all the selected pixels. The estimated irradiance is the average on the spectral channels of the resulting computation. The algorithm performs well in simulation and results are shown for errors in the assumed emissivity and for errors in the atmospheric profiles. The sensor noise influences mainly the required number of pixels.

  20. On-board data management study for EOPAP

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.

    1975-01-01

    The requirements, implementation techniques, and mission analysis associated with on-board data management for EOPAP were studied. SEASAT-A was used as a baseline, and the storage requirements, data rates, and information extraction requirements were investigated for each of the following proposed SEASAT sensors: a short pulse 13.9 GHz radar, a long pulse 13.9 GHz radar, a synthetic aperture radar, a multispectral passive microwave radiometer facility, and an infrared/visible very high resolution radiometer (VHRR). Rate distortion theory was applied to determine theoretical minimum data rates and compared with the rates required by practical techniques. It was concluded that practical techniques can be used which approach the theoretically optimum based upon an empirically determined source random process model. The results of the preceding investigations were used to recommend an on-board data management system for (1) data compression through information extraction, optimal noiseless coding, source coding with distortion, data buffering, and data selection under command or as a function of data activity, (2) for command handling, (3) for spacecraft operation and control, and (4) for experiment operation and monitoring.

  1. Applying high resolution remote sensing image and DEM to falling boulder hazard assessment

    NASA Astrophysics Data System (ADS)

    Huang, Changqing; Shi, Wenzhong; Ng, K. C.

    2005-10-01

    Boulder fall hazard assessing generally requires gaining the boulder information. The extensive mapping and surveying fieldwork is a time-consuming, laborious and dangerous conventional method. So this paper proposes an applying image processing technology to extract boulder and assess boulder fall hazard from high resolution remote sensing image. The method can replace the conventional method and extract the boulder information in high accuracy, include boulder size, shape, height and the slope and aspect of its position. With above boulder information, it can be satisfied for assessing, prevention and cure boulder fall hazard.

  2. HTP-NLP: A New NLP System for High Throughput Phenotyping.

    PubMed

    Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L

    2017-01-01

    Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.

  3. A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery.

    PubMed

    Siddiqui, Fasahat Ullah; Teng, Shyh Wei; Awrangjeb, Mohammad; Lu, Guojun

    2016-07-19

    Existing automatic building extraction methods are not effective in extracting buildings which are small in size and have transparent roofs. The application of large area threshold prohibits detection of small buildings and the use of ground points in generating the building mask prevents detection of transparent buildings. In addition, the existing methods use numerous parameters to extract buildings in complex environments, e.g., hilly area and high vegetation. However, the empirical tuning of large number of parameters reduces the robustness of building extraction methods. This paper proposes a novel Gradient-based Building Extraction (GBE) method to address these limitations. The proposed method transforms the Light Detection And Ranging (LiDAR) height information into intensity image without interpolation of point heights and then analyses the gradient information in the image. Generally, building roof planes have a constant height change along the slope of a roof plane whereas trees have a random height change. With such an analysis, buildings of a greater range of sizes with a transparent or opaque roof can be extracted. In addition, a local colour matching approach is introduced as a post-processing stage to eliminate trees. This stage of our proposed method does not require any manual setting and all parameters are set automatically from the data. The other post processing stages including variance, point density and shadow elimination are also applied to verify the extracted buildings, where comparatively fewer empirically set parameters are used. The performance of the proposed GBE method is evaluated on two benchmark data sets by using the object and pixel based metrics (completeness, correctness and quality). Our experimental results show the effectiveness of the proposed method in eliminating trees, extracting buildings of all sizes, and extracting buildings with and without transparent roof. When compared with current state-of-the-art building extraction methods, the proposed method outperforms the existing methods in various evaluation metrics.

  4. A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery

    PubMed Central

    Siddiqui, Fasahat Ullah; Teng, Shyh Wei; Awrangjeb, Mohammad; Lu, Guojun

    2016-01-01

    Existing automatic building extraction methods are not effective in extracting buildings which are small in size and have transparent roofs. The application of large area threshold prohibits detection of small buildings and the use of ground points in generating the building mask prevents detection of transparent buildings. In addition, the existing methods use numerous parameters to extract buildings in complex environments, e.g., hilly area and high vegetation. However, the empirical tuning of large number of parameters reduces the robustness of building extraction methods. This paper proposes a novel Gradient-based Building Extraction (GBE) method to address these limitations. The proposed method transforms the Light Detection And Ranging (LiDAR) height information into intensity image without interpolation of point heights and then analyses the gradient information in the image. Generally, building roof planes have a constant height change along the slope of a roof plane whereas trees have a random height change. With such an analysis, buildings of a greater range of sizes with a transparent or opaque roof can be extracted. In addition, a local colour matching approach is introduced as a post-processing stage to eliminate trees. This stage of our proposed method does not require any manual setting and all parameters are set automatically from the data. The other post processing stages including variance, point density and shadow elimination are also applied to verify the extracted buildings, where comparatively fewer empirically set parameters are used. The performance of the proposed GBE method is evaluated on two benchmark data sets by using the object and pixel based metrics (completeness, correctness and quality). Our experimental results show the effectiveness of the proposed method in eliminating trees, extracting buildings of all sizes, and extracting buildings with and without transparent roof. When compared with current state-of-the-art building extraction methods, the proposed method outperforms the existing methods in various evaluation metrics. PMID:27447631

  5. Functional requirements regarding medical registries--preliminary results.

    PubMed

    Oberbichler, Stefan; Hörbst, Alexander

    2013-01-01

    The term medical registry is used to reference tools and processes to support clinical or epidemiologic research or provide a data basis for decisions regarding health care policies. In spite of this wide range of applications the term registry and the functional requirements which a registry should support are not clearly defined. This work presents preliminary results of a literature review to discover functional requirements which form a registry. To extract these requirements a set of peer reviewed articles was collected. These set of articles was screened by using methods from qualitative research. Up to now most discovered functional requirements focus on data quality (e. g. prevent transcription error by conducting automatic domain checks).

  6. Tropical Timber Identification using Backpropagation Neural Network

    NASA Astrophysics Data System (ADS)

    Siregar, B.; Andayani, U.; Fatihah, N.; Hakim, L.; Fahmi, F.

    2017-01-01

    Each and every type of wood has different characteristics. Identifying the type of wood properly is important, especially for industries that need to know the type of timber specifically. However, it requires expertise in identifying the type of wood and only limited experts available. In addition, the manual identification even by experts is rather inefficient because it requires a lot of time and possibility of human errors. To overcome these problems, a digital image based method to identify the type of timber automatically is needed. In this study, backpropagation neural network is used as artificial intelligence component. Several stages were developed: a microscope image acquisition, pre-processing, feature extraction using gray level co-occurrence matrix and normalization of data extraction using decimal scaling features. The results showed that the proposed method was able to identify the timber with an accuracy of 94%.

  7. RNA isolation from mammalian cells using porous polymer monoliths: an approach for high-throughput automation.

    PubMed

    Chatterjee, Anirban; Mirer, Paul L; Zaldivar Santamaria, Elvira; Klapperich, Catherine; Sharon, Andre; Sauer-Budge, Alexis F

    2010-06-01

    The life science and healthcare communities have been redefining the importance of ribonucleic acid (RNA) through the study of small molecule RNA (in RNAi/siRNA technologies), micro RNA (in cancer research and stem cell research), and mRNA (gene expression analysis for biologic drug targets). Research in this field increasingly requires efficient and high-throughput isolation techniques for RNA. Currently, several commercial kits are available for isolating RNA from cells. Although the quality and quantity of RNA yielded from these kits is sufficiently good for many purposes, limitations exist in terms of extraction efficiency from small cell populations and the ability to automate the extraction process. Traditionally, automating a process decreases the cost and personnel time while simultaneously increasing the throughput and reproducibility. As the RNA field matures, new methods for automating its extraction, especially from low cell numbers and in high throughput, are needed to achieve these improvements. The technology presented in this article is a step toward this goal. The method is based on a solid-phase extraction technology using a porous polymer monolith (PPM). A novel cell lysis approach and a larger binding surface throughout the PPM extraction column ensure a high yield from small starting samples, increasing sensitivity and reducing indirect costs in cell culture and sample storage. The method ensures a fast and simple procedure for RNA isolation from eukaryotic cells, with a high yield both in terms of quality and quantity. The technique is amenable to automation and streamlined workflow integration, with possible miniaturization of the sample handling process making it suitable for high-throughput applications.

  8. The life cycle of a mineral deposit: a teacher's guide for hands-on mineral education activities

    USGS Publications Warehouse

    Frank, Dave; Galloway, John; Assmus, Ken

    2005-01-01

    This teacher's guide defines what a mineral deposit is and how a mineral deposit is identified and measured, how the mineral resources are extracted, and how the mining site is reclaimed; how minerals and mineral resources are processed; and how we use mineral resources in our every day lives. Included are 10 activitybased learning exercises that educate students on basic geologic concepts; the processes of finding, identifying, and extracting the resources from a mineral deposit; and the uses of minerals. The guide is intended for K through 12 Earth science teachers and students and is designed to meet the National Science Content Standards as defined by the National Research Council (1996). To assist in the understanding of some of the geology and mineral terms, see the Glossary (appendix 1) and Minerals and Their Uses (appendix 2). The process of finding or exploring for a mineral deposit, extracting or mining the resource, recovering the resource, also known as beneficiation, and reclaiming the land mined can be described as the “life cycle” of a mineral deposit. The complete process is time consuming and expensive, requiring the use of modern technology and equipment, and may take many years to complete. Sometimes one entity or company completes the entire process from discovery to reclamation, but often it requires multiple groups with specialized experience working together. Mineral deposits are the source of many important commodities, such as copper and gold, used by our society, but it is important to realize that mineral deposits are a nonrenewable resource. Once mined, they are exhausted, and another source must be found. New mineral deposits are being continuously created by the Earth but may take millions of years to form. Mineral deposits differ from renewable resources, such as agricultural and timber products, which may be replenished within a few months to several years.

  9. Economical and Environmentally Benign Extraction of Rare Earth Elements (REES) from Coal & Coal Byproducts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, Gary

    This final report provides a complete summary of the activities, results, analytical discussion, and overall evaluation of the project titled “Economical and Environmentally Benign Extraction of Rare Earth Elements (REES) from Coal & Coal Byproducts” under DOE Award Number DE-FE-0027155 that started in March 2016 and ended December 2017. Fly ash was selected as the coal-byproduct source material due to fact that it is readily available with no need for extensive methods to obtain the material, it is produced in large quantities (>50 million tons per year) and had REE concentrations similar to other coal-byproducts. The selected fly ash usedmore » throughout this project was from the Mill Creek power generating facility operated by Louisville Gas and Electric located in Louisville, KY and was subjected to a variety of physical and chemical characterization tests. Results from fusion extractions showed that the selected fly-ash had a TREE+Y concentration of 480 ppm with critical REEs concentration of 200 ppm. The fly ash had an outlook ratio of 1.25 and an estimated value of $16-$18 worth of salable REEs per 1-tonne of fly ash. Additional characterizations by optical evaluation, QEMSCAN, XRD, size fractionation, and SEM analysis showed the fly ash consisted of small glassy spherules with a size range between 1 to 110 µm (ave. diam. of 13 um), was heterogeneous in chemical composition (main crystalline phases: aluminum oxides and iron oxides) and was primarily an amorphous material (75 to 80%). A simple stepped approach was completed to estimate the total REE resource quantity. The approach included REE characterization of the representative samples, evaluation of fly-ash availability, and final determination estimated resource availability with regards to REE grade on a regional and national scale. This data represents the best available information and is based upon the assumptions that the power generating facility where the fly-ash was obtained will use the same coal sources (actual mines were identified), the coal materials will have relatively consistent REE concentrations, and the REE extraction process developed during this project can achieve 42% REE recovery (validated and confirmed). Calculations indicated that the estimated REE resource is approximately 175,000 tonnes with a current estimated value of $3,330MM. The proposed REE extraction and production process developed during this project used four fundamental steps; 1) fly-ash pretreatment to enhance REE extraction, 2) REE extraction by acid digestion, 3) REE separation/concentration by carbon adsorption and column chromatography, and 4) REE oxide production. Secondary processing steps to manage process residuals and additional processing techniques to produce value-added products were incorporated into the process during the project. These secondary steps were not only necessary to manage residuals, but also provided additional revenue streams that offset operational and capital expenditures. The process produces one value product stream (production of zeolite Na-P1), a solids waste stream, and one liquid stream that met RCRA discharge requirements. Based upon final design criteria and operational parameters, the proposed system could produce approximately 200 grams of REOs from 1-tonne of fly-ash, thereby representing a TREE+Y recovery of 42% (project target of > 25%). A detailed economic model was developed to evaluate both CAPEX and OPEX estimates for systems with varying capacities between 100 kg to 200 tonnes of fly ash processed per day. Using a standard system capacity of 10 tonne/day system, capital costs were estimated at $88/kg fly ash while operating costs were estimated at approximately $450/kg fly ash. This operating cost estimate includes a revenue of $495/tonne of fly ash processed from the value-added product produced from the system (zeolite Na-P1). Although operating cost savings due to zeolite production were significant, the capital + operating cost for a 10 tonne system was more expensive than the total dollar value of REEs present in the fly ash material. Specifically, the estimated cost per 1-tonne of fly ash treated is approximately $540 while the estimated value of REEs in the fly ash is $18-$20/tonne. This is an excessive difference showing that the proposed process is not economically feasible strictly on the basis of REE revenue compared to extraction costs. Although the current proposed system does not produce sufficient quantities of REEs or additional revenue sources to offset operational and capital costs, supplementary factors including US strategic concerns, commercial demands, and defense department requirements must be factored. At this time, the process developed during this project provides foundational information for future development of simple processes that require low capital investment and one that will extract a valuable quality and quantity of REE oxides from industrial waste.« less

  10. Feature extraction algorithm for space targets based on fractal theory

    NASA Astrophysics Data System (ADS)

    Tian, Balin; Yuan, Jianping; Yue, Xiaokui; Ning, Xin

    2007-11-01

    In order to offer a potential for extending the life of satellites and reducing the launch and operating costs, satellite servicing including conducting repairs, upgrading and refueling spacecraft on-orbit become much more frequently. Future space operations can be more economically and reliably executed using machine vision systems, which can meet real time and tracking reliability requirements for image tracking of space surveillance system. Machine vision was applied to the research of relative pose for spacecrafts, the feature extraction algorithm was the basis of relative pose. In this paper fractal geometry based edge extraction algorithm which can be used in determining and tracking the relative pose of an observed satellite during proximity operations in machine vision system was presented. The method gets the gray-level image distributed by fractal dimension used the Differential Box-Counting (DBC) approach of the fractal theory to restrain the noise. After this, we detect the consecutive edge using Mathematical Morphology. The validity of the proposed method is examined by processing and analyzing images of space targets. The edge extraction method not only extracts the outline of the target, but also keeps the inner details. Meanwhile, edge extraction is only processed in moving area to reduce computation greatly. Simulation results compared edge detection using the method which presented by us with other detection methods. The results indicate that the presented algorithm is a valid method to solve the problems of relative pose for spacecrafts.

  11. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    NASA Astrophysics Data System (ADS)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  12. Supercritical Nitrogen Processing for the Purification of Reactive Porous Materials

    PubMed Central

    Stadie, Nicholas P.; Callini, Elsa; Mauron, Philippe; Borgschulte, Andreas; Züttel, Andreas

    2015-01-01

    Supercritical fluid extraction and drying methods are well established in numerous applications for the synthesis and processing of porous materials. Herein, nitrogen is presented as a novel supercritical drying fluid for specialized applications such as in the processing of reactive porous materials, where carbon dioxide and other fluids are not appropriate due to their higher chemical reactivity. Nitrogen exhibits similar physical properties in the near-critical region of its phase diagram as compared to carbon dioxide: a widely tunable density up to ~1 g ml-1, modest critical pressure (3.4 MPa), and small molecular diameter of ~3.6 Å. The key to achieving a high solvation power of nitrogen is to apply a processing temperature in the range of 80-150 K, where the density of nitrogen is an order of magnitude higher than at similar pressures near ambient temperature. The detailed solvation properties of nitrogen, and especially its selectivity, across a wide range of common target species of extraction still require further investigation. Herein we describe a protocol for the supercritical nitrogen processing of porous magnesium borohydride. PMID:26066492

  13. Wireless AE Event and Environmental Monitoring for Wind Turbine Blades at Low Sampling Rates

    NASA Astrophysics Data System (ADS)

    Bouzid, Omar M.; Tian, Gui Y.; Cumanan, K.; Neasham, J.

    Integration of acoustic wireless technology in structural health monitoring (SHM) applications introduces new challenges due to requirements of high sampling rates, additional communication bandwidth, memory space, and power resources. In order to circumvent these challenges, this chapter proposes a novel solution through building a wireless SHM technique in conjunction with acoustic emission (AE) with field deployment on the structure of a wind turbine. This solution requires a low sampling rate which is lower than the Nyquist rate. In addition, features extracted from aliased AE signals instead of reconstructing the original signals on-board the wireless nodes are exploited to monitor AE events, such as wind, rain, strong hail, and bird strike in different environmental conditions in conjunction with artificial AE sources. Time feature extraction algorithm, in addition to the principal component analysis (PCA) method, is used to extract and classify the relevant information, which in turn is used to classify or recognise a testing condition that is represented by the response signals. This proposed novel technique yields a significant data reduction during the monitoring process of wind turbine blades.

  14. The ISES: A non-intrusive medium for in-space experiments in on-board information extraction

    NASA Technical Reports Server (NTRS)

    Murray, Nicholas D.; Katzberg, Stephen J.; Nealy, Mike

    1990-01-01

    The Information Science Experiment System (ISES) represents a new approach in applying advanced systems technology and techniques to on-board information extraction in the space environment. Basically, what is proposed is a 'black box' attached to the spacecraft data bus or local area network. To the spacecraft the 'black box' appears to be just another payload requiring power, heat rejection, interfaces, adding weight, and requiring time on the data management and communication system. In reality, the 'black box' is a programmable computational resource which eavesdrops on the data network, taking and producing selectable, real-time science data back on the network. This paper will present a brief overview of the ISES Concept and will discuss issues related to applying the ISES to the polar platform and Space Station Freedom. Critical to the operation of ISES is the viability of a payload-like interface to the spacecraft data bus or local area network. Study results that address this question will be reviewed vis-a-vis the solar platform and the core space station. Also, initial results of processing science and other requirements for onboard, real-time information extraction will be presented with particular emphasis on the polar platform. Opportunities for a broader range of applications on the core space station will also be discussed.

  15. Novel Binders and Methods for Agglomeration of Ore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. K. Kawatra; T. C. Eisele; K. A. Lewandowski

    2006-03-31

    Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not break down during processing. However, for many important metal extraction processes there are no binders known that willmore » work satisfactorily at a reasonable cost. A primary example of this is copper heap leaching, where there are no binders currently encountered in this acidic environment process. As a result, operators of many facilities see a large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of a broad range of mineral agglomeration applications, particularly heap leaching. The active involvement of our industrial partners will help to ensure rapid commercialization of any agglomeration technologies developed by this project.« less

  16. Novel Binders and Methods for Agglomeration of Ore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. K. Kawatra; T. C. Eisele; J. A. Gurtler

    2005-09-30

    Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not break down during processing. However, for many important metal extraction processes there are no binders known that willmore » work satisfactorily at a reasonable cost. A primary example of this is copper heap leaching, where there are no binders currently encountered in this acidic environment process. As a result, operators of many facilities see a large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of a broad range of mineral agglomeration applications, particularly heap leaching. The active involvement of our industrial partners will help to ensure rapid commercialization of any agglomeration technologies developed by this project.« less

  17. Selective aqueous extraction of organics coupled with trapping by membrane separation

    DOEpatents

    van Eikeren, Paul; Brose, Daniel J.; Ray, Roderick J.

    1991-01-01

    An improvement to processes for the selective extractation of organic solutes from organic solvents by water-based extractants is disclosed, the improvement comprising coupling various membrane separation processes with the organic extraction process, the membrane separation process being utilized to continuously recycle the water-based extractant and at the same time selectively remove or concentrate organic solute from the water-based extractant.

  18. Comparative study on Ce (III) and La (III) solvent extraction and separation from a nitric acid medium by D2EHPA and Cyanex272

    NASA Astrophysics Data System (ADS)

    Habibpour, R.; Dargahi, M.; Kashi, E.; Bagherpour, M.

    2018-01-01

    The solvent extraction of Cerium(III) and Lanthanum(III) from nitric acid solution using the organophosphorous extractants Di-(2-ethyl hexyl) phosphate (D2EHPA) and di-2,4,4- trimethylpentyl phosphoric acid (Cyanex272) in kerosene was investigated. In this study, the magnitude of the extraction of Ce(III) was found to be more significant with Cyanex272 than D2EHPA. D2EHPA was found to be a better extractant for La(III). Among the two extractants, Cyanex272 was used for the separation of Ce from La in three stages with an extraction efficiency of 90.2% for Ce. A 556 mg/L Ce solution was used for the scrubbing of La with an efficiency of ≈34%, which required multi stage scrubbing. The study of thermodynamic parameters such as enthalpy, entropy, and Gibbs free energy impart the exothermic and non-spontaneous process. The chemical speciation curves for lanthanum and cerium in the aqueous phase as a function of pH showed that the free La(III) and Ce(III) metal ion species were largely predominate between a pH = 0 and pH = 7.

  19. How do you assign persistent identifiers to extracts from large, complex, dynamic data sets that underpin scholarly publications?

    NASA Astrophysics Data System (ADS)

    Wyborn, Lesley; Car, Nicholas; Evans, Benjamin; Klump, Jens

    2016-04-01

    Persistent identifiers in the form of a Digital Object Identifier (DOI) are becoming more mainstream, assigned at both the collection and dataset level. For static datasets, this is a relatively straight-forward matter. However, many new data collections are dynamic, with new data being appended, models and derivative products being revised with new data, or the data itself revised as processing methods are improved. Further, because data collections are becoming accessible as services, researchers can log in and dynamically create user-defined subsets for specific research projects: they also can easily mix and match data from multiple collections, each of which can have a complex history. Inevitably extracts from such dynamic data sets underpin scholarly publications, and this presents new challenges. The National Computational Infrastructure (NCI) has been experiencing and making progress towards addressing these issues. The NCI is large node of the Research Data Services initiative (RDS) of the Australian Government's research infrastructure, which currently makes available over 10 PBytes of priority research collections, ranging from geosciences, geophysics, environment, and climate, through to astronomy, bioinformatics, and social sciences. Data are replicated to, or are produced at, NCI and then processed there to higher-level data products or directly analysed. Individual datasets range from multi-petabyte computational models and large volume raster arrays, down to gigabyte size, ultra-high resolution datasets. To facilitate access, maximise reuse and enable integration across the disciplines, datasets have been organized on a platform called the National Environmental Research Data Interoperability Platform (NERDIP). Combined, the NERDIP data collections form a rich and diverse asset for researchers: their co-location and standardization optimises the value of existing data, and forms a new resource to underpin data-intensive Science. New publication procedures require that a persistent identifier (DOI) be provided for the dataset that underpins the publication. Being able to produce these for data extracts from the NCI data node using only DOIs is proving difficult: preserving a copy of each data extract is not possible due to data scale. A proposal is for researchers to use workflows that capture the provenance of each data extraction, including metadata (e.g., version of the dataset used, the query and time of extraction). In parallel, NCI is now working with the NERDIP dataset providers to ensure that the provenance of data publication is also captured in provenance systems including references to previous versions and a history of data appended or modified. This proposed solution would require an enhancement to new scholarly publication procedures whereby the reference to underlying dataset to a scholarly publication would be the persistent identifier of the provenance workflow that created the data extract. In turn, the provenance workflow would itself link to a series of persistent identifiers that, at a minimum, provide complete dataset production transparency and, if required, would facilitate reconstruction of the dataset. Such a solution will require strict adherence to design patterns for provenance representation to ensure that the provenance representation of the workflow does indeed contain information required to deliver dataset generation transparency and a pathway to reconstruction.

  20. Acoustic emission signal processing for rolling bearing running state assessment using compressive sensing

    NASA Astrophysics Data System (ADS)

    Liu, Chang; Wu, Xing; Mao, Jianlin; Liu, Xiaoqin

    2017-07-01

    In the signal processing domain, there has been growing interest in using acoustic emission (AE) signals for the fault diagnosis and condition assessment instead of vibration signals, which has been advocated as an effective technique for identifying fracture, crack or damage. The AE signal has high frequencies up to several MHz which can avoid some signals interference, such as the parts of bearing (i.e. rolling elements, ring and so on) and other rotating parts of machine. However, acoustic emission signal necessitates advanced signal sampling capabilities and requests ability to deal with large amounts of sampling data. In this paper, compressive sensing (CS) is introduced as a processing framework, and then a compressive features extraction method is proposed. We use it for extracting the compressive features from compressively-sensed data directly, and also prove the energy preservation properties. First, we study the AE signals under the CS framework. The sparsity of AE signal of the rolling bearing is checked. The observation and reconstruction of signal is also studied. Second, we present a method of extraction AE compressive feature (AECF) from compressively-sensed data directly. We demonstrate the energy preservation properties and the processing of the extracted AECF feature. We assess the running state of the bearing using the AECF trend. The AECF trend of the running state of rolling bearings is consistent with the trend of traditional features. Thus, the method is an effective way to evaluate the running trend of rolling bearings. The results of the experiments have verified that the signal processing and the condition assessment based on AECF is simpler, the amount of data required is smaller, and the amount of computation is greatly reduced.

  1. Recent Advances in the Development and Application of Power Plate Transducers in Dense Gas Extraction and Aerosol Agglomeration Processes

    NASA Astrophysics Data System (ADS)

    Riera, E.; Cardoni, A.; Gallego-Juárez, J. A.; Acosta, V. M.; Blanco, A.; Rodríguez, G.; Blasco, M.; Herranz, L. E.

    Power ultrasound (PU) is an emerging, innovative, energy saving and environmental friendly technology that is generating a great interest in sectors such as food and pharmaceutical industries, green chemistry, environmental pollution, and other processes, where sustainable and energy efficient methods are required to improve and/or produce specific effects. Two typical effects of PU are the enhancement of mass transfer in gases and liquids, and the induction of particle agglomeration in aerosols. These effects are activated by a variety of mechanisms associated to the nonlinear propagation of high amplitude ultrasonic waves such as diffusion, agitation, entrainment, turbulence, etc. During the last years a great effort has been jointly made by the Spanish National Research Council (CSIC) and the company Pusonics towards introducing novel processes into the market based on airborne ultrasonic plate transducers. This technology was specifically developed for the treatment of gas and multiphasic media characterized by low specific acoustic impedance and high acoustic absorption. Different strategies have been developed to mitigate the effects of the nonlinear dynamic behavior of such ultrasonic piezoelectric transducers in order to enhance and stabilize their response at operational power conditions. This work deals with the latter advances in the mitigation of nonlinear problems found in power transducers; besides it describes two applications assisted by ultrasound developed at semi-industrial and laboratory scales and consisting in extraction via dense gases and particle agglomeration. Dense Gas Extraction (DGE) assisted by PU is a new process with a potential to enhance the extraction kinetics with supercritical CO2. Acoustic agglomeration of fine aerosol particles has a great potential for the treatment of air pollution problems generated by particulate materials. Experimental and numerical results in both processes will be shown and discussed.

  2. Towards a Video Passive Content Fingerprinting Method for Partial-Copy Detection Robust against Non-Simulated Attacks

    PubMed Central

    2016-01-01

    Passive content fingerprinting is widely used for video content identification and monitoring. However, many challenges remain unsolved especially for partial-copies detection. The main challenge is to find the right balance between the computational cost of fingerprint extraction and fingerprint dimension, without compromising detection performance against various attacks (robustness). Fast video detection performance is desirable in several modern applications, for instance, in those where video detection involves the use of large video databases or in applications requiring real-time video detection of partial copies, a process whose difficulty increases when videos suffer severe transformations. In this context, conventional fingerprinting methods are not fully suitable to cope with the attacks and transformations mentioned before, either because the robustness of these methods is not enough or because their execution time is very high, where the time bottleneck is commonly found in the fingerprint extraction and matching operations. Motivated by these issues, in this work we propose a content fingerprinting method based on the extraction of a set of independent binary global and local fingerprints. Although these features are robust against common video transformations, their combination is more discriminant against severe video transformations such as signal processing attacks, geometric transformations and temporal and spatial desynchronization. Additionally, we use an efficient multilevel filtering system accelerating the processes of fingerprint extraction and matching. This multilevel filtering system helps to rapidly identify potential similar video copies upon which the fingerprint process is carried out only, thus saving computational time. We tested with datasets of real copied videos, and the results show how our method outperforms state-of-the-art methods regarding detection scores. Furthermore, the granularity of our method makes it suitable for partial-copy detection; that is, by processing only short segments of 1 second length. PMID:27861492

  3. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  4. Recommendations for Development of New Standardized Forms of Cocoa Breeds and Cocoa Extract Processing for the Prevention of Alzheimer's Disease: Role of Cocoa in Promotion of Cognitive Resilience and Healthy Brain Aging.

    PubMed

    Dubner, Lauren; Wang, Jun; Ho, Lap; Ward, Libby; Pasinetti, Giulio M

    2015-01-01

    It is currently thought that the lackluster performance of translational paradigms in the prevention of age-related cognitive deteriorative disorders, such as Alzheimer's disease (AD), may be due to the inadequacy of the prevailing approach of targeting only a single mechanism. Age-related cognitive deterioration and certain neurodegenerative disorders, including AD, are characterized by complex relationships between interrelated biological phenotypes. Thus, alternative strategies that simultaneously target multiple underlying mechanisms may represent a more effective approach to prevention, which is a strategic priority of the National Alzheimer's Project Act and the National Institute on Aging. In this review article, we discuss recent strategies designed to clarify the mechanisms by which certain brain-bioavailable, bioactive polyphenols, in particular, flavan-3-ols also known as flavanols, which are highly represented in cocoa extracts, may beneficially influence cognitive deterioration, such as in AD, while promoting healthy brain aging. However, we note that key issues to improve consistency and reproducibility in the development of cocoa extracts as a potential future therapeutic agent requires a better understanding of the cocoa extract sources, their processing, and more standardized testing including brain bioavailability of bioactive metabolites and brain target engagement studies. The ultimate goal of this review is to provide recommendations for future developments of cocoa extracts as a therapeutic agent in AD.

  5. Application of Sequential Extractions and X-ray Absorption Spectroscopy to Determine the Speciation of Chromium in Northern New Jersey Marsh Soils Developed in Chromite ore Processing Residue (COPR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elzinga, E.; Cirmo, A

    2010-01-01

    The Cr speciation in marsh soils developed in weathering chromite ore processing residue (COPR) was characterized using sequential extractions and synchrotron microbeam and bulk X-ray absorption spectroscopic (XAS) analyses. The sequential extractions suggested substantial Cr associated with reducible and oxidizable soil components, and significant non-extractable residual Cr. Notable differences in Cr speciation estimates from three extraction schemes underscore the operationally defined nature of Cr speciation provided by these methods. Micro X-ray fluorescence maps and {mu}-XAS data indicated the presence of {micro}m-sized chromite particles scattered throughout the weathered COPR matrix. These particles derive from the original COPR material, and have relativelymore » high resistance towards weathering, and therefore persist even after prolonged leaching. Bulk XAS data further indicated Cr(III) incorporated in Fe(OH){sub 3}, and Cr(III) associated with organic matter. The low Cr contents of the weathered material (200-850 ppm) compared to unweathered COPR (20,000-60,000 ppm) point to substantial Cr leaching during COPR weathering, with partial repartitioning of released Cr into secondary Fe(OH){sub 3} phases and organics. The effects of anoxia on Cr speciation, and the potential of active COPR weathering releasing Cr(VI) deeper in the profile require further study.« less

  6. Sensitive determination of total particulate phosphorus and particulate inorganic phosphorus in seawater using liquid waveguide spectrophotometry.

    PubMed

    Ehama, Makoto; Hashihama, Fuminori; Kinouchi, Shinko; Kanda, Jota; Saito, Hiroaki

    2016-06-01

    Determining the total particulate phosphorus (TPP) and particulate inorganic phosphorus (PIP) in oligotrophic oceanic water generally requires the filtration of a large amount of water sample. This paper describes methods that require small filtration volumes for determining the TPP and PIP concentrations. The methods were devised by validating or improving conventional sample processing and by applying highly sensitive liquid waveguide spectrophotometry to the measurements of oxidized or acid-extracted phosphate from TPP and PIP, respectively. The oxidation of TPP was performed by a chemical wet oxidation method using 3% potassium persulfate. The acid extraction of PIP was initially carried out based on the conventional extraction methodology, which requires 1M HCl, followed by the procedure for decreasing acidity. While the conventional procedure for acid removal requires a ten-fold dilution of the 1M HCl extract with purified water, the improved procedure proposed in this study uses 8M NaOH solution for neutralizing 1M HCl extract in order to reduce the dilution effect. An experiment for comparing the absorbances of the phosphate standard dissolved in 0.1M HCl and of that dissolved in a neutralized solution [1M HCl: 8M NaOH=8:1 (v:v)] exhibited a higher absorbance in the neutralized solution. This indicated that the improved procedure completely removed the acid effect, which reduces the sensitivity of the phosphate measurement. Application to an ultraoligotrophic water sample showed that the TPP concentration in a 1075mL-filtered sample was 8.4nM with a coefficient of variation (CV) of 4.3% and the PIP concentration in a 2300mL-filtered sample was 1.3nM with a CV of 6.1%. Based on the detection limit (3nM) of the sensitive phosphate measurement and the ambient TPP and PIP concentrations of the ultraoligotrophic water, the minimum filtration volumes required for the detection of TPP and PIP were estimated to be 15 and 52mL, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. A two-dimensional contaminant fate and transport model for the lower Athabasca River

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brownlee, B.G.; Booty, W.G.; MacInnis, G.A.

    1995-12-31

    The lower Athabasca River flows through the Athabasca Oil Sands deposits in northeastern Alberta. Two oil sands mining/extraction/upgrading plants operate near the river downstream from Fort McMurray. Process water is stored in large tailings ponds. One of the plants (Suncor) has a licensed discharge (mostly cooling water) to the river. This effluent contains low concentrations ({<=} 1 {micro}g/L) of various polycyclic aromatic compounds (PACs). Several tributary streams which cut through oil sands deposits are potential sources of hydrocarbons to the Athabasca. The authors have found that river suspended sediments give positive responses in a number of toxicity tests, using bothmore » direct and indirect (organic-solvent extract) methods. Several environmental impact assessments are required as a result of industry expansion. To provide an assessment tool for PACs, the authors are developing a two-dimensional contaminant fate and transport model for a 120-km portion of the Athabasca River downstream from Fort McMurray. Hydraulic calibration of the model was done using sodium and chloride from a major tributary as tracers. Two groups of compounds are being modelled: (1) PACs from the Suncor effluent, and (2) PACs from natural/background sources. PAC concentrations in the river were typically < 1 ng/L, requiring large volume extractions and highly sensitive analysis. Processes such as sediment-water partitioning and biodegradation are being estimated from field experiments using river water and suspended sediment. Photodegradation is likely unimportant in this turbid river due to low penetration of 280--350 nm light. Initially, volatilization will be modelled using estimated or literature values for Henry`s constants, but may require more refined estimates from laboratory experiments.« less

  8. Development of a real-time microchip PCR system for portable plant disease diagnosis.

    PubMed

    Koo, Chiwan; Malapi-Wight, Martha; Kim, Hyun Soo; Cifci, Osman S; Vaughn-Diaz, Vanessa L; Ma, Bo; Kim, Sungman; Abdel-Raziq, Haron; Ong, Kevin; Jo, Young-Ki; Gross, Dennis C; Shim, Won-Bo; Han, Arum

    2013-01-01

    Rapid and accurate detection of plant pathogens in the field is crucial to prevent the proliferation of infected crops. Polymerase chain reaction (PCR) process is the most reliable and accepted method for plant pathogen diagnosis, however current conventional PCR machines are not portable and require additional post-processing steps to detect the amplified DNA (amplicon) of pathogens. Real-time PCR can directly quantify the amplicon during the DNA amplification without the need for post processing, thus more suitable for field operations, however still takes time and require large instruments that are costly and not portable. Microchip PCR systems have emerged in the past decade to miniaturize conventional PCR systems and to reduce operation time and cost. Real-time microchip PCR systems have also emerged, but unfortunately all reported portable real-time microchip PCR systems require various auxiliary instruments. Here we present a stand-alone real-time microchip PCR system composed of a PCR reaction chamber microchip with integrated thin-film heater, a compact fluorescence detector to detect amplified DNA, a microcontroller to control the entire thermocycling operation with data acquisition capability, and a battery. The entire system is 25 × 16 × 8 cm(3) in size and 843 g in weight. The disposable microchip requires only 8-µl sample volume and a single PCR run consumes 110 mAh of power. A DNA extraction protocol, notably without the use of liquid nitrogen, chemicals, and other large lab equipment, was developed for field operations. The developed real-time microchip PCR system and the DNA extraction protocol were used to successfully detect six different fungal and bacterial plant pathogens with 100% success rate to a detection limit of 5 ng/8 µl sample.

  9. Development of a Real-Time Microchip PCR System for Portable Plant Disease Diagnosis

    PubMed Central

    Kim, Hyun Soo; Cifci, Osman S.; Vaughn-Diaz, Vanessa L.; Ma, Bo; Kim, Sungman; Abdel-Raziq, Haron; Ong, Kevin; Jo, Young-Ki; Gross, Dennis C.; Shim, Won-Bo; Han, Arum

    2013-01-01

    Rapid and accurate detection of plant pathogens in the field is crucial to prevent the proliferation of infected crops. Polymerase chain reaction (PCR) process is the most reliable and accepted method for plant pathogen diagnosis, however current conventional PCR machines are not portable and require additional post-processing steps to detect the amplified DNA (amplicon) of pathogens. Real-time PCR can directly quantify the amplicon during the DNA amplification without the need for post processing, thus more suitable for field operations, however still takes time and require large instruments that are costly and not portable. Microchip PCR systems have emerged in the past decade to miniaturize conventional PCR systems and to reduce operation time and cost. Real-time microchip PCR systems have also emerged, but unfortunately all reported portable real-time microchip PCR systems require various auxiliary instruments. Here we present a stand-alone real-time microchip PCR system composed of a PCR reaction chamber microchip with integrated thin-film heater, a compact fluorescence detector to detect amplified DNA, a microcontroller to control the entire thermocycling operation with data acquisition capability, and a battery. The entire system is 25×16×8 cm3 in size and 843 g in weight. The disposable microchip requires only 8-µl sample volume and a single PCR run consumes 110 mAh of power. A DNA extraction protocol, notably without the use of liquid nitrogen, chemicals, and other large lab equipment, was developed for field operations. The developed real-time microchip PCR system and the DNA extraction protocol were used to successfully detect six different fungal and bacterial plant pathogens with 100% success rate to a detection limit of 5 ng/8 µl sample. PMID:24349341

  10. Enhanced separation of rare earth elements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, K.; Greenhalgh, M.; Herbst, R. S.

    2016-09-01

    Industrial rare earth separation processes utilize PC88A, a phosphonic acid ligand, for solvent extraction separations. The separation factors of the individual rare earths, the equipment requirements, and chemical usage for these flowsheets are well characterized. Alternative ligands such as Cyanex® 572 and the associated flowsheets are being investigated at the pilot scale level to determine if significant improvements to the current separation processes can be realized. These improvements are identified as higher separation factors, reduced stage requirements, or reduced chemical consumption. Any of these improvements can significantly affect the costs associated with these challenging separation proccesses. A mid/heavy rare earthmore » element (REE) separations flowsheet was developed and tested for each ligand in a 30 stage mixer-settler circuit to compare the separation performance of PC88A and Cyanex® 572. The ligand-metal complex strength of Cyanex® 572 provides efficient extraction of REE while significantly reducing the strip acid requirements. Reductions in chemical consumption have a significant impact on process economics for REE separations. Partitioning results summarized Table 1 indicate that Cyanex® 572 offers the same separation performance as PC88A while reducing acid consumption by 30% in the strip section for the mid/heavy REE separation. Flowsheet Effluent Compositions PC88A Cyanex® 572 Raffinate Mid REE Heavy REE 99.40% 0.60% 99.40% 0.60% Rich Mid REE Heavy REE 2.20% 97.80% 0.80% 99.20% Liquor Strip Acid Required 3.4 M 2.3 M Table 1 – Flowsheet results comparing separation performance of PC88A and Cyanex® 572 for a mid/heavy REE separation.« less

  11. Fractionation and characterization of semi polar and polar compounds from leaf extract Nicotiana tabaccum L. reflux ethanol extraction results

    NASA Astrophysics Data System (ADS)

    Rahardjo, Andhika Priotomo; Fauzantoro, Ahmad; Gozan, Misri

    2018-02-01

    The decline in cigarette production as the solution of health problems can interfere with the welfare of tobacco farmers in Indonesia. So, it is required to utilize the alternative uses of tobacco with chemical compounds inside it as the raw material for producing alternative products. One of the methods that is efficient in separating chemical compounds from plant extracts is fractionation and characterization method. This method has never been used for Nicotiana tabaccum L. extract using semi polar and polar solvents. This study begins with preparing Nicotiana tabaccum L. extract ingredients obtained through reflux ethanol extraction process. Extracts are analyzed by HPLC which serves to determine the chemical compounds in tobacco extract qualitatively. Extract that has been analyzed, is then fractionated using column chromatography with semi polar (ethyl acetate) and polar (ethane) solvents sequentially. Chemical compounds from tobacco extracts will be dissolved in accordance with the polarity of each solvents. The chemical compound is then characterized using HPLC quantitatively and qualitatively. Then, the data that has been obtained is used to find the partition coefficient of the main components in Nicotiana tabaccum L., which is Nicotine (kN) in Virginia 1 (Ethyl Acetate) fraction at 0.075; Virginia 2 (Ethyl Acetate) fraction at 0.037; And Virginia 3 (Ethyl Acetate) fraction at 0.043.

  12. Low Cost Extraction and Isothermal Amplification of DNA for Infectious Diarrhea Diagnosis

    PubMed Central

    Huang, Shichu; Do, Jaephil; Mahalanabis, Madhumita; Fan, Andy; Zhao, Lei; Jepeal, Lisa; Singh, Satish K.; Klapperich, Catherine M.

    2013-01-01

    In order to counter the common perception that molecular diagnostics are too complicated to work in low resource settings, we have performed a difficult sample preparation and DNA amplification protocol using instrumentation designed to be operated without wall or battery power. In this work we have combined a nearly electricity-free nucleic acid extraction process with an electricity-free isothermal amplification assay to detect the presence of Clostridium difficile (C. difficile) DNA in the stool of infected patients. We used helicase-dependent isothermal amplification (HDA) to amplify the DNA in a low-cost, thermoplastic reaction chip heated with a pair of commercially available toe warmers, while using a simple Styrofoam insulator. DNA was extracted from known positive and negative stool samples. The DNA extraction protocol utilized an air pressure driven solid phase extraction device run using a standard bicycle pump. The simple heater setup required no electricity or battery and was capable of maintaining the temperature at 65°C±2°C for 55 min, suitable for repeatable HDA amplification. Experiments were performed to explore the adaptability of the system for use in a range of ambient conditions. When compared to a traditional centrifuge extraction protocol and a laboratory thermocycler, this disposable, no power platform achieved approximately the same lower limit of detection (1.25×10−2 pg of C. difficile DNA) while requiring much less raw material and a fraction of the lab infrastructure and cost. This proof of concept study could greatly impact the accessibility of molecular assays for applications in global health. PMID:23555883

  13. Fundamental Chemical Kinetic And Thermodynamic Data For Purex Process Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, R.J.; Fox, O.D.; Sarsfield, M.J.

    2007-07-01

    To support either the continued operations of current reprocessing plants or the development of future fuel processing using hydrometallurgical processes, such as Advanced Purex or UREX type flowsheets, the accurate simulation of Purex solvent extraction is required. In recent years we have developed advanced process modeling capabilities that utilize modern software platforms such as Aspen Custom Modeler and can be run in steady state and dynamic simulations. However, such advanced models of the Purex process require a wide range of fundamental data including all relevant basic chemical kinetic and thermodynamic data for the major species present in the process. Thismore » paper will summarize some of these recent process chemistry studies that underpin our simulation, design and testing of Purex solvent extraction flowsheets. Whilst much kinetic data for actinide redox reactions in nitric acid exists in the literature, the data on reactions in the diluted TBP solvent phase is much rarer. This inhibits the accurate modelization of the Purex process particularly when species show a significant extractability in to the solvent phase or when cycling between solvent and aqueous phases occurs, for example in the reductive stripping of Pu(IV) by ferrous sulfamate in the Magnox reprocessing plant. To support current oxide reprocessing, we have investigated a range of solvent phase reactions: - U(IV)+HNO{sub 3}; - U(IV)+HNO{sub 2}; - U(IV)+HNO{sub 3} (Pu catalysis); - U(IV)+HNO{sub 3} (Tc catalysis); - U(IV)+ Np(VI); - U(IV)+Np(V); - Np(IV)+HNO{sub 3}; - Np(V)+Np(V); Rate equations have been determined for all these reactions and kinetic rate constants and activation energies are now available. Specific features of these reactions in the TBP phase include the roles of water and hydrolyzed intermediates in the reaction mechanisms. In reactions involving Np(V), cation-cation complex formation, which is much more favourable in TBP than in HNO{sub 3}, also occurs and complicates the redox chemistry. Whilst some features of the redox chemistry in TBP appear similar to the corresponding reactions in aqueous HNO{sub 3}, there are notable differences in rates, the forms of the rate equations and mechanisms. Secondly, to underpin the development of advanced single cycle flowsheets using the complexant aceto-hydroxamic acid, we have also characterised in some detail its redox chemistry and solvent extraction behaviour with both Np and Pu ions. We find that simple hydroxamic acids are remarkably rapid reducing agents for Np(VI). They also reduce Pu(VI) and cause a much slower reduction of Pu(IV) through a complex mechanism involving acid hydrolysis of the ligand. AHA is a strong hydrophilic and selective complexant for the tetravalent actinide ions as evidenced by stability constant and solvent extraction data for An(IV), M(III) and U(VI) ions. This has allowed the successful design of U/Pu+Np separation flowsheets suitable for advanced fuel cycles. (authors)« less

  14. CER Hub: An informatics platform for conducting comparative effectiveness research using multi-institutional, heterogeneous, electronic clinical data.

    PubMed

    Hazlehurst, Brian L; Kurtz, Stephen E; Masica, Andrew; Stevens, Victor J; McBurnie, Mary Ann; Puro, Jon E; Vijayadeva, Vinutha; Au, David H; Brannon, Elissa D; Sittig, Dean F

    2015-10-01

    Comparative effectiveness research (CER) requires the capture and analysis of data from disparate sources, often from a variety of institutions with diverse electronic health record (EHR) implementations. In this paper we describe the CER Hub, a web-based informatics platform for developing and conducting research studies that combine comprehensive electronic clinical data from multiple health care organizations. The CER Hub platform implements a data processing pipeline that employs informatics standards for data representation and web-based tools for developing study-specific data processing applications, providing standardized access to the patient-centric electronic health record (EHR) across organizations. The CER Hub is being used to conduct two CER studies utilizing data from six geographically distributed and demographically diverse health systems. These foundational studies address the effectiveness of medications for controlling asthma and the effectiveness of smoking cessation services delivered in primary care. The CER Hub includes four key capabilities: the ability to process and analyze both free-text and coded clinical data in the EHR; a data processing environment supported by distributed data and study governance processes; a clinical data-interchange format for facilitating standardized extraction of clinical data from EHRs; and a library of shareable clinical data processing applications. CER requires coordinated and scalable methods for extracting, aggregating, and analyzing complex, multi-institutional clinical data. By offering a range of informatics tools integrated into a framework for conducting studies using EHR data, the CER Hub provides a solution to the challenges of multi-institutional research using electronic medical record data. Copyright © 2015. Published by Elsevier Ireland Ltd.

  15. [Optimization of extraction process for tannins from Geranium orientali-tibeticum by supercritical CO2 method].

    PubMed

    Xie, Song; Tong, Zhi-Ping; Tan, Rui; Liu, Xiao-Zhen

    2014-08-01

    In order to optimize extraction process conditions of tannins from Geranium orientali-tibeticum by supercritical CO2, the content of tannins was determined by phosphomolybdium tungsten acid-casein reaction, with extraction pressure, extraction temper- ature and extraction time as factors, the content of tannins from extract of G. orientali-tibeticum as index, technology conditions were optimized by orthogonal test. Optimum technology conditions were as follows: extraction pressure was 25 MPa, extraction temperature was 50 °C, extracted 1.5 h. The content of tannins in extract was 12.91 mg x g(-1), extract rate was 3.67%. The method established could be used for assay the contents of tannin in G. orientali-tibeticum. The circulated extraction was an effective extraction process that was stable and feasible, and that provides a way of the extraction process conditions of tannin from G. orientali-tibeticum.

  16. A quality score for coronary artery tree extraction results

    NASA Astrophysics Data System (ADS)

    Cao, Qing; Broersen, Alexander; Kitslaar, Pieter H.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2018-02-01

    Coronary artery trees (CATs) are often extracted to aid the fully automatic analysis of coronary artery disease on coronary computed tomography angiography (CCTA) images. Automatically extracted CATs often miss some arteries or include wrong extractions which require manual corrections before performing successive steps. For analyzing a large number of datasets, a manual quality check of the extraction results is time-consuming. This paper presents a method to automatically calculate quality scores for extracted CATs in terms of clinical significance of the extracted arteries and the completeness of the extracted CAT. Both right dominant (RD) and left dominant (LD) anatomical statistical models are generated and exploited in developing the quality score. To automatically determine which model should be used, a dominance type detection method is also designed. Experiments are performed on the automatically extracted and manually refined CATs from 42 datasets to evaluate the proposed quality score. In 39 (92.9%) cases, the proposed method is able to measure the quality of the manually refined CATs with higher scores than the automatically extracted CATs. In a 100-point scale system, the average scores for automatically and manually refined CATs are 82.0 (+/-15.8) and 88.9 (+/-5.4) respectively. The proposed quality score will assist the automatic processing of the CAT extractions for large cohorts which contain both RD and LD cases. To the best of our knowledge, this is the first time that a general quality score for an extracted CAT is presented.

  17. Focused microwave-assisted extraction combined with solid-phase microextraction and gas chromatography-mass spectrometry for the selective analysis of cocaine from coca leaves.

    PubMed

    Bieri, Stefan; Ilias, Yara; Bicchi, Carlo; Veuthey, Jean-Luc; Christen, Philippe

    2006-04-21

    An effective combination of focused microwave-assisted extraction (FMAE) with solid-phase microextraction (SPME) prior to gas chromatography (GC) is described for the selective extraction and quantitative analysis of cocaine from coca leaves (Erythroxylum coca). This approach required switching from an organic extraction solvent to an aqueous medium more compatible with SPME liquid sampling. SPME was performed in the direct immersion mode with a universal 100 microm polydimethylsiloxane (PDMS) coated fibre. Parameters influencing this extraction step, such as solution pH, sampling time and temperature are discussed. Furthermore, the overall extraction process takes into account the stability of cocaine in alkaline aqueous solutions at different temperatures. Cocaine degradation rate was determined by capillary electrophoresis using the short end injection procedure. In the selected extraction conditions, less than 5% of cocaine was degraded after 60 min. From a qualitative point of view, a significant gain in selectivity was obtained with the incorporation of SPME in the extraction procedure. As a consequence of SPME clean-up, shorter columns could be used and analysis time was reduced to 6 min compared to 35 min with conventional GC. Quantitative results led to a cocaine content of 0.70 +/- 0.04% in dry leaves (RSD <5%) which agreed with previous investigations.

  18. Investigation and Evaluation of the open source ETL tools GeoKettle and Talend Open Studio in terms of their ability to process spatial data

    NASA Astrophysics Data System (ADS)

    Kuhnert, Kristin; Quedenau, Jörn

    2016-04-01

    Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis, the instrument data has been complemented by other georeferenced data provided by the local environmental authorities. This includes both vector and raster data on e.g. land use categories or building heights, extracted from flat files and OGC-compliant web services. The requirements on the ETL tools are now for instance the extraction of different input datasets like Web Feature Services or vector datasets and the loading of those into databases. The tools also have to manage transformations on spatial datasets like to work with spatial functions (e.g. intersection, union) or change spatial reference systems. Preliminary results suggest that many complex transformation tasks could be accomplished with the existing set of components from both software tools, while there are still many gaps in the range of available features. Both ETL tools differ in functionality and in the way of implementation of various steps. For some tasks no predefined components are available at all, which could partly be compensated by the use of the respective API (freely configurable components in Java or JavaScript).

  19. Artificial retina model for the retinally blind based on wavelet transform

    NASA Astrophysics Data System (ADS)

    Zeng, Yan-an; Song, Xin-qiang; Jiang, Fa-gang; Chang, Da-ding

    2007-01-01

    Artificial retina is aimed for the stimulation of remained retinal neurons in the patients with degenerated photoreceptors. Microelectrode arrays have been developed for this as a part of stimulator. Design such microelectrode arrays first requires a suitable mathematical method for human retinal information processing. In this paper, a flexible and adjustable human visual information extracting model is presented, which is based on the wavelet transform. With the flexible of wavelet transform to image information processing and the consistent to human visual information extracting, wavelet transform theory is applied to the artificial retina model for the retinally blind. The response of the model to synthetic image is shown. The simulated experiment demonstrates that the model behaves in a manner qualitatively similar to biological retinas and thus may serve as a basis for the development of an artificial retina.

  20. Resin purification from Dragons Blood by using sub critical solvent extraction method

    NASA Astrophysics Data System (ADS)

    Saifuddin; Nahar

    2018-04-01

    Jernang resin (dragon blood) is the world's most expensive sap. The resin obtained from jernang that grows only on the islands of Sumatra and Borneo. Jernang resin is in demand by the State of China, Hong Kong, and Singapore since they contain compounds that have the potential dracohordin as a medicinal ingredient in the biological and pharmacological activity such as antimicrobial, antiviral, antitumor and cytotoxic activity. The resin extracting process has conventionally been done by drizzly with maceration method as one way of processing jernang, which is done by people in Bireuen, Aceh. However, there are still significant obstacles, namely the quality of the yield that obtained lower than the jernang resin. The technological innovation carried out by forceful extraction process maceration by using methanol produced a yield that is higher than the extraction process maceration method carried out in Bireuen. Nevertheless, the use of methanol as a solvent would raise the production costs due to the price, which is relatively more expensive and non-environmentally friendly. To overcome the problem, this research proposed a process, which is known as subcritical solvent method. This process is cheap, and also abundant and environmentally friendly. The results show that the quality of jernang resins is better than the one that obtained by the processing group in Bireuen. The quality of the obtained jernang by maceration method is a class-A quality based on the quality specification requirements of jernang (SNI 1671: 2010) that has resin (b/b) 73%, water (w/w) of 6.8%, ash (w/b) 7%, impurity (w/w) 32%, the melting point of 88°C and red colours. While the two-stage treatment obtained a class between class-A and super quality, with the resin (b/b) 0.86%, water (w/w) of 6.5%, ash (w/w) of 2.8%, levels of impurities (w/w) of 9%, the melting point of 88 °C and dark-red colours.

  1. BEAMS Lab: Novel approaches to finding a balance between throughput and sensitivity

    NASA Astrophysics Data System (ADS)

    Liberman, Rosa G.; Skipper, Paul L.; Prakash, Chandra; Shaffer, Christopher L.; Flarakos, Jimmy; Tannenbaum, Steven R.

    2007-06-01

    Development of 14C AMS has long pursued the twin goals of maximizing both sensitivity and precision in the interest, among others, of optimizing radiocarbon dating. Application of AMS to biomedical research is less constrained with respect to sensitivity requirements, but more demanding of high throughput. This work presents some technical and conceptual developments in sample processing and analytical instrumentation designed to streamline the process of extracting quantitative data from the various types of samples encountered in analytical biochemistry.

  2. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  3. A narrative method for consciousness research.

    PubMed

    Díaz, José-Luis

    2013-01-01

    Some types of first-person narrations of mental processes that constitute phenomenological accounts and texts, such as internal monolog statements, epitomize the best expressions and representations of human consciousness available and therefore may be used to model phenomenological streams of consciousness. The type of autonomous monolog in which an author or narrator declares actual mental processes in a think aloud manner seems particularly suitable for modeling streams of consciousness. A narrative method to extract and depict conscious processes, operations, contents, and states from an acceptable phenomenological text would require three subsequent steps: operational criteria for producing and/or selecting a phenomenological text, a system for detecting text items that are indicative of conscious contents and processes, and a procedure for representing such items in formal dynamic system devices such as Petri nets. The requirements and restrictions of each of these steps are presented, analyzed, and applied to phenomenological texts in the following manner: (1) the relevance of introspective language and narrative analyses to consciousness research and the idea that specific narratives are of paramount interest for such investigation is justified; (2) some of the obstacles and constraints to attain plausible consciousness inferences from narrative texts and the methodological requirements to extract and depict items relevant to consciousness contents and operations from a suitable phenomenological text are examined; (3) a preliminary exercise of the proposed method is used to analyze and chart a classical interior monolog excerpted from James Joyce's Ulysses, a masterpiece of the stream-of-consciousness literary technique and, finally, (4) an inter-subjective evaluation for inter-observer agreement of mental attributions of another phenomenological text (an excerpt from the Intimate Journal of Miguel de Unamuno) is presented using some mathematical tools.

  4. A narrative method for consciousness research

    PubMed Central

    Díaz, José-Luis

    2013-01-01

    Some types of first-person narrations of mental processes that constitute phenomenological accounts and texts, such as internal monolog statements, epitomize the best expressions and representations of human consciousness available and therefore may be used to model phenomenological streams of consciousness. The type of autonomous monolog in which an author or narrator declares actual mental processes in a think aloud manner seems particularly suitable for modeling streams of consciousness. A narrative method to extract and depict conscious processes, operations, contents, and states from an acceptable phenomenological text would require three subsequent steps: operational criteria for producing and/or selecting a phenomenological text, a system for detecting text items that are indicative of conscious contents and processes, and a procedure for representing such items in formal dynamic system devices such as Petri nets. The requirements and restrictions of each of these steps are presented, analyzed, and applied to phenomenological texts in the following manner: (1) the relevance of introspective language and narrative analyses to consciousness research and the idea that specific narratives are of paramount interest for such investigation is justified; (2) some of the obstacles and constraints to attain plausible consciousness inferences from narrative texts and the methodological requirements to extract and depict items relevant to consciousness contents and operations from a suitable phenomenological text are examined; (3) a preliminary exercise of the proposed method is used to analyze and chart a classical interior monolog excerpted from James Joyce’s Ulysses, a masterpiece of the stream-of-consciousness literary technique and, finally, (4) an inter-subjective evaluation for inter-observer agreement of mental attributions of another phenomenological text (an excerpt from the Intimate Journal of Miguel de Unamuno) is presented using some mathematical tools. PMID:24265610

  5. Application of simplified Complexity Theory concepts for healthcare social systems to explain the implementation of evidence into practice.

    PubMed

    Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane

    2016-02-01

    To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.

  6. Bioleaching of vanadium from barren stone coal and its effect on the transition of vanadium speciation and mineral phase

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Lin, Hai; Dong, Ying-bo; Li, Gan-yu

    2018-03-01

    This study determined the optimal conditions required to obtain maximum vanadium extraction and examined the transition of mineral phases and vanadium speciation during the bioleaching process. Parameters including the initial pH value, initial Fe2+ concentration, solid load, and inoculum quantity were examined. The results revealed that 48.92wt% of the vanadium was extracted through bioleaching under optimal conditions. Comparatively, the chemical leaching yield (H2SO4, pH 2.0) showed a slower and milder increase in vanadium yield. The vanadium bioleaching yield was 35.11wt% greater than the chemical leaching yield. The Community Bureau of Reference (BCR) sequential extraction results revealed that 88.62wt% of vanadium existed in the residual fraction. The bacteria substantially changed the distribution of the vanadium speciation during the leaching process, and the residual fraction decreased to 48.44wt%. The X-ray diffraction (XRD) and Fourier transform infrared (FTIR) results provided evidence that the crystal lattice structure of muscovite was destroyed by the bacteria.

  7. Fractionation study in bioleached metallurgy wastes using six-step sequential extraction.

    PubMed

    Krasnodebska-Ostrega, Beata; Pałdyna, Joanna; Kowalska, Joanna; Jedynak, Łukasz; Golimowski, Jerzy

    2009-08-15

    The stored metallurgy wastes contain residues from ore processing operations that are characterized by relatively high concentrations of heavy metals. The bioleaching process makes use of bacteria to recover elements from industrial wastes and to decrease potential risk of environmental contamination. Wastes were treated by solutions containing bacteria. In this work, the optimized six-stage sequential extraction procedure was applied for the fractionation of Ni, Cr, Fe, Mn, Cu and Zn in iron-nickel metallurgy wastes deposited in Southern Poland (Szklary). Fractionation and total concentrations of elements in wastes before and after various bioleaching treatments were studied. Analyses of the extracts were performed by ICP-MS and FAAS. To achieve the most effective bioleaching of Zn, Cr, Ni, Cu, Mn, Fe the usage of both autotrophic and heterotrophic bacteria in sequence, combined with flushing of the residue after bioleaching is required. 80-100% of total metal concentrations were mobilized after the proposed treatment. Wastes treated according to this procedure could be deposited without any risk of environmental contamination and additionally the metals could be recovered for industrial purposes.

  8. Beyond the resolution limit: subpixel resolution in animals and now in silicon

    NASA Astrophysics Data System (ADS)

    Wilcox, M. J.

    2007-09-01

    Automatic acquisition of aerial threats at thousands of kilometers distance requires high sensitivity to small differences in contrast and high optical quality for subpixel resolution, since targets occupy much less surface area than a single pixel. Targets travel at high speed and break up in the re-entry phase. Target/decoy discrimination at the earliest possible time is imperative. Real time performance requires a multifaceted approach with hyperspectral imaging and analog processing allowing feature extraction in real time. Hyperacuity Systems has developed a prototype chip capable of nonlinear increase in resolution or subpixel resolution far beyond either pixel size or spacing. Performance increase is due to a biomimetic implementation of animal retinas. Photosensitivity is not homogeneous across the sensor surface, allowing pixel parsing. It is remarkably simple to provide this profile to detectors and we showed at least three ways to do so. Individual photoreceptors have a Gaussian sensitivity profile and this nonlinear profile can be exploited to extract high-resolution. Adaptive, analog circuitry provides contrast enhancement, dynamic range setting with offset and gain control. Pixels are processed in parallel within modular elements called cartridges like photo-receptor inputs in fly eyes. These modular elements are connected by a novel function for a cell matrix known as L4. The system is exquisitely sensitive to small target motion and operates with a robust signal under degraded viewing conditions, allowing detection of targets smaller than a single pixel or at greater distance. Therefore, not only is instantaneous feature extraction possible but also subpixel resolution. Analog circuitry increases processing speed with more accurate motion specification for target tracking and identification.

  9. Possible Applications of Photoautotrophic Biotechnologies at Lunar Settlements

    NASA Technical Reports Server (NTRS)

    McKay, David S.; Allen, Carl; Jones, J. A.; Bayless, D.; Brown, I.; Sarkisova, S.; Garrison, D.

    2007-01-01

    The most ambitious goal of the Vision of Space Exploration is to extend human presence across the solar system. Today, however, missions would have to bring all of the propellant, air, food, water, habitable volumes and shielding needed to sustain settlers beyond Earth. That is why resources for propellants, life support and construction of support systems and habitats must be found in space and utilized if humans hope to ever explore and colonize the solar system. The life support, fuel production and material processing systems currently proposed for spaceflight are essentially disconnected. Only traditional crop production has been proposed as a segment for bioregenerative life support systems, although the efficiency of higher plants for air regeneration is generally low. Thus, the investigation of air bioregeneration techniques based on the activity of photosynthetic organisms with higher rates of CO2 scrubbing and O2 release is very timely and important. Future systems for organic waste utilization in space may also benefit from the use of specific microorganisms. This janitorial job is efficiently carried out by microbes on Earth, which drive and connect different elemental cycles. It is likely that environmental control and life support systems based on bioregeneration will be capable of converting both organic and inorganic components of the waste at lunar settlements into edible biomass. The most challenging technologies for future lunar settlements are the extraction of elements (e.g. Fe, O, Si, etc) from local rocks for industrial feedstocks and the production of propellants. While such extraction can be accomplished by purely inorganic processes, the high energy requirements of such processes motivates the search for alternative technologies with lower energy requirements and appropriate efficiency. Well-developed terrestrial industrial biotechnologies for metals extraction and conversion could therefore be the prototypes for extraterrestrial biometallurgy.

  10. Dynamic-ETL: a hybrid approach for health data extraction, transformation and loading.

    PubMed

    Ong, Toan C; Kahn, Michael G; Kwan, Bethany M; Yamashita, Traci; Brandt, Elias; Hosokawa, Patrick; Uhrich, Chris; Schilling, Lisa M

    2017-09-13

    Electronic health records (EHRs) contain detailed clinical data stored in proprietary formats with non-standard codes and structures. Participating in multi-site clinical research networks requires EHR data to be restructured and transformed into a common format and standard terminologies, and optimally linked to other data sources. The expertise and scalable solutions needed to transform data to conform to network requirements are beyond the scope of many health care organizations and there is a need for practical tools that lower the barriers of data contribution to clinical research networks. We designed and implemented a health data transformation and loading approach, which we refer to as Dynamic ETL (Extraction, Transformation and Loading) (D-ETL), that automates part of the process through use of scalable, reusable and customizable code, while retaining manual aspects of the process that requires knowledge of complex coding syntax. This approach provides the flexibility required for the ETL of heterogeneous data, variations in semantic expertise, and transparency of transformation logic that are essential to implement ETL conventions across clinical research sharing networks. Processing workflows are directed by the ETL specifications guideline, developed by ETL designers with extensive knowledge of the structure and semantics of health data (i.e., "health data domain experts") and target common data model. D-ETL was implemented to perform ETL operations that load data from various sources with different database schema structures into the Observational Medical Outcome Partnership (OMOP) common data model. The results showed that ETL rule composition methods and the D-ETL engine offer a scalable solution for health data transformation via automatic query generation to harmonize source datasets. D-ETL supports a flexible and transparent process to transform and load health data into a target data model. This approach offers a solution that lowers technical barriers that prevent data partners from participating in research data networks, and therefore, promotes the advancement of comparative effectiveness research using secondary electronic health data.

  11. Extraction of Organic Molecules from Terrestrial Material: Quantitative Yields from Heat and Water Extractions

    NASA Technical Reports Server (NTRS)

    Beegle, L. W.; Abbey, W. A.; Tsapin, A. T.; Dragoi, D.; Kanik, I.

    2004-01-01

    In the robotic search for life on Mars, different proposed missions will analyze the chemical and biological signatures of life using different platforms. The analysis of samples via analytical instrumentation on the surface of Mars has thus far only been attempted by the two Viking missions. Robotic arms scooped relogith material into a pyrolysis oven attached to a GC/MS. No trace of organic material was found on any of the two different samples at either of the two different landing sites. This null result puts an upper limit on the amount of organics that might be present in Martian soil/rocks, although the level of detection for each individual molecular species is still debated. Determining the absolute limit of detection for each analytical instrument is essential so that null results can be understood. This includes investigating the trade off of using pyrolysis versus liquid solvent extraction to release organic materials (in terms of extraction efficiencies and the complexity of the sample extraction process.) Extraction of organics from field samples can be accomplished by a variety of methods such utilizing various solvents including HCl, pure water, supercritical fluid and Soxhelt extraction. Utilizing 6N HCl is one of the most commonly used method and frequently utilized for extraction of organics from meteorites but it is probably infeasible for robotic exploration due to difficulty of storage and transport. Extraction utilizing H2O is promising, but it could be less efficient than 6N HCl. Both supercritical fluid and Soxhelt extraction methods require bulky hardware and require complex steps, inappropriate for inclusion on rover spacecraft. This investigation reports the efficiencies of pyrolysis and solvent extraction methods for amino acids for different terrestrial samples. The samples studied here, initially created in aqueous environments, are sedimentary in nature. These particular samples were chosen because they possibly represent one of the best terrestrial analogs of Mars and they represent one of the absolute best case scenarios for finding organic molecules on the Martian surface.

  12. Full Characterization of CO2-Oil Properties On-Chip: Solubility, Diffusivity, Extraction Pressure, Miscibility, and Contact Angle.

    PubMed

    Sharbatian, Atena; Abedini, Ali; Qi, ZhenBang; Sinton, David

    2018-02-20

    Carbon capture, storage, and utilization technologies target a reduction in net CO 2 emissions to mitigate greenhouse gas effects. The largest such projects worldwide involve storing CO 2 through enhanced oil recovery-a technologically and economically feasible approach that combines both storage and oil recovery. Successful implementation relies on detailed measurements of CO 2 -oil properties at relevant reservoir conditions (P = 2.0-13.0 MPa and T = 23 and 50 °C). In this paper, we demonstrate a microfluidic method to quantify the comprehensive suite of mutual properties of a CO 2 and crude oil mixture including solubility, diffusivity, extraction pressure, minimum miscibility pressure (MMP), and contact angle. The time-lapse oil swelling/extraction in response to CO 2 exposure under stepwise increasing pressure was quantified via fluorescence microscopy, using the inherent fluorescence property of the oil. The CO 2 solubilities and diffusion coefficients were determined from the swelling process with measurements in strong agreement with previous results. The CO 2 -oil MMP was determined from the subsequent oil extraction process with measurements within 5% of previous values. In addition, the oil-CO 2 -silicon contact angle was measured throughout the process, with contact angle increasing with pressure. In contrast with conventional methods, which require days and ∼500 mL of fluid sample, the approach here provides a comprehensive suite of measurements, 100-fold faster with less than 1 μL of sample, and an opportunity to better inform large-scale CO 2 projects.

  13. Preparing silica aerogel monoliths via a rapid supercritical extraction method.

    PubMed

    Carroll, Mary K; Anderson, Ann M; Gorka, Caroline A

    2014-02-28

    A procedure for the fabrication of monolithic silica aerogels in eight hours or less via a rapid supercritical extraction process is described. The procedure requires 15-20 min of preparation time, during which a liquid precursor mixture is prepared and poured into wells of a metal mold that is placed between the platens of a hydraulic hot press, followed by several hours of processing within the hot press. The precursor solution consists of a 1.0:12.0:3.6:3.5 x 10(-3) molar ratio of tetramethylorthosilicate (TMOS):methanol:water:ammonia. In each well of the mold, a porous silica sol-gel matrix forms. As the temperature of the mold and its contents is increased, the pressure within the mold rises. After the temperature/pressure conditions surpass the supercritical point for the solvent within the pores of the matrix (in this case, a methanol/water mixture), the supercritical fluid is released, and monolithic aerogel remains within the wells of the mold. With the mold used in this procedure, cylindrical monoliths of 2.2 cm diameter and 1.9 cm height are produced. Aerogels formed by this rapid method have comparable properties (low bulk and skeletal density, high surface area, mesoporous morphology) to those prepared by other methods that involve either additional reaction steps or solvent extractions (lengthier processes that generate more chemical waste).The rapid supercritical extraction method can also be applied to the fabrication of aerogels based on other precursor recipes.

  14. Preparing Silica Aerogel Monoliths via a Rapid Supercritical Extraction Method

    PubMed Central

    Gorka, Caroline A.

    2014-01-01

    A procedure for the fabrication of monolithic silica aerogels in eight hours or less via a rapid supercritical extraction process is described. The procedure requires 15-20 min of preparation time, during which a liquid precursor mixture is prepared and poured into wells of a metal mold that is placed between the platens of a hydraulic hot press, followed by several hours of processing within the hot press. The precursor solution consists of a 1.0:12.0:3.6:3.5 x 10-3 molar ratio of tetramethylorthosilicate (TMOS):methanol:water:ammonia. In each well of the mold, a porous silica sol-gel matrix forms. As the temperature of the mold and its contents is increased, the pressure within the mold rises. After the temperature/pressure conditions surpass the supercritical point for the solvent within the pores of the matrix (in this case, a methanol/water mixture), the supercritical fluid is released, and monolithic aerogel remains within the wells of the mold. With the mold used in this procedure, cylindrical monoliths of 2.2 cm diameter and 1.9 cm height are produced. Aerogels formed by this rapid method have comparable properties (low bulk and skeletal density, high surface area, mesoporous morphology) to those prepared by other methods that involve either additional reaction steps or solvent extractions (lengthier processes that generate more chemical waste).The rapid supercritical extraction method can also be applied to the fabrication of aerogels based on other precursor recipes. PMID:24637334

  15. Sleep Promotes the Extraction of Grammatical Rules

    PubMed Central

    Nieuwenhuis, Ingrid L. C.; Folia, Vasiliki; Forkstam, Christian; Jensen, Ole; Petersson, Karl Magnus

    2013-01-01

    Grammar acquisition is a high level cognitive function that requires the extraction of complex rules. While it has been proposed that offline time might benefit this type of rule extraction, this remains to be tested. Here, we addressed this question using an artificial grammar learning paradigm. During a short-term memory cover task, eighty-one human participants were exposed to letter sequences generated according to an unknown artificial grammar. Following a time delay of 15 min, 12 h (wake or sleep) or 24 h, participants classified novel test sequences as Grammatical or Non-Grammatical. Previous behavioral and functional neuroimaging work has shown that classification can be guided by two distinct underlying processes: (1) the holistic abstraction of the underlying grammar rules and (2) the detection of sequence chunks that appear at varying frequencies during exposure. Here, we show that classification performance improved after sleep. Moreover, this improvement was due to an enhancement of rule abstraction, while the effect of chunk frequency was unaltered by sleep. These findings suggest that sleep plays a critical role in extracting complex structure from separate but related items during integrative memory processing. Our findings stress the importance of alternating periods of learning with sleep in settings in which complex information must be acquired. PMID:23755173

  16. Ultrasound-assisted oxidative desulfurization of liquid fuels and its industrial application.

    PubMed

    Wu, Zhilin; Ondruschka, Bernd

    2010-08-01

    Latest environmental regulations require a very deep desulfurization to meet the ultra-low sulfur diesel (ULSD, 15 ppm sulfur) specifications. Due to the disadvantages of hydrotreating technology on the slashing production conditions, costs and safety as well as environmental protection, the ultrasound-assisted oxidative desulfurization (UAOD) as an alternative technology has been developed. UAOD process selectively oxidizes sulfur in common thiophenes in diesel to sulfoxides and sulfones which can be removed via selective adsorption or extractant. SulphCo has successfully used a 5000 barrel/day mobile "Sonocracking" unit to duplicate on a commercial scale its proprietary process that applies ultrasonics at relatively low temperatures and pressures. The UAOD technology estimate capital costs less than half the cost of a new high-pressure hydrotreater. The physical and chemical mechanisms of UAOD process are illustrated, and the effective factors, such as ultrasonic frequency and power, oxidants, catalysts, phase-transfer agent, extractant and adsorbent, on reaction kinetics and product recovery are discussed in this review. Copyright 2009 Elsevier B.V. All rights reserved.

  17. Lunar oxygen and metal for use in near-Earth space: Magma electrolysis

    NASA Technical Reports Server (NTRS)

    Colson, Russell O.; Haskin, Larry A.

    1990-01-01

    Because it is energetically easier to get material from the Moon to Earth orbit than from the Earth itself, the Moon is a potentially valuable source of materials for use in space. The unique conditions on the Moon, such as vacuum, absence of many reagents common on the Earth, and the presence of very nontraditional ores suggest that a unique and nontraditional process for extracting materials from the ores may prove the most practical. With this in mind, an investigation of unfluxed silicate electrolysis as a method for extracting oxygen, iron, and silicon from lunar regolith was initiated and is discussed. The advantages of the process include simplicity of concept, absence of need to supply reagents from Earth, and low power and mass requirements for the processing plant. Disadvantages include the need for uninterrupted high temperature and the highly corrosive nature of the high-temperature silicate melts which has made identifying suitable electrode and container materials difficult.

  18. Prayon process for wet acid purification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davister, A.; Peeterbroeck, M.

    Described is a process developed in Belgium which enables the upgrading technical phosphoric acid to feed and food grades. After laboratory and pilot tests, Prayon developed and patented a solvent extraction process using a mixture of di-isopropyl ether and tributyl phosphate as solvent. The purified phosphoric acid obtained complies with the quality requirements of the market and can be used for metal treatments, in the manufacture of pure phosphates, for cattle feed, by the fermentation industry, for beverages, etc. Among the advantages of this process are its simplicity of operation, its low power consumption, and minimal environmental pollution. Extensive technologicalmore » data are given.« less

  19. Rapid microscale in-gel processing and digestion of proteins using surface acoustic waves.

    PubMed

    Kulkarni, Ketav P; Ramarathinam, Sri H; Friend, James; Yeo, Leslie; Purcell, Anthony W; Perlmutter, Patrick

    2010-06-21

    A new method for in-gel sample processing and tryptic digestion of proteins is described. Sample preparation, rehydration, in situ digestion and peptide extraction from gel slices are dramatically accelerated by treating the gel slice with surface acoustic waves (SAWs). Only 30 minutes total workflow time is required for this new method to produce base peak chromatograms (BPCs) of similar coverage and intensity to those observed for traditional processing and overnight digestion. Simple set up, good reproducibility, excellent peptide recoveries, rapid turnover of samples and high confidence protein identifications put this technology at the fore-front of the next generation of proteomics sample processing tools.

  20. Technical Parameters Modeling of a Gas Probe Foaming Using an Active Experimental Type Research

    NASA Astrophysics Data System (ADS)

    Tîtu, A. M.; Sandu, A. V.; Pop, A. B.; Ceocea, C.; Tîtu, S.

    2018-06-01

    The present paper deals with a current and complex topic, namely - a technical problem solving regarding the modeling and then optimization of some technical parameters related to the natural gas extraction process. The study subject is to optimize the gas probe sputtering using experimental research methods and data processing by regular probe intervention with different sputtering agents. This procedure makes that the hydrostatic pressure to be reduced by the foam formation from the water deposit and the scrubbing agent which can be removed from the surface by the produced gas flow. The probe production data was analyzed and the so-called candidate for the research itself emerged. This is an extremely complex study and it was carried out on the field works, finding that due to the severe gas field depletion the wells flow decreases and the start of their loading with deposit water, was registered. It was required the regular wells foaming, to optimize the daily production flow and the disposal of the wellbore accumulated water. In order to analyze the process of natural gas production, the factorial experiment and other methods were used. The reason of this choice is that the method can offer very good research results with a small number of experimental data. Finally, through this study the extraction process problems were identified by analyzing and optimizing the technical parameters, which led to a quality improvement of the extraction process.

  1. 40 CFR 180.1246 - Yeast Extract Hydrolysate from Saccharomyces cerevisiae: exemption from the requirement of a...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Yeast Extract Hydrolysate from... PESTICIDE CHEMICAL RESIDUES IN FOOD Exemptions From Tolerances § 180.1246 Yeast Extract Hydrolysate from... exemption from the requirement of a tolerance for residues of the biochemical pesticide Yeast Extract...

  2. 40 CFR 180.1246 - Yeast Extract Hydrolysate from Saccharomyces cerevisiae: exemption from the requirement of a...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Yeast Extract Hydrolysate from... PESTICIDE CHEMICAL RESIDUES IN FOOD Exemptions From Tolerances § 180.1246 Yeast Extract Hydrolysate from... exemption from the requirement of a tolerance for residues of the biochemical pesticide Yeast Extract...

  3. 40 CFR 180.1246 - Yeast Extract Hydrolysate from Saccharomyces cerevisiae: exemption from the requirement of a...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Yeast Extract Hydrolysate from... PESTICIDE CHEMICAL RESIDUES IN FOOD Exemptions From Tolerances § 180.1246 Yeast Extract Hydrolysate from... exemption from the requirement of a tolerance for residues of the biochemical pesticide Yeast Extract...

  4. 40 CFR 180.1246 - Yeast Extract Hydrolysate from Saccharomyces cerevisiae: exemption from the requirement of a...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Yeast Extract Hydrolysate from... PESTICIDE CHEMICAL RESIDUES IN FOOD Exemptions From Tolerances § 180.1246 Yeast Extract Hydrolysate from... exemption from the requirement of a tolerance for residues of the biochemical pesticide Yeast Extract...

  5. Extraction and textural characterization of above-ground areas from aerial stereo pairs: a quality assessment

    NASA Astrophysics Data System (ADS)

    Baillard, C.; Dissard, O.; Jamet, O.; Maître, H.

    Above-ground analysis is a key point to the reconstruction of urban scenes, but it is a difficult task because of the diversity of the involved objects. We propose a new method to above-ground extraction from an aerial stereo pair, which does not require any assumption about object shape or nature. A Digital Surface Model is first produced by a stereoscopic matching stage preserving discontinuities, and then processed by a region-based Markovian classification algorithm. The produced above-ground areas are finally characterized as man-made or natural according to the grey level information. The quality of the results is assessed and discussed.

  6. An improvement of vehicle detection under shadow regions in satellite imagery

    NASA Astrophysics Data System (ADS)

    Karim, Shahid; Zhang, Ye; Ali, Saad; Asif, Muhammad Rizwan

    2018-04-01

    The processing of satellite imagery is dependent upon the quality of imagery. Due to low resolution, it is difficult to extract accurate information according to the requirements of applications. For the purpose of vehicle detection under shadow regions, we have used HOG for feature extraction, SVM is used for classification and HOG is discerned worthwhile tool for complex environments. Shadow images have been scrutinized and found very complex for detection as observed very low detection rates therefore our dedication is towards enhancement of detection rate under shadow regions by implementing appropriate preprocessing. Vehicles are precisely detected under non-shadow regions with high detection rate than shadow regions.

  7. Comparative evaluation of in-house manual, and commercial semi-automated and automated DNA extraction platforms in the sample preparation of human stool specimens for a Salmonella enterica 5'-nuclease assay.

    PubMed

    Schuurman, Tim; de Boer, Richard; Patty, Rachèl; Kooistra-Smid, Mirjam; van Zwet, Anton

    2007-12-01

    In the present study, three methods (NucliSens miniMAG [bioMérieux], MagNA Pure DNA Isolation Kit III Bacteria/Fungi [Roche], and a silica-guanidiniumthiocyanate {Si-GuSCN-F} procedure for extracting DNA from stool specimens were compared with regard to analytical performance (relative DNA recovery and down stream real-time PCR amplification of Salmonella enterica DNA), stability of the extracted DNA, hands-on time (HOT), total processing time (TPT), and costs. The Si-GuSCN-F procedure showed the highest analytical performance (relative recovery of 99%, S. enterica real-time PCR sensitivity of 91%) at the lowest associated costs per extraction (euro 4.28). However, this method did required the longest HOT (144 min) and subsequent TPT (176 min) when processing 24 extractions. Both miniMAG and MagNA Pure extraction showed similar performances at first (relative recoveries of 57% and 52%, S. enterica real-time PCR sensitivity of 85%). However, when difference in the observed Ct values after real-time PCR were taken into account, MagNA Pure resulted in a significant increase in Ct value compared to both miniMAG and Si-GuSCN-F (with on average +1.26 and +1.43 cycles). With regard to inhibition all methods showed relatively low inhibition rates (< 4%), with miniMAG providing the lowest rate (0.7%). Extracted DNA was stable for at least 1 year for all methods. HOT was lowest for MagNA Pure (60 min) and TPT was shortest for miniMAG (121 min). Costs, finally, were euro 4.28 for Si-GuSCN, euro 6.69 for MagNA Pure and euro 9.57 for miniMAG.

  8. "In situ" extraction of essential oils by use of Dean-Stark glassware and a Vigreux column inside a microwave oven: a procedure for teaching green analytical chemistry.

    PubMed

    Chemat, Farid; Perino-Issartier, Sandrine; Petitcolas, Emmanuel; Fernandez, Xavier

    2012-08-01

    One of the principal objectives of sustainable and green processing development remains the dissemination and teaching of green chemistry in colleges, high schools, and academic laboratories. This paper describes simple glassware that illustrates the phenomenon of extraction in a conventional microwave oven as energy source and a process for green analytical chemistry. Simple glassware comprising a Dean-Stark apparatus (for extraction of aromatic plant material and recovery of essential oils and distilled water) and a Vigreux column (as an air-cooled condenser inside the microwave oven) was designed as an in-situ extraction vessel inside a microwave oven. The efficiency of this experiment was validated for extraction of essential oils from 30 g fresh orange peel, a by-product in the production of orange juice. Every laboratory throughout the world can use this equipment. The microwave power is 100 W and the irradiation time 15 min. The method is performed at atmospheric pressure without added solvent or water and furnishes essential oils similar to those obtained by conventional hydro or steam distillation. By use of GC-MS, 22 compounds in orange peel were separated and identified; the main compounds were limonene (72.1%), β-pinene (8.4%), and γ-terpinene (6.9%). This procedure is appropriate for the teaching laboratory, does not require any special microwave equipment, and enables the students to learn the skills of extraction, and chromatographic and spectroscopic analysis. They are also exposed to a dramatic visual example of rapid, sustainable, and green extraction of an essential oil, and are introduced to successful sustainable and green analytical chemistry.

  9. Study of optimal extraction conditions for achieving high yield and antioxidant activity of tomato seed oil.

    PubMed

    Shao, Dongyan; Atungulu, Griffiths G; Pan, Zhongli; Yue, Tianli; Zhang, Ang; Li, Xuan

    2012-08-01

    Value of tomato seed has not been fully recognized. The objectives of this research were to establish suitable processing conditions for extracting oil from tomato seed by using solvent, determine the impact of processing conditions on yield and antioxidant activity of extracted oil, and elucidate kinetics of the oil extraction process. Four processing parameters, including time, temperature, solvent-to-solid ratio and particle size were studied. A second order model was established to describe the oil extraction process. Based on the results, increasing temperature, solvent-to-solid ratio, and extraction time increased oil yield. In contrast, larger particle size reduced the oil yield. The recommended oil extraction conditions were 8 min of extraction time at temperature of 25 °C, solvent-to-solids ratio of 5/1 (v/w) and particle size of 0.38 mm, which gave oil yield of 20.32% with recovery rate of 78.56%. The DPPH scavenging activity of extracted oil was not significantly affected by the extraction parameters. The inhibitory concentration (IC(50) ) of tomato seed oil was 8.67 mg/mL which was notably low compared to most vegetable oils. A 2nd order model successfully described the kinetics of tomato oil extraction process and parameters of extraction kinetics including initial extraction rate (h), equilibrium concentration of oil (C(s) ), and the extraction rate constant (k) could be precisely predicted with R(2) of at least 0.957. The study revealed that tomato seed which is typically treated as a low value byproduct of tomato processing has great potential in producing oil with high antioxidant capability. The impact of processing conditions including time, temperature, solvent-to-solid ratio and particle size on yield, and antioxidant activity of extracted tomato seed oil are reported. Optimal conditions and models which describe the extraction process are recommended. The information is vital for determining the extraction processing conditions for industrial production of high quality tomato seed oil. Journal of Food Science © 2012 Institute of Food Technologists® No claim to original US government works.

  10. 21 CFR 173.280 - Solvent extraction process for citric acid.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 3 2014-04-01 2014-04-01 false Solvent extraction process for citric acid. 173..., Lubricants, Release Agents and Related Substances § 173.280 Solvent extraction process for citric acid. A solvent extraction process for recovery of citric acid from conventional Aspergillus niger fermentation...

  11. [Studies on the extraction process of total saponins from Paris polyphylla Smith].

    PubMed

    Sun, Zhi-Guo; Zhang, Lin; Li, Ling-Jun; Tian, Jing-Kui

    2007-06-01

    To optimize the extraction process of total saopnins from Paris polyphylla Smith. The single factor test and orthogonal experiment were used to determine the optimum extraction process. The optimum extraction process was obtained as follows: the plant materials were extracted with 70% ethanol twice, respectively with 10BV for 2 hours and then with 8BV the solvents for 1.5 hours. The yield of total saponins could be up to 4.24% and the total extraction rate of Paris polyphylla I and Paris polyphylla II was 93.28%. The optimum process obtained is steady, reasonable and feasible.

  12. Effect of a hawthorn extract on contraction and energy turnover of isolated rat cardiomyocytes.

    PubMed

    Pöpping, S; Rose, H; Ionescu, I; Fischer, Y; Kammermeier, H

    1995-11-01

    The hawthorn extract LI 132 (crataegus), prepared from leaves and flowers, and standardised to 2.2% flavonoids, was investigated with respect to its effect on (1) the contraction, (2) the energy-turnover and (3) the apparent refractory period (t(ref)) of isolated cardiac myocytes from adult rats. (1) The contractile behaviour of attached myocytes was analyzed by an image processing system. (2) The energy turnover was calculated from the decrease in oxygen content in the myocyte suspension, brought about by cellular respiration. It was differentiated between energy turnover related to cell shortening and that required for ionic transport processes by application of the contraction-inhibiting agent 2,3-butanedione monoxime. (3) The apparent refractory period (t(ref)) was evaluated by pacing the myocytes with increasing stimulation rates and determining the frequency at which failure of single contractions occurred. For these purposes, the myocytes were incubated in a stimulation chamber, which is part of a computer-assisted system allowing to simultaneously evaluate the mechanics and energetics of electrically induced contraction. Within a range of 30-180 microg/ml, the hawthorn extract exhibited a positive inotropic effect on the contraction amplitude accompanied by a moderate increase of energy turnover both for mechanical and ionic processes. In comparison with other positive inotropic interventions, such as application of the beta-adrenergic agonist isoprenaline, or of the cardiac glycoside ouabain (g-strophantin), or elevation of the extracellular Ca++-concentration, the effects of the hawthorn extract were significantly more economical with respect to the energetics of the myocytes. Furthermore the extract prolonged the apparent refractory period in the presence and the absence of isoprenaline, which be indicative for an antiarrhythmic potential.

  13. Using Microwaves for Extracting Water from the Moon

    NASA Technical Reports Server (NTRS)

    Ethridge, Edwin C.

    2009-01-01

    Twenty years ago, the Lunar Prospector remote sensing satellite provided evidence of relatively large hydrogen concentrations at the lunar poles and in particular concentrated in permanently shadowed craters. The scientific hypothesis is that the hydrogen is in the form of cryo-trapped water just under the surface of the soil. If true this would mean that an average of about 2% water ice is mixed with the lunar soil existing in the form of ice at cryogenic temperatures. For 5 years we have been investigating the use of microwaves for the processing of lunar soil. One of the early uses could be to use microwave energy to extract volatiles and in particular water from the lunar permafrost. Prototype experiments have shown that microwave energy at 2.45 GHz, as in consumer microwave ovens, will couple with and heat cryogenically cooled lunar soil permafrost simulant, resulting in the rapid sublimation of water vapor into the vacuum chamber. The water vapor has been collected on a cryogenic cold trap with high efficiency. The primary advantage of microwave processing is that the volatiles can be extracted in situ. Excavation would not be required. Microwave frequency dielectric property measurements are being made of different lunar soil simulants and plans are to measure Apollo lunar soil at different frequencies and over a range of temperatures. The materials properties are being used to evaluate the heating of lunar soil and develop COMSOL models that can be used to evaluate different microwave extraction scenarios. With COMSOL the heating from cryogenic temperatures can be calculated and COMSOL will permit temperature dependent materials properties to be used during the heating process. Calculations at different microwave frequencies will allow the evaluation of the type of hardware that would be needed to most efficiently extract the water and other volatiles.

  14. 40 CFR 180.1161 - Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... neem oil; exemption from the requirement of a tolerance. Clarified hydrophobic extract of neem oil is... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance. 180.1161 Section 180.1161 Protection of...

  15. 40 CFR 180.1161 - Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... neem oil; exemption from the requirement of a tolerance. Clarified hydrophobic extract of neem oil is... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance. 180.1161 Section 180.1161 Protection of...

  16. 40 CFR 180.1161 - Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... neem oil; exemption from the requirement of a tolerance. Clarified hydrophobic extract of neem oil is... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance. 180.1161 Section 180.1161 Protection of...

  17. 40 CFR 180.1161 - Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... neem oil; exemption from the requirement of a tolerance. Clarified hydrophobic extract of neem oil is... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance. 180.1161 Section 180.1161 Protection of...

  18. 40 CFR 180.1161 - Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... neem oil; exemption from the requirement of a tolerance. Clarified hydrophobic extract of neem oil is... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Clarified hydrophobic extract of neem oil; exemption from the requirement of a tolerance. 180.1161 Section 180.1161 Protection of...

  19. Predicting extractives content of Eucalyptus bosistoana F. Muell. Heartwood from stem cores by near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Yanjie; Altaner, Clemens

    2018-06-01

    Time and resource are the restricting factors for the wider use of chemical information of wood in tree breeding programs. NIR offers an advantage over wet-chemical analysis in these aspects and is starting to be used for tree breeding. This work describes the development of a NIR-based assessment of extractive content in heartwood of E. bosistoana, which does not require milling and conditioning of the samples. This was achieved by applying the signal processing algorithms (external parameter orthogonalisation (EPO) and significance multivariate correlation (sMC)) to spectra obtained from solid wood cores, which were able to correct for moisture content, grain direction and sample form. The accuracy of extractive content predictions was further improved by variable selection, resulting in a root mean square error of 1.27%. Considering the range of extractive content in E. bosistoana heartwood of 1.3 to 15.0%, the developed NIR calibration has the potential to be used in an E. bosistoana breeding program or to assess the special variation in extractive content throughout a stem.

  20. Application of solid-phase extraction to agar-supported fermentation.

    PubMed

    Le Goff, Géraldine; Adelin, Emilie; Cortial, Sylvie; Servy, Claudine; Ouazzani, Jamal

    2013-09-01

    Agar-supported fermentation (Ag-SF), a variant of solid-state fermentation, has recently been improved by the development of a dedicated 2 m(2) scale pilot facility, Platotex. We investigated the application of solid-phase extraction (SPE) to Ag-SF in order to increase yields and minimize the contamination of the extracts with agar constituents. The selection of the appropriate resin was conducted on liquid-state fermentation and Diaion HP-20 exhibited the highest recovery yield and selectivity for the metabolites of the model fungal strains Phomopsis sp. and Fusarium sp. SPE applied to Ag-SF resulted in a particular compartmentalization of the culture. The mycelium that requires oxygen to grow migrates to the top layer and formed a thick biofilm. The resin beads intercalate between the agar surface and the mycelium layer, and trap directly the compounds secreted by the mycelium through a "solid-solid extraction" (SSE) process. The resin/mycelium layer is easily recovered by scraping the surface and the target metabolites extracted by methanol. Ag-SF associated to SSE represents an ideal compromise for the production of bioactive secondary metabolites with limited economic and environmental impact.

  1. Application of forward osmosis membrane technology for oil sands process-affected water desalination.

    PubMed

    Jiang, Yaxin; Liang, Jiaming; Liu, Yang

    2016-01-01

    The extraction process used to obtain bitumen from the oil sands produces large volumes of oil sands process-affected water (OSPW). As a newly emerging desalination technology, forward osmosis (FO) has shown great promise in saving electrical power requirements, increasing water recovery, and minimizing brine discharge. With the support of this funding, a FO system was constructed using a cellulose triacetate FO membrane to test the feasibility of OSPW desalination and contaminant removal. The FO systems were optimized using different types and concentrations of draw solution. The FO system using 4 M NH4HCO3 as a draw solution achieved 85% water recovery from OSPW, and 80 to 100% contaminant rejection for most metals and ions. A water backwash cleaning method was applied to clean the fouled membrane, and the cleaned membrane achieved 77% water recovery, a performance comparable to that of new FO membranes. This suggests that the membrane fouling was reversible. The FO system developed in this project provides a novel and energy efficient strategy to remediate the tailings waters generated by oil sands bitumen extraction and processing.

  2. Separation science is the key to successful biopharmaceuticals.

    PubMed

    Guiochon, Georges; Beaver, Lois Ann

    2011-12-09

    The impact of economic change, advances in science, therapy and production processes resulted in considerable growth in the area of biopharmaceuticals. Progress in selection of microorganisms and improvements in cell culture and bioreactors is evidenced by increased yields of the desired products in the complex fermentation mixture. At this stage the downstream process of extraction and purification of the desired biopharmaceutical requires considerable attention in the design and operation of the units used for preparative chromatography. Understanding of the process, optimization of column design and experimental conditions have become critical to the biopharmaceutical industry in order to minimize production costs while satisfying new regulatory requirements. Optimization of the purification of biopharmaceuticals by preparative liquid chromatography including an examination of column preparation and bed properties is the focus of this manuscript. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. [Study on extraction process of zhanjin ruji].

    PubMed

    Du, Zhi-qian; Du, Tian-xin; Wang, Zhong-dong; Li, Gen-lin

    2003-01-01

    To select the optimum extraction process of Zhanjin Ruji. To observe influence of extraction time upon the extraction rate of volatile oil, the orthogonal test was adopted to observe the extraction process by alcohol from the extraction rate and content of the total saponins in Radix Notoginseng. The three kinds of herbs including Radix Angelicae Sinensis, Resina Olibani and Myrrha were extracted with water for 3 hours, 95% of volatile oil can be distilled. The three kinds of herbs including Radix Notoginseng, Herba Lycopodii and Radix Gentianae Macrophyllac were extracted by alcohol. Four factors such as alcohol concentration(A), extraction times(B), extraction time(C), and solvent amount(D), had not significant effect on the content of total saponins in Radix Notoginseng in herbal extraction, but factor A and B had significant effect on the extraction rate. The optimum extraction process was as follows extracted with 5 times the amount of the solvent volum 60% alcohol for 3 times and with each time for 1 hour. Three times experiments showed that the extraction rate was 26.5% and the content of the total saponins in Radix Notoginseng was 17.28% mg.g-1. The above experimental results can provide experimental basis for deciding the extraction process of Zhanjin Ruji.

  4. Mars Soil-Based Resource Processing and Planetary Protection

    NASA Technical Reports Server (NTRS)

    Sanders, G. B.; Mueller, R. P.

    2015-01-01

    The ability to extract and process resources at the site of exploration into products and services, commonly referred to as In Situ Resource Utilization (ISRU), can have significant benefits for robotic and human exploration missions. In particular, the ability to use in situ resources to make propellants, fuel cell reactants, and life support consumables has been shown in studies to significantly reduce mission mass, cost, and risk, while enhancing or enabling missions not possible without the incorporation of ISRU. In December 2007, NASA completed the Mars Human Design Reference Architecture (DRA) 5.0 study. For the first time in a large scale Mars architecture study, water from Mars soil was considered as a potential resource. At the time of the study, knowledge of water resources (their form, concentration, and distribution) was extremely limited. Also, due to lack of understanding of how to apply planetary protection rules and requirements to ISRU soil-based excavation and processing, an extremely conservative approach was incorporated where only the top several centimeters of ultraviolet (UV) radiated soil could be processed (assumed to be 3% water by mass). While results of the Mars DRA 5.0 study showed that combining atmosphere processing to make oxygen and methane with soil processing to extract water provided the lowest mission mass, atmosphere processing to convert carbon dioxide (CO2) into oxygen was baselined for the mission since it was the lowest power and risk option. With increased knowledge and further clarification of Mars planetary protection rules, and the recent release of the Mars Exploration Program Analysis Group (MEPAG) report on "Special Regions and the Human Exploration of Mars", it is time to reexamine potential water resources on Mars, options for soil processing to extract water, and the implications with respect to planetary protection and Special Regions on Mars.

  5. Qualitative and quantitative evaluation of the genomic DNA extracted from GMO and non-GMO foodstuffs with four different extraction methods.

    PubMed

    Peano, Clelia; Samson, Maria Cristina; Palmieri, Luisa; Gulli, Mariolina; Marmiroli, Nelson

    2004-11-17

    The presence of DNA in foodstuffs derived from or containing genetically modified organisms (GMO) is the basic requirement for labeling of GMO foods in Council Directive 2001/18/CE (Off. J. Eur. Communities 2001, L1 06/2). In this work, four different methods for DNA extraction were evaluated and compared. To rank the different methods, the quality and quantity of DNA extracted from standards, containing known percentages of GMO material and from different food products, were considered. The food products analyzed derived from both soybean and maize and were chosen on the basis of the mechanical, technological, and chemical treatment they had been subjected to during processing. Degree of DNA degradation at various stages of food production was evaluated through the amplification of different DNA fragments belonging to the endogenous genes of both maize and soybean. Genomic DNA was extracted from Roundup Ready soybean and maize MON810 standard flours, according to four different methods, and quantified by real-time Polymerase Chain Reaction (PCR), with the aim of determining the influence of the extraction methods on the DNA quantification through real-time PCR.

  6. NEFI: Network Extraction From Images

    PubMed Central

    Dirnberger, M.; Kehl, T.; Neumann, A.

    2015-01-01

    Networks are amongst the central building blocks of many systems. Given a graph of a network, methods from graph theory enable a precise investigation of its properties. Software for the analysis of graphs is widely available and has been applied to study various types of networks. In some applications, graph acquisition is relatively simple. However, for many networks data collection relies on images where graph extraction requires domain-specific solutions. Here we introduce NEFI, a tool that extracts graphs from images of networks originating in various domains. Regarding previous work on graph extraction, theoretical results are fully accessible only to an expert audience and ready-to-use implementations for non-experts are rarely available or insufficiently documented. NEFI provides a novel platform allowing practitioners to easily extract graphs from images by combining basic tools from image processing, computer vision and graph theory. Thus, NEFI constitutes an alternative to tedious manual graph extraction and special purpose tools. We anticipate NEFI to enable time-efficient collection of large datasets. The analysis of these novel datasets may open up the possibility to gain new insights into the structure and function of various networks. NEFI is open source and available at http://nefi.mpi-inf.mpg.de. PMID:26521675

  7. Automation of lidar-based hydrologic feature extraction workflows using GIS

    NASA Astrophysics Data System (ADS)

    Borlongan, Noel Jerome B.; de la Cruz, Roel M.; Olfindo, Nestor T.; Perez, Anjillyn Mae C.

    2016-10-01

    With the advent of LiDAR technology, higher resolution datasets become available for use in different remote sensing and GIS applications. One significant application of LiDAR datasets in the Philippines is in resource features extraction. Feature extraction using LiDAR datasets require complex and repetitive workflows which can take a lot of time for researchers through manual execution and supervision. The Development of the Philippine Hydrologic Dataset for Watersheds from LiDAR Surveys (PHD), a project under the Nationwide Detailed Resources Assessment Using LiDAR (Phil-LiDAR 2) program, created a set of scripts, the PHD Toolkit, to automate its processes and workflows necessary for hydrologic features extraction specifically Streams and Drainages, Irrigation Network, and Inland Wetlands, using LiDAR Datasets. These scripts are created in Python and can be added in the ArcGIS® environment as a toolbox. The toolkit is currently being used as an aid for the researchers in hydrologic feature extraction by simplifying the workflows, eliminating human errors when providing the inputs, and providing quick and easy-to-use tools for repetitive tasks. This paper discusses the actual implementation of different workflows developed by Phil-LiDAR 2 Project 4 in Streams, Irrigation Network and Inland Wetlands extraction.

  8. Semantic and syntactic interoperability in online processing of big Earth observation data.

    PubMed

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).

  9. Semantic and syntactic interoperability in online processing of big Earth observation data

    PubMed Central

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    ABSTRACT The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover). PMID:29387171

  10. Towards automatic patient selection for chemotherapy in colorectal cancer trials

    NASA Astrophysics Data System (ADS)

    Wright, Alexander; Magee, Derek; Quirke, Philip; Treanor, Darren E.

    2014-03-01

    A key factor in the prognosis of colorectal cancer, and its response to chemoradiotherapy, is the ratio of cancer cells to surrounding tissue (the so called tumour:stroma ratio). Currently tumour:stroma ratio is calculated manually, by examining H&E stained slides and counting the proportion of area of each. Virtual slides facilitate this analysis by allowing pathologists to annotate areas of tumour on a given digital slide image, and in-house developed stereometry tools mark random, systematic points on the slide, known as spots. These spots are examined and classified by the pathologist. Typical analyses require a pathologist to score at least 300 spots per tumour. This is a time consuming (10- 60 minutes per case) and laborious task for the pathologist and automating this process is highly desirable. Using an existing dataset of expert-classified spots from one colorectal cancer clinical trial, an automated tumour:stroma detection algorithm has been trained and validated. Each spot is extracted as an image patch, and then processed for feature extraction, identifying colour, texture, stain intensity and object characteristics. These features are used as training data for a random forest classification algorithm, and validated against unseen image patches. This process was repeated for multiple patch sizes. Over 82,000 such patches have been used, and results show an accuracy of 79%, depending on image patch size. A second study examining contextual requirements for pathologist scoring was conducted and indicates that further analysis of structures within each image patch is required in order to improve algorithm accuracy.

  11. Bio-Oil Separation and Stabilization by Near-Critical Propane Fractionation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ginosar, Daniel M.; Petkovic, Lucia M.; Agblevor, Foster A.

    Bio-oils produced by thermal process are promising sources of sustainable, low greenhouse gas alternative fuels. These thermal processes are also well suited to decentralized energy production due to low capital and operating costs. Algae feedstocks for bio-oil production are of particular interest, due in part to their high-energy growth yields. Further, algae can be grown in non-arable areas in fresh, brackish, salt water, or even waste water. Unfortunately, bio-oils produced by thermal processes present significant stability challenges. These oils have complex chemical compositions, are viscous, reactive, and thermally unstable. Further, the components within the oils are difficult to separate bymore » fractional distillation. By far, the most effective separation and stabilization method has been solvent extraction. However, liquid phase extraction processes pose two main obstacles to commercialization; they require a significant amount of energy to remove and recover the solvent from the product, and they have a propensity for the solvent to become contaminated with minerals from the char and ash present in the original bio-oil. Separation and fractionation of thermally produced bio-oils using supercritical fluids (SCF) offers the advantages of liquid solvent extraction while drastically reducing energy demands and the predisposition to carry over solids into the extracted phase. SCFs are dense fluids with liquid-like solvent properties and gas-like transport properties. Further, SCF density and solvent strength can be tuned with minor adjustments in pressure, co-solvent addition, or gas anti-solvent addition. Catalytic pyrolysis oils were produced from Scenedesmus dimorphus algae using a fluid catalytic cracking catalyst. Bio-oil produced from catalytic fast pyrolysis (CFP) was separated using critical fluids. Propane extraction was performed at 65 °C at a fluid reduced pressure of 2.0 (85 bar) using an eight to one solvent to feed ratio by weight. Extraction of catalytic fast pyrolysis oil with near critical propane produced an oil extract that was physically and chemically different from and more stable than the original oil. The propane extract displayed lower viscosity and lower average molecular weight. The species present in the propane extract were likely the less polar that would be expected from using a non-polar solvent (propane). Carbonyl containing species in the extract were likely ketones and esters. The raffinate contained a higher amnount of OH bonded species along with the more polar more polar acids, amides, and alcohols. The higher concentration of nitrogen in the raffinate may confirm the presence of amides. Viscosity of the propane extract increased only half as much as that of the CFP bio-oil. Further, In situ NMR aging studies showed that the propane extract was more stable than the raw oil. In conclusion, propane extraction is a promising method to decrease the nitrogen content of bio-oils and to improve the stability of bio-oils obtained by the catalytic pyrolysis of algae based biomass.« less

  12. A highly efficient method for extracting next-generation sequencing quality RNA from adipose tissue of recalcitrant animal species.

    PubMed

    Sharma, Davinder; Golla, Naresh; Singh, Dheer; Onteru, Suneel K

    2018-03-01

    The next-generation sequencing (NGS) based RNA sequencing (RNA-Seq) and transcriptome profiling offers an opportunity to unveil complex biological processes. Successful RNA-Seq and transcriptome profiling requires a large amount of high-quality RNA. However, NGS-quality RNA isolation is extremely difficult from recalcitrant adipose tissue (AT) with high lipid content and low cell numbers. Further, the amount and biochemical composition of AT lipid varies depending upon the animal species which can pose different degree of resistance to RNA extraction. Currently available approaches may work effectively in one species but can be almost unproductive in another species. Herein, we report a two step protocol for the extraction of NGS quality RNA from AT across a broad range of animal species. © 2017 Wiley Periodicals, Inc.

  13. Automatic Requirements Specification Extraction from Natural Language (ARSENAL)

    DTIC Science & Technology

    2014-10-01

    designers, implementers) involved in the design of software systems. However, natural language descriptions can be informal, incomplete, imprecise...communication of technical descriptions between the various stakeholders (e.g., customers, designers, imple- menters) involved in the design of software systems...the accuracy of the natural language processing stage, the degree of automation, and robustness to noise. 1 2 Introduction Software systems operate in

  14. Design Experiments: Developing and Testing an Intervention for Elementary School-Age Students Who Use Non-Mainstream American English Dialects

    ERIC Educational Resources Information Center

    Thomas-Tate, Shurita; Connor, Carol McDonald; Johnson, Lakeisha

    2013-01-01

    Reading comprehension, defined as the active extraction and construction of meaning from all kinds of text, requires children to fluently decode and understand what they are reading. Basic processes underlying reading comprehension are complex and call on the oral language system and a conscious understanding of this system, i.e., metalinguistic…

  15. Applying Separations Science to Waste Problems.

    DTIC Science & Technology

    1998-01-01

    inert cathode. Centrifugal Contactor for Processing Liquid Radioactive Waste We have developed an annular centrifugal contactor for use in liquid...radioactive waste. The CMT-designed centrifugal contactor has several advantages over other solvent-extraction equipment currently in use. It requires less...Y-12 Plant, Savannah River Site, and Oak Ridge National Laboratory. The benefits that make the centrifugal contactor the equipment of choice in the

  16. 40 CFR 63.2854 - How do I determine the weighted average volume fraction of HAP in the actual solvent loss?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Extraction for Vegetable Oil Production Compliance Requirements § 63.2854 How do I determine the weighted... received for use in your vegetable oil production process. By the end of each calendar month following an... the solvent in each delivery of solvent, including solvent recovered from off-site oil. To determine...

  17. 40 CFR 63.2854 - How do I determine the weighted average volume fraction of HAP in the actual solvent loss?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Extraction for Vegetable Oil Production Compliance Requirements § 63.2854 How do I determine the weighted... received for use in your vegetable oil production process. By the end of each calendar month following an... the solvent in each delivery of solvent, including solvent recovered from off-site oil. To determine...

  18. Degraded Imagery/Art Technique for the Handicapped.

    ERIC Educational Resources Information Center

    Agard, Richard

    Developed for handicapped artists, Degraded Imagery is a technique whereby images can be extracted and refined from a photograph or a collage of photographs. The advantage of this process is that it requires a lower degree of fine motor skills to produce a quality image from a photograph than it does to create a quality image on a blank piece of…

  19. Using Airborne Remote Sensing to Increase Situational Awareness in Civil Protection and Humanitarian Relief - the Importance of User Involvement

    NASA Astrophysics Data System (ADS)

    Römer, H.; Kiefl, R.; Henkel, F.; Wenxi, C.; Nippold, R.; Kurz, F.; Kippnich, U.

    2016-06-01

    Enhancing situational awareness in real-time (RT) civil protection and emergency response scenarios requires the development of comprehensive monitoring concepts combining classical remote sensing disciplines with geospatial information science. In the VABENE++ project of the German Aerospace Center (DLR) monitoring tools are being developed by which innovative data acquisition approaches are combined with information extraction as well as the generation and dissemination of information products to a specific user. DLR's 3K and 4k camera system which allow for a RT acquisition and pre-processing of high resolution aerial imagery are applied in two application examples conducted with end users: a civil protection exercise with humanitarian relief organisations and a large open-air music festival in cooperation with a festival organising company. This study discusses how airborne remote sensing can significantly contribute to both, situational assessment and awareness, focussing on the downstream processes required for extracting information from imagery and for visualising and disseminating imagery in combination with other geospatial information. Valuable user feedback and impetus for further developments has been obtained from both applications, referring to innovations in thematic image analysis (supporting festival site management) and product dissemination (editable web services). Thus, this study emphasises the important role of user involvement in application-related research, i.e. by aligning it closer to user's requirements.

  20. Climate impacts of oil extraction increase significantly with oilfield age

    NASA Astrophysics Data System (ADS)

    Masnadi, Mohammad S.; Brandt, Adam R.

    2017-08-01

    Record-breaking temperatures have induced governments to implement targets for reducing future greenhouse gas (GHG) emissions. Use of oil products contributes ~35% of global GHG emissions, and the oil industry itself consumes 3-4% of global primary energy. Because oil resources are becoming increasingly heterogeneous, requiring different extraction and processing methods, GHG studies should evaluate oil sources using detailed project-specific data. Unfortunately, prior oil-sector GHG analysis has largely neglected the fact that the energy intensity of producing oil can change significantly over the life of a particular oil project. Here we use decades-long time-series data from twenty-five globally significant oil fields (>1 billion barrels ultimate recovery) to model GHG emissions from oil production as a function of time. We find that volumetric oil production declines with depletion, but this depletion is accompanied by significant growth--in some cases over tenfold--in per-MJ GHG emissions. Depletion requires increased energy expenditures in drilling, oil recovery, and oil processing. Using probabilistic simulation, we derive a relationship for estimating GHG increases over time, showing an expected doubling in average emissions over 25 years. These trends have implications for long-term emissions and climate modelling, as well as for climate policy.

  1. Development of an optical character recognition pipeline for handwritten form fields from an electronic health record.

    PubMed

    Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin

    2012-06-01

    Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.

  2. Development of an optical character recognition pipeline for handwritten form fields from an electronic health record

    PubMed Central

    Peissig, Peggy L; McCarty, Catherine A; Starren, Justin

    2011-01-01

    Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871

  3. Solid state cathode materials for secondary magnesium-ion batteries that are compatible with magnesium metal anodes in water-free electrolyte

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowe, Adam J.; Bartlett, Bart M., E-mail: bartmb@umich.edu

    2016-10-15

    With high elemental abundance, large volumetric capacity, and dendrite-free metal deposition, magnesium metal anodes offer promise in beyond-lithium-ion batteries. However, the increased charge density associated with the divalent magnesium-ion (Mg{sup 2+}), relative to lithium-ion (Li{sup +}) hinders the ion-insertion and extraction processes within many materials and structures known for lithium-ion cathodes. As a result, many recent investigations incorporate known amounts of water within the electrolyte to provide temporary solvation of the Mg{sup 2+}, improving diffusion kinetics. Unfortunately with the addition of water, compatibility with magnesium metal anodes disappears due to forming an ion-insulating passivating layer. In this short review, recentmore » advances in solid state cathode materials for rechargeable magnesium-ion batteries are highlighted, with a focus on cathode materials that do not require water contaminated electrolyte solutions for ion insertion and extraction processes. - Graphical abstract: In this short review, we present candidate materials for reversible Mg-battery cathodes that are compatible with magnesium metal in water-free electrolytes. The data suggest that soft, polarizable anions are required for reversible cycling.« less

  4. Extraction of stability and control derivatives from orbiter flight data

    NASA Technical Reports Server (NTRS)

    Iliff, Kenneth W.; Shafer, Mary F.

    1993-01-01

    The Space Shuttle Orbiter has provided unique and important information on aircraft flight dynamics. This information has provided the opportunity to assess the flight-derived stability and control derivatives for maneuvering flight in the hypersonic regime. In the case of the Space Shuttle Orbiter, these derivatives are required to determine if certain configuration placards (limitations on the flight envelope) can be modified. These placards were determined on the basis of preflight predictions and the associated uncertainties. As flight-determined derivatives are obtained, the placards are reassessed, and some of them are removed or modified. Extraction of the stability and control derivatives was justified by operational considerations and not by research considerations. Using flight results to update the predicted database of the orbiter is one of the most completely documented processes for a flight vehicle. This process followed from the requirement for analysis of flight data for control system updates and for expansion of the operational flight envelope. These results show significant changes in many important stability and control derivatives from the preflight database. This paper presents some of the stability and control derivative results obtained from Space Shuttle flights. Some of the limitations of this information are also examined.

  5. Application of FTA technology to extraction of sperm DNA from mixed body fluids containing semen.

    PubMed

    Fujita, Yoshihiko; Kubo, Shin-ichi

    2006-01-01

    FTA technology is a novel method designed to simplify the collection, shipment, archiving and purification of nucleic acids from a wide variety of biological sources. In this study, we report a rapid and simple method of extracting DNA from sperm when body fluids mixed with semen were collected using FTA cards. After proteinase K digestion of the sperm and body fluid mixture, the washed pellet suspension as the sperm fraction and the concentrated supernatant as the epithelial cell fraction were respectively applied to FTA cards containing DTT. The FTA cards were dried, then directly added to a polymerase chain reaction (PCR) mix and processed by PCR. The time required from separation of the mixed fluid into sperm and epithelial origin DNA extractions was only about 2.5-3h. Furthermore, the procedure was extremely simple. It is considered that our designed DNA extraction procedure using an FTA card is available for application to routine work.

  6. Photon extraction and conversion for scalable ion-trap quantum computing

    NASA Astrophysics Data System (ADS)

    Clark, Susan; Benito, Francisco; McGuinness, Hayden; Stick, Daniel

    2014-03-01

    Trapped ions represent one of the most mature and promising systems for quantum information processing. They have high-fidelity one- and two-qubit gates, long coherence times, and their qubit states can be reliably prepared and detected. Taking advantage of these inherent qualities in a system with many ions requires a means of entangling spatially separated ion qubits. One architecture achieves this entanglement through the use of emitted photons to distribute quantum information - a favorable strategy if photon extraction can be made efficient and reliable. Here I present results for photon extraction from an ion in a cavity formed by integrated optics on a surface trap, as well as results in frequency converting extracted photons for long distance transmission or interfering with photons from other types of optically active qubits. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U. S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  7. Nitrogen recycling from fuel-extracted algal biomass: residuals as the sole nitrogen source for culturing Scenedesmus acutus.

    PubMed

    Gu, Huiya; Nagle, Nick; Pienkos, Philip T; Posewitz, Matthew C

    2015-05-01

    In this study, the reuse of nitrogen from fuel-extracted algal residues was investigated. The alga Scenedesmus acutus was found to be able to assimilate nitrogen contained in amino acids, yeast extracts, and proteinaceous alga residuals. Moreover, these alternative nitrogen resources could replace nitrate in culturing media. The ability of S. acutus to utilize the nitrogen remaining in processed algal biomass was unique among the promising biofuel strains tested. This alga was leveraged in a recycling approach where nitrogen is recovered from algal biomass residuals that remain after lipids are extracted and carbohydrates are fermented to ethanol. The protein-rich residuals not only provided an effective nitrogen resource, but also contributed to a carbon "heterotrophic boost" in subsequent culturing, improving overall biomass and lipid yields relative to the control medium with only nitrate. Prior treatment of the algal residues with Diaion HP20 resin was required to remove compounds inhibitory to algal growth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Fast modal extraction in NASTRAN via the FEER computer program. [based on automatic matrix reduction method for lower modes of structures with many degrees of freedom

    NASA Technical Reports Server (NTRS)

    Newman, M. B.; Pipano, A.

    1973-01-01

    A new eigensolution routine, FEER (Fast Eigensolution Extraction Routine), used in conjunction with NASTRAN at Israel Aircraft Industries is described. The FEER program is based on an automatic matrix reduction scheme whereby the lower modes of structures with many degrees of freedom can be accurately extracted from a tridiagonal eigenvalue problem whose size is of the same order of magnitude as the number of required modes. The process is effected without arbitrary lumping of masses at selected node points or selection of nodes to be retained in the analysis set. The results of computational efficiency studies are presented, showing major arithmetic operation counts and actual computer run times of FEER as compared to other methods of eigenvalue extraction, including those available in the NASTRAN READ module. It is concluded that the tridiagonal reduction method used in FEER would serve as a valuable addition to NASTRAN for highly increased efficiency in obtaining structural vibration modes.

  9. 21 CFR 173.280 - Solvent extraction process for citric acid.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 3 2013-04-01 2013-04-01 false Solvent extraction process for citric acid. 173... Solvent extraction process for citric acid. A solvent extraction process for recovery of citric acid from conventional Aspergillus niger fermentation liquor may be safely used to produce food-grade citric acid in...

  10. 21 CFR 173.280 - Solvent extraction process for citric acid.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 3 2011-04-01 2011-04-01 false Solvent extraction process for citric acid. 173... Solvent extraction process for citric acid. A solvent extraction process for recovery of citric acid from conventional Aspergillus niger fermentation liquor may be safely used to produce food-grade citric acid in...

  11. 21 CFR 173.280 - Solvent extraction process for citric acid.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 3 2012-04-01 2012-04-01 false Solvent extraction process for citric acid. 173... Solvent extraction process for citric acid. A solvent extraction process for recovery of citric acid from conventional Aspergillus niger fermentation liquor may be safely used to produce food-grade citric acid in...

  12. 21 CFR 173.280 - Solvent extraction process for citric acid.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 3 2010-04-01 2009-04-01 true Solvent extraction process for citric acid. 173.280... extraction process for citric acid. A solvent extraction process for recovery of citric acid from conventional Aspergillus niger fermentation liquor may be safely used to produce food-grade citric acid in...

  13. Material and Energy Requirement for Rare Earth Production

    NASA Astrophysics Data System (ADS)

    Talens Peiró, Laura; Villalba Méndez, Gara

    2013-10-01

    The use of rare earth metals (REMs) for new applications in renewable and communication technologies has increased concern about future supply as well as environmental burdens associated with the extraction, use, and disposal (losses) of these metals. Although there are several reports describing and quantifying the production and use of REM, there is still a lack of quantitative data about the material and energy requirements for their extraction and refining. Such information remains difficult to acquire as China is still supplying over 95% of the world REM supply. This article attempts to estimate the material and energy requirements for the production of REM based on the theoretical chemical reactions and thermodynamics. The results show the material and energy requirement varies greatly depending on the type of mineral ore, production facility, and beneficiation process selected. They also show that the greatest loss occurs during mining (25-50%) and beneficiation (10-30%) of RE minerals. We hope that the material and energy balances presented in this article will be of use in life cycle analysis, resource accounting, and other industrial ecology tools used to quantify the environmental consequences of meeting REM demand for new technology products.

  14. Microencapsulated antimicrobial compounds as a means to enhance electron beam irradiation treatment for inactivation of pathogens on fresh spinach leaves.

    PubMed

    Gomes, Carmen; Moreira, Rosana G; Castell-Perez, Elena

    2011-08-01

    Recent outbreaks associated to the consumption of raw or minimally processed vegetable products that have resulted in several illnesses and a few deaths call for urgent actions aimed at improving the safety of those products. Electron beam irradiation can extend shelf-life and assure safety of fresh produce. However, undesirable effects on the organoleptic quality at doses required to achieve pathogen inactivation limit irradiation. Ways to increase pathogen radiation sensitivity could reduce the dose required for a certain level of microbial kill. The objective of this study was to evaluate the effectiveness of using natural antimicrobials when irradiating fresh produce. The minimum inhibitory concentration of 5 natural compounds and extracts (trans-cinnamaldehyde, eugenol, garlic extract, propolis extract, and lysozyme with ethylenediaminetetraacetate acid (disodium salt dihydrate) was determined against Salmonella spp. and Listeria spp. In order to mask odor and off-flavor inherent of several compounds, and to increase their solubility, complexes of these compounds and extracts with β-cyclodextrin were prepared by the freeze-drying method. All compounds showed bacteriostatic effect at different levels for both bacteria. The effectiveness of the microencapsulated compounds was tested by spraying them on the surface of baby spinach inoculated with Salmonella spp. The dose (D₁₀ value) required to reduce the bacterial population by 1 log was 0.190 kGy without antimicrobial addition. The increase in radiation sensitivity (up to 40%) varied with the antimicrobial compound. These results confirm that the combination of spraying microencapsulated antimicrobials with electron beam irradiation was effective in increasing the killing effect of irradiation. Foodborne illness outbreaks attributed to fresh produce consumption have increased and present new challenges to food safety. Current technologies (water washing or treating with 200 ppm chlorine) cannot eliminate internalized pathogens. Ionizing radiation is a viable alternative for eliminating pathogens; however, the dose required to inactivate these pathogens is often too high to be tolerated by the fresh produce without undesirable quality changes. This study uses natural antimicrobial ingredients as radiosensitizers. These ingredients were encapsulated and applied to fresh produce that was subsequently irradiated. The process results in high level of microorganism inactivation using lower doses than the conventional irradiation treatments. © 2011 Institute of Food Technologists®

  15. The Adverse Drug Reactions from Patient Reports in Social Media Project: Five Major Challenges to Overcome to Operationalize Analysis and Efficiently Support Pharmacovigilance Process

    PubMed Central

    Dahamna, Badisse; Guillemin-Lanne, Sylvie; Darmoni, Stefan J; Faviez, Carole; Huot, Charles; Katsahian, Sandrine; Leroux, Vincent; Pereira, Suzanne; Richard, Christophe; Schück, Stéphane; Souvignet, Julien; Lillo-Le Louët, Agnès; Texier, Nathalie

    2017-01-01

    Background Adverse drug reactions (ADRs) are an important cause of morbidity and mortality. Classical Pharmacovigilance process is limited by underreporting which justifies the current interest in new knowledge sources such as social media. The Adverse Drug Reactions from Patient Reports in Social Media (ADR-PRISM) project aims to extract ADRs reported by patients in these media. We identified 5 major challenges to overcome to operationalize the analysis of patient posts: (1) variable quality of information on social media, (2) guarantee of data privacy, (3) response to pharmacovigilance expert expectations, (4) identification of relevant information within Web pages, and (5) robust and evolutive architecture. Objective This article aims to describe the current state of advancement of the ADR-PRISM project by focusing on the solutions we have chosen to address these 5 major challenges. Methods In this article, we propose methods and describe the advancement of this project on several aspects: (1) a quality driven approach for selecting relevant social media for the extraction of knowledge on potential ADRs, (2) an assessment of ethical issues and French regulation for the analysis of data on social media, (3) an analysis of pharmacovigilance expert requirements when reviewing patient posts on the Internet, (4) an extraction method based on natural language processing, pattern based matching, and selection of relevant medical concepts in reference terminologies, and (5) specifications of a component-based architecture for the monitoring system. Results Considering the 5 major challenges, we (1) selected a set of 21 validated criteria for selecting social media to support the extraction of potential ADRs, (2) proposed solutions to guarantee data privacy of patients posting on Internet, (3) took into account pharmacovigilance expert requirements with use case diagrams and scenarios, (4) built domain-specific knowledge resources embeding a lexicon, morphological rules, context rules, semantic rules, syntactic rules, and post-analysis processing, and (5) proposed a component-based architecture that allows storage of big data and accessibility to third-party applications through Web services. Conclusions We demonstrated the feasibility of implementing a component-based architecture that allows collection of patient posts on the Internet, near real-time processing of those posts including annotation, and storage in big data structures. In the next steps, we will evaluate the posts identified by the system in social media to clarify the interest and relevance of such approach to improve conventional pharmacovigilance processes based on spontaneous reporting. PMID:28935617

  16. Experimental Study on Strength Evaluation Applied for Teeth Extraction: An In Vivo Study

    PubMed Central

    Cicciù, Marco; Bramanti, Ennio; Signorino, Fabrizio; Cicciù, Alessandra; Sortino, Francesco

    2013-01-01

    Purpose: The aim of this work was to analyse all the applied movements when extracting healthy upper and lower jaw premolars for orthodontic purposes. The authors wanted to demonstrate that the different bone densities of the mandible and maxilla are not a significant parameter when related to the extraction force applied. The buccal and palatal rocking movements, plus the twisting movements were also measured in this in-vivo study during premolar extraction for orthodontic purposes. Methods: The physical strains or forces transferred onto the teeth during extraction are the following three movements: gripping, twisting, and traction. A strain measurement gauge was attached onto an ordinary dentistry plier. The strain measurement gauge was constituted with an extensimetric washer with three 45º grids. The system operation was correlated to the variation of electrical resistance. Results: The variations of resistance (∆R) and all the different forces applied to the teeth (∆V) were recorded by a computerized system. Data results were processed through Microsoft Excel. The results underlined the stress distribution on the extracted teeth during gripping, twisting and flexion. Conclusions: The obtained data showed that the strength required to effect teeth extraction is not influenced by the quality of the bone but is instead influenced by the shape of the tooth’s root. PMID:23539609

  17. Single-trial event-related potential extraction through one-unit ICA-with-reference

    NASA Astrophysics Data System (ADS)

    Lih Lee, Wee; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong

    2016-12-01

    Objective. In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. Approach. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Main results. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. Significance. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.

  18. Single-trial event-related potential extraction through one-unit ICA-with-reference.

    PubMed

    Lee, Wee Lih; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong

    2016-12-01

    In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.

  19. Tequila production.

    PubMed

    Cedeño, M

    1995-01-01

    Tequila is obtained from the distillation of fermented juice of agave plant, Agave tequilana, to which up to 49% (w/v) of an adjunct sugar, mainly from cane or corn, could be added. Agave plants require from 8 to 12 years to mature and during all this time cleaning, pest control, and slacken of land are required to produce an initial raw material with the appropriate chemical composition for tequila production. Production process comprises four steps: cooking to hydrolyze inulin into fructose, milling to extract the sugars, fermentation with a strain of Saccharomyces cerevisiae to convert the sugars into ethanol and organoleptic compounds, and, finally, a two-step distillation process. Maturation, if needed, is carried out in white oak barrels to obtain rested or aged tequila in 2 or 12 months, respectively.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frank, Edward; Pegallapati, Ambica K.; Davis, Ryan

    The Department of Energy (DOE) Bioenergy Technologies Office (BETO) Multi-year Program Plan (MYPP) describes the bioenergy objectives pursued by BETO, the strategies for achieving those objectives, the current state of technology (SOT), and a number of design cases that explore cost and operational performance required to advance the SOT towards middle and long term goals (MYPP, 2016). Two options for converting algae to biofuel intermediates were considered in the MYPP, namely algal biofuel production via lipid extraction and algal biofuel production by thermal processing. The first option, lipid extraction, is represented by the Combined Algae Processing (CAP) pathway in whichmore » algae are hydrolyzed in a weak acid pretreatment step. The treated slurry is fermented for ethanol production from sugars. The fermentation stillage contains most of the lipids from the original biomass, which are recovered through wet solvent extraction. The process residuals after lipid extraction, which contain much of the original mass of amino acids and proteins, are directed to anaerobic digestion (AD) for biogas production and recycle of N and P nutrients. The second option, thermal processing, comprises direct hydrothermal liquefaction (HTL) of the wet biomass, separation of aqueous, gas, and oil phases, and treatment of the aqueous phase with catalytic hydrothermal gasification (CHG) to produce biogas and to recover N and P nutrients. The present report describes a life cycle analysis of energy use and greenhouse gas (GHG) emissions of the CAP and HTL options for the three scenarios just described. Water use is also reported. Water use during algal biofuel production comes from evaporation during cultivation, discharge to bleed streams to control pond salinity (“blowdown”), and from use during preprocessing and upgrading. For scenarios considered to date, most water use was from evaporation and, secondarily, from bleed streams. Other use was relatively small at the level of fidelity being modeled now.« less

  1. [Research on optimal modeling strategy for licorice extraction process based on near-infrared spectroscopy technology].

    PubMed

    Wang, Hai-Xia; Suo, Tong-Chuan; Yu, He-Shui; Li, Zheng

    2016-10-01

    The manufacture of traditional Chinese medicine (TCM) products is always accompanied by processing complex raw materials and real-time monitoring of the manufacturing process. In this study, we investigated different modeling strategies for the extraction process of licorice. Near-infrared spectra associate with the extraction time was used to detemine the states of the extraction processes. Three modeling approaches, i.e., principal component analysis (PCA), partial least squares regression (PLSR) and parallel factor analysis-PLSR (PARAFAC-PLSR), were adopted for the prediction of the real-time status of the process. The overall results indicated that PCA, PLSR and PARAFAC-PLSR can effectively detect the errors in the extraction procedure and predict the process trajectories, which has important significance for the monitoring and controlling of the extraction processes. Copyright© by the Chinese Pharmaceutical Association.

  2. Extraction-less, rapid assay for the direct detection of 2,4,6-trichloroanisole (TCA) in cork samples.

    PubMed

    Apostolou, Theofylaktos; Pascual, Nuria; Marco, M-Pilar; Moschos, Anastassios; Petropoulos, Anastassios; Kaltsas, Grigoris; Kintzios, Spyridon

    2014-07-01

    2,4,6-trichloroanisole (TCA), the cork taint molecule, has been the target of several analytical approaches over the few past years. In spite of the development of highly efficient and sensitive tools for its detection, ranging from advanced chromatography to biosensor-based techniques, a practical breakthrough for routine cork screening purposes has not yet been realized, in part due to the requirement of a lengthy extraction of TCA in organic solvents, mostly 12% ethanol and the high detectability required. In the present report, we present a modification of a previously reported biosensor system based on the measurement of the electric response of cultured fibroblast cells membrane-engineered with the pAb78 TCA-specific antibody. Samples were prepared by macerating cork tissue and mixing it directly with the cellular biorecognition elements, without any intervening extraction process. By using this novel approach, we were able to detect TCA in just five minutes at extremely low concentrations (down to 0.2 ppt). The novel biosensor offers a number of practical benefits, including a very considerable reduction in the total assay time by one day, and a full portability, enabling its direct employment for on-site, high throughput screening of cork in the field and production facilities, without requiring any type of supporting infrastructure. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Appraisal on the wound healing activity of different extracts obtained from Aegle marmelos and Mucuna pruriens by in vivo experimental models.

    PubMed

    Toppo, F A; Pawar, R S

    2016-01-01

    The use of a simple and reproducible model is inevitable for an objective statement of the effects of external factors on wound healing. Hence, the present study was conducted to evaluate wound healing activities of sequential different extracts of Aegle marmelos leaves (AM) and Mucuna pruriens seeds (MP) by in vivo experimental models. Wistar albino rats were subjected to excision, incision and dead space wounds measuring approximately 250 mm2, 3 cm and implanting sterilized polyvinyl chloride tube on the back of each rat near either side of the vertebral column respectively. The experimental animals were randomized into eight groups (n = 6), control, standard and treatment groups. Hydrogel of different extracts were applied topically once daily. The parameters observed were percentage of wound contraction, epithelization period, tensile strength, hydroxyproline content of the granulation tissue, and histological changes during wound healing. The statistical study revealed that in excision, incision, and dead space wound models all formulations have significant (P < 0.01) wound healing potential. However, methanolic extract formulation was found to be superior to all other treatments as evidenced by rapid wound contraction, lesser number of days required for complete epithelization, increased tensile strength and significant increase in hydroxyproline content. As compared to the reference standard treated group the wound healing process of the experimental groups was decelerated. All extracts obtained from AM and MP facilitated the wound healing process in all experimental models.

  4. A combined physicochemical-biological method of NaCl extraction from the irrigation solution in the BTLSS

    NASA Astrophysics Data System (ADS)

    Trifonov, Sergey V.; Tikhomirov, Alexander A.; Ushakova, Sofya; Tikhomirova, Natalia

    2016-07-01

    The use of processed human wastes as a source of minerals for plants in closed biotechnical life support systems (BTLSS) leads to high salt levels in the irrigation solution, as urine contains high concentrations of NaCl. It is important to develop a process that would effectively decrease NaCl concentration in the irrigation solution and return this salt to the crew's diet. The salt-tolerant plants (Salicornia europea) used to reduce NaCl concentration in the irrigation solution require higher salt concentrations than those of the solution, and this problem cannot be resolved by concentrating the solution. At the same time, NaCl extracted from mineralized wastes by physicochemical methods is not pure enough to be included in the crew's diet. This study describes an original physicochemical method of NaCl extraction from the solution, which is intended to be used in combination with the biological method of NaCl extraction by using saltwort plants. The physicochemical method produces solutions with high NaCl concentrations, and saltwort plants serve as a biological filter in the final phase, to produce table salt. The study reports the order in which physicochemical and biological methods of NaCl extraction from the irrigation solution should be used to enable rapid and effective inclusion of NaCl into the cycling of the BTLSS with humans. This study was carried out in the IBP SB RAS and supported by the grant of the Russian Science Foundation (Project No. 14-14-00599).

  5. Tackling saponin diversity in marine animals by mass spectrometry: data acquisition and integration.

    PubMed

    Decroo, Corentin; Colson, Emmanuel; Demeyer, Marie; Lemaur, Vincent; Caulier, Guillaume; Eeckhaut, Igor; Cornil, Jérôme; Flammang, Patrick; Gerbaux, Pascal

    2017-05-01

    Saponin analysis by mass spectrometry methods is nowadays progressively supplementing other analytical methods such as nuclear magnetic resonance (NMR). Indeed, saponin extracts from plant or marine animals are often constituted by a complex mixture of (slightly) different saponin molecules that requires extensive purification and separation steps to meet the requirement for NMR spectroscopy measurements. Based on its intrinsic features, mass spectrometry represents an inescapable tool to access the structures of saponins within extracts by using LC-MS, MALDI-MS, and tandem mass spectrometry experiments. The combination of different MS methods nowadays allows for a nice description of saponin structures, without extensive purification. However, the structural characterization process is based on low kinetic energy CID which cannot afford a total structure elucidation as far as stereochemistry is concerned. Moreover, the structural difference between saponins in a same extract is often so small that coelution upon LC-MS analysis is unavoidable, rendering the isomeric distinction and characterization by CID challenging or impossible. In the present paper, we introduce ion mobility in combination with liquid chromatography to better tackle the structural complexity of saponin congeners. When analyzing saponin extracts with MS-based methods, handling the data remains problematic for the comprehensive report of the results, but also for their efficient comparison. We here introduce an original schematic representation using sector diagrams that are constructed from mass spectrometry data. We strongly believe that the proposed data integration could be useful for data interpretation since it allows for a direct and fast comparison, both in terms of composition and relative proportion of the saponin contents in different extracts. Graphical Abstract A combination of state-of-the-art mass spectrometry methods, including ion mobility spectroscopy, is developed to afford a complete description of the saponin molecules in natural extracts.

  6. Automatic drawing for traffic marking with MMS LIDAR intensity

    NASA Astrophysics Data System (ADS)

    Takahashi, G.; Takeda, H.; Shimano, Y.

    2014-05-01

    Upgrading the database of CYBER JAPAN has been strategically promoted because the "Basic Act on Promotion of Utilization of Geographical Information", was enacted in May 2007. In particular, there is a high demand for road information that comprises a framework in this database. Therefore, road inventory mapping work has to be accurate and eliminate variation caused by individual human operators. Further, the large number of traffic markings that are periodically maintained and possibly changed require an efficient method for updating spatial data. Currently, we apply manual photogrammetry drawing for mapping traffic markings. However, this method is not sufficiently efficient in terms of the required productivity, and data variation can arise from individual operators. In contrast, Mobile Mapping Systems (MMS) and high-density Laser Imaging Detection and Ranging (LIDAR) scanners are rapidly gaining popularity. The aim in this study is to build an efficient method for automatically drawing traffic markings using MMS LIDAR data. The key idea in this method is extracting lines using a Hough transform strategically focused on changes in local reflection intensity along scan lines. However, also note that this method processes every traffic marking. In this paper, we discuss a highly accurate and non-human-operator-dependent method that applies the following steps: (1) Binarizing LIDAR points by intensity and extracting higher intensity points; (2) Generating a Triangulated Irregular Network (TIN) from higher intensity points; (3) Deleting arcs by length and generating outline polygons on the TIN; (4) Generating buffers from the outline polygons; (5) Extracting points from the buffers using the original LIDAR points; (6) Extracting local-intensity-changing points along scan lines using the extracted points; (7) Extracting lines from intensity-changing points through a Hough transform; and (8) Connecting lines to generate automated traffic marking mapping data.

  7. Integrated Forest Products Refinery (IFPR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Heiningen, Adriaan R. P.

    2010-05-29

    Pre-extraction–kraft studies of hardwoods showed that when extracting about 10% of the wood, the final kraft pulp yield and physical properties could only be maintained at a level similar to that of regular kraft pulp when the final extract pH was close to neutral. This so-called “near neutral” pre-extraction condition at a level of 10% wood dissolution was achieved by contacting the wood chips with green liquor (GL) at a charge of about 3% (as Na2O on wood) at 160 °C for almost 2 hours (or an H-factor of about 800 hrs.). During subsequent kraft cooking of the pre-extracted hardwoodmore » chips the effective alkali charge could be reduced by about 3% (as Na2O on wood) and the cooking time shortened relative to that during regular kraft cooking, while still producing the same bleachable grade kappa number as the kraft control pulp. For softwood, no extraction conditions were discovered in the present investigation whereby both the final kraft pulp yield and physical properties could be maintained at a level similar to that of regular softwood kraft pulp. Therefore for hardwoods the “near- neutral green liquor pre-extraction conditions do meet the requirements of the IFPR concept, while for softwood, no extraction conditions were discovered which do meet these requirements. Application of simulated industrial GL at an extraction H-factor of about 800 hrs and 3% GL charge in a recirculating digester produced an hardwood extract containing about 4% (on wood) of total anhydro-sugars, 2% of acetic acid, and 1.3% of lignin. Xylan comprised of 80% of the sugars of which about 85% is oligomeric. Since only polymeric hemicelluloses and lignin may be adsorbed on pulp (produced at a yield of about 50% from the original wood), the maximum theoretical yield increase due to adsorption may be estimated as 10% on pulp (or 5% on wood). However, direct application of raw GL hardwood extract for hemicelluloses adsorption onto hardwood kraft pulp led to a yield increase of only about 1% (on pulp). By using the wet-end retention aid guar gum during the adsorption process at a charge of 0.5% on pulp the yield gain may be increased to about 5%. Unfortunately, most of this yield increase is lost during subsequent alkaline treatments in the pulp bleach plant. It was found that by performing the adsorption at alkaline conditions the adsorption loss during alkaline treatment in the bleach plant is mostly avoided. Thus a permanent adsorption yield of about 3 and 1.5% (on pulp) was obtained with addition of guar gum at a charge of 0.5 and 0.1% respectively during adsorption of GL hardwood extract on pre-extracted kraft pulp at optimal conditions of pH 11.5, 90 C for 60 minutes at 5% consistency. The beatability of the adsorbed kraft pulps was improved. Also, significant physical strength improvements were achieved. Further study is needed to determine whether the improvements in pulp yield and paper properties make this an economic IFPR concept. Application of the wood solids of a hot water extract of Acer rubrum wood strands as a substitute for polystyrene used for production of SMC maintained the water adsorption properties of the final product. Further work on the physical properties of the hemicellulose containing SMCs need to be completed to determine the potential of wood extracts for the production of partially renewable SMCs. The discovery of the “near-neutral” green liquor extraction process for hardwood was formed the basis for a commercial Integrated Biorefinery that will extract hemicelluloses from wood chips to make biofuels and other specialty chemicals. The pulp production process will be maintained as is proposed in the present researched IFBR concept. This Integrated Biorefinery will be constructed by Red Shield Acquisition LLC (RSA) at the Old Town kraft pulp mill in Maine. RSA in collaboration with the University of Maine will develop and commercialize the hemicellulose extraction process, the conversion of the hemicellulose sugars into butanol by fermentation, and the separation of specialty chemicals such as acetic acid from the extract. When operating the facility will produce 1.5 million gallons per year of butanol and create 16 new “green collar” jobs. Previously, a spare pulp digester was converted to a new extractor, and in 2009 it was demonstrated that a good hemicellulose extract could be produced, while simultaneously producing market pulp. Since then more than 250 hours of operational experience has been acquired by the mill generating a hemicellulose extract while simultaneously producing market pulp at a scale of 1000 tonnes (OD)/day of mixed northern hardwood chips.« less

  8. Effect of thermal processing on T cell reactivity of shellfish allergens - Discordance with IgE reactivity.

    PubMed

    Abramovitch, Jodie B; Lopata, Andreas L; O'Hehir, Robyn E; Rolland, Jennifer M

    2017-01-01

    Crustacean allergy is a major cause of food-induced anaphylaxis. We showed previously that heating increases IgE reactivity of crustacean allergens. Here we investigate the effects of thermal processing of crustacean extracts on cellular immune reactivity. Raw and cooked black tiger prawn, banana prawn, mud crab and blue swimmer crab extracts were prepared and IgE reactivity assessed by ELISA. Mass spectrometry revealed a mix of several allergens in the raw mud crab extract but predominant heat-stable tropomyosin in the cooked extract. PBMC from crustacean-allergic and non-atopic control subjects were cultured with the crab and prawn extracts and proliferation of lymphocyte subsets was analysed by CFSE labelling and flow cytometry. Effector responses were assessed by intracellular IL-4 and IFN-γ, and regulatory T (CD4+CD25+CD127loFoxp3+) cell proportions in cultures were also compared by flow cytometry. For each crustacean species, the cooked extract had greater IgE reactivity than the raw (mud crab p<0.05, other species p<0.01). In contrast, there was a trend for lower PBMC proliferative responses to cooked compared with raw extracts. In crustacean-stimulated PBMC cultures, dividing CD4+ and CD56+ lymphocytes showed higher IL-4+/IFN-γ+ ratios for crustacean-allergic subjects than for non-atopics (p<0.01), but there was no significant difference between raw and cooked extracts. The percentage IL-4+ of dividing CD4+ cells correlated with total and allergen-specific IgE levels (prawns p<0.01, crabs p<0.05). Regulatory T cell proportions were lower in cultures stimulated with cooked compared with raw extracts (mud crab p<0.001, banana prawn p<0.05). In conclusion, cooking did not substantially alter overall T cell proliferative or cytokine reactivity of crustacean extracts, but decreased induction of Tregs. In contrast, IgE reactivity of cooked extracts was increased markedly. These novel findings have important implications for improved diagnostics, managing crustacean allergy and development of future therapeutics. Assessment of individual allergen T cell reactivity is required.

  9. Storing files in a parallel computing system based on user-specified parser function

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  10. Ultrasound-assisted oxidative process for sulfur removal from petroleum product feedstock.

    PubMed

    Mello, Paola de A; Duarte, Fábio A; Nunes, Matheus A G; Alencar, Mauricio S; Moreira, Elizabeth M; Korn, Mauro; Dressler, Valderi L; Flores, Erico M M

    2009-08-01

    A procedure using ultrasonic irradiation is proposed for sulfur removal of a petroleum product feedstock. The procedure involves the combination of a peroxyacid and ultrasound-assisted treatment in order to comply with the required sulfur content recommended by the current regulations for fuels. The ultrasound-assisted oxidative desulfurization (UAOD) process was applied to a petroleum product feedstock using dibenzothiophene as a model sulfur compound. The influence of ultrasonic irradiation time, oxidizing reagents amount, kind of solvent for the extraction step and kind of organic acid were investigated. The use of ultrasonic irradiation allowed higher efficiency for sulfur removal in comparison to experiments performed without its application, under the same reactional conditions. Using the optimized conditions for UAOD, the sulfur removal was about 95% after 9min of ultrasonic irradiation (20kHz, 750W, run at 40%), using hydrogen peroxide and acetic acid, followed by extraction with methanol.

  11. Object-oriented software design in semiautomatic building extraction

    NASA Astrophysics Data System (ADS)

    Guelch, Eberhard; Mueller, Hardo

    1997-08-01

    Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.

  12. TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data.

    PubMed

    Fimereli, Danai; Detours, Vincent; Konopka, Tomasz

    2013-04-01

    High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/.

  13. A mask quality control tool for the OSIRIS multi-object spectrograph

    NASA Astrophysics Data System (ADS)

    López-Ruiz, J. C.; Vaz Cedillo, Jacinto Javier; Ederoclite, Alessandro; Bongiovanni, Ángel; González Escalera, Víctor

    2012-09-01

    OSIRIS multi object spectrograph uses a set of user-customised-masks, which are manufactured on-demand. The manufacturing process consists of drilling the specified slits on the mask with the required accuracy. Ensuring that slits are on the right place when observing is of vital importance. We present a tool for checking the quality of the process of manufacturing the masks which is based on analyzing the instrument images obtained with the manufactured masks on place. The tool extracts the slit information from these images, relates specifications with the extracted slit information, and finally communicates to the operator if the manufactured mask fulfills the expectations of the mask designer. The proposed tool has been built using scripting languages and using standard libraries such as opencv, pyraf and scipy. The software architecture, advantages and limits of this tool in the lifecycle of a multiobject acquisition are presented.

  14. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  15. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  16. Field guide for collecting and processing stream-water samples for the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Shelton, Larry R.

    1994-01-01

    The U.S. Geological Survey's National Water-Quality Assessment program includes extensive data- collection efforts to assess the quality of the Nations's streams. These studies require analyses of stream samples for major ions, nutrients, sediments, and organic contaminants. For the information to be comparable among studies in different parts of the Nation, consistent procedures specifically designed to produce uncontaminated samples for trace analysis in the laboratory are critical. This field guide describes the standard procedures for collecting and processing samples for major ions, nutrients, organic contaminants, sediment, and field analyses of conductivity, pH, alkalinity, and dissolved oxygen. Samples are collected and processed using modified and newly designed equipment made of Teflon to avoid contamination, including nonmetallic samplers (D-77 and DH-81) and a Teflon sample splitter. Field solid-phase extraction procedures developed to process samples for organic constituent analyses produce an extracted sample with stabilized compounds for more accurate results. Improvements to standard operational procedures include the use of processing chambers and capsule filtering systems. A modified collecting and processing procedure for organic carbon is designed to avoid contamination from equipment cleaned with methanol. Quality assurance is maintained by strict collecting and processing procedures, replicate sampling, equipment blank samples, and a rigid cleaning procedure using detergent, hydrochloric acid, and methanol.

  17. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1998-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: Shocks; Vortex ores; Regions of Recirculation; Boundary Layers; Wakes.

  18. Recognition of pigment network pattern in dermoscopy images based on fuzzy classification of pixels.

    PubMed

    Garcia-Arroyo, Jose Luis; Garcia-Zapirain, Begonya

    2018-01-01

    One of the most relevant dermoscopic patterns is the pigment network. An innovative method of pattern recognition is presented for its detection in dermoscopy images. It consists of two steps. In the first one, by means of a supervised machine learning process and after performing the extraction of different colour and texture features, a fuzzy classification of pixels into the three categories present in the pattern's definition ("net", "hole" and "other") is carried out. This enables the three corresponding fuzzy sets to be created and, as a result, the three probability images that map them out are generated. In the second step, the pigment network pattern is characterised from a parameterisation process -derived from the system specification- and the subsequent extraction of different features calculated from the combinations of image masks extracted from the probability images, corresponding to the alpha-cuts obtained from the fuzzy sets. The method was tested on a database of 875 images -by far the largest used in the state of the art to detect pigment network- extracted from a public Atlas of Dermoscopy, obtaining AUC results of 0.912 and 88%% accuracy, with 90.71%% sensitivity and 83.44%% specificity. The main contribution of this method is the very design of the algorithm, highly innovative, which could also be used to deal with other pattern recognition problems of a similar nature. Other contributions are: 1. The good performance in discriminating between the pattern and the disturbing artefacts -which means that no prior preprocessing is required in this method- and between the pattern and other dermoscopic patterns; 2. It puts forward a new methodological approach for work of this kind, introducing the system specification as a required step prior to algorithm design and development, being this specification the basis for a required parameterisation -in the form of configurable parameters (with their value ranges) and set threshold values- of the algorithm and the subsequent conducting of the experiments. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  19. An X-Band Radar Terrain Feature Detection Method for Low-Altitude SVS Operations and Calibration Using LiDAR

    NASA Technical Reports Server (NTRS)

    Young, Steve; UijtdeHaag, Maarten; Campbell, Jacob

    2004-01-01

    To enable safe use of Synthetic Vision Systems at low altitudes, real-time range-to-terrain measurements may be required to ensure the integrity of terrain models stored in the system. This paper reviews and extends previous work describing the application of x-band radar to terrain model integrity monitoring. A method of terrain feature extraction and a transformation of the features to a common reference domain are proposed. Expected error distributions for the extracted features are required to establish appropriate thresholds whereby a consistency-checking function can trigger an alert. A calibration-based approach is presented that can be used to obtain these distributions. To verify the approach, NASA's DC-8 airborne science platform was used to collect data from two mapping sensors. An Airborne Laser Terrain Mapping (ALTM) sensor was installed in the cargo bay of the DC-8. After processing, the ALTM produced a reference terrain model with a vertical accuracy of less than one meter. Also installed was a commercial-off-the-shelf x-band radar in the nose radome of the DC-8. Although primarily designed to measure precipitation, the radar also provides estimates of terrain reflectivity at low altitudes. Using the ALTM data as the reference, errors in features extracted from the radar are estimated. A method to estimate errors in features extracted from the terrain model is also presented.

  20. Emergency Medical Considerations in a Space-Suited Patient.

    PubMed

    Garbino, Alejandro; Nusbaum, Derek M; Buckland, Daniel M; Menon, Anil S; Clark, Jonathan B; Antonsen, Erik L

    The Stratex Project is a high altitude balloon flight that culminated in a freefall from 41,422 m (135,890 ft), breaking the record for the highest freefall to date. Crew recovery operations required an innovative approach due to the unique nature of the event as well as the equipment involved. The parachutist donned a custom space suit similar to a NASA Extravehicular Mobility Unit (EMU), with life support system mounted to the front and a parachute on the back. This space suit had a metal structure around the torso, which, in conjunction with the parachute and life support assembly, created a significant barrier to extraction from the suit in the event of a medical emergency. For this reason the Medical Support Team coordinated with the pressure suit assembly engineer team for integration, training in suit removal, definition of a priori contingency leadership on site, creation of color-coded extraction scenarios, and extraction drills with a suit mock-up that provided insight into limitations to immediate access. This paper discusses novel extraction processes and contrasts the required medical preparation for this type of equipment with the needs of the prior record-holding jump that used a different space suit with easier immediate access. Garbino A, Nusbaum DM, Buckland DM, Menon AS, Clark JB, Antonsen EL. Emergency medical considerations in a space-suited patient. Aerosp Med Hum Perform. 2016; 87(11):958-962.

  1. Evolving Maturation of the Series-Bosch System

    NASA Technical Reports Server (NTRS)

    Stanley, Christine; Abney, Morgan B.; Barnett, Bill

    2017-01-01

    Human exploration missions to Mars and other destinations beyond low Earth orbit require highly robust, reliable, and maintainable life support systems that maximize recycling of water and oxygen. In order to meet this requirement, NASA has continued the development of a Series-Bosch System, a two stage reactor process that reduces carbon dioxide (CO2) with hydrogen (H2) to produce water and solid carbon. Theoretically, the Bosch process can recover 100% of the oxygen (O2) from CO2 in the form of water, making it an attractive option for long duration missions. The Series Bosch system includes a reverse water gas shift (RWGS) reactor, a carbon formation reactor (CFR), an H2 extraction membrane, and a CO2 extraction membrane. In 2016, the results of integrated testing of the Series Bosch system showed great promise and resulted in design modifications to the CFR to further improve performance. This year, integrated testing was conducted with the modified reactor to evaluate its performance and compare it with the performance of the previous configuration. Additionally, a CFR with the capability to load new catalyst and remove spent catalyst in-situ was built. Flow demonstrations were performed to evaluate both the catalyst loading and removal process and the hardware performance. The results of the integrated testing with the modified CFR as well as the flow demonstrations are discussed in this paper.

  2. Examining the nootropic effects of a special extract of Bacopa monniera on human cognitive functioning: 90 day double-blind placebo-controlled randomized trial.

    PubMed

    Stough, Con; Downey, Luke A; Lloyd, Jenny; Silber, Beata; Redman, Stephanie; Hutchison, Chris; Wesnes, Keith; Nathan, Pradeep J

    2008-12-01

    While Ayurvedic medicine has touted the cognitive enhancing effects of Bacopa monniera for centuries, there is a need for double-blind placebo-controlled investigations. One hundred and seven healthy participants were recruited for this double-blind placebo-controlled independent group design investigation. Sixty-two participants completed the study with 80% treatment compliance. Neuropsychological testing using the Cognitive Drug Research cognitive assessment system was conducted at baseline and after 90 days of treatment with a special extract of Bacopa monniera (2 x 150 mg KeenMind) or placebo. The Bacopa monniera product significantly improved performance on the 'Working Memory' factor, more specifically spatial working memory accuracy. The number of false-positives recorded in the Rapid visual information processing task was also reduced for the Bacopa monniera group following the treatment period. The current study provides support for the two other published studies reporting cognitive enhancing effects in healthy humans after a 90 day administration of the Bacopa monniera extract. Further studies are required to ascertain the effective dosage range, the time required to attain therapeutic levels and the effects over a longer term of administration. (c) 2008 John Wiley & Sons, Ltd.

  3. [Study on extraction process of Radix Bupleuri].

    PubMed

    Zhao, Lei; Liu, Benliang; Wu, Fuxiang; Tao, Lanping; Liu, Jian

    2004-10-01

    The orthogonal design was used to optimize extraction process of Radix Bupleuri with content of total saponin and yield of the extract as markers. Factors that have been chosen were ethanol concentration, ethanol consumption, extraction times and extraction time. Each factor had three levels. The result showed that the optimum extraction condition was 80% ethanol, 4 times the amount of material, refluxing for 4 times, 60 minutes each time. The optimized process was stable and workable.

  4. Rare Earth Extraction from NdFeB Magnet Using a Closed-Loop Acid Process.

    PubMed

    Kitagawa, Jiro; Uemura, Ryohei

    2017-08-14

    There is considerable interest in extraction of rare earth elements from NdFeB magnets to enable recycling of these elements. In practical extraction methods using wet processes, the acid waste solution discharge is a problem that must be resolved to reduce the environmental impact of the process. Here, we present an encouraging demonstration of rare earth element extraction from a NdFeB magnet using a closed-loop hydrochloric acid (HCl)-based process. The extraction method is based on corrosion of the magnet in a pretreatment stage and a subsequent ionic liquid technique for Fe extraction from the HCl solution. The rare earth elements are then precipitated using oxalic acid. Triple extraction has been conducted and the recovery ratio of the rare earth elements from the solution is approximately 50% for each extraction process, as compared to almost 100% recovery when using a one-shot extraction process without the ionic liquid but with sufficient oxalic acid. Despite its reduced extraction efficiency, the proposed method with its small number of procedures at almost room temperature is still highly advantageous in terms of both cost and environmental friendliness. This study represents an initial step towards realization of a closed-loop acid process for recycling of rare earth elements.

  5. Optical fiber repeatered transmission systems utilizing SAW filters

    NASA Astrophysics Data System (ADS)

    Rosenberg, R. L.; Ross, D. G.; Trischitta, P. R.; Fishman, D. A.; Armitage, C. B.

    1983-05-01

    Baseband digital transmission-line systems capable of signaling rates of several hundred to several thousand Mbit/s are presently being developed around the world. The pulse regeneration process is gated by a timing wave which is synchronous with the symbol rate of the arriving pulse stream. Synchronization is achieved by extracting a timing wave from the arriving pulse stream, itself. To date, surface acoustic-wave (SAW) filters have been widely adopted for timing recovery in the in-line regenerators of high-bit-rate systems. The present investigation has the objective to acquaint the SAW community in general, and SAW filter suppliers in particular, with the requirements for timing recovery filters in repeatered digital transmission systems. Attention is given to the system structure, the timing loop function, the system requirements affecting the timing-recovery filter, the decision process, timing jitter accumulation, the filter 'ringing' requirement, and aspects of reliability.

  6. Neural networks to classify speaker independent isolated words recorded in radio car environments

    NASA Astrophysics Data System (ADS)

    Alippi, C.; Simeoni, M.; Torri, V.

    1993-02-01

    Many applications, in particular the ones requiring nonlinear signal processing, have proved Artificial Neural Networks (ANN's) to be invaluable tools for model free estimation. The classifying abilities of ANN's are addressed by testing their performance in a speaker independent word recognition application. A real world case requiring implementation of compact integrated devices is taken into account: the classification of isolated words in radio car environment. A multispeaker database of isolated words was recorded in different environments. Data were first processed to determinate the boundaries of each word and then to extract speech features, the latter accomplished by using cepstral coefficient representation, log area ratios and filters bank techniques. Multilayered perceptron and adaptive vector quantization neural paradigms were tested to find a reasonable compromise between performances and network simplicity, fundamental requirement for the implementation of compact real time running neural devices.

  7. Activation of bean (Phaseolus vulgaris) [alpha]-amylase inhibitor requires proteolytic processing of the proprotein

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pueyo, J.J.; Hunt, D.C.; Chrispeels, M.J.

    Seeds of the common bean (Phaseolus vulgaris) contain a plant defense protein that inhibits the [alpha]-amylases of mammals and insects. This [alpha]-amylase inhibitor ([alpha]Al) is synthesized as a proprotein on the endoplasmic reticulum and is proteolytically processed after arrival in the protein storage vacuoles to polypeptides of relative molecular weight (M[sub r]) 15,000 to 18,000. The authors report two types of evidence that proteolytic processing is linked to activation of the inhibitory activity. First, by surveying seed extracts of wild accessions of P. vulgaris and other species in the genus Phaseolus, they found that antibodies to [alpha]Al recognize large (M[submore » r] 30,000-35,000) polypeptides as well as typical [alpha]Al processing products (M[sub r] 15,000-18,000). [alpha]Al activity was found in all extracts that had the typical [alpha]Al processed polypeptides, but was absent from seed extracts that lacked such polypeptides. Second, they made a mutant [alpha]Al in which asparagine-77 is changed to aspartic acid-77. This mutation slows down the proteolytic processing of pro-[alpha]Al when the gene is expressed in tobacco. When pro-[alpha]Al was separated from mature [alpha]Al by gel filtration, pro-[alpha]Al was found not to have [alpha]-amylase inhibitory activity. The authors interpret these results to mean that formation of the active inhibitor is causally related to proteolytic processing of the proprotein. They suggest that the polypeptide cleavage removes a conformation constraint on the precursor to produce the biochemically active molecule. 43 refs., 5 figs., 1 tab.« less

  8. Determination of 15 polycyclic aromatic hydrocarbons in aquatic products by solid-phase extraction and GC-MS.

    PubMed

    Liu, Qiying; Guo, Yuanming; Sun, Xiumei; Hao, Qing; Cheng, Xin; Zhang, Lu

    2018-02-22

    We propose a method for the simultaneous determination of 15 kinds of polycyclic aromatic hydrocarbons in marine samples (muscle) employing gas chromatography with mass spectrometry after saponification with ultrasound-assisted extraction and solid-phase extraction. The experimental conditions were optimized by the response surface method. In addition, the effects of different lyes and extractants on polycyclic aromatic hydrocarbons extraction were discussed, and saturated sodium carbonate was first used as the primary saponification reaction and extracted with 10 mL of ethyl acetate and secondly 1 mol/L of sodium hydroxide and 10 mL of n-hexane were used to achieve better results. The average recovery was 67-112%. Satisfactory data showed that the method has good reproducibility with a relative standard deviation of <13%. The detection limits of polycyclic aromatic hydrocarbons were 0.02-0.13 ng/g. Compared with other methods, this method has the advantages of simple pretreatment, low solvent consumption, maximum polycyclic aromatic hydrocarbons extraction, the fast separation speed, and the high extraction efficiency. It is concluded that this method meets the batch processing requirements of the sample and can also be used to determine polycyclic aromatic hydrocarbons in other high-fat (fish, shrimp, crab, shellfish) biological samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Technologies for Extracting Valuable Metals and Compounds from Geothermal Fluids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Stephen

    2014-04-30

    Executive Summary Simbol Materials studied various methods of extracting valuable minerals from geothermal brines in the Imperial Valley of California, focusing on the extraction of lithium, manganese, zinc and potassium. New methods were explored for managing the potential impact of silica fouling on mineral extraction equipment, and for converting silica management by-products into commercial products.` Studies at the laboratory and bench scale focused on manganese, zinc and potassium extraction and the conversion of silica management by-products into valuable commercial products. The processes for extracting lithium and producing lithium carbonate and lithium hydroxide products were developed at the laboratory scale andmore » scaled up to pilot-scale. Several sorbents designed to extract lithium as lithium chloride from geothermal brine were developed at the laboratory scale and subsequently scaled-up for testing in the lithium extraction pilot plant. Lithium The results of the lithium studies generated the confidence for Simbol to scale its process to commercial operation. The key steps of the process were demonstrated during its development at pilot scale: 1. Silica management. 2. Lithium extraction. 3. Purification. 4. Concentration. 5. Conversion into lithium hydroxide and lithium carbonate products. Results show that greater than 95% of the lithium can be extracted from geothermal brine as lithium chloride, and that the chemical yield in converting lithium chloride to lithium hydroxide and lithium carbonate products is greater than 90%. The product purity produced from the process is consistent with battery grade lithium carbonate and lithium hydroxide. Manganese and zinc Processes for the extraction of zinc and manganese from geothermal brine were developed. It was shown that they could be converted into zinc metal and electrolytic manganese dioxide after purification. These processes were evaluated for their economic potential, and at the present time Simbol Materials is evaluating other products with greater commercial value. Potassium Silicotitanates, zeolites and other sorbents were evaluated as potential reagents for the extraction of potassium from geothermal brines and production of potassium chloride (potash). It was found that zeolites were effective at removing potassium but the capacity of the zeolites and the form that the potassium is in does not have economic potential. Iron-silica by-product The conversion of iron-silica by-product produced during silica management operations into more valuable materials was studied at the laboratory scale. Results indicate that it is technically feasible to convert the iron-silica by-product into ferric chloride and ferric sulfate solutions which are precursors to a ferric phosphate product. However, additional work to purify the solutions is required to determine the commercial viability of this process. Conclusion Simbol Materials is in the process of designing its first commercial plant based on the technology developed to the pilot scale during this project. The investment in the commercial plant is hundreds of millions of dollars, and construction of the commercial plant will generate hundreds of jobs. Plant construction will be completed in 2016 and the first lithium products will be shipped in 2017. The plant will have a lithium carbonate equivalent production capacity of 15,000 tonnes per year. The gross revenues from the project are expected to be approximately $ 80 to 100 million annually. During this development program Simbol grew from a company of about 10 people to over 60 people today. Simbol is expected to employ more than 100 people once the plant is constructed. Simbol Materials’ business is scalable in the Imperial Valley region because there are eleven geothermal power plants already in operation, which allows Simbol to expand its business from one plant to multiple plants. Additionally, the scope of the resource is vast in terms of potential products such as lithium, manganese and zinc and potentially potassium.« less

  10. Human mismatch repair protein hMutLα is required to repair short slipped-DNAs of trinucleotide repeats.

    PubMed

    Panigrahi, Gagan B; Slean, Meghan M; Simard, Jodie P; Pearson, Christopher E

    2012-12-07

    Mismatch repair (MMR) is required for proper maintenance of the genome by protecting against mutations. The mismatch repair system has also been implicated as a driver of certain mutations, including disease-associated trinucleotide repeat instability. We recently revealed a requirement of hMutSβ in the repair of short slip-outs containing a single CTG repeat unit (1). The involvement of other MMR proteins in short trinucleotide repeat slip-out repair is unknown. Here we show that hMutLα is required for the highly efficient in vitro repair of single CTG repeat slip-outs, to the same degree as hMutSβ. HEK293T cell extracts, deficient in hMLH1, are unable to process single-repeat slip-outs, but are functional when complemented with hMutLα. The MMR-deficient hMLH1 mutant, T117M, which has a point mutation proximal to the ATP-binding domain, is defective in slip-out repair, further supporting a requirement for hMLH1 in the processing of short slip-outs and possibly the involvement of hMHL1 ATPase activity. Extracts of hPMS2-deficient HEC-1-A cells, which express hMLH1, hMLH3, and hPMS1, are only functional when complemented with hMutLα, indicating that neither hMutLβ nor hMutLγ is sufficient to repair short slip-outs. The resolution of clustered short slip-outs, which are poorly repaired, was partially dependent upon a functional hMutLα. The joint involvement of hMutSβ and hMutLα suggests that repeat instability may be the result of aberrant outcomes of repair attempts.

  11. Supercritical Fluid Fractionation of JP-8

    DTIC Science & Technology

    1991-12-26

    applications, such as coffee decaffeination , spice extraction, and lipids purification. The processing principles have also long been well known and ipracticed...PRINCIPLES OF SUPERCRITICAL FLUID EXTRACTION 8 A. Background on Supercritical Fluid Solubility 8 B. Supercritical Fluid Extraction Process ...Operation I0 1. Batch Extraction of Solid Materials 10 2. Counter-Current Continuous SCF Processing of Liquid 15 Products 3. Supercritical Fluid Extraction vs

  12. Mise en Scene: Conversion of Scenarios to CSP Traces for the Requirements-to-Design-to-Code Project

    NASA Technical Reports Server (NTRS)

    Carter. John D.; Gardner, William B.; Rash, James L.; Hinchey, Michael G.

    2007-01-01

    The "Requirements-to-Design-to-Code" (R2D2C) project at NASA's Goddard Space Flight Center is based on deriving a formal specification expressed in Communicating Sequential Processes (CSP) notation from system requirements supplied in the form of CSP traces. The traces, in turn, are to be extracted from scenarios, a user-friendly medium often used to describe the required behavior of computer systems under development. This work, called Mise en Scene, defines a new scenario medium (Scenario Notation Language, SNL) suitable for control-dominated systems, coupled with a two-stage process for automatic translation of scenarios to a new trace medium (Trace Notation Language, TNL) that encompasses CSP traces. Mise en Scene is offered as an initial solution to the problem of the scenarios-to-traces "D2" phase of R2D2C. A survey of the "scenario" concept and some case studies are also provided.

  13. Study on extraction process and activity of plant polysaccharides

    NASA Astrophysics Data System (ADS)

    Ma, Xiaogen; Wang, Xiaojing; Fan, Shuangli; Chen, Jiezhong

    2017-10-01

    Recent studies have shown that plant polysaccharides have many pharmacological activities, such as hypoglycemic, anti-inflammatory and tumor inhibition. The pharmacological activities of plant polysaccharides were summarized. The extraction methods of plant polysaccharides were discussed. Finally, the extraction process of Herba Taraxaci polysaccharides was optimized by ultrasonic assisted extraction. Through single factor experiments and orthogonal experiment to optimize the optimum extraction process from dandelion polysaccharide, optimum conditions of dandelion root polysaccharide by ultrasonic assisted extraction method for ultrasonic power 320W, temperature 80°C, extraction time 40min, can get higher dandelion polysaccharide extract.

  14. Ultrasonically enhanced extraction of bioactive principles from Quillaja Saponaria Molina.

    PubMed

    Gaete-Garretón, L; Vargas-Hernández, Yolanda; Cares-Pacheco, María G; Sainz, Javier; Alarcón, John

    2011-07-01

    A study of ultrasonic enhancement in the extraction of bioactive principles from Quillaja Saponaria Molina (Quillay) is presented. The effects influencing the extraction process were studied through a two-level factorial design. The effects considered in the experimental design were: granulometry, extraction time, acoustic Power, raw matter/solvent ratio (concentration) and acoustic impedance. It was found that for aqueous extraction the main factors affecting the ultrasonically-assisted process were: granulometry, raw matter/solvent ratio and extraction time. The extraction ratio was increased by Ultrasonics effect and a reduction in extraction time was verified without any influence in the product quality. In addition the process can be carried out at lower temperatures than the conventional method. As the process developed uses chips from the branches of trees, and not only the bark, this research contributes to make the saponin exploitation process a sustainable industry. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Automated ancillary cancer history classification for mesothelioma patients from free-text clinical reports

    PubMed Central

    Wilson, Richard A.; Chapman, Wendy W.; DeFries, Shawn J.; Becich, Michael J.; Chapman, Brian E.

    2010-01-01

    Background: Clinical records are often unstructured, free-text documents that create information extraction challenges and costs. Healthcare delivery and research organizations, such as the National Mesothelioma Virtual Bank, require the aggregation of both structured and unstructured data types. Natural language processing offers techniques for automatically extracting information from unstructured, free-text documents. Methods: Five hundred and eight history and physical reports from mesothelioma patients were split into development (208) and test sets (300). A reference standard was developed and each report was annotated by experts with regard to the patient’s personal history of ancillary cancer and family history of any cancer. The Hx application was developed to process reports, extract relevant features, perform reference resolution and classify them with regard to cancer history. Two methods, Dynamic-Window and ConText, for extracting information were evaluated. Hx’s classification responses using each of the two methods were measured against the reference standard. The average Cohen’s weighted kappa served as the human benchmark in evaluating the system. Results: Hx had a high overall accuracy, with each method, scoring 96.2%. F-measures using the Dynamic-Window and ConText methods were 91.8% and 91.6%, which were comparable to the human benchmark of 92.8%. For the personal history classification, Dynamic-Window scored highest with 89.2% and for the family history classification, ConText scored highest with 97.6%, in which both methods were comparable to the human benchmark of 88.3% and 97.2%, respectively. Conclusion: We evaluated an automated application’s performance in classifying a mesothelioma patient’s personal and family history of cancer from clinical reports. To do so, the Hx application must process reports, identify cancer concepts, distinguish the known mesothelioma from ancillary cancers, recognize negation, perform reference resolution and determine the experiencer. Results indicated that both information extraction methods tested were dependant on the domain-specific lexicon and negation extraction. We showed that the more general method, ConText, performed as well as our task-specific method. Although Dynamic- Window could be modified to retrieve other concepts, ConText is more robust and performs better on inconclusive concepts. Hx could greatly improve and expedite the process of extracting data from free-text, clinical records for a variety of research or healthcare delivery organizations. PMID:21031012

  16. An integrated process for the recovery of high added-value compounds from olive oil using solid support free liquid-liquid extraction and chromatography techniques.

    PubMed

    Angelis, Apostolis; Hamzaoui, Mahmoud; Aligiannis, Nektarios; Nikou, Theodora; Michailidis, Dimitris; Gerolimatos, Panagiotis; Termentzi, Aikaterini; Hubert, Jane; Halabalaki, Maria; Renault, Jean-Hugues; Skaltsounis, Alexios-Léandros

    2017-03-31

    An integrated extraction and purification process for the direct recovery of high added value compounds from extra virgin olive oil (EVOO) is proposed by using solid support free liquid-liquid extraction and chromatography techniques. Two different extraction methods were developed on a laboratory-scale Centrifugal Partition Extractor (CPE): a sequential strategy consisting of several "extraction-recovery" cycles and a continuous strategy based on stationary phase co-current elution. In both cases, EVOO was used as mobile phase diluted in food grade n-hexane (feed mobile phase) and the required biphasic system was obtained by adding ethanol and water as polar solvents. For the sequential process, 17.5L of feed EVOO containing organic phase (i.e. 7L of EVOO treated) were extracted yielding 9.5g of total phenolic fraction corresponding to a productivity of 5.8g/h/L of CPE column. Regarding the second approach, the co-current process, 2L of the feed oil phase (containing to 0.8L of EVOO) were treated at 100mL/min yielding 1.03g of total phenolic fraction corresponding to a productivity of 8.9g/h/L of CPE column. The total phenolic fraction was then fractionated by using stepwise gradient elution Centrifugal Partition Chromatography (CPC). The biphasic solvent systems were composed of n-hexane, ethyl acetate, ethanol and water in different proportions (X/Y/2/3, v/v). In a single run of 4h on a column with a capacity of 1L, 910mg of oleocanthal, 882mg of oleacein, 104mg of hydroxytyrosol were successfully recovered from 5g of phenolic extract with purities of 85%, 92% and 90%, respectively. CPC fractions were then submitted to orthogonal chromatographic steps (adsorption on silica gel or size exclusion chromatography) leading to the isolation of additional eleven compounds belonging to triterpens, phenolic compounds and secoiridoids. Among them, elenolic acid ethylester was found to be new compound. Thin Layer Chromatography (TLC), Nuclear magnetic Resonance (NMR) and High Performance Liquid Chromatography - Diode Array Detector (HPLC-DAD) were used for monitoring and evaluation purposes throughout the entire procedure. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.

  18. Improved collagen extraction from jellyfish (Acromitus hardenbergi) with increased physical-induced solubilization processes.

    PubMed

    Khong, Nicholas M H; Yusoff, Fatimah Md; Jamilah, B; Basri, Mahiran; Maznah, I; Chan, Kim Wei; Armania, Nurdin; Nishikawa, Jun

    2018-06-15

    Efficiency and effectiveness of collagen extraction process contribute to huge impacts to the quality, supply and cost of the collagen produced. Jellyfish is a potential sustainable source of collagen where their applications are not limited by religious constraints and threats of transmittable diseases. The present study compared the extraction yield, physico-chemical properties and toxicology in vitro of collagens obtained by the conventional acid-assisted and pepsin-assisted extraction to an improved physical-aided extraction process. By increasing physical intervention, the production yield increased significantly compared to the conventional extraction processes (p < .05). Collagen extracted using the improved process was found to possess similar proximate and amino acids composition to those extracted using pepsin (p > .05) while retaining high molecular weight distributions and polypeptide profiles similar to those extracted using only acid. Moreover, they exhibited better appearance, instrumental colour and were found to be non-toxic in vitro and free of heavy metal contamination. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. [Studies on extraction process of Radix Platycodi].

    PubMed

    Wu, Biyuan; Sun, Jun; Jiang, Hongfang

    2002-06-01

    The orthogonal design was used to optimize the extraction process of Radix Platycodi with content of total saponin and yield of the extract as markers. Factors that have been chosen were alcohol concentration, alcohol consumption, extraction times and extraction time. Each factor has three levels. The result showed that the optimum extraction condition obtained was 70% alcohol, 3 times the amount of material, refluxing for 5 times, 60 minutes each time, the optimized process was stable and workable.

  20. Smart sensor for terminal homing

    NASA Astrophysics Data System (ADS)

    Panda, D.; Aggarwal, R.; Hummel, R.

    1980-01-01

    The practical scene matching problem is considered to present certain complications which must extend classical image processing capabilities. Certain aspects of the scene matching problem which must be addressed by a smart sensor for terminal homing are discussed. First a philosophy for treating the matching problem for the terminal homing scenario is outlined. Then certain aspects of the feature extraction process and symbolic pattern matching are considered. It is thought that in the future general ideas from artificial intelligence will be more useful for terminal homing requirements of fast scene recognition and pattern matching.

  1. Reducing full one-loop amplitudes to scalar integrals at the integrand level

    NASA Astrophysics Data System (ADS)

    Ossola, Giovanni; Papadopoulos, Costas G.; Pittau, Roberto

    2007-02-01

    We show how to extract the coefficients of the 4-, 3-, 2- and 1-point one-loop scalar integrals from the full one-loop amplitude of arbitrary scattering processes. In a similar fashion, also the rational terms can be derived. Basically no information on the analytical structure of the amplitude is required, making our method appealing for an efficient numerical implementation.

  2. Toward the Decision Tree for Inferring Requirements Maturation Types

    NASA Astrophysics Data System (ADS)

    Nakatani, Takako; Kondo, Narihito; Shirogane, Junko; Kaiya, Haruhiko; Hori, Shozo; Katamine, Keiichi

    Requirements are elicited step by step during the requirements engineering (RE) process. However, some types of requirements are elicited completely after the scheduled requirements elicitation process is finished. Such a situation is regarded as problematic situation. In our study, the difficulties of eliciting various kinds of requirements is observed by components. We refer to the components as observation targets (OTs) and introduce the word “Requirements maturation.” It means when and how requirements are elicited completely in the project. The requirements maturation is discussed on physical and logical OTs. OTs Viewed from a logical viewpoint are called logical OTs, e.g. quality requirements. The requirements of physical OTs, e.g., modules, components, subsystems, etc., includes functional and non-functional requirements. They are influenced by their requesters' environmental changes, as well as developers' technical changes. In order to infer the requirements maturation period of each OT, we need to know how much these factors influence the OTs' requirements maturation. According to the observation of actual past projects, we defined the PRINCE (Pre Requirements Intelligence Net Consideration and Evaluation) model. It aims to guide developers in their observation of the requirements maturation of OTs. We quantitatively analyzed the actual cases with their requirements elicitation process and extracted essential factors that influence the requirements maturation. The results of interviews of project managers are analyzed by WEKA, a data mining system, from which the decision tree was derived. This paper introduces the PRINCE model and the category of logical OTs to be observed. The decision tree that helps developers infer the maturation type of an OT is also described. We evaluate the tree through real projects and discuss its ability to infer the requirements maturation types.

  3. A biorefinery scheme to fractionate bamboo into high-grade dissolving pulp and ethanol.

    PubMed

    Yuan, Zhaoyang; Wen, Yangbing; Kapu, Nuwan Sella; Beatson, Rodger; Mark Martinez, D

    2017-01-01

    Bamboo is a highly abundant source of biomass which is underutilized despite having a chemical composition and fiber structure similar as wood. The main challenge for the industrial processing of bamboo is the high level of silica, which forms water-insoluble precipitates negetively affecting the process systems. A cost-competitive and eco-friendly scheme for the production of high-purity dissolving grade pulp from bamboo not only requires a process for silica removal, but also needs to fully utilize all of the materials dissolved in the process which includes lignin, and cellulosic and hemicellulosic sugars as well as the silica. Many investigations have been carried out to resolve the silica issue, but none of them has led to a commercial process. In this work, alkaline pretreatment of bamboo was conducted to extract silica prior to pulping process. The silica-free substrate was used to produce high-grade dissolving pulp. The dissolved silica, lignin, hemicellulosic sugars, and degraded cellulose in the spent liquors obtained from alkaline pretreatment and pulping process were recovered for providing high-value bio-based chemicals and fuel. An integrated process which combines dissolving pulp production with the recovery of excellent sustainable biofuel and biochemical feedstocks is presented in this work. Pretreatment at 95 °C with 12% NaOH charge for 150 min extracted all the silica and about 30% of the hemicellulose from bamboo. After kraft pulping, xylanase treatment and cold caustic extraction, pulp with hemicellulose content of about 3.5% was obtained. This pulp, after bleaching, provided a cellulose acetate grade dissolving pulp with α-cellulose content higher than 97% and hemicellulose content less than 2%. The amount of silica and lignin that could be recovered from the process corresponded to 95 and 77.86% of the two components in the original chips, respectively. Enzymatic hydrolysis and fermentation of the concentrated and detoxified sugar mixture liquor showed that an ethanol recovery of 0.46 g/g sugar was achieved with 93.2% of hydrolyzed sugars being consumed. A mass balance of the overall process showed that 76.59 g of solids was recovered from 100 g (o.d.) of green bamboo. The present work proposes an integrated biorefinery process that contains alkaline pre-extraction, kraft pulping, enzyme treatment and cold caustic extraction for the production of high-grade dissolving pulp and recovery of silica, lignin, and hemicellulose from bamboo. This process could alleviate the silica-associated challenges and provide feedstocks for bio-based products, thereby allowing the improvement and expansion of bamboo utilization in industrial processes.

  4. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  5. Latent Dirichlet Allocation (LDA) Model and kNN Algorithm to Classify Research Project Selection

    NASA Astrophysics Data System (ADS)

    Safi’ie, M. A.; Utami, E.; Fatta, H. A.

    2018-03-01

    Universitas Sebelas Maret has a teaching staff more than 1500 people, and one of its tasks is to carry out research. In the other side, the funding support for research and service is limited, so there is need to be evaluated to determine the Research proposal submission and devotion on society (P2M). At the selection stage, research proposal documents are collected as unstructured data and the data stored is very large. To extract information contained in the documents therein required text mining technology. This technology applied to gain knowledge to the documents by automating the information extraction. In this articles we use Latent Dirichlet Allocation (LDA) to the documents as a model in feature extraction process, to get terms that represent its documents. Hereafter we use k-Nearest Neighbour (kNN) algorithm to classify the documents based on its terms.

  6. Joint detection of anatomical points on surface meshes and color images for visual registration of 3D dental models

    NASA Astrophysics Data System (ADS)

    Destrez, Raphaël.; Albouy-Kissi, Benjamin; Treuillet, Sylvie; Lucas, Yves

    2015-04-01

    Computer aided planning for orthodontic treatment requires knowing occlusion of separately scanned dental casts. A visual guided registration is conducted starting by extracting corresponding features in both photographs and 3D scans. To achieve this, dental neck and occlusion surface are firstly extracted by image segmentation and 3D curvature analysis. Then, an iterative registration process is conducted during which feature positions are refined, guided by previously found anatomic edges. The occlusal edge image detection is improved by an original algorithm which follows Canny's poorly detected edges using a priori knowledge of tooth shapes. Finally, the influence of feature extraction and position optimization is evaluated in terms of the quality of the induced registration. Best combination of feature detection and optimization leads to a positioning average error of 1.10 mm and 2.03°.

  7. Quantum Watermarking Scheme Based on INEQR

    NASA Astrophysics Data System (ADS)

    Zhou, Ri-Gui; Zhou, Yang; Zhu, Changming; Wei, Lai; Zhang, Xiafen; Ian, Hou

    2018-04-01

    Quantum watermarking technology protects copyright by embedding invisible quantum signal in quantum multimedia data. In this paper, a watermarking scheme based on INEQR was presented. Firstly, the watermark image is extended to achieve the requirement of embedding carrier image. Secondly, the swap and XOR operation is used on the processed pixels. Since there is only one bit per pixel, XOR operation can achieve the effect of simple encryption. Thirdly, both the watermark image extraction and embedding operations are described, where the key image, swap operation and LSB algorithm are used. When the embedding is made, the binary image key is changed. It means that the watermark has been embedded. Of course, if the watermark image is extracted, the key's state need detected. When key's state is |1>, this extraction operation is carried out. Finally, for validation of the proposed scheme, both the Signal-to-noise ratio (PSNR) and the security of the scheme are analyzed.

  8. Review of hydrofracking, the environmental pollution and some new methods may be used to skip the water in fracking process

    NASA Astrophysics Data System (ADS)

    Wang, B.

    2013-12-01

    Shale gas is natural gas that is found trapped within shale formations. And it has become an increasingly important source of natural gas in the United States since start of this century. Because shales ordinarily have insufficient permeability to allow significant fluid flow to a well bore, so gas production in commercial quantities requires fractures to provide permeability. Usually, the shale gas boom is due to modern technology in hydraulic fracturing to create extensive artificial fractures around well bores. In the same time, horizontal drilling is often used with shale gas wells, to create maximum borehole surface area in contact with shale. However, the extraction and use of shale gas can affect the environment through the leaking of extraction into water supplies, and the pollution caused by improper processing of natural gas. The challenge to prevent pollution is that shale gas extractions varies widely even in the two wells that in the same project. What's more, the enormous amounts of water will be needed for drilling, while some of the largest sources of shale gas are found in deserts. So if we can find some technologies to substitute the water in the fracking process, we will not only solve the environmental problems, but also the water supply issues. There are already some methods that have been studied for this purpose, like the CO2 fracking process by Tsuyoshi Ishida et al. I will also propose our new method called air-pressure system for fracking the shales without using water in the fracking process at last.

  9. Intensity dependent spread theory

    NASA Technical Reports Server (NTRS)

    Holben, Richard

    1990-01-01

    The Intensity Dependent Spread (IDS) procedure is an image-processing technique based on a model of the processing which occurs in the human visual system. IDS processing is relevant to many aspects of machine vision and image processing. For quantum limited images, it produces an ideal trade-off between spatial resolution and noise averaging, performs edge enhancement thus requiring only mean-crossing detection for the subsequent extraction of scene edges, and yields edge responses whose amplitudes are independent of scene illumination, depending only upon the ratio of the reflectance on the two sides of the edge. These properties suggest that the IDS process may provide significant bandwidth reduction while losing only minimal scene information when used as a preprocessor at or near the image plane.

  10. Computer vision applications for coronagraphic optical alignment and image processing.

    PubMed

    Savransky, Dmitry; Thomas, Sandrine J; Poyneer, Lisa A; Macintosh, Bruce A

    2013-05-10

    Modern coronagraphic systems require very precise alignment between optical components and can benefit greatly from automated image processing. We discuss three techniques commonly employed in the fields of computer vision and image analysis as applied to the Gemini Planet Imager, a new facility instrument for the Gemini South Observatory. We describe how feature extraction and clustering methods can be used to aid in automated system alignment tasks, and also present a search algorithm for finding regular features in science images used for calibration and data processing. Along with discussions of each technique, we present our specific implementation and show results of each one in operation.

  11. Extraterrestrial materials processing and construction

    NASA Technical Reports Server (NTRS)

    Criswell, D. R.

    1978-01-01

    Applications of available terrestrial skills to the gathering of lunar materials and the processing of raw lunar materials into industrial feed stock were investigated. The literature on lunar soils and rocks was reviewed and the chemical processes by which major oxides and chemical elements can be extracted were identified. The gathering of lunar soil by means of excavation equipment was studied in terms of terrestrial experience with strip mining operations on earth. The application of electrostatic benefication techniques was examined for use on the moon to minimize the quantity of materials requiring surface transport and to optimize the stream of raw materials to be transported off the moon for subsequent industrial use.

  12. Protocols for the Investigation of Information Processing in Human Assessment of Fundamental Movement Skills.

    PubMed

    Ward, Brodie J; Thornton, Ashleigh; Lay, Brendan; Rosenberg, Michael

    2017-01-01

    Fundamental movement skill (FMS) assessment remains an important tool in classifying individuals' level of FMS proficiency. The collection of FMS performances for assessment and monitoring has remained unchanged over the last few decades, but new motion capture technologies offer opportunities to automate this process. To achieve this, a greater understanding of the human process of movement skill assessment is required. The authors present the rationale and protocols of a project in which they aim to investigate the visual search patterns and information extraction employed by human assessors during FMS assessment, as well as the implementation of the Kinect system for FMS capture.

  13. Indigenous knowledge of shea processing and quality perception of shea products in Benin.

    PubMed

    Honfo, Fernande G; Linnemann, Anita R; Akissoe, Noël H; Soumanou, Mohamed M; van Boekel, Martinus A J S

    2012-01-01

    A survey among 246 people belonging to 14 ethnic groups and living in 5 different parklands in Benin revealed different practices to process shea kernels (namely boiling followed sun drying and smoking) and extract shea butter. A relation between parklands, gathering period, and sun-drying conditions was established. Moisture content and appearance of kernels were the selection criteria for users of shea kernels; color was the main characteristic to buy butter. Constraints to be solved are long processing times, lack of milling equipment and high water requirements. Best practices for smoking, sun drying, and roasting operations need to be established for further improvement.

  14. Diode laser soldering using a lead-free filler material for electronic packaging structures

    NASA Astrophysics Data System (ADS)

    Chaminade, C.; Fogarassy, E.; Boisselier, D.

    2006-04-01

    As of today, several lead-free soldering pastes have been qualified for currently used soldering process. Regarding the new potential of laser-assisted soldering processes, the behaviour of the SnAgCu soldering paste requires, however, new investigations. In the first part of this study, the specific temperature profile of a laser soldering process is investigated using a high power diode laser (HPDL). These experimental results are compared to a thermal simulation developed for this specific application. The second part of this work deals with the diffusion of the tin-based filler material through the nickel barrier using the information extracted from the temperature simulations.

  15. Energy recovery with turboexpander processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holm, J.

    1985-07-01

    Although the primary function of turboexpanders has been to provide efficient, low-temperature refrigeration, the energy thus extracted has also been an important additional feature. Today, turboexpanders are proven reliable and used widely in the following applications discussed in this article: industrial gases; natural gas (NG) processing; production of liquefied natural gas (LNG); flashing hydrocarbon liquids; NG pressure letdown energy recovery; oilfield cogeneration; and recovery of energy from waste heat. Turboexpander applications for energy conservation resulted because available turboexpanders have the required high-performance capabilities and reliability. At the same time, the development of these energy conservation practices and processes helped furthermore » improve turboexpanders.« less

  16. A multiplexed microfluidic toolbox for the rapid optimization of affinity-driven partition in aqueous two phase systems.

    PubMed

    Bras, Eduardo J S; Soares, Ruben R G; Azevedo, Ana M; Fernandes, Pedro; Arévalo-Rodríguez, Miguel; Chu, Virginia; Conde, João P; Aires-Barros, M Raquel

    2017-09-15

    Antibodies and other protein products such as interferons and cytokines are biopharmaceuticals of critical importance which, in order to be safely administered, have to be thoroughly purified in a cost effective and efficient manner. The use of aqueous two-phase extraction (ATPE) is a viable option for this purification, but these systems are difficult to model and optimization procedures require lengthy and expensive screening processes. Here, a methodology for the rapid screening of antibody extraction conditions using a microfluidic channel-based toolbox is presented. A first microfluidic structure allows a simple negative-pressure driven rapid screening of up to 8 extraction conditions simultaneously, using less than 20μL of each phase-forming solution per experiment, while a second microfluidic structure allows the integration of multi-step extraction protocols based on the results obtained with the first device. In this paper, this microfluidic toolbox was used to demonstrate the potential of LYTAG fusion proteins used as affinity tags to optimize the partitioning of antibodies in ATPE processes, where a maximum partition coefficient (K) of 9.2 in a PEG 3350/phosphate system was obtained for the antibody extraction in the presence of the LYTAG-Z dual ligand. This represents an increase of approx. 3.7 fold when compared with the same conditions without the affinity molecule (K=2.5). Overall, this miniaturized and versatile approach allowed the rapid optimization of molecule partition followed by a proof-of-concept demonstration of an integrated back extraction procedure, both of which are critical procedures towards obtaining high purity biopharmaceuticals using ATPE. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. rEHR: An R package for manipulating and analysing Electronic Health Record data.

    PubMed

    Springate, David A; Parisi, Rosa; Olier, Ivan; Reeves, David; Kontopantelis, Evangelos

    2017-01-01

    Research with structured Electronic Health Records (EHRs) is expanding as data becomes more accessible; analytic methods advance; and the scientific validity of such studies is increasingly accepted. However, data science methodology to enable the rapid searching/extraction, cleaning and analysis of these large, often complex, datasets is less well developed. In addition, commonly used software is inadequate, resulting in bottlenecks in research workflows and in obstacles to increased transparency and reproducibility of the research. Preparing a research-ready dataset from EHRs is a complex and time consuming task requiring substantial data science skills, even for simple designs. In addition, certain aspects of the workflow are computationally intensive, for example extraction of longitudinal data and matching controls to a large cohort, which may take days or even weeks to run using standard software. The rEHR package simplifies and accelerates the process of extracting ready-for-analysis datasets from EHR databases. It has a simple import function to a database backend that greatly accelerates data access times. A set of generic query functions allow users to extract data efficiently without needing detailed knowledge of SQL queries. Longitudinal data extractions can also be made in a single command, making use of parallel processing. The package also contains functions for cutting data by time-varying covariates, matching controls to cases, unit conversion and construction of clinical code lists. There are also functions to synthesise dummy EHR. The package has been tested with one for the largest primary care EHRs, the Clinical Practice Research Datalink (CPRD), but allows for a common interface to other EHRs. This simplified and accelerated work flow for EHR data extraction results in simpler, cleaner scripts that are more easily debugged, shared and reproduced.

  18. Extraction of actionable information from crowdsourced disaster data.

    PubMed

    Kiatpanont, Rungsun; Tanlamai, Uthai; Chongstitvatana, Prabhas

    Natural disasters cause enormous damage to countries all over the world. To deal with these common problems, different activities are required for disaster management at each phase of the crisis. There are three groups of activities as follows: (1) make sense of the situation and determine how best to deal with it, (2) deploy the necessary resources, and (3) harmonize as many parties as possible, using the most effective communication channels. Current technological improvements and developments now enable people to act as real-time information sources. As a result, inundation with crowdsourced data poses a real challenge for a disaster manager. The problem is how to extract the valuable information from a gigantic data pool in the shortest possible time so that the information is still useful and actionable. This research proposed an actionable-data-extraction process to deal with the challenge. Twitter was selected as a test case because messages posted on Twitter are publicly available. Hashtag, an easy and very efficient technique, was also used to differentiate information. A quantitative approach to extract useful information from the tweets was supported and verified by interviews with disaster managers from many leading organizations in Thailand to understand their missions. The information classifications extracted from the collected tweets were first performed manually, and then the tweets were used to train a machine learning algorithm to classify future tweets. One particularly useful, significant, and primary section was the request for help category. The support vector machine algorithm was used to validate the results from the extraction process of 13,696 sample tweets, with over 74 percent accuracy. The results confirmed that the machine learning technique could significantly and practically assist with disaster management by dealing with crowdsourced data.

  19. Design of experiments for amino acid extraction from tobacco leaves and their subsequent determination by capillary zone electrophoresis.

    PubMed

    Hodek, Ondřej; Křížek, Tomáš; Coufal, Pavel; Ryšlavá, Helena

    2017-03-01

    In this study, we optimized a method for the determination of free amino acids in Nicotiana tabacum leaves. Capillary electrophoresis with contactless conductivity detector was used for the separation of 20 proteinogenic amino acids in acidic background electrolyte. Subsequently, the conditions of extraction with HCl were optimized for the highest extraction yield of the amino acids because sample treatment of plant materials brings some specific challenges. Central composite face-centered design with fractional factorial design was used in order to evaluate the significance of selected factors (HCl volume, HCl concentration, sonication, shaking) on the extraction process. In addition, the composite design helped us to find the optimal values for each factor using the response surface method. The limits of detection and limits of quantification for the 20 proteinogenic amino acids were found to be in the order of 10 -5 and 10 -4  mol l -1 , respectively. Addition of acetonitrile to the sample was tested as a method commonly used to decrease limits of detection. Ambiguous results of this experiment pointed out some features of plant extract samples, which often required specific approaches. Suitability of the method for metabolomic studies was tested by analysis of a real sample, in which all amino acids, except for L-methionine and L-cysteine, were successfully detected. The optimized extraction process together with the capillary electrophoresis method can be used for the determination of proteinogenic amino acids in plant materials. The resulting inexpensive, simple, and robust method is well suited for various metabolomic studies in plants. As such, the method represents a valuable tool for research and practical application in the fields of biology, biochemistry, and agriculture.

  20. The Future Impact of Wind on BPA Power System Load Following and Regulation Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Lu, Shuai; McManus, Bart

    Wind power is growing in a very fast pace as an alternative generating resource. As the ratio of wind power over total system capacity increases, the impact of wind on various system aspects becomes significant. This paper presents a methodology to study the future impact of wind on BPA power system load following and regulation requirements. Existing methodologies for similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. The methodology proposed in this paper uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system. It mimics themore » actual power system operations therefore the results are close to reality yet the study based on this methodology is convenient to perform. The capacity, ramp rate and ramp duration characteristics are extracted from the simulation results. System load following and regulation capacity requirements are calculated accordingly. The ramp rate and ramp duration data obtained from the analysis can be used to evaluate generator response or maneuverability requirement and regulating units’ energy requirement, respectively.« less

  1. Simultaneous application of chemical oxidation and extraction processes is effective at remediating soil Co-contaminated with petroleum and heavy metals.

    PubMed

    Yoo, Jong-Chan; Lee, Chadol; Lee, Jeung-Sun; Baek, Kitae

    2017-01-15

    Chemical extraction and oxidation processes to clean up heavy metals and hydrocarbon from soil have a higher remediation efficiency and take less time than other remediation processes. In batch extraction/oxidation process, 3% hydrogen peroxide (H 2 O 2 ) and 0.1 M ethylenediaminetetraacetic acid (EDTA) could remove approximately 70% of the petroleum and 60% of the Cu and Pb in the soil, respectively. In particular, petroleum was effectively oxidized by H 2 O 2 without addition of any catalysts through dissolution of Fe oxides in natural soils. Furthermore, heavy metals bound to Fe-Mn oxyhydroxides could be extracted by metal-EDTA as well as Fe-EDTA complexation due to the high affinity of EDTA for metals. However, the strong binding of Fe-EDTA inhibited the oxidation of petroleum in the extraction-oxidation sequential process because Fe was removed during the extraction process with EDTA. The oxidation-extraction sequential process did not significantly enhance the extraction of heavy metals from soil, because a small portion of heavy metals remained bound to organic matter. Overall, simultaneous application of oxidation and extraction processes resulted in highly efficient removal of both contaminants; this approach can be used to remove co-contaminants from soil in a short amount of time at a reasonable cost. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. High-throughput simultaneous analysis of RNA, protein, and lipid biomarkers in heterogeneous tissue samples.

    PubMed

    Reiser, Vladimír; Smith, Ryan C; Xue, Jiyan; Kurtz, Marc M; Liu, Rong; Legrand, Cheryl; He, Xuanmin; Yu, Xiang; Wong, Peggy; Hinchcliffe, John S; Tanen, Michael R; Lazar, Gloria; Zieba, Renata; Ichetovkin, Marina; Chen, Zhu; O'Neill, Edward A; Tanaka, Wesley K; Marton, Matthew J; Liao, Jason; Morris, Mark; Hailman, Eric; Tokiwa, George Y; Plump, Andrew S

    2011-11-01

    With expanding biomarker discovery efforts and increasing costs of drug development, it is critical to maximize the value of mass-limited clinical samples. The main limitation of available methods is the inability to isolate and analyze, from a single sample, molecules requiring incompatible extraction methods. Thus, we developed a novel semiautomated method for tissue processing and tissue milling and division (TMAD). We used a SilverHawk atherectomy catheter to collect atherosclerotic plaques from patients requiring peripheral atherectomy. Tissue preservation by flash freezing was compared with immersion in RNAlater®, and tissue grinding by traditional mortar and pestle was compared with TMAD. Comparators were protein, RNA, and lipid yield and quality. Reproducibility of analyte yield from aliquots of the same tissue sample processed by TMAD was also measured. The quantity and quality of biomarkers extracted from tissue prepared by TMAD was at least as good as that extracted from tissue stored and prepared by traditional means. TMAD enabled parallel analysis of gene expression (quantitative reverse-transcription PCR, microarray), protein composition (ELISA), and lipid content (biochemical assay) from as little as 20 mg of tissue. The mean correlation was r = 0.97 in molecular composition (RNA, protein, or lipid) between aliquots of individual samples generated by TMAD. We also demonstrated that it is feasible to use TMAD in a large-scale clinical study setting. The TMAD methodology described here enables semiautomated, high-throughput sampling of small amounts of heterogeneous tissue specimens by multiple analytical techniques with generally improved quality of recovered biomolecules.

  3. Integrating Data Sources for Process Sustainability ...

    EPA Pesticide Factsheets

    To perform a chemical process sustainability assessment requires significant data about chemicals, process design specifications, and operating conditions. The required information includes the identity of the chemicals used, the quantities of the chemicals within the context of the sustainability assessment, physical properties of these chemicals, equipment inventory, as well as health, environmental, and safety properties of the chemicals. Much of this data are currently available to the process engineer either from the process design in the chemical process simulation software or online through chemical property and environmental, health, and safety databases. Examples of these databases include the U.S. Environmental Protection Agency’s (USEPA’s) Aggregated Computational Toxicology Resource (ACToR), National Institute for Occupational Safety and Health’s (NIOSH’s) Hazardous Substance Database (HSDB), and National Institute of Standards and Technology’s (NIST’s) Chemistry Webbook. This presentation will provide methods and procedures for extracting chemical identity and flow information from process design tools (such as chemical process simulators) and chemical property information from the online databases. The presentation will also demonstrate acquisition and compilation of the data for use in the EPA’s GREENSCOPE process sustainability analysis tool. This presentation discusses acquisition of data for use in rapid LCI development.

  4. Chromatographic analysis of methylglyoxal and other α-dicarbonyls using gas-diffusion microextraction.

    PubMed

    Santos, Christiane M; Valente, Inês M; Gonçalves, Luís M; Rodrigues, José A

    2013-12-07

    Many α-dicarbonyl compounds such as methylglyoxal, diacetyl and pentane-2,3-dione are important quality markers of processed foods. They are produced by enzymatic and chemical processes, the Maillard reaction is the most known chemical route for α-dicarbonyl formation. In the case of methylglyoxal, there are obstacles to be overcome when analysing this compound due to its high reactivity, low volatility and low concentration. The use of extraction techniques based on the volatilization of methylglyoxal (like solid-phase microextraction) showed to be ineffective for the methylglyoxal extraction from aqueous solutions. Therefore, derivatization is typically applied to increase analyte's volatility. In this work a new methodology for the extraction and analysis of methylglyoxal and also diacetyl and pentane-2,3-dione from selected food matrices is presented. It is based on a gas-diffusion microextraction step followed by high performance liquid chromatographic analysis. It was successfully applied to port wines, black tea and soy sauce. Methylglyoxal, diacetyl and pentane-2,3-dione were quantified in the following concentration ranges: 0.24-1.74 mg L(-1), 0.1-1.85 mg L(-1) and 0.023-0.15 mg L(-1), respectively. The main advantages over existing methodologies are its simplicity in terms of sample handling, not requiring any chemical modification of the α-dicarbonyls prior to the extraction, low reagent consumption and short time of analysis.

  5. Low temperature synthesis of hierarchical TiO 2 nanostructures for high performance perovskite solar cells by pulsed laser deposition

    DOE PAGES

    Yang, Bin; Mahjouri-Samani, Masoud; Rouleau, Christopher M.; ...

    2016-06-10

    A promising way to advance perovskite solar cells is to improve the quality of the electron transport material e.g., titanium dioxide (TiO 2) in a direction that increases electron transport and extraction. Although dense TiO 2 films are easily grown in solution, efficient electron extraction suffers due to a lack of interfacial contact area with the perovskite. Conversely, mesoporous films do offer high surface-area-to-volume ratios, thereby promoting efficient electron extraction, but their morphology is relatively difficult to control via conventional solution synthesis methods. Here, a pulsed laser deposition method was used to assemble TiO 2 nanoparticles into TiO 2 hierarchicalmore » nanoarchitectures having the anatase crystal structure, and prototype solar cells employing these structures yielded power conversion efficiencies of ~ 14%. Our approach demonstrates a way to grow high aspect-ratio TiO 2 nanostructures for improved interfacial contact between TiO 2 and perovskite materials, leading to high electron-hole pair separation and electron extraction efficiencies for superior photovoltaic performance. In addition, compared to conventional solution-processed TiO 2 films that require 500 °C to obtain a good crystallinity, our relatively low temperature (300 °C) TiO 2 processing method may promote reduced energy-consumption during device fabrication as well as enable compatibility with various flexible polymer substrates.« less

  6. Real-Time EEG Signal Enhancement Using Canonical Correlation Analysis and Gaussian Mixture Clustering

    PubMed Central

    Huang, Chih-Sheng; Yang, Wen-Yu; Chuang, Chun-Hsiang; Wang, Yu-Kai

    2018-01-01

    Electroencephalogram (EEG) signals are usually contaminated with various artifacts, such as signal associated with muscle activity, eye movement, and body motion, which have a noncerebral origin. The amplitude of such artifacts is larger than that of the electrical activity of the brain, so they mask the cortical signals of interest, resulting in biased analysis and interpretation. Several blind source separation methods have been developed to remove artifacts from the EEG recordings. However, the iterative process for measuring separation within multichannel recordings is computationally intractable. Moreover, manually excluding the artifact components requires a time-consuming offline process. This work proposes a real-time artifact removal algorithm that is based on canonical correlation analysis (CCA), feature extraction, and the Gaussian mixture model (GMM) to improve the quality of EEG signals. The CCA was used to decompose EEG signals into components followed by feature extraction to extract representative features and GMM to cluster these features into groups to recognize and remove artifacts. The feasibility of the proposed algorithm was demonstrated by effectively removing artifacts caused by blinks, head/body movement, and chewing from EEG recordings while preserving the temporal and spectral characteristics of the signals that are important to cognitive research. PMID:29599950

  7. Incoherent optical generalized Hough transform: pattern recognition and feature extraction applications

    NASA Astrophysics Data System (ADS)

    Fernández, Ariel; Ferrari, José A.

    2017-05-01

    Pattern recognition and feature extraction are image processing applications of great interest in defect inspection and robot vision among others. In comparison to purely digital methods, the attractiveness of optical processors for pattern recognition lies in their highly parallel operation and real-time processing capability. This work presents an optical implementation of the generalized Hough transform (GHT), a well-established technique for recognition of geometrical features in binary images. Detection of a geometric feature under the GHT is accomplished by mapping the original image to an accumulator space; the large computational requirements for this mapping make the optical implementation an attractive alternative to digital-only methods. We explore an optical setup where the transformation is obtained, and the size and orientation parameters can be controlled, allowing for dynamic scale and orientation-variant pattern recognition. A compact system for the above purposes results from the use of an electrically tunable lens for scale control and a pupil mask implemented on a high-contrast spatial light modulator for orientation/shape variation of the template. Real-time can also be achieved. In addition, by thresholding of the GHT and optically inverse transforming, the previously detected features of interest can be extracted.

  8. Methodology to Collect Natural Gas from Methane Hydrate Deposits Using Sunlight: Design of Direct Sunlight Exposure System

    NASA Astrophysics Data System (ADS)

    Shimada, M.; Shimada, J.; Tsunashima, K.; Aoyama, C.

    2017-12-01

    Methane hydrate is anticipated to be the unconventional natural gas energy resource. Two types of methane hydrates are known to exist, based on the settings: "shallow" type and "sand layer" type. In comparison, shallow type is considered an advantage due to its high purity and the more simple exploration. However, not much development methods have been made in the area of extraction techniques. Currently, heating and depressurization are used as methods to collect sand layer methane hydrate, but these methods are still under examination and not yet to be implemented. This is probably because fossil fuel is used for the extraction process instead of natural energy. It is necessary to utilize natural energy instead of relying on fossil fuel. This is why sunlight is believed to be the most significant alternative. Solar power generation is commonly used to extract sunlight, but it is said that this process causes extreme energy loss since solar energy converted to electricity requires conversion to heat energy. A new method is contrived to accelerate the decomposition of methane hydrate with direct sunlight utilizing optical fibers. Authors will present details of this new method to collect methane hydrate with direct sunlight exposure.

  9. Microwave Extraction of Lunar Water for Rocket Fuel

    NASA Technical Reports Server (NTRS)

    Ethridge, Edwin C.; Donahue, Benjamin; Kaukler, William

    2008-01-01

    Nearly 50% of the lunar surface is oxygen, present as oxides in silicate rocks and soil. Methods for reduction of these oxides could liberate the oxygen. Remote sensing has provided evidence of significant quantities of hydrogen possibly indicating hundreds of millions of metric tons, MT, of water at the lunar poles. If the presence of lunar water is verified, water is likely to be the first in situ resource exploited for human exploration and for LOX-H2 rocket fuel. In-Situ lunar resources offer unique advantages for space operations. Each unit of product produced on the lunar surface represents 6 units that need not to be launched into LEO. Previous studies have indicated the economic advantage of LOX for space tugs from LEO to GEO. Use of lunar derived LOX in a reusable lunar lander would greatly reduce the LEO mass required for a given payload to the moon. And Lunar LOX transported to L2 has unique advantages for a Mars mission. Several methods exist for extraction of oxygen from the soil. But, extraction of lunar water has several significant advantages. Microwave heating of lunar permafrost has additional important advantages for water extraction. Microwaves penetrate and heat from within not just at the surface and excavation is not required. Proof of concept experiments using a moon in a bottle concept have demonstrated that microwave processing of cryogenic lunar permafrost simulant in a vacuum rapidly and efficiently extracts water by sublimation. A prototype lunar water extraction rover was built and tested for heating of simulant. Microwave power was very efficiently delivered into a simulated lunar soil. Microwave dielectric properties (complex electric permittivity and magnetic permeability) of lunar regolith simulant, JSC-1A, were measured down to cryogenic temperatures and above room temperature. The microwave penetration has been correlated with the measured dielectric properties. Since the microwave penetration depth is a function of temperature and frequency, an extraction system can be designed for water removal from different depths.

  10. An Approach for Calculating Land Valuation by Using Inspire Data Models

    NASA Astrophysics Data System (ADS)

    Aydinoglu, A. C.; Bovkir, R.

    2017-11-01

    Land valuation is a highly important concept for societies and governments have always emphasis on the process especially for taxation, expropriation, market capitalization and economic activity purposes. To success an interoperable and standardised land valuation, INSPIRE data models can be very practical and effective. If data used in land valuation process produced in compliance with INSPIRE specifications, a reliable and effective land valuation process can be performed. In this study, possibility of the performing land valuation process with using the INSPIRE data models was analysed and with the help of Geographic Information Systems (GIS) a case study in Pendik was implemented. For this purpose, firstly data analysis and gathering was performed. After, different data structures were transformed according to the INSPIRE data model requirements. For each data set necessary ETL (Extract-Transform-Load) tools were produced and all data transformed according to the target data requirements. With the availability and practicability of spatial analysis tools of GIS software, land valuation calculations were performed for study area.

  11. Diffraction based overlay metrology for α-carbon applications

    NASA Astrophysics Data System (ADS)

    Saravanan, Chandra Saru; Tan, Asher; Dasari, Prasad; Goelzer, Gary; Smith, Nigel; Woo, Seouk-Hoon; Shin, Jang Ho; Kang, Hyun Jae; Kim, Ho Chul

    2008-03-01

    Applications that require overlay measurement between layers separated by absorbing interlayer films (such as α- carbon) pose significant challenges for sub-50nm processes. In this paper scatterometry methods are investigated as an alternative to meet these stringent overlay metrology requirements. In this article, a spectroscopic Diffraction Based Overlay (DBO) measurement technique is used where registration errors are extracted from specially designed diffraction targets. DBO measurements are performed on detailed set of wafers with varying α-carbon (ACL) thicknesses. The correlation in overlay values between wafers with varying ACL thicknesses will be discussed. The total measurement uncertainty (TMU) requirements for these layers are discussed and the DBO TMU results from sub-50nm samples are reviewed.

  12. Composition and process for separating cesium ions from an acidic aqueous solution also containing other ions

    DOEpatents

    Dietz, Mark L.; Horwitz, E. Philip; Bartsch, Richard A.; Barrans, Jr., Richard E.; Rausch, David

    1999-01-01

    A crown ether cesium ion extractant is disclosed as is its synthesis. The crown ether cesium ion extractant is useful for the selective purification of cesium ions from aqueous acidic media, and more particularly useful for the isolation of radioactive cesium-137 from nuclear waste streams. Processes for isolating cesium ions from aqueous acidic media using the crown ether cesium extractant are disclosed as are processes for recycling the crown ether cesium extractant and processes for recovering cesium from a crown ether cesium extractant solution.

  13. Composition and process for separating cesium ions from an acidic aqueous solution also containing other ions

    DOEpatents

    Dietz, M.L.; Horwitz, E.P.; Bartsch, R.A.; Barrans, R.E. Jr.; Rausch, D.

    1999-03-30

    A crown ether cesium ion extractant is disclosed as is its synthesis. The crown ether cesium ion extractant is useful for the selective purification of cesium ions from aqueous acidic media, and more particularly useful for the isolation of radioactive cesium-137 from nuclear waste streams. Processes for isolating cesium ions from aqueous acidic media using the crown ether cesium extractant are disclosed as are processes for recycling the crown ether cesium extractant and processes for recovering cesium from a crown ether cesium extractant solution. 4 figs.

  14. Development of a Post-Processing Algorithm for Accurate Human Skull Profile Extraction via Ultrasonic Phased Arrays

    NASA Astrophysics Data System (ADS)

    Al-Ansary, Mariam Luay Y.

    Ultrasound Imaging has been favored by clinicians for its safety, affordability, accessibility, and speed compared to other imaging modalities. However, the trade-offs to these benefits are a relatively lower image quality and interpretability, which can be addressed by, for example, post-processing methods. One particularly difficult imaging case is associated with the presence of a barrier, such as a human skull, with significantly different acoustical properties than the brain tissue as the target medium. Some methods were proposed in the literature to account for this structure if the skull's geometry is known. Measuring the skull's geometry is therefore an important task that requires attention. In this work, a new edge detection method for accurate human skull profile extraction via post-processing of ultrasonic A-Scans is introduced. This method, referred to as the Selective Echo Extraction algorithm, SEE, processes each A-Scan separately and determines the outermost and innermost boundaries of the skull by means of adaptive filtering. The method can also be used to determine the average attenuation coefficient of the skull. When applied to simulated B-Mode images of the skull profile, promising results were obtained. The profiles obtained from the proposed process in simulations were found to be within 0.15lambda +/- 0.11lambda or 0.09 +/- 0.07mm from the actual profiles. Experiments were also performed to test SEE on skull mimicking phantoms with major acoustical properties similar to those of the actual human skull. With experimental data, the profiles obtained with the proposed process were within 0.32lambda +/- 0.25lambda or 0.19 +/- 0.15mm from the actual profile.

  15. Lunar exploration for resource utilization

    NASA Technical Reports Server (NTRS)

    Duke, Michael B.

    1992-01-01

    The strategy for developing resources on the Moon depends on the stage of space industrialization. A case is made for first developing the resources needed to provide simple materials required in large quantities for space operations. Propellants, shielding, and structural materials fall into this category. As the enterprise grows, it will be feasible to develop additional sources - those more difficult to obtain or required in smaller quantities. Thus, the first materials processing on the Moon will probably take the abundant lunar regolith, extract from it major mineral or glass species, and do relatively simple chemical processing. We need to conduct a lunar remote sensing mission to determine the global distribution of features, geophysical properties, and composition of the Moon, information which will serve as the basis for detailed models of and engineering decisions about a lunar mine.

  16. Ultrasonic Micro-Blades for the Rapid Extraction of Impact Tracks from Aerogel

    NASA Technical Reports Server (NTRS)

    Ishii, H. A.; Graham, G. A.; Kearsley, A. T.; Grant, P. G.; Snead, C. J.; Bradley, J. P.

    2005-01-01

    The science return of NASA's Stardust Mission with its valuable cargo of cometary debris hinges on the ability to efficiently extract particles from silica aerogel collectors. The current method for extracting cosmic dust impact tracks is a mature procedure involving sequential perforation of the aerogel with glass needles on computer controlled micromanipulators. This method is highly successful at removing well-defined aerogel fragments of reasonable optical clarity while causing minimal damage to the surrounding aerogel collector tile. Such a system will be adopted by the JSC Astromaterials Curation Facility in anticipation of Stardust s arrival in early 2006. In addition to Stardust, aerogel is a possible collector for future sample return missions and is used for capture of hypervelocity ejecta in high power laser experiments of interest to LLNL. Researchers will be eager to obtain Stardust samples for study as quickly as possible, and rapid extraction tools requiring little construction, training, or investment would be an attractive asset. To this end, we have experimented with micro-blades for the Stardust impact track extraction process. Our ultimate goal is a rapid extraction system in a clean electron beam environment, such as an SEM or dual-beam FIB, for in situ sample preparation, mounting and analysis.

  17. Extracting foreground ensemble features to detect abnormal crowd behavior in intelligent video-surveillance systems

    NASA Astrophysics Data System (ADS)

    Chan, Yi-Tung; Wang, Shuenn-Jyi; Tsai, Chung-Hsien

    2017-09-01

    Public safety is a matter of national security and people's livelihoods. In recent years, intelligent video-surveillance systems have become important active-protection systems. A surveillance system that provides early detection and threat assessment could protect people from crowd-related disasters and ensure public safety. Image processing is commonly used to extract features, e.g., people, from a surveillance video. However, little research has been conducted on the relationship between foreground detection and feature extraction. Most current video-surveillance research has been developed for restricted environments, in which the extracted features are limited by having information from a single foreground; they do not effectively represent the diversity of crowd behavior. This paper presents a general framework based on extracting ensemble features from the foreground of a surveillance video to analyze a crowd. The proposed method can flexibly integrate different foreground-detection technologies to adapt to various monitored environments. Furthermore, the extractable representative features depend on the heterogeneous foreground data. Finally, a classification algorithm is applied to these features to automatically model crowd behavior and distinguish an abnormal event from normal patterns. The experimental results demonstrate that the proposed method's performance is both comparable to that of state-of-the-art methods and satisfies the requirements of real-time applications.

  18. Comparison between 2 methods of solid-liquid extraction for the production of Cinchona calisaya elixir: an experimental kinetics and numerical modeling approach.

    PubMed

    Naviglio, Daniele; Formato, Andrea; Gallo, Monica

    2014-09-01

    The purpose of this study is to compare the extraction process for the production of China elixir starting from the same vegetable mixture, as performed by conventional maceration or a cyclically pressurized extraction process (rapid solid-liquid dynamic extraction) using the Naviglio Extractor. Dry residue was used as a marker for the kinetics of the extraction process because it was proportional to the amount of active principles extracted and, therefore, to their total concentration in the solution. UV spectra of the hydroalcoholic extracts allowed for the identification of the predominant chemical species in the extracts, while the organoleptic tests carried out on the final product provided an indication of the acceptance of the beverage and highlighted features that were not detectable by instrumental analytical techniques. In addition, a numerical simulation of the process has been performed, obtaining useful information about the timing of the process (time history) as well as its mathematical description. © 2014 Institute of Food Technologists®

  19. Compositional similarities of non-solvent extractable fatty acids from recent marine sediments deposited in differing environments

    NASA Astrophysics Data System (ADS)

    Nishimura, Mitsugu; Baker, Earl W.

    1987-06-01

    Five recent sediment samples from a variety of North American continental shelves were analyzed for fatty acids (FAs) in the solvent-extractable (SOLEX) lipids as well as four types of non-solvent extractable (NONEX) lipids. The NONEX lipids were operationally defined by the succession of extraction procedure required to recover them. The complete procedure included (i) very mild acid treatment, (ii) HF digestion and (iii) saponification of the sediment residue following exhaustive solvent extraction. The distribution pattern and various compositional parameters of SOLEX FAs in the five sediments were divided into three different groups, indicating the difference of biological sources and also diagenetic factors and processes among the three groups of samples. Nevertheless, the compositions of the corresponding NONEX FAs after acid treatment were surprisingly very similar. This was also true for the remaining NONEX FA groups in the five sediment samples. The findings implied that most of the NONEX FAs reported here are derived directly from living organisms. It is also concluded that a large part of NONEX FAs are much more resistant to biodegradation than we have thought, so that they can form the large percentage of total lipids with increasing depth of water and sediments.

  20. A Review of Enzymatic Transesterification of Microalgal Oil-Based Biodiesel Using Supercritical Technology

    PubMed Central

    Taher, Hanifa; Al-Zuhair, Sulaiman; Al-Marzouqi, Ali H.; Haik, Yousef; Farid, Mohammed M.

    2011-01-01

    Biodiesel is considered a promising replacement to petroleum-derived diesel. Using oils extracted from agricultural crops competes with their use as food and cannot realistically satisfy the global demand of diesel-fuel requirements. On the other hand, microalgae, which have a much higher oil yield per hectare, compared to oil crops, appear to be a source that has the potential to completely replace fossil diesel. Microalgae oil extraction is a major step in the overall biodiesel production process. Recently, supercritical carbon dioxide (SC-CO2) has been proposed to replace conventional solvent extraction techniques because it is nontoxic, nonhazardous, chemically stable, and inexpensive. It uses environmentally acceptable solvent, which can easily be separated from the products. In addition, the use of SC-CO2 as a reaction media has also been proposed to eliminate the inhibition limitations that encounter biodiesel production reaction using immobilized enzyme as a catalyst. Furthermore, using SC-CO2 allows easy separation of the product. In this paper, conventional biodiesel production with first generation feedstock, using chemical catalysts and solvent-extraction, is compared to new technologies with an emphasis on using microalgae, immobilized lipase, and SC-CO2 as an extraction solvent and reaction media. PMID:21915372

Top