Integrated gasifier combined cycle polygeneration system to produce liquid hydrogen
NASA Technical Reports Server (NTRS)
Burns, R. K.; Staiger, P. J.; Donovan, R. M.
1982-01-01
An integrated gasifier combined cycle (IGCC) system which simultaneously produces electricity, process steam, and liquid hydrogen was evaluated and compared to IGCC systems which cogenerate electricity and process steam. A number of IGCC plants, all employing a 15 MWe has turbine and producing from 0 to 20 tons per day of liquid hydrogen and from 0 to 20 MWt of process steam were considered. The annual revenue required to own and operate such plants was estimated to be significantly lower than the potential market value of the products. The results indicate a significant potential economic benefit to configuring IGCC systems to produce a clean fuel in addition to electricity and process steam in relatively small industrial applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Provost, G.; Stone, H.; McClintock, M.
2008-01-01
To meet the growing demand for education and experience with the analysis, operation, and control of commercial-scale Integrated Gasification Combined Cycle (IGCC) plants, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) is leading a collaborative R&D project with participants from government, academia, and industry. One of the goals of this project is to develop a generic, full-scope, real-time generic IGCC dynamic plant simulator for use in establishing a world-class research and training center, as well as to promote and demonstrate the technology to power industry personnel. The NETL IGCC dynamic plant simulator will combine for the first timemore » a process/gasification simulator and a power/combined-cycle simulator together in a single dynamic simulation framework for use in training applications as well as engineering studies. As envisioned, the simulator will have the following features and capabilities: A high-fidelity, real-time, dynamic model of process-side (gasification and gas cleaning with CO2 capture) and power-block-side (combined cycle) for a generic IGCC plant fueled by coal and/or petroleum coke Full-scope training simulator capabilities including startup, shutdown, load following and shedding, response to fuel and ambient condition variations, control strategy analysis (turbine vs. gasifier lead, etc.), representative malfunctions/trips, alarms, scenarios, trending, snapshots, data historian, and trainee performance monitoring The ability to enhance and modify the plant model to facilitate studies of changes in plant configuration and equipment and to support future R&D efforts To support this effort, process descriptions and control strategies were developed for key sections of the plant as part of the detailed functional specification, which will form the basis of the simulator development. These plant sections include: Slurry Preparation Air Separation Unit Gasifiers Syngas Scrubbers Shift Reactors Gas Cooling, Medium Pressure (MP) and Low Pressure (LP) Steam Generation, and Knockout Sour Water Stripper Mercury Removal Selexol™ Acid Gas Removal System CO2 Compression Syngas Reheat and Expansion Claus Plant Hydrogenation Reactor and Gas Cooler Combustion Turbine (CT)-Generator Assemblies Heat Recovery Steam Generators (HRSGs) and Steam Turbine (ST)-Generator In this paper, process descriptions, control strategies, and Process & Instrumentation Diagram (P&ID) drawings for key sections of the generic IGCC plant are presented, along with discussions of some of the operating procedures and representative faults that the simulator will cover. Some of the intended future applications for the simulator are discussed, including plant operation and control demonstrations as well as education and training services such as IGCC familiarization courses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Provost, G.; Zitney, S.; Turton, R.
2009-01-01
To meet increasing demand for education and experience with commercial-scale, coal-fired, integrated gasification combined cycle (IGCC) plants with CO2 capture, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) is leading a project to deploy a generic, full-scope, real-time IGCC dynamic plant simulator for use in establishing a world-class research and training center, and to promote and demonstrate IGCC technology to power industry personnel. The simulator, being built by Invensys Process Systems (IPS), will be installed at two separate sites, at NETL and West Virginia University (WVU), and will combine a process/gasification simulator with a power/combined-cycle simulator together inmore » a single dynamic simulation framework for use in engineering research studies and training applications. The simulator, scheduled to be launched in mid-year 2010, will have the following capabilities: High-fidelity, dynamic model of process-side (gasification and gas cleaning with CO2 capture) and power-block-side (combined cycle) for a generic IGCC plant fueled by coal and/or petroleum coke. Highly flexible configuration that allows concurrent training on separate gasification and combined cycle simulators, or up to two IGCC simulators. Ability to enhance and modify the plant model to facilitate studies of changes in plant configuration, equipment, and control strategies to support future R&D efforts. Training capabilities including startup, shutdown, load following and shedding, response to fuel and ambient condition variations, control strategy analysis (turbine vs. gasifier lead, etc.), representative malfunctions/trips, alarms, scenarios, trending, snapshots, data historian, etc. To support this effort, process descriptions and control strategies were developed for key sections of the plant as part of the detailed functional specification, which is serving as the basis of the simulator development. In this paper, we highlight the contents of the detailed functional specification for the simulator. We also describe the engineering, design, and expert testing process that the simulator will undergo in order to ensure that maximum fidelity is built into the generic simulator. Future applications and training programs associated with gasification, combined cycle, and IGCC simulations are discussed, including plant operation and control demonstrations, as well as education and training services.« less
Zhu, Yunhua; Frey, H Christopher
2006-12-01
Integrated gasification combined cycle (IGCC) technology is a promising alternative for clean generation of power and coproduction of chemicals from coal and other feedstocks. Advanced concepts for IGCC systems that incorporate state-of-the-art gas turbine systems, however, are not commercially demonstrated. Therefore, there is uncertainty regarding the future commercial-scale performance, emissions, and cost of such technologies. The Frame 7F gas turbine represents current state-of-practice, whereas the Frame 7H is the most recently introduced advanced commercial gas turbine. The objective of this study was to evaluate the risks and potential payoffs of IGCC technology based on different gas turbine combined cycle designs. Models of entrained-flow gasifier-based IGCC systems with Frame 7F (IGCC-7F) and 7H gas turbine combined cycles (IGCC-7H) were developed in ASPEN Plus. An uncertainty analysis was conducted. Gasifier carbon conversion and project cost uncertainty are identified as the most important uncertain inputs with respect to system performance and cost. The uncertainties in the difference of the efficiencies and costs for the two systems are characterized. Despite uncertainty, the IGCC-7H system is robustly preferred to the IGCC-7F system. Advances in gas turbine design will improve the performance, emissions, and cost of IGCC systems. The implications of this study for decision-making regarding technology selection, research planning, and plant operation are discussed.
ASPEN simulation of a fixed-bed integrated gasification combined-cycle power plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, K.R.
1986-03-01
A fixed-bed integrated gasification combined-cycle (IGCC) power plant has been modeled using the Advanced System for Process ENgineering (ASPEN). The ASPEN simulation is based on a conceptual design of a 509-MW IGCC power plant that uses British Gas Corporation (BGC)/Lurgi slagging gasifiers and the Lurgi acid gas removal process. The 39.3-percent thermal efficiency of the plant that was calculated by the simulation compares very favorably with the 39.4 percent that was reported by EPRI. The simulation addresses only thermal performance and does not calculate capital cost or process economics. Portions of the BGC-IGCC simulation flowsheet are based on the SLAGGERmore » fixed-bed gasifier model (Stefano May 1985), and the Kellogg-Rust-Westinghouse (KRW) iGCC, and the Texaco-IGCC simulations (Stone July 1985) that were developed at the Department of Energy (DOE), Morgantown Energy Technology Center (METC). The simulation runs in 32 minutes of Central Processing Unit (CPU) time on the VAX-11/780. The BGC-IGCC simulation was developed to give accurate mass and energy balances and to track coal tars and environmental species such as SO/sub x/ and NO/sub x/ for a fixed-bed, coal-to-electricity system. This simulation is the third in a series of three IGCC simulations that represent fluidized-bed, entrained-flow, and fixed-bed gasification processes. Alternate process configurations can be considered by adding, deleting, or rearranging unit operation blocks. The gasifier model is semipredictive; it can properly respond to a limited range of coal types and gasifier operating conditions. However, some models in the flowsheet are based on correlations that were derived from the EPRI study, and are therefore limited to coal types and operating conditions that are reasonably close to those given in the EPRI design. 4 refs., 7 figs., 2 tabs.« less
Analysis of potential benefits of integrated-gasifier combined cycles for a utility system
NASA Technical Reports Server (NTRS)
Choo, Y. K.
1983-01-01
Potential benefits of integrated gasifier combined cycle (IGCC) units were evaluated for a reference utility system by comparing long range expansion plans using IGCC units and gas turbine peakers with a plan using only state of the art steam turbine units and gas turbine peakers. Also evaluated was the importance of the benefits of individual IGCC unit characteristics, particularly unit efficiency, unit equivalent forced outage rate, and unit size. A range of IGCC units was analyzed, including cases achievable with state of the art gas turbines and cases assuming advanced gas turbine technology. All utility system expansion plans that used IGCC units showed substantial savings compared with the base expansion plan using the steam turbine units.
Pinon Pine power project nears start-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tatar, G.A.; Gonzalez, M.; Mathur, G.K.
1997-12-31
The IGCC facility being built by Sierra Pacific Power Company (SPPCo) at their Tracy Station in Nevada is one of three IGCC facilities being cost-shared by the US Department of Energy (DOE) under their Clean Coal Technology Program. The specific technology to be demonstrated in SPPCo`s Round Four Project, known as the Pinon Pine IGCC Project, includes the KRW air blown pressurized fluidized bed gasification process with hot gas cleanup coupled with a combined cycle facility based on a new GE 6FA gas turbine. Construction of the 100 MW IGCC facility began in February 1995 and the first firing ofmore » the gas turbine occurred as scheduled on August 15, 1996 with natural gas. Mechanical completion of the gasifier and other outstanding work is due in January 1997. Following the startup of the plant, the project will enter a 42 month operating and testing period during which low sulfur western and high sulfur eastern or midwestern coals will be processed.« less
A regenerative process for carbon dioxide removal and hydrogen production in IGCC
NASA Astrophysics Data System (ADS)
Hassanzadeh Khayyat, Armin
Advanced power generation technologies, such as Integrated Gasification-Combined Cycles (IGCC) processes, are among the leading contenders for power generation conversion because of their significantly higher efficiencies and potential environmental advantages, compared to conventional coal combustion processes. Although the increased in efficiency in the IGCC processes will reduce the emissions of carbon dioxide per unit of power generated, further reduction in CO2 emissions is crucial due to enforcement of green house gases (GHG) regulations. In IGCC processes to avoid efficiency losses, it is desirable to remove CO2 in the temperature range of 300° to 500°C, which makes regenerable MgO-based sorbents ideal for such operations. In this temperature range, CO2 removal results in the shifting of the water-gas shift (WGS) reaction towards significant reduction in carbon monoxide (CO), and enhancement in hydrogen production. However, regenerable, reactive and attrition resistant sorbents are required for such application. In this work, a highly reactive and attrition resistant regenerable MgO-based sorbent is prepared through dolomite modification, which can simultaneously remove carbon dioxide and enhance hydrogen production in a single reactor. The results of the experimental tests conducted in High-Pressure Thermogravimetric Analyzer (HP-TGA) and high-pressure packed-bed units indicate that in the temperature range of 300° to 500°C at 20 atm more than 95 molar percent of CO2 can be removed from the simulated coal gas, and the hydrogen concentration can be increased to above 70 percent. However, a declining trend is observed in the capacity of the sorbent exposed to long-term durability analysis, which appears to level off after about 20 cycles. Based on the physical and chemical analysis of the sorbent, a two-zone expanding grain model was applied to obtain an excellent fit to the carbonation reaction rate data at various operating conditions. The modeling results indicate that more than 90 percent purification of hydrogen is achievable, either by increasing the activity of the sorbent towards water-gas shift reaction or by mixing the sorbent bed with a commercialized water-gas shift catalyst. The preliminary economical evaluation of the MgO-based process indicates that this process can be economically viable compared to the commercially available WGS/Selexol(TM) processes.
NASA Astrophysics Data System (ADS)
Gordeev, S. I.; Bogatova, T. F.; Ryzhkov, A. F.
2017-11-01
Raising the efficiency and environmental friendliness of electric power generation from coal is the aim of numerous research groups today. The traditional approach based on the steam power cycle has reached its efficiency limit, prompted by materials development and maneuverability performance. The rival approach based on the combined cycle is also drawing nearer to its efficiency limit. However, there is a reserve for efficiency increase of the integrated gasification combined cycle, which has the energy efficiency at the level of modern steam-turbine power units. The limit of increase in efficiency is the efficiency of NGCC. One of the main problems of the IGCC is higher costs of receiving and preparing fuel gas for GTU. It would be reasonable to decrease the necessary amount of fuel gas in the power unit to minimize the costs. The effect can be reached by raising of the heat value of fuel gas, its heat content and the heat content of cycle air. On the example of the process flowsheet of the IGCC with a power of 500 MW, running on Kuznetsk bituminous coal, by means of software Thermoflex, the influence of the developed technical solutions on the efficiency of the power plant is considered. It is received that rise in steam-air blast temperature to 900°C leads to an increase in conversion efficiency up to 84.2%. An increase in temperature levels of fuel gas clean-up to 900°C leads to an increase in the IGCC efficiency gross/net by 3.42%. Cycle air heating reduces the need for fuel gas by 40% and raises the IGCC efficiency gross/net by 0.85-1.22%. The offered solutions for IGCC allow to exceed net efficiency of analogous plants by 1.8-2.3%.
Technical and economic assessments commercial success for IGCC technology in China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, T.
1998-07-01
The experiences gained from several Integrated Gasification Combined Cycle (IGCC) demonstration plants operating in the US and Europe facilitate commercial success of this advanced coal-based power generation technology. However, commercialization of coal-based IGCC technology in the West, particularly in the US, is restricted due to the low price of natural gas. On the contrary, in China--the largest coal producer and consumer in the world--a lack of natural gas supply, strong demand for air pollution control and relatively low costs of manufacturing and construction provide tremendous opportunities for IGCC applications. The first Chinese IGCC demonstration project was initiated in 1994, andmore » other potential IGCC projects are in planning. IGCC applications in re-powering, fuel switching and multi-generation also show a great market potential in China. However, questions for IGCC development in China remain; where are realistic opportunities for IGCC projects and how can these opportunities be converted into commercial success? The answers to these questions should focus on the Chinese market needs and emphasize economic benefits, not just clean, or power. High price of imported equipment, high financing costs, and the technical risk of first-of-a-kind installation barricade IGCC development in China. This paper presents preliminary technical and economic assessments for four typical IGCC applications in the Chinese marketplace: central power station, fuel switching, re-powering, and multi-generation. The major factors affecting project economics--such as plant cost, financing, prices of fuel and electricity and operating capacity factor--are analyzed. The results indicate that well-proven technology for versatile applications, preferred financing, reduction of the plant cost, environmental superiority and appropriate project structure are the key for commercial success of IGCC in China.« less
Improved system integration for integrated gasification combined cycle (IGCC) systems.
Frey, H Christopher; Zhu, Yunhua
2006-03-01
Integrated gasification combined cycle (IGCC) systems are a promising technology for power generation. They include an air separation unit (ASU), a gasification system, and a gas turbine combined cycle power block, and feature competitive efficiency and lower emissions compared to conventional power generation technology. IGCC systems are not yet in widespread commercial use and opportunities remain to improve system feasibility via improved process integration. A process simulation model was developed for IGCC systems with alternative types of ASU and gas turbine integration. The model is applied to evaluate integration schemes involving nitrogen injection, air extraction, and combinations of both, as well as different ASU pressure levels. The optimal nitrogen injection only case in combination with an elevated pressure ASU had the highest efficiency and power output and approximately the lowest emissions per unit output of all cases considered, and thus is a recommended design option. The optimal combination of air extraction coupled with nitrogen injection had slightly worse efficiency, power output, and emissions than the optimal nitrogen injection only case. Air extraction alone typically produced lower efficiency, lower power output, and higher emissions than all other cases. The recommended nitrogen injection only case is estimated to provide annualized cost savings compared to a nonintegrated design. Process simulation modeling is shown to be a useful tool for evaluation and screening of technology options.
NASA Technical Reports Server (NTRS)
Nainiger, J. J.; Burns, R. K.; Easley, A. J.
1982-01-01
A performance and operational economics analysis is presented for an integrated-gasifier, combined-cycle (IGCC) system to meet the steam and baseload electrical requirements. The effect of time variations in steam and electrial requirements is included. The amount and timing of electricity purchases from sales to the electric utility are determined. The resulting expenses for purchased electricity and revenues from electricity sales are estimated by using an assumed utility rate structure model. Cogeneration results for a range of potential IGCC cogeneration system sizes are compared with the fuel consumption and costs of natural gas and electricity to meet requirements without cogeneration. The results indicate that an IGCC cogeneration system could save about 10 percent of the total fuel energy presently required to supply steam and electrical requirements without cogeneration. Also for the assumed future fuel and electricity prices, an annual operating cost savings of 21 percent to 26 percent could be achieved with such a cogeneration system. An analysis of the effects of electricity price, fuel price, and system availability indicates that the IGCC cogeneration system has a good potential for economical operation over a wide range in these assumptions.
Producing fired bricks using coal slag from a gasification plant in indiana
Chen, L.-M.; Chou, I.-Ming; Chou, S.-F.J.; Stucki, J.W.
2009-01-01
Integrated gasification combined cycle (IGCC) is a promising power generation technology which increases the efficiency of coal-to-power conversion and enhances carbon dioxide concentration in exhaust emissions for better greenhouse gas capture. Two major byproducts from IGCC plants are bottom slag and sulfur. The sulfur can be processed into commercially viable products, but high value applications need to be developed for the slag material in order to improve economics of the process. The purpose of this study was to evaluate the technical feasibility of incorporating coal slag generated by the Wabash River IGCC plant in Indiana as a raw material for the production of fired bricks. Full-size bricks containing up to 20 wt% of the coal slag were successfully produced at a bench-scale facility. These bricks have color and texture similar to those of regular fired bricks and their water absorption properties met the ASTM specifications for a severe weathering grade. Other engineering properties tests, including compressive strength tests, are in progress.
AO13. High energy, low methane syngas from low-rank coals for coal-to-liquids production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucero, Andrew; Goyal, Amit; McCabe, Kevin
2015-06-30
An experimental program was undertaken to develop and demonstrate novel steam reforming catalysts for converting tars, C2+ hydrocarbons, and methane under high temperature and sulfur environments at lab scale. Several catalysts were developed and synthesized along with some catalysts based on recipes found in the literature. Of these, two had good resistance at 90 ppm H 2S with one almost not affected at all. Higher concentrations of H 2S did affect methane conversion across the catalyst, but performance was fairly stable for up to 200 hours. Based on the results of the experimental program, a techno-economic analysis was developed formore » IGCC and CTL applications and compared to DOE reference cases to examine the effects of the new technology. In the IGCC cases, the reformer/POX system produces nearly the same amount of electricity for nearly the same cost, however, the reformers/POX case sequesters a higher percentage of the carbon when compared to IGCC alone. For the CTL case the economics of the new process were nearly identical to the CTL case, but due to improved yields, the greenhouse gas emissions for a given production of fuels was approximately 50% less than the baseline case.« less
Recycling of residual IGCC slags and their benefits as degreasers in ceramics.
Iglesias Martín, I; Acosta Echeverría, A; García-Romero, E
2013-11-15
This work studies the evolution of IGCC slag grains within a ceramic matrix fired at different temperatures to investigate the effect of using IGCC slag as a degreaser. Pressed ceramic specimens from two clay mixtures are used in this study. The M1 mixture is composed of standard clays, whereas the M2 mixture is composed of the same clay mixture as M1 mixture but contains 15% by weight IGCC slag. The amount of IGCC slag added coincides with the amount of slag typically used as a degreaser in the ceramic industry. Specimens are fired at 950 °C, 1000 °C, 1050 °C, 1100 °C and 1150 °C. The mineralogical composition and the IGCC slag grain shape within the ceramic matrix are determined by X-ray diffraction, polarized light microscopy and scanning electron microscopy. The results reveal that the surface of the slag grains is welded to the ceramic matrix while the quartz grains are separated, which causes increased water absorption and reduces the mechanical strength. IGCC slag, however, reduces water absorption. This behaviour is due to the softening temperature of the slag. This property is quite important from an industrial viewpoint because IGCC slag can serve as an alternative to traditional degreasing agents in the ceramic building industry. Additionally, using IGCC slag allows for the transformation of waste into a secondary raw material, thereby avoiding disposal at landfills; moreover, these industrial wastes are made inert and improve the properties of ceramics. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-12-01
A report written by the leading US and Chinese experts in Integrated Gasification Combined Cycle (IGCC) power plants, intended for high level decision makers, may greatly accelerate the development of an IGCC demonstration project in the People`s Republic of China (PRC). The potential market for IGCC systems in China and the competitiveness of IGCC technology with other clean coal options for China have been analyzed in the report. Such information will be useful not only to the Chinese government but also to US vendors and companies. The goal of this report is to analyze the energy supply structure of China,more » China`s energy and environmental protection demand, and the potential market in China in order to make a justified and reasonable assessment on feasibility of the transfer of US Clean Coal Technologies to China. The Expert Report was developed and written by the joint US/PRC IGCC experts and will be presented to the State Planning Commission (SPC) by the President of the CAS to ensure consideration of the importance of IGCC for future PRC power production.« less
NASA Astrophysics Data System (ADS)
Rani, Abha; Singh, Udayan; Jayant; Singh, Ajay K.; Sankar Mahapatra, Siba
2017-07-01
Coal gasification processes are crucial to decarbonisation in the power sector. While underground coal gasification (UCG) and integrated gasification combined cycle (IGCC) are different in terms of the site of gasification, they have considerable similarities in terms of the types of gasifiers used. Of course, UCG offers some additional advantages such as reduction of the fugitive methane emissions accompanying the coal mining process. Nevertheless, simulation of IGCC plants involving surface coal gasification is likely to give reasonable indication of the 3E (efficiency, economics and emissions) prospects of the gasification pathway towards electricity. This paper will aim at Estimating 3E impacts (efficiency, environment, economics) of gasification processes using simulation carried out in the Integrated Environmental Control Model (IECM) software framework. Key plant level controls which will be studied in this paper will be based on Indian financial regulations and operating costs which are specific to the country. Also, impacts of CO2 capture and storage (CCS) in these plants will be studied. The various parameters that can be studied are plant load factor, impact of coal quality and price, type of CO2 capture process, capital costs etc. It is hoped that relevant insights into electricity generation from gasification may be obtained with this paper.
CoalFleet RD&D augmentation plan for integrated gasification combined cycle (IGCC) power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2007-01-15
To help accelerate the development, demonstration, and market introduction of integrated gasification combined cycle (IGCC) and other clean coal technologies, EPRI formed the CoalFleet for Tomorrow initiative, which facilitates collaborative research by more than 50 organizations from around the world representing power generators, equipment suppliers and engineering design and construction firms, the U.S. Department of Energy, and others. This group advised EPRI as it evaluated more than 120 coal-gasification-related research projects worldwide to identify gaps or critical-path activities where additional resources and expertise could hasten the market introduction of IGCC advances. The resulting 'IGCC RD&D Augmentation Plan' describes such opportunitiesmore » and how they could be addressed, for both IGCC plants to be built in the near term (by 2012-15) and over the longer term (2015-25), when demand for new electric generating capacity is expected to soar. For the near term, EPRI recommends 19 projects that could reduce the levelized cost-of-electricity for IGCC to the level of today's conventional pulverized-coal power plants with supercritical steam conditions and state-of-the-art environmental controls. For the long term, EPRI's recommended projects could reduce the levelized cost of an IGCC plant capturing 90% of the CO{sub 2} produced from the carbon in coal (for safe storage away from the atmosphere) to the level of today's IGCC plants without CO{sub 2} capture. EPRI's CoalFleet for Tomorrow program is also preparing a companion RD&D augmentation plan for advanced-combustion-based (i.e., non-gasification) clean coal technologies (Report 1013221). 7 refs., 30 figs., 29 tabs., 4 apps.« less
Baseload coal investment decisions under uncertain carbon legislation.
Bergerson, Joule A; Lave, Lester B
2007-05-15
More than 50% of electricity in the U.S. is generated by coal. The U.S. has large coal resources, the cheapest fuel in most areas. Coal fired power plants are likely to continue to provide much of U.S. electricity. However, the type of power plant that should be built is unclear. Technology can reduce pollutant discharges and capture and sequester the CO2 from coal-fired generation. The U.S. Energy Policy Act of 2005 provides incentives for large scale commercial deployment of Integrated Coal Gasification Combined Cycle (IGCC) systems (e.g., loan guarantees and project tax credits). This analysis examines whether a new coal plant should be Pulverized Coal (PC) or IGCC. Do stricter emissions standards (PM, SO2, NOx, Hg) justify the higher costs of IGCC over PC? How does potential future carbon legislation affect the decision to add carbon capture and storage (CCS) technology? Finally, can the impact of uncertain carbon legislation be minimized? We find that SO2, NOx, PM, and Hg emission standards would have to be far more stringent than twice current standards to justify the increased costs of the IGCC system. A C02 tax less than $29/ton would lead companies to continuing to choose PC, paying the tax for emitted CO2. The earlier a decision-maker believes the carbon tax will be imposed and the higher the tax, the more likely companies will choose IGCC w/CCS. Having government announce the date and level of a carbon tax would promote more sensible decisions, but government would have to use a tax or subsidy to induce companies to choose the technology that is best for society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drown, D.P.; Brown, W.R.; Heydorn, E.C.
1997-12-31
The Liquid Phase Methanol (LPMEOH{trademark}) process uses a slurry bubble column reactor to convert syngas (primarily a mixture of carbon monoxide and hydrogen) to methanol. Because of its superior heat management, the process is able to be designed to directly handle the carbon monoxide (CO)-rich syngas characteristic of the gasification of coal, petroleum coke, residual oil, wastes, or of other hydrocarbon feedstocks. When added to an integrated gasification combined cycle (IGCC) power plant, the LPMEOH{trademark} process converts a portion of the CO-rich syngas produced by the gasifier to methanol, and the remainder of the unconverted gas is used to fuelmore » the gas turbine combined-cycle power plant. The LPMEOH{trademark} process has the flexibility to operate in a daily electricity demand load-following manner. Coproduction of power and methanol via IGCC and the LPMEOH{trademark} process provides opportunities for energy storage for electrical demand peak shaving, clean fuel for export, and/or chemical methanol sales.« less
Anantharaman, Rahul; Peters, Thijs; Xing, Wen; Fontaine, Marie-Laure; Bredesen, Rune
2016-10-20
Dual phase membranes are highly CO 2 -selective membranes with an operating temperature above 400 °C. The focus of this work is to quantify the potential of dual phase membranes in pre- and post-combustion CO 2 capture processes. The process evaluations show that the dual phase membranes integrated with an NGCC power plant for CO 2 capture are not competitive with the MEA process for post-combustion capture. However, dual phase membrane concepts outperform the reference Selexol technology for pre-combustion CO 2 capture in an IGCC process. The two processes evaluated in this work, post-combustion NGCC and pre-combustion IGCC, represent extremes in CO 2 partial pressure fed to the separation unit. Based on the evaluations it is expected that dual phase membranes could be competitive for post-combustion capture from a pulverized coal fired power plant (PCC) and pre-combustion capture from an Integrated Reforming Cycle (IRCC).
Durán, A; Monteagudo, J M; San Martín, I
2012-05-15
The aim of this work was to study the operation costs of treating a real effluent from an integrated gasification combined cycle (IGCC) power station located in Spain. The study compares different homogeneous photocatalytic processes on a pilot plant scale using different types of radiation (artificial UV or solar UV with a compound parabolic collector). The efficiency of the processes was evaluated by an analysis of the total organic carbon (TOC) removed. The following processes were considered in the study: (i) a photo-Fenton process at an artificial UV pilot plant (with the initial addition of H(2)O(2)), (ii) a modified photo-Fenton process with continuous addition of H(2)O(2) and O(2) to the system and (iii) a ferrioxalate-assisted solar photo-Fenton process at a compound parabolic collector (CPC) pilot plant. The efficiency of these processes in degrading pollutants has been studied previously, and the results obtained in each of those studies have been published elsewhere. The operational costs due to the consumption of electrical energy, reagents and catalysts were calculated from the optimal conditions of each process. The results showed that the solar photo-Fenton system was economically feasible, being able to achieve up to 75% mineralization with a total cost of 6 €/m(3), which can be reduced to 3.6 €/m(3) by subtracting the electrical costs because the IGCC plant is self-sufficient in terms of energy. Copyright © 2011 Elsevier Ltd. All rights reserved.
Kobayashi, Makoto; Akiho, Hiroyuki
2017-12-01
Electricity production from coal fuel with minimizing efficiency penalty for the carbon dioxide abatement will bring us sustainable and compatible energy utilization. One of the promising options is oxy-fuel type Integrated Gasification Combined Cycle (oxy-fuel IGCC) power generation that is estimated to achieve thermal efficiency of 44% at lower heating value (LHV) base and provide compressed carbon dioxide (CO 2 ) with concentration of 93 vol%. The proper operation of the plant is established by introducing dry syngas cleaning processes to control halide and sulfur compounds satisfying tolerate contaminants level of gas turbine. To realize the dry process, the bench scale test facility was planned to demonstrate the first-ever halide and sulfur removal with fixed bed reactor using actual syngas from O 2 -CO 2 blown gasifier for the oxy-fuel IGCC power generation. Design parameter for the test facility was required for the candidate sorbents for halide removal and sulfur removal. Breakthrough test was performed on two kinds of halide sorbents at accelerated condition and on honeycomb desulfurization sorbent at varied space velocity condition. The results for the both sorbents for halide and sulfur exhibited sufficient removal within the satisfactory short depth of sorbent bed, as well as superior bed conversion of the impurity removal reaction. These performance evaluation of the candidate sorbents of halide and sulfur removal provided rational and affordable design parameters for the bench scale test facility to demonstrate the dry syngas cleaning process for oxy-fuel IGCC system as the scaled up step of process development. Copyright © 2017 Elsevier Ltd. All rights reserved.
Scoping Studies to Evaluate the Benefits of an Advanced Dry Feed System on the Use of Low-Rank Coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rader, Jeff; Aguilar, Kelly; Aldred, Derek
2012-03-30
The purpose of this project was to evaluate the ability of advanced low rank coal gasification technology to cause a significant reduction in the COE for IGCC power plants with 90% carbon capture and sequestration compared with the COE for similarly configured IGCC plants using conventional low rank coal gasification technology. GE’s advanced low rank coal gasification technology uses the Posimetric Feed System, a new dry coal feed system based on GE’s proprietary Posimetric Feeder. In order to demonstrate the performance and economic benefits of the Posimetric Feeder in lowering the cost of low rank coal-fired IGCC power with carbonmore » capture, two case studies were completed. In the Base Case, the gasifier was fed a dilute slurry of Montana Rosebud PRB coal using GE’s conventional slurry feed system. In the Advanced Technology Case, the slurry feed system was replaced with the Posimetric Feed system. The process configurations of both cases were kept the same, to the extent possible, in order to highlight the benefit of substituting the Posimetric Feed System for the slurry feed system.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... DEPARTMENT OF AGRICULTURE Rural Utilities Service South Mississippi Electric Cooperative: Plant Ratcliff, Kemper County Integrated Gasification Combined-Cycle (IGCC) Project AGENCY: Rural Utilities... Combined-Cycle (IGCC) Project currently under construction in Kemper County, Mississippi (hereinafter ``the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Kevin; Anasti, William; Fang, Yichuan
The main purpose of this project is to look at technologies and philosophies that would help reduce the costs of an Integrated Gasification Combined Cycle (IGCC) plant, increase its availability or do both. GE’s approach to this problem is to consider options in three different areas: 1) technology evaluations and development; 2) constructability approaches; and 3) design and operation methodologies. Five separate tasks were identified that fall under the three areas: Task 2 – Integrated Operations Philosophy; Task 3 – Slip Forming of IGCC Components; Task 4 – Modularization of IGCC Components; Task 5 – Fouling Removal; and Task 6more » – Improved Slag Handling. Overall, this project produced results on many fronts. Some of the ideas could be utilized immediately by those seeking to build an IGCC plant in the near future. These include the considerations from the Integrated Operations Philosophy task and the different construction techniques of Slip Forming and Modularization (especially if the proposed site is in a remote location or has a lack of a skilled workforce). Other results include ideas for promising technologies that require further development and testing to realize their full potential and be available for commercial operation. In both areas GE considers this project to be a success in identifying areas outside the core IGCC plant systems that are ripe for cost reduction and ity improvement opportunities.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-26
... Ratcliffe, Kemper County Integrated Gasification Combined-Cycle (IGCC) Project AGENCY: Rural Utilities... Plant Ratcliffe, an Integrated Gasification Combined-Cycle Facility located in Kemper County... Company (MPCo), and will demonstrate the feasibility of the Integrated Gasification Combined-Cycle (IGCC...
TECHNOECONOMIC APPRAISAL OF INTEGRATED GASIFICATION COMBINED-CYCLE POWER GENERATION
The report is a technoeconomic appraisal of the integrated (coal) gasification combined-cycle (IGCC) system. lthough not yet a proven commercial technology, IGCC is a future competitive technology to current pulverized-coal boilers equipped with SO2 and NOx controls, because of i...
Advanced IGCC/Hydrogen Gas Turbine Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
York, William; Hughes, Michael; Berry, Jonathan
2015-07-30
The objective of this program was to develop the technologies required for a fuel flexible (coal derived hydrogen or syngas) gas turbine for IGCC that met DOE turbine performance goals. The overall DOE Advanced Power System goal was to conduct the research and development (R&D) necessary to produce coal-based IGCC power systems with high efficiency, near-zero emissions, and competitive capital cost. To meet this goal, the DOE Fossil Energy Turbine Program had as an interim objective of 2 to 3 percentage points improvement in combined cycle (CC) efficiency. The final goal is 3 to 5 percentage points improvement in CCmore » efficiency above the state of the art for CC turbines in IGCC applications at the time the program started. The efficiency goals were for NOx emissions of less than 2 ppm NOx (@15 % O2). As a result of the technologies developed under this program, the DOE goals were exceeded with a projected 8 point efficiency improvement. In addition, a new combustion technology was conceived of and developed to overcome the challenges of burning hydrogen and achieving the DOE’s NOx goal. This report also covers the developments under the ARRA-funded portion of the program that include gas turbine technology advancements for improvement in the efficiency, emissions, and cost performance of gas turbines for industrial applications with carbon capture and sequestration. Example applications could be cement plants, chemical plants, refineries, steel and aluminum plants, manufacturing facilities, etc. The DOE’s goal for more than 5 percentage point improvement in efficiency was met with cycle analyses performed for representative IGCC Steel Mill and IGCC Refinery applications. Technologies were developed in this program under the following areas: combustion, larger latter stage buckets, CMC and EBC, advanced materials and coatings, advanced configurations to reduce cooling, sealing and rotor purge flows, turbine aerodynamics, advanced sensors, advancements in first stage hot gas path components, and systems analyses to determine benefits of all previously mentioned technologies to a gas turbine system in an IGCC configuration. This project built on existing gas turbine technology and product developments, and developed and validated the necessary turbine related technologies and sub-systems needed to meet the DOE turbine program goals. The scope of the program did not cover the design and validation of a full-scale prototype machine with the technology advances from this program incorporated. In summary, the DOE goals were met with this program. While the commercial landscape has not resulted in a demand for IGCC gas turbines many of the technologies that were developed over the course of the program are benefiting the US by being applied to new higher efficiency natural gas fueled gas turbines.« less
Prospects for the use of SMR and IGCC technologies for power generation in Poland
NASA Astrophysics Data System (ADS)
Wyrwa, Artur; Suwała, Wojciech
2017-11-01
This study is a preliminary assessment of prospects for new power generation technologies that are of particular interest in Poland. We analysed the economic competitiveness of small size integrated gasification combined cycle units (IGCC) and small modular reactors (SMR). For comparison we used one of the most widely applied and universal metric i.e. Levelized Cost of Electricity (LCOE). The LCOE results were complemented with the results of energy-economic model TIMES-PL in order to analyse the economic viability of these technologies under operation regime of the entire power system. The results show that with techno-economic assumptions presented in the paper SMRs are more competitive option as compared to small IGCC units.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharyya, D.; Turton, R.; Zitney, S.
In this presentation, development of a plant-wide dynamic model of an advanced Integrated Gasification Combined Cycle (IGCC) plant with CO2 capture will be discussed. The IGCC reference plant generates 640 MWe of net power using Illinois No.6 coal as the feed. The plant includes an entrained, downflow, General Electric Energy (GEE) gasifier with a radiant syngas cooler (RSC), a two-stage water gas shift (WGS) conversion process, and two advanced 'F' class combustion turbines partially integrated with an elevated-pressure air separation unit (ASU). A subcritical steam cycle is considered for heat recovery steam generation. Syngas is selectively cleaned by a SELEXOLmore » acid gas removal (AGR) process. Sulfur is recovered using a two-train Claus unit with tail gas recycle to the AGR. A multistage intercooled compressor is used for compressing CO2 to the pressure required for sequestration. Using Illinois No.6 coal, the reference plant generates 640 MWe of net power. The plant-wide steady-state and dynamic IGCC simulations have been generated using the Aspen Plus{reg_sign} and Aspen Plus Dynamics{reg_sign} process simulators, respectively. The model is generated based on the Case 2 IGCC configuration detailed in the study available in the NETL website1. The GEE gasifier is represented with a restricted equilibrium reactor model where the temperature approach to equilibrium for individual reactions can be modified based on the experimental data. In this radiant-only configuration, the syngas from the Radiant Syngas Cooler (RSC) is quenched in a scrubber. The blackwater from the scrubber bottom is further cleaned in the blackwater treatment plant. The cleaned water is returned back to the scrubber and also used for slurry preparation. The acid gas from the sour water stripper (SWS) is sent to the Claus plant. The syngas from the scrubber passes through a sour shift process. The WGS reactors are modeled as adiabatic plug flow reactors with rigorous kinetics based on the mid-life activity of the shift-catalyst. The SELEXOL unit consists of the H2S and CO2 absorbers that are designed to meet the stringent environmental limits and requirements of other associated units. The model also considers the stripper for recovering H2S that is sent as a feed to a split-flow Claus unit. The tail gas from the Claus unit is recycled to the SELEXOL unit. The cleaned syngas is sent to the GE 7FB gas turbine. This turbine is modeled as per published data in the literature. Diluent N2 is used from the elevated-pressure ASU for reducing the NOx formation. The heat recovery steam generator (HRSG) is modeled by considering generation of high-pressure, intermediate-pressure, and low-pressure steam. All of the vessels, reactors, heat exchangers, and the columns have been sized. The basic IGCC process control structure has been synthesized by standard guidelines and existing practices. The steady-state simulation is solved in sequential-modular mode in Aspen Plus{reg_sign} and consists of more than 300 unit operations, 33 design specs, and 16 calculator blocks. The equation-oriented dynamic simulation consists of more than 100,000 equations solved using a multi-step Gear's integrator in Aspen Plus Dynamics{reg_sign}. The challenges faced in solving the dynamic model and key transient results from this dynamic model will also be discussed.« less
Kumar, Aditya; Shi, Ruijie; Kumar, Rajeeva; Dokucu, Mustafa
2013-04-09
Control system and method for controlling an integrated gasification combined cycle (IGCC) plant are provided. The system may include a controller coupled to a dynamic model of the plant to process a prediction of plant performance and determine a control strategy for the IGCC plant over a time horizon subject to plant constraints. The control strategy may include control functionality to meet a tracking objective and control functionality to meet an optimization objective. The control strategy may be configured to prioritize the tracking objective over the optimization objective based on a coordinate transformation, such as an orthogonal or quasi-orthogonal projection. A plurality of plant control knobs may be set in accordance with the control strategy to generate a sequence of coordinated multivariable control inputs to meet the tracking objective and the optimization objective subject to the prioritization resulting from the coordinate transformation.
Method and system to estimate variables in an integrated gasification combined cycle (IGCC) plant
Kumar, Aditya; Shi, Ruijie; Dokucu, Mustafa
2013-09-17
System and method to estimate variables in an integrated gasification combined cycle (IGCC) plant are provided. The system includes a sensor suite to measure respective plant input and output variables. An extended Kalman filter (EKF) receives sensed plant input variables and includes a dynamic model to generate a plurality of plant state estimates and a covariance matrix for the state estimates. A preemptive-constraining processor is configured to preemptively constrain the state estimates and covariance matrix to be free of constraint violations. A measurement-correction processor may be configured to correct constrained state estimates and a constrained covariance matrix based on processing of sensed plant output variables. The measurement-correction processor is coupled to update the dynamic model with corrected state estimates and a corrected covariance matrix. The updated dynamic model may be configured to estimate values for at least one plant variable not originally sensed by the sensor suite.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulationmore » for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.« less
Capture-ready power plants - options, technologies and economics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohm, M.C.
2006-06-15
A plant can be considered to be capture-ready if at some point in the future it can be retrofitted for carbon capture and sequestration and still be economical to operate. The first part of the thesis outlines the two major designs that are being considered for construction in the near-term - pulverized coal (PC) and integrated gasification/combined cycle (IGCC). It details the steps that are necessary to retrofit each of these plants for CO{sub 2} capture and sequestration and assesses the steps that can be taken to reduce the costs and output de-rating of the plant after a retrofit. The second part of the thesis evaluates the lifetime (40 year) net present value (NPV) costs of plants with differing levels of pre-investment for CO{sub 2} capture. Three scenarios are evaluated - a baseline supercritical PC plant, a baseline IGCC plant and an IGCC plant with pre-investment for capture. The results of this thesis show that a baseline PC plant is the most economical choice under low CO{sub 2} tax rates, and IGCC plants are preferable at higher tax rates. The third part of this thesis evaluates the concept of CO{sub 2} 'lock-in'. CO{sub 2} lock-in occurs when a newly built plant is so prohibitively expensive to retrofit for CO{sub 2} capture that it will never be retrofitted for capture, and offers no economic opportunity to reduce the CO{sub 2} emissions from the plant, besides shutting down or rebuilding. The results show that IGCC plants are expected to have lower lifetime CO{sub 2} emissions than a PC plant, given moderate (10-35more » $$/ton CO{sub 2}) initial tax rates. Higher 4 (above $$40) or lower (below $7) initial tax rates do not result in significant differences in lifetime CO{sub 2} emissions from these plants. Little difference is seen in the lifetime CO{sub 2} emissions between the IGCC plants with and without pre-investment for CO{sub 2} capture. 32 refs., 22 figs., 20 tabs., 1 app.« less
Tampa Electric Company Polk Power Station IGCC project: Project status
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDaniel, J.E.; Carlson, M.R.; Hurd, R.
1997-12-31
The Tampa Electric Company Polk Power Station is a nominal 250 MW (net) Integrated Gasification Combined Cycle (IGCC) power plant located to the southeast of Tampa, Florida in Polk County, Florida. This project is being partially funded under the Department of Energy`s Clean Coal Technology Program pursuant to a Round II award. The Polk Power Station uses oxygen-blown, entrained-flow IGCC technology licensed from Texaco Development Corporation to demonstrate significant reductions of SO{sub 2} and NO{sub x} emissions when compared to existing and future conventional coal-fired power plants. In addition, this project demonstrates the technical feasibility of commercial scale IGCC andmore » Hot Gas Clean Up (HGCU) technology. The Polk Power Station achieved ``first fire`` of the gasification system on schedule in mid-July, 1996. Since that time, significant advances have occurred in the operation of the entire IGCC train. This paper addresses the operating experiences which occurred in the start-up and shakedown phase of the plant. Also, with the plant being declared in commercial operation as of September 30, 1996, the paper discusses the challenges encountered in the early phases of commercial operation. Finally, the future plans for improving the reliability and efficiency of the Unit in the first quarter of 1997 and beyond, as well as plans for future alternate fuel test burns, are detailed. The presentation features an up-to-the-minute update on actual performance parameters achieved by the Polk Power Station. These parameters include overall Unit capacity, heat rate, and availability. In addition, the current status of the start-up activities for the HGCU portion of the plant is discussed.« less
Membrane-based systems for carbon capture and hydrogen purification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berchtold, Kathryn A
2010-11-24
This presentation describes the activities being conducted at Los Alamos National Laboratory to develop carbon capture technologies for power systems. This work is aimed at continued development and demonstration of a membrane based pre- and post-combustion carbon capture technology and separation schemes. Our primary work entails the development and demonstration of an innovative membrane technology for pre-combustion capture of carbon dioxide that operates over a broad range of conditions relevant to the power industry while meeting the US DOE's Carbon Sequestration Program goals of 90% CO{sub 2} capture at less than a 10% increase in the cost of energy services.more » Separating and capturing carbon dioxide from mixed gas streams is a first and critical step in carbon sequestration. To be technically and economically viable, a successful separation method must be applicable to industrially relevant gas streams at realistic temperatures and pressures as well as be compatible with large gas volumes. Our project team is developing polymer membranes based on polybenzimidazole (PBI) chemistries that can purify hydrogen and capture CO{sub 2} at industrially relevant temperatures. Our primary objectives are to develop and demonstrate polymer-based membrane chemistries, structures, deployment platforms, and sealing technologies that achieve the critical combination of high selectivity, high permeability, chemical stability, and mechanical stability all at elevated temperatures (> 150 C) and packaged in a scalable, economically viable, high area density system amenable to incorporation into an advanced Integrated Gasification Combined-Cycle (IGCC) plant for pre-combustion CO{sub 2} capture. Stability requirements are focused on tolerance to the primary synthesis gas components and impurities at various locations in the IGCC process. Since the process stream compositions and conditions (temperature and pressure) vary throughout the IGCC process, the project is focused on the optimization of a technology that could be positioned upstream or downstream of one or more of the water-gas-shift reactors (WGSRs) or integrated with a WGSR.« less
NASA Technical Reports Server (NTRS)
Nainiger, J. J.; Abbott, J. M.; Burns, R. K.
1981-01-01
In the cogeneration technology alternatives study (CTAS) a number of advanced coal fired systems were examined and systems using a integrated coal gasifier IGCC or a fluid bed combustor AFB were found to yield attractive cogeneration results in industrial cogeneration applications. A range of site requirements and cogeneration sizing strategies using ground rules based on CTAS were used in comparing an IGCC and an AFB. The effect of time variations in site requirements and the sensitivity to fuel and electricity price assumptions are examined. The economic alternatives of industrial or utility ownership are also considered. The results indicate that the IGCC system has potentially higher fuel and emission savings and could be an attractive option for utility ownership. The AFB steam turbine system has a potentially higher return on investment and could be attractive assuming industrial ownership.
2001-01-01
standards can retrofit with flue - gas - desulfurization systems, use low sulfur coal, purchase emissions credits, or close. If a power plant’s emissions...a flue gas scrubbing device. IGCC technology is even more environmentally friendly. In an IGCC plant, coal is converted into a gaseous fuel, purified...and natural gas have rocketed this industry into the public’s spotlight and discussion. Secretary Abraham in a recent speech to the U.S. Chamber of
Novel concepts for the compression of large volumes of carbon dioxide-phase III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, J. Jeffrey; Allison, Timothy C.; Evans, Neal D.
In the effort to reduce the release of CO 2 greenhouse gases to the atmosphere, sequestration of CO 2 from Integrated Gasification Combined Cycle (IGCC) and Oxy-Fuel power plants is being pursued. This approach, however, requires significant compression power to boost the pressure to typical pipeline levels. The penalty can be as high as 8-12% on a typical IGCC plant. The goal of this research is to reduce this penalty through novel compression concepts and integration with existing IGCC processes. The primary objective of the study of novel CO 2 compression concepts is to reliably boost the pressure of COmore » 2 to pipeline pressures with the minimal amount of energy required. Fundamental thermodynamics were studied to explore pressure rise in both liquid and gaseous states. For gaseous compression, the project investigated novel methods to compress CO 2 while removing the heat of compression internal to the compressor. The highpressure ratio, due to the delivery pressure of the CO 2 for enhanced oil recovery, results in significant heat of compression. Since less energy is required to boost the pressure of a cooler gas stream, both upstream and inter-stage cooling is desirable. While isothermal compression has been utilized in some services, it has not been optimized for the IGCC environment. Phase I of this project determined the optimum compressor configuration and developed technology concepts for internal heat removal. Other compression options using liquefied CO 2 and cryogenic pumping were explored as well. Preliminary analysis indicated up to a 35% reduction in power is possible with the new concepts being considered. In the Phase II program, two experimental test rigs were developed to investigate the two concepts further. A new pump loop facility was constructed to qualify a cryogenic turbopump for use on liquid CO 2 . Also, an internally cooled compressor diaphragm was developed and tested in a closed loop compressor facility using CO 2 . Both test programs successfully demonstrated good performance and mechanical behavior. In Phase III, a pilot compression plant consisting of a multi-stage centrifugal compressor with cooled diaphragm technology has been designed, constructed, and tested. Comparative testing of adiabatic and cooled tests at equivalent inlet conditions shows that the cooled diaphragms reduce power consumption by 3-8% when the compressor is operated as a back-to-back unit and by up to 9% when operated as a straight-though compressor with no intercooler. The power savings, heat exchanger effectiveness, and temperature drops for the cooled diaphragm were all slightly higher than predicted values but showed the same trends.« less
Systems Analysis of Physical Absorption of CO2 in Ionic Liquids for Pre-Combustion Carbon Capture.
Zhai, Haibo; Rubin, Edward S
2018-04-17
This study develops an integrated technical and economic modeling framework to investigate the feasibility of ionic liquids (ILs) for precombustion carbon capture. The IL 1-hexyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide is modeled as a potential physical solvent for CO 2 capture at integrated gasification combined cycle (IGCC) power plants. The analysis reveals that the energy penalty of the IL-based capture system comes mainly from the process and product streams compression and solvent pumping, while the major capital cost components are the compressors and absorbers. On the basis of the plant-level analysis, the cost of CO 2 avoided by the IL-based capture and storage system is estimated to be $63 per tonne of CO 2 . Technical and economic comparisons between IL- and Selexol-based capture systems at the plant level show that an IL-based system could be a feasible option for CO 2 capture. Improving the CO 2 solubility of ILs can simplify the capture process configuration and lower the process energy and cost penalties to further enhance the viability of this technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakhamkin, M.; Patel, M.; Andersson, L.
1992-12-01
A previous study sponsored by EPRI concluded that integrating a compressed-air energy storage (CAES) plant with a coal-gasification system (CGS) can reduce the required capacity and cost of the expensive gasification system. The results showed that when compared at an equal plant capacity, the capital cost of the CGS portion of the integrated CAES/CGS plant can be reduced by as much as 30% relative to the same portion of an integrated gasification combined cycle (IGCC) plant. Furthermore, the capital cost of the CAES/CGS.plant, configured as a peaking unit, was found to be slightly lower than that of the base-load IGCCmore » plant. However, the overall economics of the CAES/CGS plant were adversely affected by the low capacity factor of the peak-load service, and ultimately, were found to be less attractive than the IGCC plant. The main objective of this study was to develop and analyze integrated CAES/CGS power plant concepts which provide for continuous (around-the-clock) operation of both the CAES reheat turboexpander train and the CGS facility. The developed concepts also provide utility-load management functions by driving the CAES compressor trains with off-peak electricity supplied through the grid. EPRI contracted with Energy Storage & Power Consultants, Inc. (ESPC) to develop conceptual designs, optimized performance characteristics, and preliminary cost data for these CAES/CGS concepts, and to provide a technical and cost comparison to the IGCC plant. The CAES/CGS concepts developed by ESPC for the current study contrast from those of Reference 1.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakhamkin, M.; Patel, M.; Andersson, L.
1992-12-01
A previous study sponsored by EPRI concluded that integrating a compressed-air energy storage (CAES) plant with a coal-gasification system (CGS) can reduce the required capacity and cost of the expensive gasification system. The results showed that when compared at an equal plant capacity, the capital cost of the CGS portion of the integrated CAES/CGS plant can be reduced by as much as 30% relative to the same portion of an integrated gasification combined cycle (IGCC) plant. Furthermore, the capital cost of the CAES/CGS.plant, configured as a peaking unit, was found to be slightly lower than that of the base-load IGCCmore » plant. However, the overall economics of the CAES/CGS plant were adversely affected by the low capacity factor of the peak-load service, and ultimately, were found to be less attractive than the IGCC plant. The main objective of this study was to develop and analyze integrated CAES/CGS power plant concepts which provide for continuous (around-the-clock) operation of both the CAES reheat turboexpander train and the CGS facility. The developed concepts also provide utility-load management functions by driving the CAES compressor trains with off-peak electricity supplied through the grid. EPRI contracted with Energy Storage Power Consultants, Inc. (ESPC) to develop conceptual designs, optimized performance characteristics, and preliminary cost data for these CAES/CGS concepts, and to provide a technical and cost comparison to the IGCC plant. The CAES/CGS concepts developed by ESPC for the current study contrast from those of Reference 1.« less
NASA Astrophysics Data System (ADS)
Bellerive, Nathalie
The research project hypothesis is that CO2 capture and sequestration technologies (CSC) leads to a significant decrease in global warming, but increases the impact of all other aspects of the study. This is because other processes used for CO2 capture and sequestration require additional quantities of raw materials and energy. Two other objectives are described in this project. The first is the modeling of an Integrated Gasification Combined Cycle power plant for which there is no known generic data. The second is to select the right hypothesis regarding electrical production technologies, CO2 capture, compression and transportation by pipeline and finally sequestration. "Life Cycle Assessment" (LCA) analyses were chosen for this research project. LCA is an exhaustive quantitative method used to evaluate potential environmental impacts associated with a product, a service or an activity from resource extraction to waste elimination. This tool is governed by ISO 14 040 through ISO 14 049 and is sustained by the Society of Environmental Toxicology and Chemistry (SETAC) and the United Nations Environment Program (UNEP). Two power plants were studied, the Integrated Gasification Combined Cycle (IGCC) power plant and the Natural Gas Combined Cycle (NGCC) power plant. In order to sequester CO2 in geological formation, it is necessary to extract CO2from emission flows. For the IGCC power plant, CO 2 was captured before the burning phase. For the NGCC power plant, the capture was done during the afterburning phase. Once the CO2 was isolated, it was compressed and directed through a transportation pipe 1 000 km in length on the ground surface and in the sea. It is hypothesized that the power plant is 300 km from the shore and the sequestration platform 700 km from France's shore, in the North Sea. The IGCC power plant modeling and data selection regarding CO2 capture and sequestration were done by using primary data from the industry and the Ecoinvent generic database (Version 1.2). This database was selected due to its European source. Finally, technical calculations and literature were used to complete the data inventory. This was validated by electrical experts in order to increase data and modeling precision. Results were similar for IGCC and NGCC power plants using Impact 2002+, an impacts analysis method. Global warming potential decreased by 67% with the implementation of CO2 capture and sequestration compared to systems without CSC. Results for all others impacts categories, demonstrated an increase from 16% to 116% in relative proportions compared to systems without CSC. The main contributor was the additional quantity of energy required to operate CO2 capture and compression facilities. This additional energy negatively affected the power plant's global efficiency because of the increase in the quantity of fossil fuel that needed to be extracted and consumed. The increase in other impacts was mainly due to additional electricity, fossil fuel (for extracting, treatment and transportation) and additional emissions generated during power plant operations. A scenario analysis was done to study the sensitivity and variability of uncertain data during the software modeling process of a power plant. Data on power plant efficiency is the most variable and sensitive during modeling, followed by the length of the transportation pipe and the leaking rate during CO2 sequestration. This result analysis is interesting because it led to the maximum efficiency scenario with capture (with a short CO 2 transportation distance and a low leaking rate) obtaining better results on all impact category indicators, compared to the minimum efficiency scenario without capture. In fact, positive results on all category indicators were possible during the system comparison between the two cases (with and without capture). (Abstract shortened by UMI.)
ADVANCED SULFUR CONTROL CONCEPTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostolos A. Nikolopoulos; Santosh K. Gangwal; William J. McMichael
Conventional sulfur removal in integrated gasification combined cycle (IGCC) power plants involves numerous steps: COS (carbonyl sulfide) hydrolysis, amine scrubbing/regeneration, Claus process, and tail-gas treatment. Advanced sulfur removal in IGCC systems involves typically the use of zinc oxide-based sorbents. The sulfides sorbent is regenerated using dilute air to produce a dilute SO{sub 2} (sulfur dioxide) tail gas. Under previous contracts the highly effective first generation Direct Sulfur Recovery Process (DSRP) for catalytic reduction of this SO{sub 2} tail gas to elemental sulfur was developed. This process is currently undergoing field-testing. In this project, advanced concepts were evaluated to reduce themore » number of unit operations in sulfur removal and recovery. Substantial effort was directed towards developing sorbents that could be directly regenerated to elemental sulfur in an Advanced Hot Gas Process (AHGP). Development of this process has been described in detail in Appendices A-F. RTI began the development of the Single-step Sulfur Recovery Process (SSRP) to eliminate the use of sorbents and multiple reactors in sulfur removal and recovery. This process showed promising preliminary results and thus further process development of AHGP was abandoned in favor of SSRP. The SSRP is a direct Claus process that consists of injecting SO{sub 2} directly into the quenched coal gas from a coal gasifier, and reacting the H{sub 2}S-SO{sub 2} mixture over a selective catalyst to both remove and recover sulfur in a single step. The process is conducted at gasifier pressure and 125 to 160 C. The proposed commercial embodiment of the SSRP involves a liquid phase of molten sulfur with dispersed catalyst in a slurry bubble-column reactor (SBCR).« less
Ethanol and other oxygenateds from low grade carbonaceous resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joo, O.S.; Jung, K.D.; Han, S.H.
1995-12-31
Anhydrous ethanol and other oxygenates of C2 up can be produced quite competitively from low grade carbonaceous resources in high yield via gasification, methanol synthesis, carbonylation of methanol an hydrogenation consecutively. Gas phase carbonylation of methanol to form methyl acetate is the key step for the whole process. Methyl acetate can be produced very selectively in one step gas phase reaction on a fixed bed column reactor with GHSV over 5,000. The consecutive hydrogenation of methyl or ethyl acetate produce anhydrous ethanol in high purity. It is also attempted to co-produce methanol and DME in IGCC, in which low grademore » carbonaceous resources are used as energy sources, and the surplus power and pre-power gas can be stored in liquid form of methanol and DME during base load time. Further integration of C2 up oxygenate production with IGCC can improve its economics. The attempt of above extensive technology integration can generate significant industrial profitability as well as reduce the environmental complication related with massive energy consumption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schobeiri, Meinhard; Han, Je-Chin
2014-09-30
This report deals with the specific aerodynamics and heat transfer problematic inherent to high pressure (HP) turbine sections of IGCC-gas turbines. Issues of primary relevance to a turbine stage operating in an IGCC-environment are: (1) decreasing the strength of the secondary flow vortices at the hub and tip regions to reduce (a), the secondary flow losses and (b), the potential for end wall deposition, erosion and corrosion due to secondary flow driven migration of gas flow particles to the hub and tip regions, (2) providing a robust film cooling technology at the hub and that sustains high cooling effectiveness lessmore » sensitive to deposition, (3) investigating the impact of blade tip geometry on film cooling effectiveness. The document includes numerical and experimental investigations of above issues. The experimental investigations were performed in the three-stage multi-purpose turbine research facility at the Turbomachinery Performance and Flow Research Laboratory (TPFL), Texas A&M University. For the numerical investigations a commercial Navier-Stokes solver was utilized.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-12-31
The project involves the construction of an 80,000 gallon per day (260 tons per day (TPD)) methanol unit utilizing coal-derived synthesis gas from Eastman`s integrated coal gasification facility. The new equipment consists of synthesis gas feed preparation and compression facilities, the liquid phase reactor and auxiliaries, product distillation facilities, and utilities. The technology to be demonstrated is the product of a cooperative development effort by Air Products and DOE in a program that started in 1981. Developed to enhance electric power generation using integrated gasification combined cycle (IGCC) technology, the LPMEOH{trademark} process is ideally suited for directly processing gases producedmore » by modern-day coal gasifiers. Originally tested at a small (10 TPD), DOE-owned experimental unit in LaPorte, Texas, the technology provides several improvements essential for the economic coproduction of methanol and electricity directly from gasified coal. This liquid phase process suspends fine catalyst particles in an inert liquid, forming a slurry. The slurry dissipates the heat of the chemical reaction away from the catalyst surface, protecting the catalyst and allowing the methanol synthesis reaction to proceed at higher rates. At the Eastman complex, the technology is being integrated with existing coal-gasifiers. A carefully developed test plan will allow operations at Eastman to simulate electricity demand load-following in coal-based IGCC facilities. The operations will also demonstrate the enhanced stability and heat dissipation of the conversion process, its reliable on/off operation, and its ability to produce methanol as a clean liquid fuel without additional upgrading.« less
Hybrid Molten Bed Gasifier for High Hydrogen Syngas Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rue, David
The techno-economic analyses of the hybrid molten bed gasification technology and laboratory testing of the HMB process were carried out in this project by the Gas Technology Institute and partner Nexant, Inc. under contract with the US Department of Energy’s National Energy Technology Laboratory. This report includes the results of two complete IGCC and Fischer-Tropsch TEA analyses comparing HMB gasification with the Shell slagging gasification process as a base case. Also included are the results of the laboratory simulation tests of the HMB process using Illinois #6 coal fed along with natural gas, two different syngases, and steam. Work inmore » this 18-month project was carried out in three main Tasks. Task 2 was completed first and involved modeling, mass and energy balances, and gasification process design. The results of this work were provided to Nexant as input to the TEA IGCC and FT configurations studied in detail in Task 3. The results of Task 2 were also used to guide the design of the laboratory-scale testing of the HMB concept in the submerged combustion melting test facility in GTI’s industrial combustion laboratory. All project work was completed on time and budget. A project close-out meeting reviewing project results was conducted on April 1, 2015 at GTI in Des Plaines, IL. The hybrid molten bed gasification process techno-economic analyses found that the HMB process is both technically and economically attractive compared with the Shell entrained flow gasification process. In IGCC configuration, HMB gasification provides both efficiency and cost benefits. In Fischer-Tropsch configuration, HMB shows small benefits, primarily because even at current low natural gas prices, natural gas is more expensive than coal on an energy cost basis. HMB gasification was found in the TEA to improve the overall IGCC economics as compared to the coal only Shell gasification process. Operationally, the HMB process proved to be robust and easy to operate. The burner was stable over the full oxygen to fuel firing range (0.8 to 1.05 of fuel gas stoichiometry) and with all fuel gases (natural gas and two syngas compositions), with steam, and without steam. The lower Btu content of the syngases presented no combustion difficulties. The molten bed was stable throughout testing. The molten bed was easily established as a bed of molten glass. As the composition changed from glass cullet to cullet with slag, no instabilities were encountered. The bed temperature and product syngas temperature remained stable throughout testing, demonstrating that the bed serves as a good heat sink for the gasification process. Product syngas temperature measured above the bed was stable at ~1600ºF. Testing found that syngas quality measured as H 2/CO ratio increased with decreasing oxygen to fuel gas stoichiometric ratio, higher steam to inlet carbon ratio, higher temperature, and syngas compared with natural gas. The highest H 2/CO ratios achieved were in the range of 0.70 to 0.78. These values are well below the targets of 1.5 to 2.0 that were expected and were predicted by modeling. The team, however, is encouraged that the HMB process can and will achieve H 2/CO ratios up to 2.0. Changes needed include direct injection of coal into the molten bed of slag to prevent coal particle bypass into the product gas stream, elevation of the molten bed temperature to approximately 2500ºF, and further decrease of the oxygen to fuel gas ratio to well below the 0.85 minimum ratio used in the testing in this project.« less
Report on all ARRA Funded Technical Work
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2013-10-05
The main focus of this American Recovery and Reinvestment Act of 2009 (ARRA) funded project was to design an energy efficient carbon capture and storage (CCS) process using the Recipients membrane system for H{sub 2} separation and CO{sub 2} capture. In the ARRA-funded project, the Recipient accelerated development and scale-up of ongoing hydrogen membrane technology research and development (R&D). Specifically, this project focused on accelerating the current R&D work scope of the base program-funded project, involving lab scale tests, detail design of a 250 lb/day H{sub 2} process development unit (PDU), and scale-up of membrane tube and coating manufacturing. Thismore » project scope included the site selection and a Front End Engineering Design (FEED) study of a nominally 4 to 10 ton-per-day (TPD) Pre-Commercial Module (PCM) hydrogen separation membrane system. Process models and techno-economic analysis were updated to include studies on integration of this technology into an Integrated Gasification Combined Cycle (IGCC) power generation system with CCS.« less
Arroyo, Fátima; Font, Oriol; Fernández-Pereira, Constantino; Querol, Xavier; Juan, Roberto; Ruiz, Carmen; Coca, Pilar
2009-08-15
In this study the purity of the germanium end-products obtained by two different precipitation methods carried out on germanium-bearing solutions was evaluated as a last step of a hydrometallurgy process for the recovery of this valuable element from the Puertollano Integrated Gasification Combined Cycle (IGCC) fly ash. Since H(2)S is produced as a by-product in the gas cleaning system of the Puertollano IGCC plant, precipitation of germanium as GeS(2) was tested by sulfiding the Ge-bearing solutions. The technological and hazardous issues that surround H(2)S handling conducted to investigate a novel precipitation procedure: precipitation as an organic complex by adding 1,2-dihydroxy benzene pyrocatechol (CAT) and cetyltrimethylammonium bromide (CTAB) to the Ge-bearing solutions. Relatively high purity Ge end-products (90 and 93% hexagonal-GeO(2) purity, respectively) were obtained by precipitating Ge from enriched solutions, as GeS(2) sulfiding the solutions with H(2)S, or as organic complex with CAT/CTAB mixtures and subsequent roasting of the precipitates. Both methods showed high efficiency (>99%) to precipitate selectively Ge using a single precipitation stage from germanium-bearing solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-12-31
The project involves the construction of an 80,000 gallons per day (260 TPD) methanol unit utilizing coal-derived synthesis gas from Eastman`s integrated coal gasification facility. The new equipment consists of synthesis gas feed preparation and compression facilities, the liquid phase reactor and auxiliaries, product distillation facilities, and utilities. The technology to be demonstrated is the product of a cooperative development effort by Air Products and DOE in a program that started in 1981. Developed to enhance electric power generation using integrated gasification combined cycle (IGCC) technology, the LPMEOH{trademark} process is ideally suited for directly processing gases produced by modern-day coalmore » gasifiers. This liquid phase process suspends fine catalyst particles in an inert liquid, forming a slurry. The slurry dissipates the heat of the chemical reaction away from the catalyst surface protecting the catalyst and allowing the methanol synthesis reaction to proceed at higher rates. At the Eastman complex, the technology will be integrated with existing coal-gasifiers. A carefully developed test plan will allow operations at Eastman to simulate electricity demand load-following in coal-based IGCC facilities. The operations will also demonstrate the enhanced stability and heat dissipation of the conversion process, its reliable on/off operation, and its ability to produce methanol as a clean liquid fuel without additional upgrading. An off-site product testing program will be conducted to demonstrate the suitability of the methanol product as a transportation fuel and as a fuel for stationary applications for small modular electric power generators for distributed power.« less
NASA Astrophysics Data System (ADS)
Musulin, Mike, II
The continued failure of synthetic fuels development in the United States to achieve commercialization has been documented through the sporadic periods of mounting corporate and government enthusiasm and high levels of research and development efforts. Four periods of enthusiasm at the national level were followed by waning intervals of shrinking financial support and sagging R&D work. The continuing cycle of mobilization and stagnation has had a corresponding history in Kentucky. To better understand the potential and the pitfalls of this type of technological development the history of synthetic fuels development in the United States is presented as background, with a more detailed analysis of synfuels development in Kentucky. The first two periods of interest in synthetic fuels immediately after the Second World War and in the 1950s did not result in any proposed plants for Kentucky, but the third and fourth periods of interest created a great deal of activity. A theoretically grounded case study is utilized in this research project to create four different scenarios for the future of synthetic fuels development. The Kentucky experience is utilized in this case study because a fifth incarnation of synthetic fuels development has been proposed for the state in the form of an integrated gasification combined cycle power plant (IGCC) to utilize coal and refuse derived fuel (RDF). The project has been awarded a grant from the U.S. Department of Energy Clean Coal Technology program. From an examination and analysis of these periods of interest and the subsequent dwindling of interest and participation, four alternative scenarios are constructed. A synfuels breakthrough scenario is described whereby IGCC becomes a viable part of the country's energy future. A multiplex scenario describes how IGCC becomes a particular niche in energy production. The status quo scenario describes how the old patterns of project failure repeat themselves. The fourth scenario describes how synfuels and other conventional energy sources are rejected in favor of conservation, use of nuclear facilities, and use of alternative fuels.
NASA Astrophysics Data System (ADS)
Luo, Kevin
Coal synthesis gas (syngas) can introduce contaminants into the flow of an Integrated Gasification Combined Cycle (IGCC) industrial gas turbine which can form molten deposits onto components of the first stage of a turbine. Research is being conducted at West Virginia University (WVU) to study the effects of particulate deposition on thermal barrier coatings (TBC) employed on the airfoils of an IGCC turbine hot section. WVU had been working with U.S. Department of Energy, National Energy Technology Laboratory (NETL) to simulate deposition on the pressure side of an IGCC turbine first stage vane to study the effects on film cooling. To simulate the particulate deposition, TBC coated, angled film-cooled test articles were subjected to accelerated deposition injected into the flow of a combustor facility with a pressure of approximately 4 atm and a gas temperature of 1560 K. The particle characteristics between engine conditions and laboratory are matched using the Stokes number and particulate loading. To investigate the degradation on the TBC from the particulate deposition, non-destructive evaluations were performed using a load-based multiple-partial unloading micro-indentation technique and were followed by scanning electron microscopy (SEM) evaluation and energy dispersive X-ray spectroscopy (EDS) examinations. The micro-indentation technique used in the study was developed by Kang et al. and can quantitatively evaluate the mechanical properties of materials. The indentation results found that the Young's Modulus of the ceramic top coat is higher in areas with deposition formation due to the penetration of the fly ash. The increase in the modulus of elasticity has been shown to result in a reduction of strain tolerance of the 7% yttria-stabilized zirconia (7YSZ) TBC coatings. The increase in the Young's modulus of the ceramic top coat is due to the stiffening of the YSZ columnar microstructure from the cooled particulate fly ash. SEM evaluation was used to evaluate the microstructure of the layers within the TBC system, and the SEM micrographs showed that the TBC/fly ash deposition interaction zone made the YSZ coating more susceptible to delamination and promoted a dissolution-reprecipitation mechanism that changes the YSZ morphology and composition. EDS examination provided elemental maps which showed a shallow infiltration depth of the fly ash deposits and an elemental distribution spectrum analysis showed yttria migration from the YSZ top coating into the molten deposition. This preliminary work should lead to future studies in gas turbine material coating systems and their interaction with simulated fly ash and potentially CMAS or volcanic ash deposition.
Development of ITM oxygen technology for integration in IGCC and other advanced power generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, Phillip A.
2015-03-31
Ion Transport Membrane (ITM) technology is based on the oxygen-ion-conducting properties of certain mixed-metal oxide ceramic materials that can separate oxygen from an oxygen-containing gas, such as air, under a suitable driving force. The “ITM Oxygen” air separation system that results from the use of such ceramic membranes produces a hot, pure oxygen stream and a hot, pressurized, oxygen-depleted stream from which significant amounts of energy can be extracted. Accordingly, the technology integrates well with other high-temperature processes, including power generation. Air Products and Chemicals, Inc., the Recipient, in conjunction with a dozen subcontractors, developed ITM Oxygen technology under thismore » five-phase Cooperative Agreement from the laboratory bench scale to implementation in a pilot plant capable of producing power and 100 tons per day (TPD) of purified oxygen. A commercial-scale membrane module manufacturing facility (the “CerFab”), sized to support a conceptual 2000 TPD ITM Oxygen Development Facility (ODF), was also established and operated under this Agreement. In the course of this work, the team developed prototype ceramic production processes and a robust planar ceramic membrane architecture based on a novel ceramic compound capable of high oxygen fluxes. The concept and feasibility of the technology was thoroughly established through laboratory pilot-scale operations testing commercial-scale membrane modules run under industrial operating conditions with compelling lifetime and reliability performance that supported further scale-up. Auxiliary systems, including contaminant mitigation, process controls, heat exchange, turbo-machinery, combustion, and membrane pressure vessels were extensively investigated and developed. The Recipient and subcontractors developed efficient process cycles that co-produce oxygen and power based on compact, low-cost ITMs. Process economics assessments show significant benefits relative to state-of-the-art cryogenic air separation technology in energy-intensive applications such as IGCC with and without carbon capture.« less
NASA Astrophysics Data System (ADS)
Liu, M.; Bi, J.; Huang, Y.; Kinney, P. L.
2016-12-01
Jiangsu, which has three national low-carbon pilot cities, is set to be a model province in China for achieving peak carbon targets before 2030. However, according to local planning of responding to climate change, carbon emissions are projected to keep going up before 2020 even the strictest measures are implemented. In other words, innovative measures must be in action after 2020. This work aimed at assessing the air quality and health co-benefits of alternative post-2020 measures to help remove barriers of policy implementation through tying it to local incentives for air quality improvement. To achieve the aim, we select 2010 as baseline year and develop Bussiness As Usual (BAU) and Traditional Carbon Reduction (TCR) scenarios before 2020. Under BAU, only existing climate and air pollution control policies are considered; under TCR, potential climate policies in local planning and existing air pollution control policies are considered. After 2020, integrated gasification combined cycle (IGCC) plant with carbon capture and storage (CCS) technology and large-scale substitution of renewable energy seem to be two promising pathways for achieving peak carbon targets. Therefore, two additional scenarios (TCR-IGCC and TCR-SRE) are set after 2020. Based on the projections of future energy balances and industrial productions, we estimate the pollutant emissions and simulate PM2.5 and ozone concentrations by 2017, 2020, 2030 and 2050 using CMAQ. Then using health impact assessment approach, the premature deaths are estimated and monetized. Results show that the carbon peak in Jiangsu will be achieved before 2030 only under TCR-IGCC and TCR-SRE scenarios. Under three policy scenarios, Jiangsu's carbon emission control targets would have substantial effects on primary air pollutant emissions far beyond those we estimate would be needed to meet the PM2.5 concentration targets in 2017. Compared with IGCC with CCS, large-scale substitutions of renewable energy bring comparable pollutant emission reductions but more health benefits because it reduces more emissions from traffic sources which are more harmful to health. However, large-scale substitution of renewable energy posed challenges on energy supply capacity, which need to be seriously considered in future policy decision.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dennis, R.A.
1997-05-01
The availability of reliable, low-cost electricity is a cornerstone for the United States` ability to compete in the world market. The Department of Energy (DOE) projects the total consumption of electricity in the US to rise from 2.7 trillion kilowatt-hours in 1990 to 3.5 trillion in 2010. Although energy sources are diversifying, fossil fuel still produces 90 percent of the nation`s energy. Coal is our most abundant fossil fuel resource and the source of 56 percent of our electricity. It has been the fuel of choice because of its availability and low cost. A new generation of high-efficiency power systemsmore » has made it possible to continue the use of coal while still protecting the environment. Such power systems greatly reduce the pollutants associated with cola-fired plants built before the 1970s. To realize this high efficiency and superior environmental performance, advanced coal-based power systems will require gas stream cleanup under high-temperature and high-pressure (HTHP) process conditions. Presented in this paper are the HTHP particulate capture requirements for the Integrated Gasification Combined Cycle (IGCC) and Pressurized Fluidized-Bed Combustion (PFBC) power systems, the HTHP particulate cleanup systems being implemented in the PFBC and IGCC Clean Coal Technology (CCT) Projects, and the currently available particulate capture performance results.« less
Novel findings about management of gastric cancer: a summary from 10th IGCC.
Penon, Danila; Cito, Letizia; Giordano, Antonio
2014-07-21
The Tenth International Gastric Cancer Congress (IGCC) was held in Verona, Italy, from June 19 to 22, 2013. The meeting enclosed various aspects of stomach tumor management, including both tightly clinical approaches, and topics more related to basic research. Moreover, an overview on gastrointestinal stromal tumors was provided too, although here not discussed. Here we will discuss some topics related to molecular biology of gastric cancer (GC), inherent to prognostic, diagnostic and therapeutic tools shown at the conference. Results about well known subjects, such as E-cadherin loss of expression/function, were presented. They revealed that other mutations of the gene were identified, showing a continuous research to improve diagnosis and prognosis of stomach tumor. Simultaneously, new possible molecular markers with an established role for other neoplasms, were discussed, such as mesothelin, stomatin-like protein 2 and Notch-1. Hence, a wide overview including both old and new diagnostic/prognostic tools was offered. Great attention was also dedicated to possible drugs to be used against GC. They included monoclonal antibodies, such as MS57-2.1, drugs used in other pathologies, such as maraviroc, and natural extracts from plants such as biflorin. We would like to contribute to summarize the most impressive studies presented at the IGCC, concerning novel findings about molecular biology of gastric cancer. Although further investigations will be necessary, it can be inferred that more and more tools were developed, so as to better face stomach neoplasms.
Novel findings about management of gastric cancer: A summary from 10th IGCC
Penon, Danila; Cito, Letizia; Giordano, Antonio
2014-01-01
The Tenth International Gastric Cancer Congress (IGCC) was held in Verona, Italy, from June 19 to 22, 2013. The meeting enclosed various aspects of stomach tumor management, including both tightly clinical approaches, and topics more related to basic research. Moreover, an overview on gastrointestinal stromal tumors was provided too, although here not discussed. Here we will discuss some topics related to molecular biology of gastric cancer (GC), inherent to prognostic, diagnostic and therapeutic tools shown at the conference. Results about well known subjects, such as E-cadherin loss of expression/function, were presented. They revealed that other mutations of the gene were identified, showing a continuous research to improve diagnosis and prognosis of stomach tumor. Simultaneously, new possible molecular markers with an established role for other neoplasms, were discussed, such as mesothelin, stomatin-like protein 2 and Notch-1. Hence, a wide overview including both old and new diagnostic/prognostic tools was offered. Great attention was also dedicated to possible drugs to be used against GC. They included monoclonal antibodies, such as MS57-2.1, drugs used in other pathologies, such as maraviroc, and natural extracts from plants such as biflorin. We would like to contribute to summarize the most impressive studies presented at the IGCC, concerning novel findings about molecular biology of gastric cancer. Although further investigations will be necessary, it can be inferred that more and more tools were developed, so as to better face stomach neoplasms. PMID:25083072
Ziemkiewicz, Paul; Stauffer, Philip H.; Sullivan-Graham, Jeri; ...
2016-08-04
Carbon capture, utilization and storage (CCUS) seeks beneficial applications for CO 2 recovered from fossil fuel combustion. This study evaluated the potential for removing formation water to create additional storage capacity for CO 2, while simultaneously treating the produced water for beneficial use. Furthermore, the process would control pressures within the target formation, lessen the risk of caprock failure, and better control the movement of CO 2 within that formation. The project plans to highlight the method of using individual wells to produce formation water prior to injecting CO 2 as an efficient means of managing reservoir pressure. Because themore » pressure drawdown resulting from pre-injection formation water production will inversely correlate with pressure buildup resulting from CO 2 injection, it can be proactively used to estimate CO 2 storage capacity and to plan well-field operations. The project studied the GreenGen site in Tianjin, China where Huaneng Corporation is capturing CO 2 at a coal fired IGCC power plant. Known as the Tianjin Enhanced Water Recovery (EWR) project, local rock units were evaluated for CO 2 storage potential and produced water treatment options were then developed. Average treatment cost for produced water with a cooling water treatment goal ranged from 2.27 to 2.96 US$/m 3 (recovery 95.25%), and for a boiler water treatment goal ranged from 2.37 to 3.18 US$/m 3 (recovery 92.78%). Importance analysis indicated that water quality parameters and transportation are significant cost factors as the injection-extraction system is managed over time. Our study found that in a broad sense, active reservoir management in the context of CCUS/EWR is technically feasible. In addition, criteria for evaluating suitable vs. unsuitable reservoir properties, reservoir storage (caprock) integrity, a recommended injection/withdrawal strategy and cost estimates for water treatment and reservoir management are proposed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ziemkiewicz, Paul; Stauffer, Philip H.; Sullivan-Graham, Jeri
Carbon capture, utilization and storage (CCUS) seeks beneficial applications for CO 2 recovered from fossil fuel combustion. This study evaluated the potential for removing formation water to create additional storage capacity for CO 2, while simultaneously treating the produced water for beneficial use. Furthermore, the process would control pressures within the target formation, lessen the risk of caprock failure, and better control the movement of CO 2 within that formation. The project plans to highlight the method of using individual wells to produce formation water prior to injecting CO 2 as an efficient means of managing reservoir pressure. Because themore » pressure drawdown resulting from pre-injection formation water production will inversely correlate with pressure buildup resulting from CO 2 injection, it can be proactively used to estimate CO 2 storage capacity and to plan well-field operations. The project studied the GreenGen site in Tianjin, China where Huaneng Corporation is capturing CO 2 at a coal fired IGCC power plant. Known as the Tianjin Enhanced Water Recovery (EWR) project, local rock units were evaluated for CO 2 storage potential and produced water treatment options were then developed. Average treatment cost for produced water with a cooling water treatment goal ranged from 2.27 to 2.96 US$/m 3 (recovery 95.25%), and for a boiler water treatment goal ranged from 2.37 to 3.18 US$/m 3 (recovery 92.78%). Importance analysis indicated that water quality parameters and transportation are significant cost factors as the injection-extraction system is managed over time. Our study found that in a broad sense, active reservoir management in the context of CCUS/EWR is technically feasible. In addition, criteria for evaluating suitable vs. unsuitable reservoir properties, reservoir storage (caprock) integrity, a recommended injection/withdrawal strategy and cost estimates for water treatment and reservoir management are proposed.« less
75 FR 28612 - Environmental Impact Statements; Notice of Availability
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-21
... Counties, OR and Adams and Nez Perce Counties, ID, Wait Period Ends: 06/21/2010, Contact: Robert W. Rock.... EIS No. 20100181, Final EIS, DOE, MS, Kemper County Integrated Gasification Combined-Cycle (IGCC...
Coal Integrated Gasification Fuel Cell System Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chellappa Balan; Debashis Dey; Sukru-Alper Eker
2004-01-31
This study analyzes the performance and economics of power generation systems based on Solid Oxide Fuel Cell (SOFC) technology and fueled by gasified coal. System concepts that integrate a coal gasifier with a SOFC, a gas turbine, and a steam turbine were developed and analyzed for plant sizes in excess of 200 MW. Two alternative integration configurations were selected with projected system efficiency of over 53% on a HHV basis, or about 10 percentage points higher than that of the state-of-the-art Integrated Gasification Combined Cycle (IGCC) systems. The initial cost of both selected configurations was found to be comparable withmore » the IGCC system costs at approximately $1700/kW. An absorption-based CO2 isolation scheme was developed, and its penalty on the system performance and cost was estimated to be less approximately 2.7% and $370/kW. Technology gaps and required engineering development efforts were identified and evaluated.« less
Mitigation of Syngas Cooler Plugging and Fouling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bockelie, Michael J.
This Final Report summarizes research performed to develop a technology to mitigate the plugging and fouling that occurs in the syngas cooler used in many Integrated Gasification Combined Cycle (IGCC) plants. The syngas cooler is a firetube heat exchanger located downstream of the gasifier. It offers high thermal efficiency, but its’ reliability has generally been lower than other process equipment in the gasification island. The buildup of ash deposits that form on the fireside surfaces in the syngas cooler (i.e., fouling) lead to reduced equipment life and increased maintenance costs. Our approach to address this problem is that fouling ofmore » the syngas cooler cannot be eliminated, but it can be better managed. The research program was funded by DOE using two budget periods: Budget Period 1 (BP1) and Budget Period 2 (BP2). The project used a combination of laboratory scale experiments, analysis of syngas cooler deposits, modeling and guidance from industry to develop a better understanding of fouling mechanisms and to develop and evaluate strategies to mitigate syngas cooler fouling and thereby improve syngas cooler performance. The work effort in BP 1 and BP 2 focused on developing a better understanding of the mechanisms that lead to syngas cooler plugging and fouling and investigating promising concepts to mitigate syngas cooler plugging and fouling. The work effort focused on the following: • analysis of syngas cooler deposits and fuels provided by an IGCC plant collaborating with this project; • performing Jet cleaning tests in the University of Utah Laminar Entrained Flow Reactor to determine the bond strength between an ash deposit to a metal plate, as well as implementing planned equipment modifications to the University of Utah Laminar Entrained Flow Reactor and the one ton per day, pressurized Pilot Scale Gasifier; • performing Computational Fluid Dynamic modeling of industrially relevant syngas cooler configurations to develop a better understanding of deposit formation mechanisms; • performing Techno-Economic-Analysis for a representative IGCC plant to investigate the impact on plant economics, in particular the impacts on the Cost of Electricity (COE), due to plant shutdowns caused by syngas cooler plugging and fouling and potential benefits to plant economics of developing strategies to mitigate syngas cooler fouling; and • performing modeling and pilot scale tests to investigate the potential benefits of using a sorbent (fuel additive) to capture the vaporized metals that result in syngas cooler fouling. All project milestones for BP 1 and BP 2 were achieved. DOE was provided a briefing on our accomplishments in BP1 and BP2 and our proposed plans for Budget Period 3 (BP 3). Based on our research the mitigation technology selected to investigate in BP 3 was the use of a sorbent that can be injected into the gasifier with the fuel slurry to capture vaporized metals that lead to the deposit formation in the syngas cooler. The work effort proposed for BP 3 would have focused on addressing concerns raised by gasification industry personnel for the impacts on gasifier performance of sorbent injection, so that at the end of BP 3 the use of sorbent injection would be at “pre-commercial” stage and ready for use in a Field Demonstration that could be funded by industry or DOE. A Budget Continuation Application (BCA) was submitted to obtain funding for BP3 DOE but DOE chose to not fund the proposed BP3 effort.« less
NASA Astrophysics Data System (ADS)
Abaimov, N. A.; Osipov, P. V.; Ryzhkov, A. F.
2016-10-01
In the paper the development of the advanced bituminous coal entrained-flow air- blown gasifier for the high power integrated gasification combined cycle is considered. The computational fluid dynamics technique is used as the basic development tool. The experiment on the pressurized entrained-flow gasifier was performed by “NPO CKTI” JSC for the thermochemical processes submodel verification. The kinetic constants for Kuznetsk bituminous coal (flame coal), obtained by thermal gravimetric analysis method, are used in the model. The calculation results obtained by the CFD model are in satisfactory agreements with experimental data. On the basis of the verified model the advanced gasifier structure was suggested which permits to increase the hydrogen content in the synthesis gas and consequently to improve the gas turbine efficiency. In order to meet the specified requirements vapor is added on the second stage of MHI type gasifier and heat necessary for air gasification is compensated by supplemental heating of the blasting air.
A new way to experience the International Gastric Cancer Association Congress: the Web Round Tables.
Morgagni, Paolo; Verlato, Giuseppe; Marrelli, Daniele; Roviello, Franco; de Manzoni, Giovanni
2014-10-01
In an attempt to attract a wider diversity of professionals to the 10th International Gastric Cancer Association Congress (IGCC) held in June 2013, the Scientific Committee of the conference organized a number of pre-congress Web Round Tables to discuss cutting-edge topics relating to gastric cancer treatment. Twenty Web Round Tables, each coordinated by a different chairman, were proposed on the IGCC Website 1 year before the congress. Each chairman identified a number of studies related to the theme of his/her Round Table and invited corresponding authors to send an update of their conclusions in light of their subsequent experience, which would then form the basis of discussion of the Web Round Tables. The chairmen posted several questions regarding these updates on the web and opened a forum for a period of 1-2 months. The forum was free and specifically intended for congress participants. Fifty-one (9.9 %) of the 516 authors contacted took part in the initiative. Two hundred fifty participants from 21 countries joined the forum discussion and posted 671 comments. The Web Round Tables were viewed 15,810 times while the forum was open. Overall, the Web Round Tables aroused considerable interest, especially among young professionals working in the area of gastric cancer who had the opportunity to contact and interact with experts in what often turned out to be an interesting and lively exchange of views. All the discussions are now freely available for consultation on the IGCC website. The Web Round Table experience was presented, with great success, during the conference at special afternoon sessions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, Ronald; Whitty, Kevin
2014-12-01
The integrated gasification combined cycle (IGCC) when combined with carbon capture and storage can be one of the cleanest methods of extracting energy from coal. Control of coal and biomass gasification processes to accommodate the changing character of input-fuel streams is required for practical implementation of integrated gasification combined-cycle (IGCC) technologies. Therefore a fast time-response sensor is needed for real-time monitoring of the composition and ideally the heating value of the synthesis gas (here called syngas) as it exits the gasifier. The goal of this project was the design, construction, and demonstration an in situ laserabsorption sensor to monitor multiplemore » species in the syngas output from practical-scale coal gasifiers. This project investigated the hypothesis of using laser absorption sensing in particulateladen syngas. Absorption transitions were selected with design rules to optimize signal strength while minimizing interference from other species. Successful in situ measurements in the dusty, high-pressure syngas flow were enabled by Stanford’s normalized and scanned wavelength modulation strategy. A prototype sensor for CO, CH4, CO2, and H2O was refined with experiments conducted in the laboratory at Stanford University, a pilot-scale at the University of Utah, and an engineering-scale gasifier at DoE’s National Center for Carbon Capture with the demonstration of a prototype sensor with technical readiness level 6 in the 2014 measurement campaign.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, P.; Bhattacharyya, D.; Turton, R.
2012-01-01
Future integrated gasification combined cycle (IGCC) power plants with CO{sub 2} capture will face stricter operational and environmental constraints. Accurate values of relevant states/outputs/disturbances are needed to satisfy these constraints and to maximize the operational efficiency. Unfortunately, a number of these process variables cannot be measured while a number of them can be measured, but have low precision, reliability, or signal-to-noise ratio. In this work, a sensor placement (SP) algorithm is developed for optimal selection of sensor location, number, and type that can maximize the plant efficiency and result in a desired precision of the relevant measured/unmeasured states. In thismore » work, an SP algorithm is developed for an selective, dual-stage Selexol-based acid gas removal (AGR) unit for an IGCC plant with pre-combustion CO{sub 2} capture. A comprehensive nonlinear dynamic model of the AGR unit is developed in Aspen Plus Dynamics® (APD) and used to generate a linear state-space model that is used in the SP algorithm. The SP algorithm is developed with the assumption that an optimal Kalman filter will be implemented in the plant for state and disturbance estimation. The algorithm is developed assuming steady-state Kalman filtering and steady-state operation of the plant. The control system is considered to operate based on the estimated states and thereby, captures the effects of the SP algorithm on the overall plant efficiency. The optimization problem is solved by Genetic Algorithm (GA) considering both linear and nonlinear equality and inequality constraints. Due to the very large number of candidate sets available for sensor placement and because of the long time that it takes to solve the constrained optimization problem that includes more than 1000 states, solution of this problem is computationally expensive. For reducing the computation time, parallel computing is performed using the Distributed Computing Server (DCS®) and the Parallel Computing® toolbox from Mathworks®. In this presentation, we will share our experience in setting up parallel computing using GA in the MATLAB® environment and present the overall approach for achieving higher computational efficiency in this framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, P.; Bhattacharyya, D.; Turton, R.
2012-01-01
Future integrated gasification combined cycle (IGCC) power plants with CO{sub 2} capture will face stricter operational and environmental constraints. Accurate values of relevant states/outputs/disturbances are needed to satisfy these constraints and to maximize the operational efficiency. Unfortunately, a number of these process variables cannot be measured while a number of them can be measured, but have low precision, reliability, or signal-to-noise ratio. In this work, a sensor placement (SP) algorithm is developed for optimal selection of sensor location, number, and type that can maximize the plant efficiency and result in a desired precision of the relevant measured/unmeasured states. In thismore » work, an SP algorithm is developed for an selective, dual-stage Selexol-based acid gas removal (AGR) unit for an IGCC plant with pre-combustion CO{sub 2} capture. A comprehensive nonlinear dynamic model of the AGR unit is developed in Aspen Plus Dynamics® (APD) and used to generate a linear state-space model that is used in the SP algorithm. The SP algorithm is developed with the assumption that an optimal Kalman filter will be implemented in the plant for state and disturbance estimation. The algorithm is developed assuming steady-state Kalman filtering and steady-state operation of the plant. The control system is considered to operate based on the estimated states and thereby, captures the effects of the SP algorithm on the overall plant efficiency. The optimization problem is solved by Genetic Algorithm (GA) considering both linear and nonlinear equality and inequality constraints. Due to the very large number of candidate sets available for sensor placement and because of the long time that it takes to solve the constrained optimization problem that includes more than 1000 states, solution of this problem is computationally expensive. For reducing the computation time, parallel computing is performed using the Distributed Computing Server (DCS®) and the Parallel Computing® toolbox from Mathworks®. In this presentation, we will share our experience in setting up parallel computing using GA in the MATLAB® environment and present the overall approach for achieving higher computational efficiency in this framework.« less
Low Carbon Technology Options for the Natural Gas ...
The ultimate goal of this task is to perform environmental and economic analysis of natural gas based power production technologies (different routes) to investigate and evaluate strategies for reducing emissions from the power sector. It is a broad research area. Initially, the research will be focused on the preliminary analyses of hydrogen fuel based power production technologies utilizing hydrogen fuel in a large size, heavy-duty gas turbines in integrated reformer combined cycle (IRCC) and integrated gasification combined cycle (IGCC) for electric power generation. The research will be expanded step-by-step to include other advanced (e.g., Net Power, a potentially transformative technology utilizing a high efficiency CO2 conversion cycle (Allam cycle), and chemical looping etc.) pre-combustion and post-combustion technologies applied to natural gas, other fossil fuels (coal and heavy oil) and biomass/biofuel based on findings. Screening analysis is already under development and data for the analysis is being processed. The immediate action on this task include preliminary economic and environmental analysis of power production technologies applied to natural gas. Data for catalytic reforming technology to produce hydrogen from natural gas is being collected and compiled on Microsoft Excel. The model will be expanded for exploring and comparing various technologies scenarios to meet our goal. The primary focus of this study is to: 1) understand the chemic
Liquid CO 2/Coal Slurry for Feeding Low Rank Coal to Gasifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marasigan, Jose; Goldstein, Harvey; Dooher, John
2013-09-30
This study investigates the practicality of using a liquid CO 2/coal slurry preparation and feed system for the E-Gas™ gasifier in an integrated gasification combined cycle (IGCC) electric power generation plant configuration. Liquid CO 2 has several property differences from water that make it attractive for the coal slurries used in coal gasification-based power plants. First, the viscosity of liquid CO 2 is much lower than water. This means it should take less energy to pump liquid CO 2 through a pipe compared to water. This also means that a higher solids concentration can be fed to the gasifier, whichmore » should decrease the heat requirement needed to vaporize the slurry. Second, the heat of vaporization of liquid CO 2 is about 80% lower than water. This means that less heat from the gasification reactions is needed to vaporize the slurry. This should result in less oxygen needed to achieve a given gasifier temperature. And third, the surface tension of liquid CO 2 is about 2 orders of magnitude lower than water, which should result in finer atomization of the liquid CO 2 slurry, faster reaction times between the oxygen and coal particles, and better carbon conversion at the same gasifier temperature. EPRI and others have recognized the potential that liquid CO 2 has in improving the performance of an IGCC plant and have previously conducted systemslevel analyses to evaluate this concept. These past studies have shown that a significant increase in IGCC performance can be achieved with liquid CO 2 over water with certain gasifiers. Although these previous analyses had produced some positive results, they were still based on various assumptions for liquid CO 2/coal slurry properties.« less
Frontal lobe morphometry with MRI in a normal age group of 6-17 year-olds.
Ilkay Koşar, M; Otağ, Ilhan; Sabancıoğulları, Vedat; Atalar, Mehmet; Tetiker, Hasan; Otağ, Aynur; Cimen, Mehmet
2012-12-01
Morphometric data of the frontal lobe are important for surgical planning of lesions in the frontal lobe and its surroundings. Magnetic resonance imaging (MRI) techniques provide suitable data for this purpose. In our study, the morphometric data of mid-sagittal MRI of the frontal lobe in certain age and gender groups of children have been presented. In a normal age group of 6-17-year-old participants, the length of the line passing through predetermined different points, including the frontal pole (FP), commissura anterior (AC), commissura posterior (PC), the outermost point of corpus callosum genu (AGCC), the innermost point of corpus callosum genu (IGCC), tuberculum sella (TS), AGCC and IGCC points parallel to AC-PC line and the point such line crosses at the frontal lobe surface (FCS) were measured in three age groups (6-9, 10-13 and 14-17 years) for each gender. The frontal lobe morphometric data were higher in males than females. Frontal lobe measurements peak at the age group of 10-13 in the male and at the age group of 6-13 in the female. In boys, the length of FP-AC increases 4.1% in the 10-13 age group compared with the 6-9-year-old group, while this increase is 2.3% in girls. Differences in age and gender groups were determined. While the length of AGCC-IGCC increases 10.4% in adults, in children aged 6-17, the length of AC-PC is 11.5% greater than adults. These data will contribute to the preliminary assessment for developing a surgical plan in fine interventions in the frontal lobe and its surroundings in children.
Fuel-Flexible Gasification-Combustion Technology for Production of H2 and Sequestration-Ready CO2
DOE Office of Scientific and Technical Information (OSTI.GOV)
George Rizeq; Parag Kulkarni; Wei Wei
It is expected that in the 21st century the Nation will continue to rely on fossil fuels for electricity, transportation, and chemicals. It will be necessary to improve both the process efficiency and environmental impact performance of fossil fuel utilization. GE Global Research is developing an innovative fuel-flexible Unmixed Fuel Processor (UFP) technology to produce H{sub 2}, power, and sequestration-ready CO{sub 2} from coal and other solid fuels. The UFP module offers the potential for reduced cost, increased process efficiency relative to conventional gasification and combustion systems, and near-zero pollutant emissions including NO{sub x}. GE was awarded a contract frommore » U.S. DOE NETL to develop the UFP technology. Work on the Phase I program started in October 2000, and work on the Phase II effort started in April 2005. In the UFP technology, coal and air are simultaneously converted into separate streams of (1) high-purity hydrogen that can be utilized in fuel cells or turbines, (2) sequestration-ready CO{sub 2}, and (3) high temperature/pressure vitiated air to produce electricity in a gas turbine. The process produces near-zero emissions with an estimated efficiency higher than IGCC with conventional CO2 separation. The Phase I R&D program established the feasibility of the integrated UFP technology through lab-, bench- and pilot-scale testing and investigated operating conditions that maximize separation of CO{sub 2} and pollutants from the vent gas, while simultaneously maximizing coal conversion efficiency and hydrogen production. The Phase I effort integrated experimental testing, modeling and preliminary economic studies to demonstrate the UFP technology. The Phase II effort will focus on three high-risk areas: economics, sorbent attrition and lifetime, and product gas quality for turbines. The economic analysis will include estimating the capital cost as well as the costs of hydrogen and electricity for a full-scale UFP plant. These costs will be benchmarked with IGCC polygen costs for plants of similar size. Sorbent attrition and lifetime will be addressed via bench-scale experiments that monitor sorbent performance over time and by assessing materials interactions at operating conditions. The product gas from the third reactor (high-temperature vitiated air) will be evaluated to assess the concentration of particulates, pollutants and other impurities relative to the specifications required for gas turbine feed streams. This is the eighteenth quarterly technical progress report for the UFP program, which is supported by U.S. DOE NETL (Contract No. DE-FC26-00FT40974) and GE. This report summarizes program accomplishments for the Phase II period starting July 01, 2005 and ending September 30, 2005. The report includes an introduction summarizing the UFP technology, main program tasks, and program objectives; it also provides a summary of program activities and accomplishments covering progress in tasks including process modeling, scale-up and economic analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, D.A.; Shoemaker, S.A.
1996-12-31
The Morgantown Energy Technology Center (METC) is currently evaluating hot gas desulfurization (HGD)in its on-site transport reactor facility (TRF). This facility was originally constructed in the early 1980s to explore advanced gasification processes with an entrained reactor, and has recently been modified to incorporate a transport riser reactor. The TRF supports Integrated Gasification Combined Cycle (IGCC) power systems, one of METC`s advanced power generation systems. The HGD subsystem is a key developmental item in reducing the cost and increasing the efficiency of the IGCC concept. The TRF is a unique facility with high-temperature, high-pressure, and multiple reactant gas composition capability.more » The TRF can be configured for reacting a single flow pass of gas and solids using a variety of gases. The gas input system allows six different gas inputs to be mixed and heated before entering the reaction zones. Current configurations allow the use of air, carbon dioxide, carbon monoxide, hydrogen, hydrogen sulfide, methane, nitrogen, oxygen, steam, or any mixture of these gases. Construction plans include the addition of a coal gas input line. This line will bring hot coal gas from the existing Fluidized-Bed Gasifier (FBG) via the Modular Gas Cleanup Rig (MGCR) after filtering out particulates with ceramic candle filters. Solids can be fed either by a rotary pocket feeder or a screw feeder. Particle sizes may range from 70 to 150 micrometers. Both feeders have a hopper that can hold enough solid for fairly lengthy tests at the higher feed rates, thus eliminating the need for lockhopper transfers during operation.« less
Degradation of TBC Systems in Environments Relevant to Advanced Gas Turbines for IGCC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gleeson, Brian
2014-09-30
Air plasma sprayed (APS) thermal barrier coatings (TBCs) are used to provide thermal insulation for the hottest components in gas turbines. Zirconia stabilized with 7wt% yttria (7YSZ) is the most common ceramic top coat used for turbine blades. The 7YSZ coating can be degraded from the buildup of fly-ash deposits created in the power-generation process. Fly ash from an integrated gasification combined cycle (IGCC) system can result from coal-based syngas. TBCs are also exposed to harsh gas environments containing CO 2, SO 2, and steam. Degradation from the combined effects of fly ash and harsh gas atmospheres has the potentialmore » to severely limit TBC lifetimes. The main objective of this study was to use lab-scale testing to systematically elucidate the interplay between prototypical deposit chemistries (i.e., ash and its constituents, K 2SO 4, and FeS) and environmental oxidants (i.e., O 2, H 2O and CO 2) on the degradation behavior of advanced TBC systems. Several mechanisms of early TBC failure were identified, as were the specific fly-ash constituents responsible for degradation. The reactivity of MCrAlY bondcoats used in TBC systems was also investigated. The specific roles of oxide and sulfate components were assessed, together with the complex interplay between gas composition, deposit chemistry and alloy reactivity. Bondcoat composition design strategies to mitigate corrosion were established, particularly with regard to controlling phase constitution and the amount of reactive elements the bondcoat contains in order to achieve optimal corrosion resistance.« less
NANOMATERIAL SOLUTIONS FOR HOT COAL GAS CLEANUP - PHASE I
Integrated gasification combined cycle (IGCC) is a new coal gasification technique that efficiently uses the hot (900-1500°C) generated syngas to power both steam and gas turbines. Due to regulations, this syngas must be free of sulfur and purification is normally carried ...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-19
... supply (i.e., reclaimed effluent from municipal wastewater treatment) pipeline, a natural gas pipeline... the reclaimed effluent, natural gas, and CO 2 pipelines may cause temporary direct impacts to the... target today's most pressing environmental challenges, including reducing mercury and greenhouse gas (GHG...
Integrated gasification combined cycle (IGCC), which uses a gasilier to convert coal to fuel gas, and then uses a combined cycle power block to generate electricity. is one of the most promising technologies for generating electricity from coal in an environmentally sustainabl...
40 CFR 60.45Da - Standard for mercury (Hg).
Code of Federal Regulations, 2010 CFR
2010-07-01
...-fired electric utility steam generating unit that burns only lignite, you must not discharge into the... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Electric Utility... for mercury (Hg). (a) For each coal-fired electric utility steam generating unit other than an IGCC...
Model Based Optimal Sensor Network Design for Condition Monitoring in an IGCC Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Rajeeva; Kumar, Aditya; Dai, Dan
2012-12-31
This report summarizes the achievements and final results of this program. The objective of this program is to develop a general model-based sensor network design methodology and tools to address key issues in the design of an optimal sensor network configuration: the type, location and number of sensors used in a network, for online condition monitoring. In particular, the focus in this work is to develop software tools for optimal sensor placement (OSP) and use these tools to design optimal sensor network configuration for online condition monitoring of gasifier refractory wear and radiant syngas cooler (RSC) fouling. The methodology developedmore » will be applicable to sensing system design for online condition monitoring for broad range of applications. The overall approach consists of (i) defining condition monitoring requirement in terms of OSP and mapping these requirements in mathematical terms for OSP algorithm, (ii) analyzing trade-off of alternate OSP algorithms, down selecting the most relevant ones and developing them for IGCC applications (iii) enhancing the gasifier and RSC models as required by OSP algorithms, (iv) applying the developed OSP algorithm to design the optimal sensor network required for the condition monitoring of an IGCC gasifier refractory and RSC fouling. Two key requirements for OSP for condition monitoring are desired precision for the monitoring variables (e.g. refractory wear) and reliability of the proposed sensor network in the presence of expected sensor failures. The OSP problem is naturally posed within a Kalman filtering approach as an integer programming problem where the key requirements of precision and reliability are imposed as constraints. The optimization is performed over the overall network cost. Based on extensive literature survey two formulations were identified as being relevant to OSP for condition monitoring; one based on LMI formulation and the other being standard INLP formulation. Various algorithms to solve these two formulations were developed and validated. For a given OSP problem the computation efficiency largely depends on the “size” of the problem. Initially a simplified 1-D gasifier model assuming axial and azimuthal symmetry was used to test out various OSP algorithms. Finally these algorithms were used to design the optimal sensor network for condition monitoring of IGCC gasifier refractory wear and RSC fouling. The sensors type and locations obtained as solution to the OSP problem were validated using model based sensing approach. The OSP algorithm has been developed in a modular form and has been packaged as a software tool for OSP design where a designer can explore various OSP design algorithm is a user friendly way. The OSP software tool is implemented in Matlab/Simulink© in-house. The tool also uses few optimization routines that are freely available on World Wide Web. In addition a modular Extended Kalman Filter (EKF) block has also been developed in Matlab/Simulink© which can be utilized for model based sensing of important process variables that are not directly measured through combining the online sensors with model based estimation once the hardware sensor and their locations has been finalized. The OSP algorithm details and the results of applying these algorithms to obtain optimal sensor location for condition monitoring of gasifier refractory wear and RSC fouling profile are summarized in this final report.« less
Solar TiO2-assisted photocatalytic degradation of IGCC power station effluents using a Fresnel lens.
Monteagudo, J M; Durán, A; Guerra, J; García-Peña, F; Coca, P
2008-03-01
The heterogeneous TiO2 assisted photocatalytic degradation of wastewater from a thermoelectric power station under concentrated solar light irradiation using a Fresnel lens has been studied. The efficiency of photocatalytic degradation was determined from the analysis of cyanide and formate removal. Firstly, the influence of the initial concentration of H2O2 and TiO2 on the degradation kinetics of cyanides and formates was studied based on a factorial experimental design. Experimental kinetic constants were fitted using neural networks. Results showed that the photocatalytic process was effective for cyanides destruction (mainly following a molecular mechanism), whereas most of formates (degraded mainly via a radical path) remained unaffected. Finally, to improve formates degradation, the effect of lowering pH on their degradation rate was evaluated after complete cyanide destruction. The photooxidation efficiency of formates reaches a maximum at pH around 5-6. Above pH 6, formate anion is subjected to electrostatic repulsion with the negative surface of TiO2. At pH<4.5, formate adsorption and photon absorption are reduced due to some catalyst agglomeration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mobed, Parham; Pednekar, Pratik; Bhattacharyya, Debangsu
Design and operation of energy producing, near “zero-emission” coal plants has become a national imperative. This report on model-based sensor placement describes a transformative two-tier approach to identify the optimum placement, number, and type of sensors for condition monitoring and fault diagnosis in fossil energy system operations. The algorithms are tested on a high fidelity model of the integrated gasification combined cycle (IGCC) plant. For a condition monitoring network, whether equipment should be considered at a unit level or a systems level depends upon the criticality of the process equipment, its likeliness to fail, and the level of resolution desiredmore » for any specific failure. Because of the presence of a high fidelity model at the unit level, a sensor network can be designed to monitor the spatial profile of the states and estimate fault severity levels. In an IGCC plant, besides the gasifier, the sour water gas shift (WGS) reactor plays an important role. In view of this, condition monitoring of the sour WGS reactor is considered at the unit level, while a detailed plant-wide model of gasification island, including sour WGS reactor and the Selexol process, is considered for fault diagnosis at the system-level. Finally, the developed algorithms unify the two levels and identifies an optimal sensor network that maximizes the effectiveness of the overall system-level fault diagnosis and component-level condition monitoring. This work could have a major impact on the design and operation of future fossil energy plants, particularly at the grassroots level where the sensor network is yet to be identified. In addition, the same algorithms developed in this report can be further enhanced to be used in retrofits, where the objectives could be upgrade (addition of more sensors) and relocation of existing sensors.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, Prokash; Bhattacharyya, Debangsu; Turton, Richard
Here, a novel sensor network design (SND) algorithm is developed for maximizing process efficiency while minimizing sensor network cost for a nonlinear dynamic process with an estimator-based control system. The multiobjective optimization problem is solved following a lexicographic approach where the process efficiency is maximized first followed by minimization of the sensor network cost. The partial net present value, which combines the capital cost due to the sensor network and the operating cost due to deviation from the optimal efficiency, is proposed as an alternative objective. The unscented Kalman filter is considered as the nonlinear estimator. The large-scale combinatorial optimizationmore » problem is solved using a genetic algorithm. The developed SND algorithm is applied to an acid gas removal (AGR) unit as part of an integrated gasification combined cycle (IGCC) power plant with CO 2 capture. Due to the computational expense, a reduced order nonlinear model of the AGR process is identified and parallel computation is performed during implementation.« less
Paul, Prokash; Bhattacharyya, Debangsu; Turton, Richard; ...
2017-06-06
Here, a novel sensor network design (SND) algorithm is developed for maximizing process efficiency while minimizing sensor network cost for a nonlinear dynamic process with an estimator-based control system. The multiobjective optimization problem is solved following a lexicographic approach where the process efficiency is maximized first followed by minimization of the sensor network cost. The partial net present value, which combines the capital cost due to the sensor network and the operating cost due to deviation from the optimal efficiency, is proposed as an alternative objective. The unscented Kalman filter is considered as the nonlinear estimator. The large-scale combinatorial optimizationmore » problem is solved using a genetic algorithm. The developed SND algorithm is applied to an acid gas removal (AGR) unit as part of an integrated gasification combined cycle (IGCC) power plant with CO 2 capture. Due to the computational expense, a reduced order nonlinear model of the AGR process is identified and parallel computation is performed during implementation.« less
Using an operator training simulator in the undergraduate chemical engineering curriculim
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharyya, D.; Turton, R.; Zitney, S.
2012-01-01
An operator training simulator (OTS) is to the chemical engineer what a flight simulator is to the aerospace engineer. The basis of an OTS is a high-fidelity dynamic model of a chemical process that allows an engineer to simulate start-up, shut-down, and normal operation. It can also be used to test the skill and ability of an engineer or operator to respond and control some unforeseen situation(s) through the use of programmed malfunctions. West Virginia University (WVU) is a member of the National Energy Technology Laboratory’s Regional University Alliance (NETL-RUA). Working through the NETL-RUA, the authors have spent the lastmore » four years collaborating on the development of a high-fidelity OTS for an Integrated Gasification Combined Cycle (IGCC) power plant with CO{sub 2} capture that is the cornerstone of the AVESTARTM (Advanced Virtual Energy Simulation Training And Research) Center with sister facilities at NETL and WVU in Morgantown, WV. This OTS is capable of real-time dynamic simulation of IGCC plant operation, including start-up, shut-down, and power demand load following. The dynamic simulator and its human machine interfaces (HMIs) are based on the DYNSIM and InTouch software, respectively, from Invensys Operations Management. The purpose of this presentation is to discuss the authors’ experiences in using this sophisticated dynamic simulation-based OTS as a hands-on teaching tool in the undergraduate chemical engineering curriculum. At present, the OTS has been used in two separate courses: a new process simulation course and a traditional process control course. In the process simulation course, concepts of steady-state and dynamic simulations were covered prior to exposing the students to the OTS. Moreover, digital logic and the concept of equipment requiring one or more permissive states to be enabled prior to successful operation were also covered. Students were briefed about start-up procedures and the importance of following a predetermined sequence of actions in order to start-up the plant successfully. Student experience with the dynamic simulator consisted of a six-hour training session in which the Claus sulfur capture unit of the IGCC plant was started up. The students were able to operate the simulator through the InTouch-based HMI displays and study and understand the underlying dynamic modeling approach used in the DYNSIM-based simulator. The concepts learned during the training sessions were further reinforced when students developed their own DYNSIM models for a chemical process and wrote a detailed start-up procedure. In the process control course, students learned how the plant responds dynamically to changes in the manipulated inputs, as well as how the control system impacts plant performance, stability, robustness and disturbance rejection characteristics. The OTS provided the opportunity to study the dynamics of complicated, “real-life” process plants consisting of hundreds of pieces of equipment. Students implemented ideal forcing functions, tracked the time-delay through the entire plant, studied the response of open-loop unstable systems, and learned “good practices” in control system design by taking into account the real-world events where significant deviations from the “ideal” or “expected” response can occur. The theory of closed-loop stability was reinforced by implementing limiting proportional gain for stability limits of real plants. Finally, students were divided into several groups where each group was tasked to control a section of the plant within a set of operating limits in the face of disturbances and simulated process faults. At the end of this test, they suggested ways to improve the control system performance based on the theory they learned in class and the hands-on experience they earned while working on the OTS.« less
Decontamination of industrial cyanide-containing water in a solar CPC pilot plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duran, A.; Monteagudo, J.M.; San Martin, I.
2010-07-15
The aim of this work was to improve the quality of wastewater effluent coming from an Integrated Gasification Combined-Cycle (IGCC) power station to meet with future environmental legislation. This study examined a homogeneous photocatalytic oxidation process using concentrated solar UV energy (UV/Fe(II)/H{sub 2}O{sub 2}) in a Solar Compound Parabolic Collector (CPC) pilot plant. The efficiency of the process was evaluated by analysis of the oxidation of cyanides and Total Organic Carbon (TOC). A factorial experimental design allowed the determination of the influences of operating variables (initial concentration of H{sub 2}O{sub 2}, oxalic acid and Fe(II) and pH) on the degradationmore » kinetics. Temperature and UV-A solar power were also included in the Neural Network fittings. The pH was maintained at a value >9.5 during cyanide oxidation to avoid the formation of gaseous HCN and later lowered to enhance mineralization. Under the optimum conditions ([H{sub 2}O{sub 2}] = 2000 ppm, [Fe(II)] = 8 ppm, pH = 3.3 after cyanide oxidation, and [(COOH){sub 2}] = 60 ppm), it was possible to degrade 100% of the cyanides and up to 92% of Total Organic Carbon. (author)« less
Process for CO.sub.2 capture using zeolites from high pressure and moderate temperature gas streams
Siriwardane, Ranjani V [Morgantown, WV; Stevens, Robert W [Morgantown, WV
2012-03-06
A method for separating CO.sub.2 from a gas stream comprised of CO.sub.2 and other gaseous constituents using a zeolite sorbent in a swing-adsorption process, producing a high temperature CO.sub.2 stream at a higher CO.sub.2 pressure than the input gas stream. The method utilizes CO.sub.2 desorption in a CO.sub.2 atmosphere and effectively integrates heat transfers for optimizes overall efficiency. H.sub.2O adsorption does not preclude effective operation of the sorbent. The cycle may be incorporated in an IGCC for efficient pre-combustion CO.sub.2 capture. A particular application operates on shifted syngas at a temperature exceeding 200.degree. C. and produces a dry CO.sub.2 stream at low temperature and high CO.sub.2 pressure, greatly reducing any compression energy requirements which may be subsequently required.
Recovery of gallium and vanadium from gasification fly ash.
Font, Oriol; Querol, Xavier; Juan, Roberto; Casado, Raquel; Ruiz, Carmen R; López-Soler, Angel; Coca, Pilar; García Peña, Francisco
2007-01-31
The Puertollano Integrated Coal Gasification Combined Cycle (IGCC) Power Plant (Spain) fly ash is characterized by a relatively high content of Ga and V, which occurs mainly as Ga2O3 and as Ga3+ and V3+ substituting for Al3+ in the Al-Si fly ash glass matrix. Investigations focused on evaluating the potential recovery of Ga and V from these fly ashes. Several NaOH based extraction tests were performed on the IGCC fly ash, at different temperatures, NaOH/fly ash (NaOH/FA) ratios, NaOH concentrations and extraction times. The optimal Ga extraction conditions was determined as 25 degrees C, NaOH 0.7-1 M, NaOH/FA ratio of 5 L/kg and 6 h, attaining Ga extraction yields of 60-86%, equivalent to 197-275 mg of Ga/kg of fly ash. Re-circulation of leachates increased initial Ga concentrations (25-38 mg/L) to 188-215 mg/L, while reducing both content of impurities and NaOH consumption. Carbonation of concentrated Ga leachate demonstrated that 99% of the bulk Ga content in the leachate precipitates at pH 7.4. At pH 10.5 significant proportions of impurities, mainly Al (91%), co-precipitate while >98% of the bulk Ga remains in solution. A second carbonation of the remaining solution (at pH 7.5) recovers the 98.8% of the bulk Ga. Re-dissolution (at pH 0) of the precipitate increases Ga purity from 7 to 30%, this being a suitable Ga end product for further purification by electrolysis. This method produces higher recovery efficiency than currently applied for Ga on an industrial scale. In contrast, low V extraction yields (<64%) were obtained even when using extreme alkaline extraction conditions, which given the current marked price of this element, limits considerably the feasibility of V recovery from IGCC fly ash.
Highly Attrition Resistant Zinc Oxide-Based Sorbents for H2S Removal by Spray Drying Technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryu, C.K.; Lee, J.B.; Ahn, D.H.
2002-09-19
Primary issues for the fluidized-bed/transport reactor process are high attrition resistant sorbent, its high sorption capacity and regenerability, durability, and cost. The overall objective of this project is the development of a superior attrition resistant zinc oxide-based sorbent for hot gas cleanup in integrated coal gasification combined cycle (IGCC). Sorbents applicable to a fluidized-bed hot gas desulfurization process must have a high attrition resistance to withstand the fast solid circulation between a desulfurizer and a regenerator, fast kinetic reactions, and high sulfur sorption capacity. The oxidative regeneration of zinc-based sorbent usually initiated at greater than 600 C with highly exothermicmore » nature causing deactivation of sorbent as well as complication of sulfidation process by side reaction. Focusing on solving the sorbent attrition and regenerability of zinc oxide-based sorbent, we have adapted multi-binder matrices and direct incorporation of regeneration promoter. The sorbent forming was done with a spray drying technique that is easily scalable to commercial quantity.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-10
.... The IgCC is intended to provide a green model building code provisions for new and existing commercial... DEPARTMENT OF ENERGY 10 CFR Part 430 [Docket No. EERE-2011-BT-BC-0009] Building Energy Codes Program: Presenting and Receiving Comments to DOE Proposed Changes to the International Green Construction...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, Richard Stephen
2017-05-22
This presentation is part of US-China Clean Coal project and describes the impact of power plant cycling, techno economic modeling of combined IGCC and CCS, integrated capacity generation decision making for power utilities, and a new decision support tool for integrated assessment of CCUS.
40 CFR 60.50Da - Compliance determination procedures and methods.
Code of Federal Regulations, 2013 CFR
2013-07-01
... effluent is saturated or laden with water droplets. (2) The Fc factor (CO2) procedures in Method 19 of... operator of an electric utility combined cycle gas turbine that does not meet the definition of an IGCC... of this part. The SO2 and NOX emission rates calculations from the gas turbine used in Method 19 of...
40 CFR 60.50Da - Compliance determination procedures and methods.
Code of Federal Regulations, 2014 CFR
2014-07-01
... effluent is saturated or laden with water droplets. (2) The Fc factor (CO2) procedures in Method 19 of... operator of an electric utility combined cycle gas turbine that does not meet the definition of an IGCC... of this part. The SO2 and NOX emission rates calculations from the gas turbine used in Method 19 of...
Multi-stage circulating fluidized bed syngas cooling
Liu, Guohai; Vimalchand, Pannalal; Guan, Xiaofeng; Peng, WanWang
2016-10-11
A method and apparatus for cooling hot gas streams in the temperature range 800.degree. C. to 1600.degree. C. using multi-stage circulating fluid bed (CFB) coolers is disclosed. The invention relates to cooling the hot syngas from coal gasifiers in which the hot syngas entrains substances that foul, erode and corrode heat transfer surfaces upon contact in conventional coolers. The hot syngas is cooled by extracting and indirectly transferring heat to heat transfer surfaces with circulating inert solid particles in CFB syngas coolers. The CFB syngas coolers are staged to facilitate generation of steam at multiple conditions and hot boiler feed water that are necessary for power generation in an IGCC process. The multi-stage syngas cooler can include internally circulating fluid bed coolers, externally circulating fluid bed coolers and hybrid coolers that incorporate features of both internally and externally circulating fluid bed coolers. Higher process efficiencies can be realized as the invention can handle hot syngas from various types of gasifiers without the need for a less efficient precooling step.
Coal-Gen attendees hear there's no magic bullet
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2007-09-15
Those attending COAL-GEN 2007 in August heard that there is no magic bullet for meeting the energy and infrastructure needs facing the USA. The article reports on the conference which addressed topics including development of supercritical circulating fluidized bed coal unit; IGCC projects, the importance of including carbon capture and sequestration, and the need to attract and train personnel to work in the power industry. 3 photos.
Follow-up after gastrectomy for cancer: the Charter Scaligero Consensus Conference.
Baiocchi, Gian Luca; D'Ugo, Domenico; Coit, Daniel; Hardwick, Richard; Kassab, Paulo; Nashimoto, Atsushi; Marrelli, Daniele; Allum, William; Berruti, Alfredo; Chandramohan, Servarayan Murugesan; Coburn, Natalie; Gonzàlez-Moreno, Santiago; Hoelscher, Arnulf; Jansen, Edwin; Leja, Marcis; Mariette, Christophe; Meyer, Hans-Joachim; Mönig, Stefan; Morgagni, Paolo; Ott, Katia; Preston, Shaun; Rha, Sun Young; Roviello, Franco; Sano, Takeshi; Sasako, Mitsuru; Shimada, Hideaki; Schuhmacher, Cristoph; So Bok-Yan, Jimmy; Strong, Vivian; Yoshikawa, Takaki; Terashima, Masanori; Ter-Ovanesov, Michail; Van der Velde, Cornelis; Memo, Maurizio; Castelli, Francesco; Pecorelli, Sergio; Detogni, Claudio; Kodera, Yasuhiro; de Manzoni, Giovanni
2016-01-01
Presently, there is no scientific evidence supporting a definite role for follow-up after gastrectomy for cancer, and clinical practices are quite different around the globe. The aim of this consensus conference was to present an ideal prototype of follow-up after gastrectomy for cancer, based on shared experiences and taking into account the need to rationalize the diagnostic course without losing the possibility of detecting local recurrence at a potentially curable stage. On June 19-22, 2013 in Verona (Italy), during the 10th International Gastric Cancer Congress (IGCC) of the International Gastric Cancer Association, a consensus meeting was held, concluding a 6-month, Web-based, consensus conference entitled "Rationale of oncological follow-up after gastrectomy for cancer." Forty-eight experts, with a geographical distribution reflecting different health cultures worldwide, participated in the consensus conference, and 39 attended the consensus meeting. Six statements were finally approved, displayed in a plenary session and signed by the vast majority of the 10th IGCC participants. These statements are attached as an annex to the Charter Scaligero on Gastric Cancer. After gastrectomy for cancer, oncological follow-up should be offered to patients; it should be tailored to the stage of the disease, mainly based on cross-sectional imaging, and should be discontinued after 5 years.
NASA Astrophysics Data System (ADS)
Robinson, Patrick J.
Gasification has been used in industry on a relatively limited scale for many years, but it is emerging as the premier unit operation in the energy and chemical industries. The switch from expensive and insecure petroleum to solid hydrocarbon sources (coal and biomass) is occurring due to the vast amount of domestic solid resources, national security and global warming issues. Gasification (or partial oxidation) is a vital component of "clean coal" technology. Sulfur and nitrogen emissions can be reduced, overall energy efficiency is increased and carbon dioxide recovery and sequestration are facilitated. Gasification units in an electric power generation plant produce a fuel gas for driving combustion turbines. Gasification units in a chemical plant generate synthesis gas, which can be used to produce a wide spectrum of chemical products. Future plants are predicted to be hybrid power/chemical plants with gasification as the key unit operation. The coupling of an Integrated Gasification Combined Cycle (IGCC) with a methanol plant can handle swings in power demand by diverting hydrogen gas from a combustion turbine and synthesis gas from the gasifier to a methanol plant for the production of an easily-stored, hydrogen-consuming liquid product. An additional control degree of freedom is provided with this hybrid plant, fundamentally improving the controllability of the process. The idea is to base-load the gasifier and use the more responsive gas-phase units to handle disturbances. During the summer days, power demand can fluctuate up to 50% over a 12-hour period. The winter provides a different problem where spikes of power demand can go up 15% within the hour. The following dissertation develops a hybrid IGCC / methanol plant model, validates the steady-state results with a National Energy Technical Laboratory study, and tests a proposed control structure to handle these significant disturbances. All modeling was performed in the widely used chemical process simulators Aspen Plus and Aspen Dynamics. This dissertation first presents a simple approximate method for achieving the objective of having a gasifier model that can be exported into Aspen Dynamics. Limitations in the software dealing with solids make this a necessary task. The basic idea is to use a high molecular weight hydrocarbon that is present in the Aspen library as a pseudo fuel. For many plantwide dynamic studies, a rigorous high-fidelity dynamic model of the gasifier is not needed because its dynamics are very fast and the gasifier gas volume is a relatively small fraction of the total volume of the entire plant. The proposed approximate model captures the essential macro-scale thermal, flow, composition and pressure dynamics. This paper does not attempt to optimize the design or control of gasifiers, but merely presents an idea of how to dynamically simulate coal gasification in an approximate way. This dissertation also presents models of the downstream units of a typical IGCC. Dynamic simulations of the H2S absorption/stripping unit, Water-gas Shift (WGS) reactors, and CO2 absorption/stripping unit are essential for the development of stable and agile plantwide control structures of this hybrid power/chemical plant. Due to the high pressure of the system, hydrogen sulfide is removed by means of physical absorption. SELEXOLRTM (a mixture of the dimethyl ethers of polyethylene glycol) is used to achieve a gas purity of less than 5 ppm H2S. This desulfurized synthesis gas is sent to two water gas shift reactors that convert a total of 99% of carbon monoxide to hydrogen. Physical absorption of carbon dioxide with Selexol produces a hydrogen rich stream (90 mol% H2) to be fed into combustion turbines or to a methanol plant. Steady-state economic designs and plantwide control structures are developed in this dissertation. A steady-state economic design, control structure, and successful turndown of the methanol plant are shown in this dissertation. The Plantwide control structure and interaction among units are also shown. The methanol plant was sized to handle a reductions of the power generation from an IGCC by 50%, producing a high purity methanol stream of 99.5 mol%. Advanced regulatory control structures were designed and play a significant role for the successful turndown of the methanol plant to 20% capacity. The cooled methanol reactor is controlled by the exit temperature instead of a peak temperature within the reactor. During times of low capacity and minimum vapor rate within the column, tray temperature is controlled by recycling some of the distillate and bottoms flow. The gasifier feed is held constant. The product hydrogen from the IGCC is fed to the combustion turbine as required by electric power demand. Synthesis gas fed into the methanol plant maintains pressure of the hydrogen stream. Make-up hydrogen is also fed to the methanol plant to maintain stoichiometry via a flow ratio. This ratio is adjusted to hold carbon monoxide composition of the recycle gas in the methanol plant. This dissertation also explores various methods on how to turn down distillation columns to very low capacity. Recycling flow back to the column was determined to be the best method. Inserting Langmuir-Hinshelwood-Hougen-Watson kinetics into Aspen was also demonstrated with an example.
Methodology for the assessment of oxygen as an energy carrier
NASA Astrophysics Data System (ADS)
Yang, Ming Wei
Due to the energy intensity of the oxygen generating process, the electric power grid would benefit if the oxygen generating process was consumed electric power only during low demand periods. Thus, the question to be addressed in this study is whether oxygen production and/or usage can be modified to achieve energy storage and/or transmission objectives at lower cost. The specific benefit to grid would be a leveling, over time, of the demand profile and thus would require less installation capacity. In order to track the availability of electricity, a compressed air storage unit is installed between the cryogenic distillation section and the main air compressor of air separation unit. A profit maximizing scheme for sizing storage inventory and related equipments is developed. The optimum scheme is capable of market responsiveness. Profits of steel maker, oxy-combustion, and IGCC plants with storage facilities can be higher than those plants without storage facilities, especially, at high-price market. Price tracking feature of air storage integration will certainly increase profit margins of the plants. The integration may push oxy-combustion and integrated gasification combined cycle process into economic viability. Since oxygen is used in consumer sites, it may generate at remote locations and transport to the place needed. Energy losses and costs analysis of oxygen transportation is conducted for various applications. Energy consumptions of large capacity and long distance GOX and LOX pipelines are lower than small capacity pipelines. However, transportation losses and costs of GOX and LOX pipelines are still higher than electricity transmission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-02-01
This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less
Pre-Combustion Carbon Dioxide Capture by a New Dual Phase Ceramic-Carbonate Membrane Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Jerry Y. S.
2015-01-31
This report documents synthesis, characterization and carbon dioxide permeation and separation properties of a new group of ceramic-carbonate dual-phase membranes and results of a laboratory study on their application for water gas shift reaction with carbon dioxide separation. A series of ceramic-carbonate dual phase membranes with various oxygen ionic or mixed ionic and electronic conducting metal oxide materials in disk, tube, symmetric, and asymmetric geometric configurations was developed. These membranes, with the thickness of 10 μm to 1.5 mm, show CO 2 permeance in the range of 0.5-5×10 -7 mol·m -2·s -1·Pa -1 in 500-900°C and measured CO 2/N 2more » selectivity of up to 3000. CO 2 permeation mechanism and factors that affect CO 2 permeation through the dual-phase membranes have been identified. A reliable CO 2 permeation model was developed. A robust method was established for the optimization of the microstructures of ceramic-carbonate membranes. The ceramic-carbonate membranes exhibit high stability for high temperature CO 2 separations and water gas shift reaction. Water gas shift reaction in the dual-phase membrane reactors was studied by both modeling and experiments. It is found that high temperature syngas water gas shift reaction in tubular ceramic-carbonate dual phase membrane reactor is feasible even without catalyst. The membrane reactor exhibits good CO 2 permeation flux, high thermal and chemical stability and high thermal shock resistance. Reaction and separation conditions in the membrane reactor to produce hydrogen of 93% purity and CO 2 stream of >95% purity, with 90% CO 2 capture have been identified. Integration of the ceramic-carbonate dual-phase membrane reactor with IGCC process for carbon dioxide capture was analyzed. A methodology was developed to identify optimum operation conditions for a membrane tube of given dimensions that would treat coal syngas with targeted performance. The calculation results show that the dual-phase membrane reactor could improve IGCC process efficiency but the cost of the membrane reactor with membranes having current CO 2 permeance is high. Further research should be directed towards improving the performance of the membranes and developing cost-effective, scalable methods for fabrication of dual-phase membranes and membrane reactors.« less
Co-gasification of solid waste and lignite - a case study for Western Macedonia.
Koukouzas, N; Katsiadakis, A; Karlopoulos, E; Kakaras, E
2008-01-01
Co-gasification of solid waste and coal is a very attractive and efficient way of generating power, but also an alternative way, apart from conventional technologies such as incineration and landfill, of treating waste materials. The technology of co-gasification can result in very clean power plants using a wide range of solid fuels but there are considerable economic and environmental challenges. The aim of this study is to present the available existing co-gasification techniques and projects for coal and solid wastes and to investigate the techno-economic feasibility, concerning the installation and operation of a 30MW(e) co-gasification power plant based on integrated gasification combined cycle (IGCC) technology, using lignite and refuse derived fuel (RDF), in the region of Western Macedonia prefecture (WMP), Greece. The gasification block was based on the British Gas-Lurgi (BGL) gasifier, while the gas clean-up block was based on cold gas purification. The competitive advantages of co-gasification systems can be defined both by the fuel feedstock and production flexibility but also by their environmentally sound operation. It also offers the benefit of commercial application of the process by-products, gasification slag and elemental sulphur. Co-gasification of coal and waste can be performed through parallel or direct gasification. Direct gasification constitutes a viable choice for installations with capacities of more than 350MW(e). Parallel gasification, without extensive treatment of produced gas, is recommended for gasifiers of small to medium size installed in regions where coal-fired power plants operate. The preliminary cost estimation indicated that the establishment of an IGCC RDF/lignite plant in the region of WMP is not profitable, due to high specific capital investment and in spite of the lower fuel supply cost. The technology of co-gasification is not mature enough and therefore high capital requirements are needed in order to set up a direct co-gasification plant. The cost of electricity estimated was not competitive, compared to the prices dominating the Greek electricity market and thus further economic evaluation is required. The project would be acceptable if modular construction of the unit was first adopted near operating power plants, based on parallel co-gasification, and gradually incorporating the remaining process steps (gas purification, power generation) with the aim of eventually establishing a true direct co-gasification plant.
[Tampa Electric Company IGCC project]. 1996 DOE annual technical report, January--December 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
Tampa Electric Company`s Polk Power Station Unit 1 (PPS-1) Integrated Gasification Combined Cycle (IGCC) demonstration project uses a Texaco pressurized, oxygen-blown, entrained-flow coal gasifier to convert approximately 2,000 tons per day of coal to syngas. The gasification plant is coupled with a combined cycle power block to produce a net 250 MW electrical power output. Coal is slurried in water, combined with 95% pure oxygen from an air separation unit, and sent to the gasifier to produce a high temperature, high pressure, medium-Btu syngas with a heat content of about 250 BTUs/cf (HHV). The syngas then flows through a highmore » temperature heat recovery unit which cools the syngas prior to its entering the cleanup systems. Molten coal ash flows from the bottom of the high temperature heat recovery unit into a water-filled quench chamber where it solidifies into a marketable slag by-product. Approximately 10% of the raw, hot syngas at 900 F is designed to pass through an intermittently moving bed of metal-oxide sorbent which removes sulfur-bearing compounds from the syngas. PPS-1 will be the first unit in the world to demonstrate this advanced metal oxide hot gas desulfurization technology on a commercial unit. The emphasis during 1996 centered around start-up activities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alptekin, Gokhan
2013-02-15
Co-gasification of biomass and coal in large-scale, Integrated Gasification Combined Cycle (IGCC) plants increases the efficiency and reduces the environmental impact of making synthesis gas ("syngas") that can be used in Coal-Biomass-to-Liquids (CBTL) processes for producing transportation fuels. However, the water-gas shift (WGS) and Fischer-Tropsch synthesis (FTS) catalysts used in these processes may be poisoned by multiple contaminants found in coal-biomass derived syngas; sulfur species, trace toxic metals, halides, nitrogen species, the vapors of alkali metals and their salts (e.g., KCl and NaCl), ammonia, and phosphorous. Thus, it is essential to develop a fundamental understanding of poisoning/inhibition mechanisms before investingmore » in the development of any costly mitigation technologies. We therefore investigated the impact of potential contaminants (H 2S, NH 3, HCN, AsH 3, PH 3, HCl, NaCl, KCl, AS 3, NH 4NO 3, NH 4OH, KNO 3, HBr, HF, and HNO 3) on the performance and lifetime of commercially available and generic (prepared in-house) WGS and FT catalysts.« less
IGCC as BACT for Proposed Coal-fired Power Plant Projects
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Singh, Rajinder P.; Dahe, Ganpat J.; Dudeck, Kevin W.; ...
2014-12-31
Sustainable reliance on hydrocarbon feedstocks for energy generation requires CO₂ separation technology development for energy efficient carbon capture from industrial mixed gas streams. High temperature H₂ selective glassy polymer membranes are an attractive option for energy efficient H₂/CO₂ separations in advanced power production schemes with integrated carbon capture. They enable high overall process efficiencies by providing energy efficient CO₂ separations at process relevant operating conditions and correspondingly, minimized parasitic energy losses. Polybenzimidazole (PBI)-based materials have demonstrated commercially attractive H₂/CO₂ separation characteristics and exceptional tolerance to hydrocarbon fuel derived synthesis (syngas) gas operating conditions and chemical environments. To realize a commerciallymore » attractive carbon capture technology based on these PBI materials, development of high performance, robust PBI hollow fiber membranes (HFMs) is required. In this work, we discuss outcomes of our recent efforts to demonstrate and optimize the fabrication and performance of PBI HFMs for use in pre-combustion carbon capture schemes. These efforts have resulted in PBI HFMs with commercially attractive fabrication protocols, defect minimized structures, and commercially attractive permselectivity characteristics at IGCC syngas process relevant conditions. The H₂/CO₂ separation performance of these PBI HFMs presented in this document regarding realistic process conditions is greater than that of any other polymeric system reported to-date.« less
Fuel-Flexible Gasification-Combustion Technology for Production of H2 and Sequestration-Ready CO2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parag Kulkarni; Jie Guan; Raul Subia
In the near future, the nation will continue to rely on fossil fuels for electricity, transportation, and chemicals. It is necessary to improve both the process efficiency and environmental impact of fossil fuel utilization including greenhouse gas management. GE Global Research (GEGR) investigated an innovative fuel-flexible Unmixed Fuel Processor (UFP) technology with potential to produce H{sub 2}, power, and sequestration-ready CO{sub 2} from coal and other solid fuels. The UFP technology offers the long-term potential for reduced cost, increased process efficiency relative to conventional gasification and combustion systems, and near-zero pollutant emissions. GE was awarded a contract from U.S. DOEmore » NETL to investigate and develop the UFP technology. Work started on the Phase I program in October 2000 and on the Phase II effort in April 2005. In the UFP technology, coal, water and air are simultaneously converted into (1) hydrogen rich stream that can be utilized in fuel cells or turbines, (2) CO{sub 2} rich stream for sequestration, and (3) high temperature/pressure vitiated air stream to produce electricity in a gas turbine expander. The process produces near-zero emissions with an estimated efficiency higher than Integrated Gasification Combined Cycle (IGCC) process with conventional CO{sub 2} separation. The Phase I R&D program established the chemical feasibility of the major reactions of the integrated UFP technology through lab-, bench- and pilot-scale testing. A risk analysis session was carried out at the end of Phase I effort to identify the major risks in the UFP technology and a plan was developed to mitigate these risks in the Phase II of the program. The Phase II effort focused on three high-risk areas: economics, lifetime of solids used in the UFP process, and product gas quality for turbines (or the impact of impurities in the coal on the overall system). The economic analysis included estimating the capital cost as well as the costs of hydrogen and electricity for a full-scale UFP plant. These costs were benchmarked with IGCC polygen plants with similar level of CO{sub 2} capture. Based on the promising economic analysis comparison results (performed with the help from Worley Parsons), GE recommended a 'Go' decision in April 2006 to continue the experimental investigation of the UFP technology to address the remaining risks i.e. solids lifetime and the impact of impurities in the coal on overall system. Solids attrition and lifetime risk was addressed via bench-scale experiments that monitor solids performance over time and by assessing materials interactions at operating conditions. The product gas under the third reactor (high-temperature vitiated air) operating conditions was evaluated to assess the concentration of particulates, pollutants and other impurities relative to the specifications required for gas turbine feed streams. During this investigation, agglomeration of solids used in the UFP process was identified as a serious risk that impacts the lifetime of the solids and in turn feasibility of the UFP technology. The main causes of the solids agglomeration were the combination of oxygen transfer material (OTM) reduction at temperatures {approx}1000 C and interaction between OTM and CO{sub 2} absorbing material (CAM) at high operating temperatures (>1200 C). At the end of phase II, in March 2008, GEGR recommended a 'No-go' decision for taking the UFP technology to the next level of development, i.e. development of a 3-5 MW prototype system, at this time. GEGR further recommended focused materials development research programs on improving the performance and lifetime of solids materials used in UFP or chemical looping technologies. The scale-up activities would be recommended only after mitigating the risks involved with the agglomeration and overall lifetime of the solids. This is the final report for the phase II of the DOE-funded Vision 21 program entitled 'Fuel-Flexible Gasification-Combustion Technology for Production of H{sub 2} and Sequestration-Ready CO{sub 2}' (DOE Award No. DE-FC26-00NT40974). The report focuses on the major accomplishments and lessons learned in analyzing the risks of the novel UFP technology during Phase II of the DOE program.« less
NASA Astrophysics Data System (ADS)
Fahie, Monique
With most of the energy produced in the state of Indiana coming from coal, the implementation of policy instruments such as cap-and-trade, which is included in the most recent climate bill, will have significant effects. This thesis provides an analysis of the effects that a cap-and-trade policy might have on the investment decisions for alternative technologies in the power plant sector in Indiana. Two economic models of representative coal-fired power plants, Gallagher (600MW) and Rockport (2600MW), are selected and used to evaluate the repowering decision of a plant for several technologies: integrated gasification combined cycle (IGCC), wind farm combined with natural gas combined cycle (NGCC) and supercritical pulverized coal (SCPC). The firm will make its decisions based on the net present value (NPV) of cost estimates for these CO2 reducing technologies, the cost of purchasing offsets and CO 2 allowances. This model is applied to a base case and three American Clean Energy and Security Act of 2009 cases derived from the Energy Information Administration (EIA, 2009b). A sensitivity analysis is done on the discount rate and capital costs. The results of the study indicate that a SCPC plant without carbon capture and storage (CCS) is the least costly compliance option for both plants under all of the cases while retrofitting the existing plant with CCS is the most expensive. Gallagher's three least expensive options across most scenarios were SCPC without CCS, the operation of the existing plant as is and investment in wind plus NGCC. Rockport's three least expensive compliance options across most scenarios were SCPC without CCS, the operation of the existing plant as is and IGCC without CCS. For both plants, when a 12% discount rate is utilized, NPV of costs are generally lower and the operation of the existing plant technology with the aid of allowances and offsets to be in compliance is the cheapest option. If capital costs were to decrease by 30%, a SCPC without CCS would remain the least costly option to invest in for both plants, but if costs were to increase by 30% operating the existing plant as is becomes the least pricey option.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Bravo, R.; Pinacci, P.; Trifilo, R.
1998-07-01
This paper has the aim to give a general overview of the api Energia IGCC project starting from the project background in 1992 and ending with the progress of construction. api Energia S.p.A., a joint VENTURE between api anonima petroli italiana S.p.A., Roma, Italy (51%), ABB Sae Sadelmi S.p.A., Milano, Italy (25%) and Texaco Development Corporation (24%), is building a 280 MW Integrated Gasification Combined Cycle plant in the api refinery at Falconara Marittima, on Italy' s Adriatic coast, using heavy oil residues. The plant is based on the modern concept of employing a highly efficient combined cycle power plantmore » fed with a low heating value fuel gas produced by gasifying heavy refinery residues. This scheme provides consistent advantages in terms of efficiency and environmental impact over alternative applications of the refinery residues. The electric power produced will feed the national grid. The project has been financed using the ``project financing'' scheme: over 1,000 billion Lira, representing 75% of the overall capital requirement, have been provided by a pool of international banks. In November 1996 the project reached financial closure and immediately after the detailed design and procurement activities started. Engineering, Procurement and Construction activities, carried out by a Consortium of companies of the ABB group, are totally in line with the schedule. Commercial operation of the plant, is scheduled for November 1999.« less
Carbon Molecular Sieve Membrane as a True One Box Unit for Large Scale Hydrogen Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Paul
2012-05-01
IGCC coal-fired power plants show promise for environmentally-benign power generation. In these plants coal is gasified to syngas then processed in a water gas-shift (WGS) reactor to maximize the hydrogen/CO{sub 2} content. The gas stream can then be separated into a hydrogen rich stream for power generation and/or further purified for sale as a chemical and a CO{sub 2} rich stream for the purpose of carbon capture and storage (CCS). Today, the separation is accomplished using conventional absorption/desorption processes with post CO{sub 2} compression. However, significant process complexity and energy penalties accrue with this approach, accounting for ~20% of themore » capital cost and ~27% parasitic energy consumption. Ideally, a one-box process is preferred in which the syngas is fed directly to the WGS reactor without gas pre-treatment, converting the CO to hydrogen in the presence of H{sub 2}S and other impurities and delivering a clean hydrogen product for power generation or other uses. The development of such a process is the primary goal of this project. Our proposed "one-box" process includes a catalytic membrane reactor (MR) that makes use of a hydrogen-selective, carbon molecular sieve (CMS) membrane, and a sulfur-tolerant Co/Mo/Al{sub 2}O{sub 3} catalyst. The membrane reactor's behavior has been investigated with a bench top unit for different experimental conditions and compared with the modeling results. The model is used to further investigate the design features of the proposed process. CO conversion >99% and hydrogen recovery >90% are feasible under the operating pressures available from IGCC. More importantly, the CMS membrane has demonstrated excellent selectivity for hydrogen over H{sub 2}S (>100), and shown no flux loss in the presence of a synthetic "tar"-like material, i.e., naphthalene. In summary, the proposed "one-box" process has been successfully demonstrated with the bench-top reactor. In parallel we have successfully designed and fabricated a full-scale CMS membrane and module for the proposed application. This full-scale membrane element is a 3" diameter with 30"L, composed of ~85 single CMS membrane tubes. The membrane tubes and bundles have demonstrated satisfactory thermal, hydrothermal, thermal cycling and chemical stabilities under an environment simulating the temperature, pressure and contaminant levels encountered in our proposed process. More importantly, the membrane module packed with the CMS bundle was tested for over 30 pressure cycles between ambient pressure and >300 -600 psi at 200 to 300°C without mechanical degradation. Finally, internal baffles have been designed and installed to improve flow distribution within the module, which delivered 90% separation efficiency in comparison with the efficiency achieved with single membrane tubes. In summary, the full-scale CMS membrane element and module have been successfully developed and tested satisfactorily for our proposed one-box application; a test quantity of elements/modules have been fabricated for field testing. Multiple field tests have been performed under this project at National Carbon Capture Center (NCCC). The separation efficiency and performance stability of our full-scale membrane elements have been verified in testing conducted for times ranging from 100 to >250 hours of continuous exposure to coal/biomass gasifier off-gas for hydrogen enrichment with no gas pre-treatment for contaminants removal. In particular, "tar-like" contaminants were effectively rejected by the membrane with no evidence of fouling. In addition, testing was conducted using a hybrid membrane system, i.e., the CMS membrane in conjunction with the palladium membrane, to demonstrate that 99+% H{sub 2} purity and a high degree of CO{sub 2} capture could be achieved. In summary, the stability and performance of the full-scale hydrogen selective CMS membrane/module has been verified in multiple field tests in the presence of coal/biomass gasifier off-gas under this project. A promising process scheme has been developed for power generation and/or hydrogen coproduction with CCS based upon our proposed "one-box" process. Our preliminary economic analysis indicates about 10% reduction in the required electricity selling price and ~40% cost reduction in CCS on per ton CO{sub 2} can be achieved in comparison with the base case involving conventional WGS with a two-stage Selexsol® for CCS. Long term field tests (e.g., >1,000 hrs) with the incorporation of the catalyst for the WGS membrane reactor and more in-depth analysis of the process scheme are recommended for the future study.« less
Bankole, Temitayo; Jones, Dustin; Bhattacharyya, Debangsu; ...
2017-11-03
In this study, a two-level control methodology consisting of an upper-level scheduler and a lower-level supervisory controller is proposed for an advanced load-following energy plant with CO 2 capture. With the use of an economic objective function that considers fluctuation in electricity demand and price at the upper level, optimal scheduling of energy plant electricity production and carbon capture with respect to several carbon tax scenarios is implemented. The optimal operational profiles are then passed down to corresponding lower-level supervisory controllers designed using a methodological approach that balances control complexity with performance. Finally, it is shown how optimal carbon capturemore » and electricity production rate profiles for an energy plant such as the integrated gasification combined cycle (IGCC) plant are affected by electricity demand and price fluctuations under different carbon tax scenarios. As a result, the paper also presents a Lyapunov stability analysis of the proposed scheme.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bankole, Temitayo; Jones, Dustin; Bhattacharyya, Debangsu
In this study, a two-level control methodology consisting of an upper-level scheduler and a lower-level supervisory controller is proposed for an advanced load-following energy plant with CO 2 capture. With the use of an economic objective function that considers fluctuation in electricity demand and price at the upper level, optimal scheduling of energy plant electricity production and carbon capture with respect to several carbon tax scenarios is implemented. The optimal operational profiles are then passed down to corresponding lower-level supervisory controllers designed using a methodological approach that balances control complexity with performance. Finally, it is shown how optimal carbon capturemore » and electricity production rate profiles for an energy plant such as the integrated gasification combined cycle (IGCC) plant are affected by electricity demand and price fluctuations under different carbon tax scenarios. As a result, the paper also presents a Lyapunov stability analysis of the proposed scheme.« less
Advanced Acid Gas Separation Technology for Clean Power and Syngas Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amy, Fabrice; Hufton, Jeffrey; Bhadra, Shubhra
2015-06-30
Air Products has developed an acid gas removal technology based on adsorption (Sour PSA) that favorably compares with incumbent AGR technologies. During this DOE-sponsored study, Air Products has been able to increase the Sour PSA technology readiness level by successfully operating a two-bed test system on coal-derived sour syngas at the NCCC, validating the lifetime and performance of the adsorbent material. Both proprietary simulation and data obtained during the testing at NCCC were used to further refine the estimate of the performance of the Sour PSA technology when expanded to a commercial scale. In-house experiments on sweet syngas combined withmore » simulation work allowed Air Products to develop new PSA cycles that allowed for further reduction in capital expenditure. Finally our techno economic analysis of the use the Sour PSA technology for both IGCC and coal-to-methanol applications suggests significant improvement of the unit cost of electricity and methanol compared to incumbent AGR technologies.« less
Chemical Looping Gasification for Hydrogen Enhanced Syngas Production with In-Situ CO 2 Capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kathe, Mandar; Xu, Dikai; Hsieh, Tien-Lin
2014-12-31
This document is the final report for the project titled “Chemical Looping Gasification for Hydrogen Enhanced Syngas Production with In-Situ CO 2 Capture” under award number FE0012136 for the performance period 10/01/2013 to 12/31/2014.This project investigates the novel Ohio State chemical looping gasification technology for high efficiency, cost efficiency coal gasification for IGCC and methanol production application. The project developed an optimized oxygen carrier composition, demonstrated the feasibility of the concept and completed cold-flow model studies. WorleyParsons completed a techno-economic analysis which showed that for a coal only feed with carbon capture, the OSU CLG technology reduced the methanol requiredmore » selling price by 21%, lowered the capital costs by 28%, increased coal consumption efficiency by 14%. Further, using the Ohio State Chemical Looping Gasification technology resulted in a methanol required selling price which was lower than the reference non-capture case.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kay, John; Stanislowski, Joshua; Tolbert, Scott
Utilities continue to investigate ways to decrease their carbon footprint. Carbon capture and storage (CCS) can enable existing power generation facilities to maintain operations and address carbon reduction. Subtask 2.1 – Pathway to Low-Carbon Lignite Utilization focused on several research areas in an effort to find ways to decrease the cost of capture across both precombustion and postcombustion platforms. Two postcombustion capture solvents were tested, one from CO 2 Solutions Inc. and one from ARCTECH, Inc. The CO 2 Solutions solvent had been evaluated previously, and the company had incorporated the concept of a rotating packed bed (RPB) to replacemore » the traditional packed columns typically used. In the limited testing performed at the Energy & Environmental Research Center (EERC), no CO 2 reduction benefit was seen from the RPB; however, if the technology could be scaled up, it may introduce some savings in capital expense and overall system footprint. Rudimentary tests were conducted with the ARCTECH solvent to evaluate if it could be utilized in a spray tower configuration contactor and capture CO 2, SO 2, and NO x. This solvent after loading can be processed to make an additional product to filter wastewater, providing a second-tier usable product. Modeling of the RPB process for scaling to a 550-MW power system was also conducted. The reduced cost of RPB systems combined with a smaller footprint highlight the potential for reducing the cost of capturing CO 2; however, more extensive testing is needed to truly evaluate their potential for use at full scale. Hydrogen separation membranes from Commonwealth Scientific and Industrial Research Organisation (CSIRO) were evaluated through precombustion testing. These had also been previously tested and were improved by CSIRO for this test campaign. They are composed of vanadium alloy, which is less expensive than the palladium alloys that are typically used. Their performance was good, and they may be good candidates for medium-pressure gasifiers, but much more scale-up work is needed. Next-generation power cycles are currently being developed and show promise for high efficiency, and the utilization of supercritical CO 2 to drive a turbine could significantly increase cycle efficiency over traditional steam cycles. The EERC evaluated pressurized oxy-combustion technology from the standpoint of CO 2 purification. If impurities can be removed, the costs for CO 2 capture can be lowered significantly over postcombustion capture systems. Impurity removal consisted of a simple water scrubber referred to as the DeSNO x process. The process worked well, but corrosion management is crucial to its success. A model of this process was constructed. Finally, an integrated gasification combined-cycle (IGCC) system model, developed by the Massachusetts Institute of Technology (MIT), was modified to allow for the modeling of membrane systems in the IGCC process. This modified model was used to provide an assessment of the costs of membrane use at full scale. An economic estimation indicated a 14% reduction in cost for CO 2 separation over the SELEXOL™ process. This subtask was funded through the EERC–DOE Joint Program on Research and Development for Fossil Energy-Related Resources Cooperative Agreement No. DE-FE0024233. Nonfederal sponsors for this project were the North Dakota Industrial Commission, Basin Electric Power Cooperative, and Allete, Inc. (including BNI Coal and Minnesota Power).« less
Energy generation potential from coals of the Charqueadas Coalfield, RS, Brazil
NASA Astrophysics Data System (ADS)
Correa da Silva, Z. C.; Heemann, R.; Castro, L.; Ketzer, J. M.
2009-04-01
Three coal seams, I2B (Inferior 2), I1F (Inferior 1) and MB, from the Charqueadas Coalfield located in the central-east region of the State of Rio Grande do Sul, Southern Brazil were studied on the basis of geological, petrographic, chemical and geochemical techniques and correlated to the SR1, SR2 and SR3 coal seams from the Santa Rita Coalfield. The Charqueadas Coalfield reserves reach 2,993x106 metric tons of coal distributed in six coal seams. The study of sedimentary and organic facies is made on the subsurface data from five boreholes drilled in the area. There show a well marked lateral facies change from sub aquatic to sub aerial environment, conditioned by both the water level variations and the irregular palaeotopography of the basement. The coals change from limnic to forest-terrestrial moor types characterized by variations of composition in terms of macerals, microlithotypes and mineral matter. The coals are rich in mineral matter (28 to 40%); the vitrinite content reaches 50 %, inertinite 44 % and liptinite varies from 10 to 30 %, in mineral matter free basis. Among the microlithotypes carbominerite and vitrite are predominant. Rank studies carried out by different methods (vitrinite reflectance, max and red-green quotient among others) gave conflicting results, which are explained by the strong bituminization of the vitrinite. However, agreement between fluorescence measurements and organic geochemical parameters (e.g. CPI values) confirm that the coals are of a High Volatile Bituminous B/C (ASTM) or Gasflammkohle (DIN) rank. Based on these characteristics, the Charqueadas coal seams show great potential for use in Underground Coal Gasification (UCG) and Enhanced Coalbed Methane (ECBM) projects. Nowadays the state of Rio Grande do Sul is rapidly growing and needs to increase the energy efficiency to attend the industrial demands, filling the gap between supply and energy generation. As with conventional IGCC, UCG gas can be used to generate electricity with efficiency as high as 55% and overall UCG-IGCC process efficiency reaching 43%. Regarding to environmental problems the UCG minimize environmental impacts (waste piles/acid mine drainage) and reduce CO2 emissions because syngas contains CO2 that can be captured with relatively low-energy penalty. The Clean Coal Technologies (CCT), especially UCG and ECBM projects, will be a key factor to maintain the annual state's economy expansion associated with energy efficiency improvement programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Howard
2010-11-30
This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion conceptsmore » were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.« less
CRADA opportunities with METC`s gasification and hot gas cleanup facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galloway, E N; Rockey, J M; Tucker, M S
1995-06-01
Opportunities exist for Cooperative Research and Development Agreements (CRADA) at the Morgantown Energy Technology Center (METC) to support commercialization of IGCC power systems. METC operates an integrated gasifier and hot gas cleanup facility for the development of gasification and hot gas cleanup technologies. The objective of our program is to gather performance data on gasifier operation, particulate removal, desulfurization and regeneration technologies. Additionally, slip streams are provided for developing various technologies such as; alkali monitoring, particulate measuring, chloride removal, and contaminate recovery processes. METC`s 10-inch diameter air blown Fluid Bed Gasifier (FBG) provides 300 lb/hr of coal gas at 1100{degrees}Fmore » and 425 psig. The particulate laden gas is transported to METC`s Modular Gas Cleanup Rig (MGCR). The gas pressure is reduced to 285 psig before being fed into a candle filter vessel. The candle filter vessel houses four candle filters and multiple test coupons. The particulate free gas is then desulfurized in a sorbent reactor. Starting in 1996 the MGCR system will be able to regenerate the sorbent in the same vessel.« less
Power Systems Life Cycle Analysis Tool (Power L-CAT).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andruski, Joel; Drennen, Thomas E.
2011-01-01
The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation;more » and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).« less
Unconventional Coal in Wyoming: IGCC and Gasification of Direct Coal Liquefaction Residue
NASA Astrophysics Data System (ADS)
Schaffers, William Clemens
Two unconventional uses for Wyoming Powder River Basin coal were investigated in this study. The first was the use of coal fired integrated gasification combined cycle (IGCC) plants to generate electricity. Twenty-eight different scenarios were modeled using AspenPlusRTM software. These included slurry, mechanical and dried fed gasifiers; Wyodak and Green River coals, 0%, 70%, and 90% CO2 capture; and conventional evaporative vs air cooling. All of the models were constructed on a feed basis of 6,900 tons of coal per day on an "as received basis". The AspenPlus RTM results were then used to create economic models using Microsoft RTM Excel for each configuration. These models assumed a 3 year construction period and a 30 year plant life. Results for capital and operating costs, yearly income, and internal rates of return (IRR) were compared. In addition, the scenarios were evaluated to compare electricity sales prices required to obtain a 12% IRR and to determine the effects of a carbon emissions tax on the sales price. The second part of the study investigated the gasification potential of residue remaining from solvent extraction or liquefaction of Powder River Basin Coal. Coal samples from the Decker mine on the Wyoming-Montana border were extracted with tetralin at a temperature of 360°C and pressure of 250 psi. Residue from the extraction was gasified with CO2 or steam at 833°C, 900°C and 975°C at pressures of 0.1 and 0.4 MPa. Product gases were analyzed with a mass spectrometer. Results were used to determine activation energies, reaction order, reaction rates and diffusion effects. Surface area and electron microscopic analyses were also performed on char produced from the solvent extraction residue.
Capturing the emerging market for climate-friendly technologies: opportunities for Ohio
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2006-11-15
This paper briefly describes the factors driving the growing demand for climate-friendly technologies, some of the key existing companies, organizations, and resources in Ohio, and the potential for Ohio to become a leading supplier of climate solutions. These solutions include a new generation of lower-emitting coal technologies, components for wind turbines, and the feedstocks and facilities to produce biofuels. Several public-private partnerships and initiatives have been established in Ohio. These efforts have encouraged the development of numerous federal- and state-funded projects and attracted major private investments in two increasingly strategic sectors of the Ohio economy: clean-coal technology and alternative energymore » technology, with a focus on fuel cells. Several major clean-coal projects have been recently initiated in Ohio. In April 2006, the Public Utilities Commission of Ohio approved American Electric Power's (AEP) plan to build a 600 MW clean-coal plant along the Ohio River in Meigs County. The plant will use Integrated Gasification Combined Cycle (IGCC) technology which makes it easier to capture carbon dioxide for sequestration. Three other potential coal gasification facilities are being considered in Ohio: a combination IGCC and synthetic natural gas plant in Allen County by Global Energy/Lima Energy; a coal-to-fuels facility in Lawrence County by Baard Energy, and a coal-to-fuels facility in Scioto County by CME North American Merchant Energy. The paper concludes with recommendations for how Ohio can capitalize on these emerging opportunities. These recommendations include focusing and coordinating state funding of climate technology programs, promoting the development of climate-related industry clusters, and exploring export opportunities to states and countries with existing carbon constraints.« less
Collins, John P.; Way, J. Douglas
1995-09-19
A hydrogen-selective membrane comprises a tubular porous ceramic support having a palladium metal layer deposited on an inside surface of the ceramic support. The thickness of the palladium layer is greater than about 10 .mu.m but typically less than about 20 .mu.m. The hydrogen permeation rate of the membrane is greater than about 1.0 moles/m.sup.2.s at a temperature of greater than about 500.degree. C. and a transmembrane pressure difference of about 1,500 kPa. Moreover, the hydrogen-to-nitrogen selectivity is greater than about 600 at a temperature of greater than about 500.degree. C. and a transmembrane pressure of about 700 kPa. Hydrogen can be separated from a mixture of gases using the membrane. The method may include the step of heating the mixture of gases to a temperature of greater than about 400.degree. C. and less than about 1000.degree. C. before the step of flowing the mixture of gases past the membrane. The mixture of gases may include ammonia. The ammonia typically is decomposed to provide nitrogen and hydrogen using a catalyst such as nickel. The catalyst may be placed inside the tubular ceramic support. The mixture of gases may be supplied by an industrial process such as the mixture of exhaust gases from the IGCC process.
Collins, J.P.; Way, J.D.
1995-09-19
A hydrogen-selective membrane comprises a tubular porous ceramic support having a palladium metal layer deposited on an inside surface of the ceramic support. The thickness of the palladium layer is greater than about 10 {micro}m but typically less than about 20 {micro}m. The hydrogen permeation rate of the membrane is greater than about 1.0 moles/m{sup 2}s at a temperature of greater than about 500 C and a transmembrane pressure difference of about 1,500 kPa. Moreover, the hydrogen-to-nitrogen selectivity is greater than about 600 at a temperature of greater than about 500 C and a transmembrane pressure of about 700 kPa. Hydrogen can be separated from a mixture of gases using the membrane. The method may include the step of heating the mixture of gases to a temperature of greater than about 400 C and less than about 1000 C before the step of flowing the mixture of gases past the membrane. The mixture of gases may include ammonia. The ammonia typically is decomposed to provide nitrogen and hydrogen using a catalyst such as nickel. The catalyst may be placed inside the tubular ceramic support. The mixture of gases may be supplied by an industrial process such as the mixture of exhaust gases from the IGCC process. 9 figs.
Collins, J.P.; Way, J.D.
1997-07-29
A hydrogen-selective membrane comprises a tubular porous ceramic support having a palladium metal layer deposited on an inside surface of the ceramic support. The thickness of the palladium layer is greater than about 10 {micro}m but typically less than about 20 {micro}m. The hydrogen permeation rate of the membrane is greater than about 1.0 moles/m{sup 2} s at a temperature of greater than about 500 C and a transmembrane pressure difference of about 1,500 kPa. Moreover, the hydrogen-to-nitrogen selectivity is greater than about 600 at a temperature of greater than about 500 C and a transmembrane pressure of about 700 kPa. Hydrogen can be separated from a mixture of gases using the membrane. The method may include the step of heating the mixture of gases to a temperature of greater than about 400 C and less than about 1000 C before the step of flowing the mixture of gases past the membrane. The mixture of gases may include ammonia. The ammonia typically is decomposed to provide nitrogen and hydrogen using a catalyst such as nickel. The catalyst may be placed inside the tubular ceramic support. The mixture of gases may be supplied by an industrial process such as the mixture of exhaust gases from the IGCC process. 9 figs.
Collins, John P.; Way, J. Douglas
1997-01-01
A hydrogen-selective membrane comprises a tubular porous ceramic support having a palladium metal layer deposited on an inside surface of the ceramic support. The thickness of the palladium layer is greater than about 10 .mu.m but typically less than about 20 .mu.m. The hydrogen permeation rate of the membrane is greater than about 1.0 moles/m.sup.2. s at a temperature of greater than about 500.degree. C. and a transmembrane pressure difference of about 1,500 kPa. Moreover, the hydrogen-to-nitrogen selectivity is greater than about 600 at a temperature of greater than about 500.degree. C. and a transmembrane pressure of about 700 kPa. Hydrogen can be separated from a mixture of gases using the membrane. The method may include the step of heating the mixture of gases to a temperature of greater than about 400.degree. C. and less than about 1000.degree. C. before the step of flowing the mixture of gases past the membrane. The mixture of gases may include ammonia. The ammonia typically is decomposed to provide nitrogen and hydrogen using a catalyst such as nickel. The catalyst may be placed inside the tubular ceramic support. The mixture of gases may be supplied by an industrial process such as the mixture of exhaust gases from the IGCC process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, Vijay; Denton, David; SHarma, Pradeep
The key objective for this project was to evaluate the potential to achieve substantial reductions in the production cost of H 2-rich syngas via coal gasification with near-zero emissions due to the cumulative and synergistic benefits realized when multiple advanced technologies are integrated into the overall conversion process. In this project, Aerojet Rocketdyne’s (AR’s) advanced gasification technology (currently being offered as R-GAS™) and RTI International’s (RTI’s) advanced warm syngas cleanup technologies were evaluated via a number of comparative techno-economic case studies. AR’s advanced gasification technology consists of a dry solids pump and a compact gasifier system. Based on the uniquemore » design of this gasifier, it has been shown to reduce the capital cost of the gasification block by between 40 and 50%. At the start of this project, actual experimental work had been demonstrated through pilot plant systems for both the gasifier and dry solids pump. RTI’s advanced warm syngas cleanup technologies consist primarily of RTI’s Warm Gas Desulfurization Process (WDP) technology, which effectively allows decoupling of the sulfur and CO 2 removal allowing for more flexibility in the selection of the CO 2 removal technology, plus associated advanced technologies for direct sulfur recovery and water gas shift (WGS). WDP has been demonstrated at pre-commercial scale using an activated amine carbon dioxide recovery process which would not have been possible if a majority of the sulfur had not been removed from the syngas by WDP. This pre-commercial demonstration of RTI’s advanced warm syngas cleanup system was conducted in parallel to the activities on this project. The technical data and cost information from this pre-commercial demonstration were extensively used in this project during the techno-economic analysis. With this project, both of RTI’s advanced WGS technologies were investigated. Because RT’s advanced fixed-bed WGS (AFWGS) process was successfully implemented in the WDP pre-commercial demonstration test mentioned above, this technology was used as part of RTI’s advanced warm syngas technology package for the techno-economic analyses for this project. RTI’s advanced transport-reactor-based WGS (ATWGS) process was still conceptual at the start of this project, but one of the tasks for this project was to evaluate the technical feasibility of this technology. In each of the three application-based comparison studies conducted as part of this project, the reference case was based on an existing Department of Energy National Energy Technology Laboratory (DOE/NETL) system study. Each of these references cases used existing commercial technology and the system resulted in > 90% carbon capture. In the comparison studies for the use of the hydrogen-rich syngas generated in either an Integrated Gasification Combined Cycle (IGCC) or a Coal-to-Methanol (CTM) plant, the comparison cases consisted of the reference case, a case with the integration of each individual advanced technology (either AR or RTI), and finally a case with the integration of all the advanced technologies (AR and RTI combined). In the Coal-to-Liquids (CTL) comparison study, the comparison study consisted of only three cases, which included a reference case, a case with just RTI’s advanced syngas cleaning technology, and a case with AR’s and RTI’s advanced technologies. The results from these comparison studies showed that the integration of the advanced technologies did result in substantial benefits, and by far the greatest benefits were achieved for cases integrating all the advanced technologies. For the IGCC study, the fully integrated case resulted in a 1.4% net efficiency improvement, an 18% reduction in capital cost per kW of capacity, a 12% reduction in the operating cost per kWh, and a 75–79% reduction in sulfur emissions. For the CTM case, the fully integrated plant resulted in a 22% reduction in capital cost, a 13% reduction in operating costs, a > 99% net reduction in sulfur emissions, and a reduction of 13–15% in CO 2 emissions. Because the capital cost represents over 60% of the methanol Required Selling Price (RSP), the significant reduction in the capital cost for the advanced technology case resulted in an 18% reduction in methanol RSP. For the CTL case, the fully integrated plant resulted in a 16% reduction in capital cost, which represented a 13% reduction in diesel RSP. Finally, the technical feasibility analysis of RTI’s ATWGS process demonstrated that a fluid-bed catalyst with sufficient attrition resistance and WGS activity could be made and that the process achieved about a 24% reduction in capital cost compared to a conventional fixed-bed commercial process.« less
Gas cleaning system and method
Newby, Richard Allen
2006-06-06
A gas cleaning system for removing at least a portion of contaminants, such as halides, sulfur, particulates, mercury, and others, from a synthesis gas (syngas). The gas cleaning system may include one or more filter vessels coupled in series for removing halides, particulates, and sulfur from the syngas. The gas cleaning system may be operated by receiving gas at a first temperature and pressure and dropping the temperature of the syngas as the gas flows through the system. The gas cleaning system may be used for an application requiring clean syngas, such as, but not limited to, fuel cell power generation, IGCC power generation, and chemical synthesis.
CO 2 capture from IGCC gas streams using the AC-ABC process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagar, Anoop; McLaughlin, Elisabeth; Hornbostel, Marc
The objective of this project was to develop a novel, low-cost CO 2 capture process from pre-combustion gas streams. The bench-scale work was conducted at the SRI International. A 0.15-MWe integrated pilot plant was constructed and operated for over 700 hours at the National Carbon Capture Center, Wilsonville, AL. The AC-ABC (ammonium carbonate-ammonium bicarbonate) process for capture of CO 2 and H 2S from the pre-combustion gas stream offers many advantages over Selexol-based technology. The process relies on the simple chemistry of the NH 3-CO 2-H 2O-H 2S system and on the ability of the aqueous ammoniated solution to absorbmore » CO 2 at near ambient temperatures and to release it as a high-purity, high-pressure gas at a moderately elevated regeneration temperature. It is estimated the increase in cost of electricity (COE) with the AC-ABC process will be ~ 30%, and the cost of CO 2 captured is projected to be less than $27/metric ton of CO 2 while meeting 90% CO 2 capture goal. The Bechtel Pressure Swing Claus (BPSC) is a complementary technology offered by Bechtel Hydrocarbon Technology Solutions, Inc. BPSC is a high-pressure, sub-dew-point Claus process that allows for nearly complete removal of H 2S from a gas stream. It operates at gasifier pressures and moderate temperatures and does not affect CO 2 content. When coupled with AC-ABC, the combined technologies allow a nearly pure CO 2 stream to be captured at high pressure, something which Selexol and other solvent-based technologies cannot achieve.« less
Physical and Economic Integration of Carbon Capture Methods with Sequestration Sinks
NASA Astrophysics Data System (ADS)
Murrell, G. R.; Thyne, G. D.
2007-12-01
Currently there are several different carbon capture technologies either available or in active development for coal- fired power plants. Each approach has different advantages, limitations and costs that must be integrated with the method of sequestration and the physiochemical properties of carbon dioxide to evaluate which approach is most cost effective. For large volume point sources such as coal-fired power stations, the only viable sequestration sinks are either oceanic or geological in nature. However, the carbon processes and systems under consideration produce carbon dioxide at a variety of pressure and temperature conditions that must be made compatible with the sinks. Integration of all these factors provides a basis for meaningful economic comparisons between the alternatives. The high degree of compatibility between carbon dioxide produced by integrated gasification combined cycle technology and geological sequestration conditions makes it apparent that this coupling currently holds the advantage. Using a basis that includes complete source-to-sink sequestration costs, the relative cost benefit of pre-combustion IGCC compared to other post-combustion methods is on the order of 30%. Additional economic benefits arising from enhanced oil recovery revenues and potential sequestration credits further improve this coupling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwon, K.C.; Crowe, E.R.; Gangwal, S.K.
1997-01-01
Hot-gas desulfurization for the integrated gasification combined cycle (IGCC) process has been investigated to effectively remove hydrogen sulfide with various metal oxide sorbents at high temperatures and pressures. Metal oxide sorbents such as zinc titanate oxide, zinc ferrite oxide, copper oxide, manganese oxide and calcium oxide were found to be promising sorbents in comparison with other removal methods such as membrane separation and reactive membrane separation. The removal reaction of H{sub 2}S from coal gas mixtures with zinc titanate oxide sorbents was conducted in a batch reactor. The main objectives of this research are to formulate promising metal oxide sorbentsmore » for removal of hydrogen sulfide from coal gas mixtures, to compare reactivity of a formulated sorbent with a sorbent supplied by the Research Triangle Institute at high temperatures and pressures, and to determine effects of concentrations of moisture contained in coal gas mixtures on equilibrium absorption of H{sub 2}S into metal oxide sorbents. Promising durable metal oxide sorbents with high-sulfur-absorbing capacity were formulated by mixing active metal oxide powders with inert metal oxide powders and calcining these powder mixtures.« less
Degradation of TBC Systems in Environments Relevant to Advanced Gas Turbines for IGCC Systems
NASA Astrophysics Data System (ADS)
Bohna, Nathaniel Allan
Plasma sprayed (PS) thermal barrier coatings (TBCs) are used to provide thermal insulation for the hottest components in gas turbines. Zirconia stabilized with 7wt% yttria (7YSZ) is the most common ceramic top coat used for turbine blades. The 7YSZ coating can be degraded by the buildup of fly-ash deposits which can arise from the fuel source (coal/biomass) used in the combustion process in gas turbines. Fly-ash from the integrated gasification combined cycle (IGCC) process can result from coal-based syngas and also from ambient air which passes through the system. TBCs are also exposed to harsh gas environments containing CO2, SO2, and steam. As presented in this thesis, degradation from the combined effects of fly-ash and harsh gas atmosphere can severely limit TBC lifetimes. It is well established that degradation at very high temperatures (≥1250°C) from deposits consisting of the oxides CaO-MgO-Al2O3-SiO 2 results from extensive liquid silicate infiltration into the porous top coat of the YSZ. This infiltration causes early failure resulting from chemical and/or mechanical damage to the ceramic layer. Damage resulting from liquid infiltration, however, is not typically considered at relatively lower temperatures around 1100°C because liquid silicates would not be expected to form from the oxides in the deposit. A key focus of this study is to assess the mode and extent of TBC degradation at 1100°C in cases when some amount of liquid forms owing to the presence of K2SO4 as a minor ash constituent. Two types of liquid infiltrations are observed depending on the principal oxide (i.e., CaO or SiO2) in the deposit. The degradation is primarily the result of mechanical damage, which results from infiltration caused by the interaction of liquid K2SO4 with either the CaO or SiO2. The TBCs used in this work are representative of commonly used coatings used in the hottest sections of land-based gas turbines. The specimens consist of 7YSZ top coats deposited on superalloy (Rene' N5 and PWA 1484) substrates that had been coated with NiCoCrAlY bond coats. Two different top coats are studied: conventional low-density 7YSZ, and also dense vertically cracked coatings. The specific mechanisms of liquid infiltration resulting from CaO and SiO2 are studied by conducting isothermal exposures followed by detailed characterizations. The resulting consequences on cyclic lifetimes are also determined. Further, the cyclic lifetimes are studied in several gas atmospheres to examine the combined effect of deposit and gas atmosphere on TBC lifetime. This work identifies a TBC degradation mechanism which had previously not been considered. It will be clearly shown that deposit-induced attack of TBCs can be highly detrimental at an intermediate temperature like 1100°C.
Greenhouse gas mitigation in a carbon constrained world - the role of CCS in Germany
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schumacher, Katja; Sands, Ronald D.
2009-01-05
In a carbon constrained world, at least four classes of greenhouse gas mitigation options are available: energy efficiency, switching to low or carbon-free energy sources, introduction of carbon dioxide capture and storage along with electric generating technologies, and reductions in emissions of non-CO2 greenhouse gases. The contribution of each option to overall greenhouse gas mitigation varies by cost, scale, and timing. In particular, carbon dioxide capture and storage (CCS) promises to allow for low-emissions fossil-fuel based power generation. This is particularly relevant for Germany, where electricity generation is largely coal-based and, at the same time, ambitious climate targets are inmore » place. Our objective is to provide a balanced analysis of the various classes of greenhouse gas mitigation options with a particular focus on CCS for Germany. We simulate the potential role of advanced fossil fuel based electricity generating technologies with CCS (IGCC, NGCC) as well the potential for retrofit with CCS for existing and currently built fossil plants from the present through 2050. We employ a computable general equilibrium (CGE) economic model as a core model and integrating tool.« less
Environmental performance of green building code and certification systems.
Suh, Sangwon; Tomar, Shivira; Leighton, Matthew; Kneifel, Joshua
2014-01-01
We examined the potential life-cycle environmental impact reduction of three green building code and certification (GBCC) systems: LEED, ASHRAE 189.1, and IgCC. A recently completed whole-building life cycle assessment (LCA) database of NIST was applied to a prototype building model specification by NREL. TRACI 2.0 of EPA was used for life cycle impact assessment (LCIA). The results showed that the baseline building model generates about 18 thousand metric tons CO2-equiv. of greenhouse gases (GHGs) and consumes 6 terajoule (TJ) of primary energy and 328 million liter of water over its life-cycle. Overall, GBCC-compliant building models generated 0% to 25% less environmental impacts than the baseline case (average 14% reduction). The largest reductions were associated with acidification (25%), human health-respiratory (24%), and global warming (GW) (22%), while no reductions were observed for ozone layer depletion (OD) and land use (LU). The performances of the three GBCC-compliant building models measured in life-cycle impact reduction were comparable. A sensitivity analysis showed that the comparative results were reasonably robust, although some results were relatively sensitive to the behavioral parameters, including employee transportation and purchased electricity during the occupancy phase (average sensitivity coefficients 0.26-0.29).
Should a coal-fired power plant be replaced or retrofitted?
Patiño-Echeverri, Dalia; Morel, Benoit; Apt, Jay; Chen, Chao
2007-12-01
In a cap-and-trade system, a power plant operator can choose to operate while paying for the necessary emissions allowances, retrofit emissions controls to the plant, or replace the unit with a new plant. Allowance prices are uncertain, as are the timing and stringency of requirements for control of mercury and carbon emissions. We model the evolution of allowance prices for SO2, NOx, Hg, and CO2 using geometric Brownian motion with drift, volatility, and jumps, and use an options-based analysis to find the value of the alternatives. In the absence of a carbon price, only if the owners have a planning horizon longer than 30 years would they replace a conventional coal-fired plant with a high-performance unit such as a supercritical plant; otherwise, they would install SO2 and NOx, controls on the existing unit. An expectation that the CO2 price will reach $50/t in 2020 makes the installation of an IGCC with carbon capture and sequestration attractive today, even for planning horizons as short as 20 years. A carbon price below $40/t is unlikely to produce investments in carbon capture for electric power.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arroyo, F.; Fernandez-Pereira, C.; Olivares, J.
2009-04-15
In this article, a hydrometallurgical method for the selective recovery of germanium from fly ash (FA) has been tested at pilot plant scale. The pilot plant flowsheet comprised a first stage of water leaching of FA, and a subsequent selective recovery of the germanium from the leachate by solvent extraction method. The solvent extraction method was based on Ge complexation with catechol in an aqueous solution followed by the extraction of the Ge-catechol complex (Ge(C{sub 6}H{sub 4}O{sub 2}){sub 3}{sup 2-}) with an extracting organic reagent (trioctylamine) diluted in an organic solvent (kerosene), followed by the subsequent stripping of the organicmore » extract. The process has been tested on a FA generated in an integrated gasification with combined cycle (IGCC) process. The paper describes the designed 5 kg/h pilot plant and the tests performed on it. Under the operational conditions tested, approximately 50% of germanium could be recovered from FA after a water extraction at room temperature. Regarding the solvent extraction method, the best operational conditions for obtaining a concentrated germanium-bearing solution practically free of impurities were as follows: extraction time equal to 20 min; aqueous phase/organic phase volumetric ratio equal to 5; stripping with 1 M NaOH, stripping time equal to 30 min, and stripping phase/organic phase volumetric ratio equal to 5. 95% of germanium were recovered from water leachates using those conditions.« less
[Removal of CO2 from simulated flue gas of power plants by membrane-based gas absorption processes].
Yang, Ming-Fen; Fang, Meng-Xiang; Zhang, Wei-Feng; Wang, Shu-Yuan; Xu, Zhi-Kang; Luo, Zhong-Yang; Cen, Ke-Fa
2005-07-01
Three typical absorbents such as aqueous of aminoacetic acid potassium (AAAP), monoethanolamine (MEA) and methyldiethanolamine(MDEA) are selected to investigate the performance of CO2 separation from flue gas via membrane contactors made of hydrophobic hollow fiber polypropylene porous membrane. Impacts of absorbents, concentrations and flow rates of feeding gas and absorbent solution, cyclic loading of CO2 on the removal rate and the mass transfer velocity of CO2 are discussed. The results demonstrate that the mass transfer velocity was 7.1 mol x (m2 x s)(-1) for 1 mol x L(-1) MEA with flow rate of 0.1 m x s(-1) and flue gas with that of 0.211 m x s(-1). For 1 mol L(-1) AAAP with flow rate of 0.05 m x s(-1) and flue gas of 0.211 m x s(-1), CO2 removal rate (eta) was 93.2 % and eta was 98% for 4 mol x L(-1) AAAP under the same conditions. AAAP being absorbent, eta was higher than 90% in a wider range of concentrations of CO2. It indicates that membrane-based absorption process is a widely-applied and promising way of CO2 removal from flue gas of power plants, which not only appropriates for CO2 removal of flue gas of widely-used PF and NGCC, but also for that of flue gas of IGCC can be utilized widely in future.
Comparative analyses for selected clean coal technologies in the international marketplace
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szpunar, C.B.; Gillette, J.L.
1990-07-01
Clean coal technologies (CCTs) are being demonstrated in research and development programs under public and private sponsorship. Many of these technologies could be marketed internationally. To explore the scope of these international opportunities and to match particular technologies with markets appearing to have high potential, a study was undertaken that focused on seven representative countries: Italy, Japan, Morocco, Turkey, Pakistan, the Peoples' Republic of China, and Poland. The results suggest that there are international markets for CCTs and that these technologies can be cost competitive with more conventional alternatives. The identified markets include construction of new plants and refurbishment ofmore » existing ones, especially when decision makers want to decrease dependence on imported oil. This report describes potential international market niches for U.S. CCTs and discusses the status and implications of ongoing CCT demonstration activities. Twelve technologies were selected as representative of technologies under development for use in new or refurbished industrial or electric utility applications. Included are the following: Two generic precombustion technologies: two-stage froth-flotation coal beneficiation and coal-water mixtures (CWMs); Four combustion technologies: slagging combustors, integrated-gasification combined-cycle (IGCC) systems, atmospheric fluidized-bed combustors (AFBCs), and pressurized fluidized-bed combustors (PFBCs); and Six postcombustion technologies: limestone-injection multistage burner (LIMB) systems, gas-reburning sorbent-injection (GRSI) systems, dual-alkali flue-gas desulfurization (FGD), spray-dryer FGD, the NOXSO process, and selective catalytic reduction (SCR) systems. Major chapters of this report have been processed separately for inclusion on the data base.« less
INTEGRATED GASIFICATION COMBINED CYCLE PROJECT 2 MW FUEL CELL DEMONSTRATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
FuelCell Energy
2005-05-16
With about 50% of power generation in the United States derived from coal and projections indicating that coal will continue to be the primary fuel for power generation in the next two decades, the Department of Energy (DOE) Clean Coal Technology Demonstration Program (CCTDP) has been conducted since 1985 to develop innovative, environmentally friendly processes for the world energy market place. The 2 MW Fuel Cell Demonstration was part of the Kentucky Pioneer Energy (KPE) Integrated Gasification Combined Cycle (IGCC) project selected by DOE under Round Five of the Clean Coal Technology Demonstration Program. The participant in the CCTDP Vmore » Project was Kentucky Pioneer Energy for the IGCC plant. FuelCell Energy, Inc. (FCE), under subcontract to KPE, was responsible for the design, construction and operation of the 2 MW fuel cell power plant. Duke Fluor Daniel provided engineering design and procurement support for the balance-of-plant skids. Colt Engineering Corporation provided engineering design, fabrication and procurement of the syngas processing skids. Jacobs Applied Technology provided the fabrication of the fuel cell module vessels. Wabash River Energy Ltd (WREL) provided the test site. The 2 MW fuel cell power plant utilizes FuelCell Energy's Direct Fuel Cell (DFC) technology, which is based on the internally reforming carbonate fuel cell. This plant is capable of operating on coal-derived syngas as well as natural gas. Prior testing (1992) of a subscale 20 kW carbonate fuel cell stack at the Louisiana Gasification Technology Inc. (LGTI) site using the Dow/Destec gasification plant indicated that operation on coal derived gas provided normal performance and stable operation. Duke Fluor Daniel and FuelCell Energy developed a commercial plant design for the 2 MW fuel cell. The plant was designed to be modular, factory assembled and truck shippable to the site. Five balance-of-plant skids incorporating fuel processing, anode gas oxidation, heat recovery, water treatment/instrument air, and power conditioning/controls were built and shipped to the site. The two fuel cell modules, each rated at 1 MW on natural gas, were fabricated by FuelCell Energy in its Torrington, CT manufacturing facility. The fuel cell modules were conditioned and tested at FuelCell Energy in Danbury and shipped to the site. Installation of the power plant and connection to all required utilities and syngas was completed. Pre-operation checkout of the entire power plant was conducted and the plant was ready to operate in July 2004. However, fuel gas (natural gas or syngas) was not available at the WREL site due to technical difficulties with the gasifier and other issues. The fuel cell power plant was therefore not operated, and subsequently removed by October of 2005. The WREL fuel cell site was restored to the satisfaction of WREL. FuelCell Energy continues to market carbonate fuel cells for natural gas and digester gas applications. A fuel cell/turbine hybrid is being developed and tested that provides higher efficiency with potential to reach the DOE goal of 60% HHV on coal gas. A system study was conducted for a 40 MW direct fuel cell/turbine hybrid (DFC/T) with potential for future coal gas applications. In addition, FCE is developing Solid Oxide Fuel Cell (SOFC) power plants with Versa Power Systems (VPS) as part of the Solid State Energy Conversion Alliance (SECA) program and has an on-going program for co-production of hydrogen. Future development in these technologies can lead to future coal gas fuel cell applications.« less
Advanced Hydrogen Turbine Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joesph Fadok
2008-01-01
Siemens has developed a roadmap to achieve the DOE goals for efficiency, cost reduction, and emissions through innovative approaches and novel technologies which build upon worldwide IGCC operational experience, platform technology, and extensive experience in G-class operating conditions. In Phase 1, the technologies and concepts necessary to achieve the program goals were identified for the gas turbine components and supporting technology areas and testing plans were developed to mitigate identified risks. Multiple studies were conducted to evaluate the impact in plant performance of different gas turbine and plant technologies. 2015 gas turbine technologies showed a significant improvement in IGCC plantmore » efficiency, however, a severe performance penalty was calculated for high carbon capture cases. Thermodynamic calculations showed that the DOE 2010 and 2015 efficiency targets can be met with a two step approach. A risk management process was instituted in Phase 1 to identify risk and develop mitigation plans. For the risks identified, testing and development programs are in place and the risks will be revisited periodically to determine if changes to the plan are necessary. A compressor performance prediction has shown that the design of the compressor for the engine can be achieved with additional stages added to the rear of the compressor. Tip clearance effects were studied as well as a range of flow and pressure ratios to evaluate the impacts to both performance and stability. Considerable data was obtained on the four candidate combustion systems: diffusion, catalytic, premix, and distributed combustion. Based on the results of Phase 1, the premixed combustion system and the distributed combustion system were chosen as having the most potential and will be the focus of Phase 2 of the program. Significant progress was also made in obtaining combustion kinetics data for high hydrogen fuels. The Phase 1 turbine studies indicate initial feasibility of the advanced hydrogen turbine that meets the aggressive targets set forth for the advanced hydrogen turbine, including increased rotor inlet temperature (RIT), lower total cooling and leakage air (TCLA) flow, higher pressure ratio, and higher mass flow through the turbine compared to the baseline. Maintaining efficiency with high mass flow Syngas combustion is achieved using a large high AN2 blade 4, which has been identified as a significant advancement beyond the current state-of-the-art. Preliminary results showed feasibility of a rotor system capable of increased power output and operating conditions above the baseline. In addition, several concepts were developed for casing components to address higher operating conditions. Rare earth modified bond coat for the purpose of reducing oxidation and TBC spallation demonstrated an increase in TBC spallation life of almost 40%. The results from Phase 1 identified two TBC compositions which satisfy the thermal conductivity requirements and have demonstrated phase stability up to temperatures of 1850 C. The potential to join alloys using a bonding process has been demonstrated and initial HVOF spray deposition trials were promising. The qualitative ranking of alloys and coatings in environmental conditions was also performed using isothermal tests where significant variations in alloy degradation were observed as a function of gas composition. Initial basic system configuration schematics and working system descriptions have been produced to define key boundary data and support estimation of costs. Review of existing materials in use for hydrogen transportation show benefits or tradeoffs for materials that could be used in this type of applications. Hydrogen safety will become a larger risk than when using natural gas fuel as the work done to date in other areas has shown direct implications for this type of use. Studies were conducted which showed reduced CO{sub 2} and NOx emissions with increased plant efficiency. An approach to maximize plant output is needed in order to address the DOE turbine goal for 20-30% reduction of combined cycle cost from the baseline. A customer advisory board was instituted during Phase 1 to obtain important feedback regarding the future direction of the project. he technologies being developed for the Hydrogen Turbine will also be utilized, as appropriate, in the 2010 time frame engine and the FutureGen Plant. These new technologies and concepts also have the potential to accelerate commercialization of advanced coal-based IGCC plants in the U. S. and around the world, thereby reducing emissions, water use, solid waste production and dependence on scarce, expensive and insecure foreign energy supplies. Technology developments accomplished in Phase 1 provide a solid foundation for ensuring successful completion in Phase 2 and providing that the challenging program goals will be achieved.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-12-31
The US Department of Energy (DOE) Morgantown Energy Technology Center (METC) is sponsoring research in advanced methods for controlling contaminants in hot coal gasifier gas (coal gas) streams of integrated gasification combined-cycle (IGCC) power systems. The programs focus on hot-gas particulate removal and desulfurization technologies that match or nearly match the temperatures and pressures of the gasifier, cleanup system, and power generator. The work seeks to eliminate the need for expensive heat recovery equipment, reduce efficiency losses due to quenching, and minimize wastewater treatment costs. The goal of this project is to continue further development of the zinc titanate desulfurizationmore » and direct sulfur recovery process (DSRP) technologies by (1) scaling up the zinc titanate reactor system; (2) developing an integrated skid-mounted zinc titanate desulfurization-DSRP reactor system; (3) testing the integrated system over an extended period with real coal-as from an operating gasifier to quantify the degradative effect, if any, of the trace contaminants present in cola gas; (4) developing an engineering database suitable for system scaleup; and (5) designing, fabricating and commissioning a larger DSRP reactor system capable of operating on a six-fold greater volume of gas than the DSRP reactor used in the bench-scale field test. The work performed during the April 1 through June 30, 1996 period is described.« less
Innovative energy technologies and climate policy in Germany
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schumacher, Katja; Sands, Ronald D.
2006-12-01
Due to the size and structure of its economy, Germany is one of the largest carbon emitters in the European Union. However, Germany is facing a major renewal and restructuring process in electricity generation. Within the next two decades, up to 50% of current electricity generation capacity may retire because of end-of-plant lifetime and the nuclear phase-out pact of 1998. Substantial opportunities therefore exist for deployment of advanced electricity generating technologies in both a projected baseline and in alternative carbon policy scenarios. We simulate the potential role of coal integrated gasification combined cycle (IGCC), natural gas combined cycle (NGCC), carbonmore » dioxide capture and storage (CCS), and wind power within a computable general equilibrium of Germany from the present through 2050. These advanced technologies and their role within a future German electricity system are the focus of this paper. We model the response of greenhouse gas emissions in Germany to various technology and carbon policy assumptions over the next few decades. In our baseline scenario, all of the advanced technologies except CCS provide substantial contributions to electricity generation. We also calculate the carbon price where each fossil technology, combined with CCS, becomes competitive. Constant carbon price experiments are used to characterize the model response to a carbon policy. This provides an estimate of the cost of meeting an emissions target, and the share of emissions reductions available from the electricity generation sector.« less
Park, Sungwon; Lee, Seungmin; Lee, Youngjun; Seo, Yongwon
2013-07-02
In order to investigate the feasibility of semiclathrate hydrate-based precombustion CO2 capture, thermodynamic, kinetic, and spectroscopic studies were undertaken on the semiclathrate hydrates formed from a fuel gas mixture of H2 (60%) + CO2 (40%) in the presence of quaternary ammonium salts (QASs) such as tetra-n-butylammonium bromide (TBAB) and fluoride (TBAF). The inclusion of QASs demonstrated significantly stabilized hydrate dissociation conditions. This effect was greater for TBAF than TBAB. However, due to the presence of dodecahedral cages that are partially filled with water molecules, TBAF showed a relatively lower gas uptake than TBAB. From the stability condition measurements and compositional analyses, it was found that with only one step of semiclathrate hydrate formation with the fuel gas mixture from the IGCC plants, 95% CO2 can be enriched in the semiclathrate hydrate phase at room temperature. The enclathration of both CO2 and H2 in the cages of the QAS semiclathrate hydrates and the structural transition that results from the inclusion of QASs were confirmed through Raman and (1)H NMR measurements. The experimental results obtained in this study provide the physicochemical background required for understanding selective partitioning and distributions of guest gases in the QAS semiclathrate hydrates and for investigating the feasibility of a semiclathrate hydrate-based precombustion CO2 capture process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph Rabovitser
The report presents a feasibility study of a new type of gas turbine. A partial oxidation gas turbine (POGT) shows potential for really high efficiency power generation and ultra low emissions. There are two main features that distinguish a POGT from a conventional gas turbine. These are associated with the design arrangement and the thermodynamic processes used in operation. A primary design difference of the POGT is utilization of a non?catalytic partial oxidation reactor (POR) in place of a conventional combustor. Another important distinction is that a much smaller compressor is required, one that typically supplies less than half ofmore » the air flow required in a conventional gas turbine. From an operational and thermodynamic point of view a key distinguishing feature is that the working fluid, fuel gas provided by the OR, has a much higher specific heat than lean combustion products and more energy per unit mass of fluid can be extracted by the POGT expander than in the conventional systems. The POGT exhaust stream contains unreacted fuel that can be combusted in different bottoming ycle or used as syngas for hydrogen or other chemicals production. POGT studies include feasibility design for conversion a conventional turbine to POGT duty, and system analyses of POGT based units for production of power solely, and combined production of power and yngas/hydrogen for different applications. Retrofit design study was completed for three engines, SGT 800, SGT 400, and SGT 100, and includes: replacing the combustor with the POR, compressor downsizing for about 50% design flow rate, generator replacement with 60 90% ower output increase, and overall unit integration, and extensive testing. POGT performances for four turbines with power output up to 350 MW in POGT mode were calculated. With a POGT as the topping cycle for power generation systems, the power output from the POGT ould be increased up to 90% compared to conventional engine keeping hot section temperatures, pressures, and volumetric flows practically identical. In POGT mode, the turbine specific power (turbine net power per lb mass flow from expander exhaust) is twice the value of the onventional turbine. POGT based IGCC plant conceptual design was developed and major components have been identified. Fuel flexible fluid bed gasifier, and novel POGT unit are the key components of the 100 MW IGCC plant for co producing electricity, hydrogen and/or yngas. Plant performances were calculated for bituminous coal and oxygen blown versions. Various POGT based, natural gas fueled systems for production of electricity only, coproduction of electricity and hydrogen, and co production of electricity and syngas for gas to liquid and hemical processes were developed and evaluated. Performance calculations for several versions of these systems were conducted. 64.6 % LHV efficiency for fuel to electricity in combined cycle was achieved. Such a high efficiency arise from using of syngas from POGT exhaust s a fuel that can provide required temperature level for superheated steam generation in HRSG, as well as combustion air preheating. Studies of POGT materials and combustion instabilities in POR were conducted and results reported. Preliminary market assessment was performed, and recommendations for POGT systems applications in oil industry were defined. POGT technology is ready to proceed to the engineering prototype stage, which is recommended.« less
Low-pressure hydrocracking of coal-derived Fischer-Tropsch waxes to diesel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dieter Leckel
2007-06-15
Coal-derived low-temperature Fischer-Tropsch (LTFT) wax was hydrocracked at pressures of 3.5-7.0 MPa using silica-alumina-supported sulfided NiW/NiMo and an unsulfided noble metal catalyst, modified with MoO{sub 3}. A low-pressure operation at 3.5 MPa produced a highly isomerized diesel, having low cloud points (from -12 to -28{sup o}C) combined with high cetane numbers (69-73). These properties together with the extremely low sulfur ({lt}5 ppm) and aromatic ({lt}0.5%) contents place coal/liquid (CTL) derived distillates as highly valuable blending components to achieve Eurograde diesel specifications. The upgrading of coal-based LTFT waxes through hydrocracking to high-quality diesel fuel blend components in combination with commercial-feasible coal-integratedmore » gasification combined cycle (coal-IGCC) CO{sub 2} capture and storage schemes should make CTL technology more attractive. 28 refs., 7 figs., 8 tabs.« less
Dual-track CCS stakeholder engagement: Lessons learned from FutureGen in Illinois
Hund, G.; Greenberg, S.E.
2011-01-01
FutureGen, as originally planned, was to be the world's first coal-fueled, near-zero emissions power plant with fully integrated, 90% carbon capture and storage (CCS). From conception through siting and design, it enjoyed strong support from multiple stakeholder groups, which benefited the overall project. Understanding the stakeholder engagement process for this project provides valuable insights into the design of stakeholder programs for future CCS projects. FutureGen is one of few projects worldwide that used open competition for siting both the power plant and storage reservoir. Most site proposals were coordinated by State governments. It was unique in this and other respects relative to the site selection method used on other DOE-supported projects. At the time of site selection, FutureGen was the largest proposed facility designed to combine an integrated gasification combined cycle (IGCC) coal-fueled power plant with a CCS system. Stakeholder engagement by states and the industry consortium responsible for siting, designing, building, and operating the facility took place simultaneously and on parallel tracks. On one track were states spearheading state-wide site assessments to identify candidate sites that they wanted to propose for consideration. On the other track was a public-private partnership between an industry consortium of thirteen coal companies and electric utilities that comprised the FutureGen Alliance (Alliance) and the U.S. Department of Energy (DOE). The partnership was based on a cooperative agreement signed by both parties, which assigned the lead for siting to the Alliance. This paper describes the stakeholder engagement strategies used on both of these tracks and provides examples from the engagement process using the Illinois semi-finalist sites. ?? 2011 Published by Elsevier Ltd.
Gasification Product Improvement Facility (GPIF). Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-09-01
The gasifier selected for development under this contract is an innovative and patented hybrid technology which combines the best features of both fixed-bed and fluidized-bed types. PyGas{trademark}, meaning Pyrolysis Gasification, is well suited for integration into advanced power cycles such as IGCC. It is also well matched to hot gas clean-up technologies currently in development. Unlike other gasification technologies, PyGas can be designed into both large and small scale systems. It is expected that partial repowering with PyGas could be done at a cost of electricity of only 2.78 cents/kWh, more economical than natural gas repowering. It is extremely unfortunatemore » that Government funding for such a noble cause is becoming reduced to the point where current contracts must be canceled. The Gasification Product Improvement Facility (GPIF) project was initiated to provide a test facility to support early commercialization of advanced fixed-bed coal gasification technology at a cost approaching $1,000 per kilowatt for electric power generation applications. The project was to include an innovative, advanced, air-blown, pressurized, fixed-bed, dry-bottom gasifier and a follow-on hot metal oxide gas desulfurization sub-system. To help defray the cost of testing materials, the facility was to be located at a nearby utility coal fired generating site. The patented PyGas{trademark} technology was selected via a competitive bidding process as the candidate which best fit overall DOE objectives. The paper describes the accomplishments to date.« less
Development and Testing of PRD-66 Hot Gas Filters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chambers, J.A.; Garnier, J.E.; McMahon, T. J.
1996-12-31
The overall objective of this program is to develop and commercialize PRD-66 hot gas filters for application in pressurized fluidized bed combustors (PFBC) and Integrated Gas Combined Cycle (IGCC) power generation systems. The work is being carried out in phases with the following specific objectives: 1. Demonstrate acceptable mechanical, chemical, and filtration properties in exposure tests. 2. Produce and qualify selected prototype design filter elements in high temperature high pressure (HTHP) simulated PFBC exposure tests. 3. (Option) Generate a manufacturing plan to support commercial scale-up. 4. (Option) Recommend process equipment upgrades and produce 50 candle filters. Since the beginning ofmore » this program, a parallel evaluation of DuPont Lanxide Composites Inc. (DLC) PRD-66 hot gas candle filters took place using AEP`s TIDD PFBC facility. Several PRD-66 filters experienced damage during the final testing phase at TIDD, after highly successful testing in earlier runs. During the past year, DLC has undertaken a study under this contract to understand the mechanism of damage sustained in TIDD Test Segment 5. DLC has formulated a hypothesis for the damage mechanism based on the available evidence, and verified that the damage mechanism is possible given the conditions known to exist in TIDD. Improvements to the filter design to eliminate the root cause of the failure have been undertaken. This report details DLC`s conclusions regarding the failure mechanism, the evidence supporting the conclusions, and steps being taken to eliminate the root cause.« less
MCM-41 support for ultrasmall γ-Fe 2O 3 nanoparticles for H 2S removal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cara, C.; Rombi, E.; Musinu, A.
In this paper, MCM-41 is proposed to build mesostructured Fe 2O 3-based sorbents as an alternative to other silica or alumina supports for mid-temperature H 2S removal. MCM-41 was synthesized as micrometric (MCM41_M) and nanometric (MCM41_N) particles and impregnated through an efficient two-solvent (hexane–water) procedure to obtain the corresponding γ-Fe 2O 3@MCM-41 composites. The active phase is homogeneously dispersed within the 2 nm channels in the form of ultrasmall maghemite nanoparticles assuring a high active phase reactivity. The final micrometric (Fe_MCM41_M) and nanometric (Fe_MCM41_N) composites were tested as sorbents for hydrogen sulphide removal at 300 °C and the results weremore » compared with a reference sorbent (commercial unsupported ZnO) and an analogous silica-based sorbent (Fe_SBA15). MCM-41 based sorbents, having the highest surface areas, showed superior performances that were retained after the first sulphidation cycle. Specifically, the micrometric sorbent (Fe_MCM41_M) showed a higher SRC value than the nanometric one (Fe_MCM41_N), due to the low stability of the nanosized particles over time caused by their high reactivity. Finally and furthermore, the low regeneration temperature (300–350 °C), besides the high removal capacity, renders MCM41-based systems an alternative class of regenerable sorbents for thermally efficient cleaning up processes in Integrated Gasification Combined Cycles (IGCC) systems.« less
MCM-41 support for ultrasmall γ-Fe 2O 3 nanoparticles for H 2S removal
Cara, C.; Rombi, E.; Musinu, A.; ...
2017-07-08
In this paper, MCM-41 is proposed to build mesostructured Fe 2O 3-based sorbents as an alternative to other silica or alumina supports for mid-temperature H 2S removal. MCM-41 was synthesized as micrometric (MCM41_M) and nanometric (MCM41_N) particles and impregnated through an efficient two-solvent (hexane–water) procedure to obtain the corresponding γ-Fe 2O 3@MCM-41 composites. The active phase is homogeneously dispersed within the 2 nm channels in the form of ultrasmall maghemite nanoparticles assuring a high active phase reactivity. The final micrometric (Fe_MCM41_M) and nanometric (Fe_MCM41_N) composites were tested as sorbents for hydrogen sulphide removal at 300 °C and the results weremore » compared with a reference sorbent (commercial unsupported ZnO) and an analogous silica-based sorbent (Fe_SBA15). MCM-41 based sorbents, having the highest surface areas, showed superior performances that were retained after the first sulphidation cycle. Specifically, the micrometric sorbent (Fe_MCM41_M) showed a higher SRC value than the nanometric one (Fe_MCM41_N), due to the low stability of the nanosized particles over time caused by their high reactivity. Finally and furthermore, the low regeneration temperature (300–350 °C), besides the high removal capacity, renders MCM41-based systems an alternative class of regenerable sorbents for thermally efficient cleaning up processes in Integrated Gasification Combined Cycles (IGCC) systems.« less
Pre-Combustion Carbon Capture by a Nanoporous, Superhydrophobic Membrane Contactor Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Howard; Zhou, S James; Ding, Yong
2012-03-31
This report summarizes progress made during Phase I and Phase II of the project: "Pre-Combustion Carbon Capture by a Nanoporous, Superhydrophobic Membrane Contactor Process," under contract DE-FE-0000646. The objective of this project is to develop a practical and cost effective technology for CO{sub 2} separation and capture for pre-combustion coal-based gasification plants using a membrane contactor/solvent absorption process. The goals of this technology development project are to separate and capture at least 90% of the CO{sub 2} from Integrated Gasification Combined Cycle (IGCC) power plants with less than 10% increase in the cost of energy services. Unlike conventional gas separationmore » membranes, the membrane contactor is a novel gas separation process based on the gas/liquid membrane concept. The membrane contactor is an advanced mass transfer device that operates with liquid on one side of the membrane and gas on the other. The membrane contactor can operate with pressures that are almost the same on both sides of the membrane, whereas the gas separation membranes use the differential pressure across the membrane as driving force for separation. The driving force for separation for the membrane contactor process is the chemical potential difference of CO{sub 2} in the gas phase and in the absorption liquid. This process is thus easily tailored to suit the needs for pre-combustion separation and capture of CO{sub 2}. Gas Technology Institute (GTI) and PoroGen Corporation (PGC) have developed a novel hollow fiber membrane technology that is based on chemically and thermally resistant commercial engineered polymer poly(ether ether ketone) or PEEK. The PEEK membrane material used in the membrane contactor during this technology development program is a high temperature engineered plastic that is virtually non-destructible under the operating conditions encountered in typical gas absorption applications. It can withstand contact with most of the common treating solvents. GTI and PGC have developed a nanoporous and superhydrophobic PEEK-based hollow fiber membrane contactor tailored for the membrane contactor/solvent absorption application for syngas cleanup. The membrane contactor modules were scaled up to 8-inch diameter commercial size modules. We have performing extensive laboratory and bench testing using pure gases, simulated water-gas-shifted (WGS) syngas stream, and a slipstream from a gasification derived syngas from GTI's Flex-Fuel Test Facility (FFTF) gasification plant under commercially relevant conditions. The team have also carried out an engineering and economic analysis of the membrane contactor process to evaluate the economics of this technology and its commercial potential. Our test results have shown that 90% CO{sub 2} capture can be achieved with several physical solvents such as water and chilled methanol. The rate of CO{sub 2} removal by the membrane contactor is in the range of 1.5 to 2.0 kg/m{sup 2}/hr depending on the operating pressures and temperatures and depending on the solvents used. The final economic analysis has shown that the membrane contactor process will cause the cost of electricity to increase by 21% from the base plant without CO{sub 2} capture. The goal of 10% increase in levelized cost of electricity (LCOE) from base DOE Case 1(base plant without capture) is not achieved by using the membrane contactor. However, the 21% increase in LCOE is a substantial improvement as compared with the 31.6% increase in LCOE as in DOE Case 2(state of art capture technology using 2-stages of Selexol{TM}).« less
CO2 Capture and Storage in Coal Gasification Projects
NASA Astrophysics Data System (ADS)
Rao, Anand B.; Phadke, Pranav C.
2017-07-01
In response to the global climate change problem, the world community today is in search for an effective means of carbon mitigation. India is a major developing economy and the economic growth is driven by ever-increasing consumption of energy. Coal is the only fossil fuel that is available in abundance in India and contributes to the major share of the total primary energy supply (TPES) in the country. Owing to the large unmet demand for affordable energy, primarily driven by the need for infrastructure development and increasing incomes and aspirations of people, as well as the energy security concerns, India is expected to have continued dependence on coal. Coal is not only the backbone of the electric power generation, but many major industries like cement, iron and steel, bricks, fertilizers also consume large quantities of coal. India has very low carbon emissions (˜ 1.5 tCO2 per capita) as compared to the world average (4.7 tCO2 per capita) and the developed world (11.2 tCO2 per capita). Although the aggregate emissions of the country are increasing with the rising population and fossil energy use, India has a very little contribution to the historical GHG accumulation in the atmosphere linked to the climate change problem. However, a large fraction of the Indian society is vulnerable to the impacts of climate change - due to its geographical location, large dependence on monsoon-based agriculture and limited technical, financial and institutional capacity. Today, India holds a large potential to offer cost-effective carbon mitigation to tackle the climate change problem. Carbon Capture and Storage (CCS) is the process of extraction of Carbon Dioxide (CO2) from industrial and energy related sources, transport to storage locations and long-term isolation from the atmosphere. It is a technology that has been developed in recent times and is considered as a bridging technology as we move towards carbon-neutral energy sources in response to the growing concerns about climate change problem. Carbon Capture and Storage (CCS) is being considered as a promising carbon mitigation technology, especially for large point sources such as coal power plants. Gasification of coal helps in better utilization of this resource offering multiple advantages such as pollution prevention, product flexibility (syngas and hydrogen) and higher efficiency (combined cycle). It also enables the capture of CO2 prior to the combustion, from the fuel gas mixture, at relatively lesser cost as compared to the post-combustion CO2 capture. CCS in gasification projects is considered as a promising technology for cost-effective carbon mitigation. Although many projects (power and non-power) have been announced internationally, very few large-scale projects have actually come up. This paper looks at the various aspects of CCS applications in gasification projects, including the technical feasibility and economic viability and discusses an Indian perspective. Impacts of including CCS in gasification projects (e.g. IGCC plants) have been assessed using a simulation tool. Integrated Environmental Control Model (IECM) - a modelling framework to simulate power plants - has been used to estimate the implications of adding CCS units in IGCC plants, on their performance and costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2006-07-01
A variety of papers/posters were presented on topics concerning power generation, including solid oxide fuel cells, hydrogen production, mercury as a combustion product, carbon dioxide separation from flue gas. A total of 31 presentations in slide/overview/viewgraph form and with a separate abstract are available online (one in abstract form only) and 24 poster papers (text). In addition 41 abstracts only are available. Papers of particular interest include: Hydrogen production from hydrogen sulfide in IGCC power plants; Oxidation of mercury in products of coal combustion; Computer aided design of advanced turbine aerofoil alloys for industrial gas turbines in coal fired environments;more » Developing engineered fuel using flyash and biomass; Conversion of hydrogen sulfide in coal gases to elemental sulfur with monolithic catalysts; Intelligent control via wireless sensor networks for advanced coal combustion systems; and Investment of fly ash and activated carbon obtained from pulverized coal boilers (poster).« less
Tyurin, Michael; Kiriukhin, Michael
2013-09-01
Methanol-resistant mutant acetogen Clostridium sp. MT1424 originally producing only 365 mM acetate from CO₂/CO was engineered to eliminate acetate production and spore formation using Cre-lox66/lox71-system to power subsequent methanol production via expressing synthetic methanol dehydrogenase, formaldehyde dehydrogenase and formate dehydrogenase, three copies of each, assembled in cluster and integrated to chromosome using Tn7-based approach. Production of 2.2 M methanol was steady (p < 0.005) in single step fermentations of 20 % CO₂ + 80 % H₂ blend (v/v) 25 day runs each in five independent repeats. If the integrated cluster comprised only three copies of formate dehydrogenase the respective recombinants produced 95 mM formate (p < 0.005) under the same conditions. For commercialization, the suggested source of inorganic carbon would be CO₂ waste of IGCC power plant. Hydrogen may be produced in situ via powered by solar panels electrolysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eric Larson; Robert Williams; Thomas Kreutz
2012-03-11
The overall objective of this project was to quantify the energy, environmental, and economic performance of industrial facilities that would coproduce electricity and transportation fuels or chemicals from a mixture of coal and biomass via co-gasification in a single pressurized, oxygen-blown, entrained-flow gasifier, with capture and storage of CO{sub 2} (CCS). The work sought to identify plant designs with promising (Nth plant) economics, superior environmental footprints, and the potential to be deployed at scale as a means for simultaneously achieving enhanced energy security and deep reductions in U.S. GHG emissions in the coming decades. Designs included systems using primarily already-commercializedmore » component technologies, which may have the potential for near-term deployment at scale, as well as systems incorporating some advanced technologies at various stages of R&D. All of the coproduction designs have the common attribute of producing some electricity and also of capturing CO{sub 2} for storage. For each of the co-product pairs detailed process mass and energy simulations (using Aspen Plus software) were developed for a set of alternative process configurations, on the basis of which lifecycle greenhouse gas emissions, Nth plant economic performance, and other characteristics were evaluated for each configuration. In developing each set of process configurations, focused attention was given to understanding the influence of biomass input fraction and electricity output fraction. Self-consistent evaluations were also carried out for gasification-based reference systems producing only electricity from coal, including integrated gasification combined cycle (IGCC) and integrated gasification solid-oxide fuel cell (IGFC) systems. The reason biomass is considered as a co-feed with coal in cases when gasoline or olefins are co-produced with electricity is to help reduce lifecycle greenhouse gas (GHG) emissions for these systems. Storing biomass-derived CO{sub 2} underground represents negative CO{sub 2} emissions if the biomass is grown sustainably (i.e., if one ton of new biomass growth replaces each ton consumed), and this offsets positive CO{sub 2} emissions associated with the coal used in these systems. Different coal:biomass input ratios will produce different net lifecycle greenhouse gas (GHG) emissions for these systems, which is the reason that attention in our analysis was given to the impact of the biomass input fraction. In the case of systems that produce only products with no carbon content, namely electricity, ammonia and hydrogen, only coal was considered as a feedstock because it is possible in theory to essentially fully decarbonize such products by capturing all of the coal-derived CO{sub 2} during the production process.« less
FUEL-FLEXIBLE GASIFICATION-COMBUSTION TECHNOLOGY FOR PRODUCTION OF H2 AND SEQUESTRATION-READY CO2
DOE Office of Scientific and Technical Information (OSTI.GOV)
George Rizeq; Janice West; Arnaldo Frydman
It is expected that in the 21st century the Nation will continue to rely on fossil fuels for electricity, transportation, and chemicals. It will be necessary to improve both the process efficiency and environmental impact performance of fossil fuel utilization. GE Global Research (GEGR) has developed an innovative fuel-flexible Unmixed Fuel Processor (UFP) technology to produce H{sub 2}, power, and sequestration-ready CO{sub 2} from coal and other solid fuels. The UFP module offers the potential for reduced cost, increased process efficiency relative to conventional gasification and combustion systems, and near-zero pollutant emissions including NO{sub x}. GEGR (prime contractor) was awardedmore » a contract from U.S. DOE NETL to develop the UFP technology. Work on this Phase I program started on October 1, 2000. The project team includes GEGR, Southern Illinois University at Carbondale (SIU-C), California Energy Commission (CEC), and T. R. Miles, Technical Consultants, Inc. In the UFP technology, coal and air are simultaneously converted into separate streams of (1) high-purity hydrogen that can be utilized in fuel cells or turbines, (2) sequestration-ready CO{sub 2}, and (3) high temperature/pressure vitiated air to produce electricity in a gas turbine. The process produces near-zero emissions and, based on Aspen Plus process modeling, has an estimated process efficiency of 6% higher than IGCC with conventional CO{sub 2} separation. The current R&D program will determine the feasibility of the integrated UFP technology through pilot-scale testing, and will investigate operating conditions that maximize separation of CO{sub 2} and pollutants from the vent gas, while simultaneously maximizing coal conversion efficiency and hydrogen production. The program integrates experimental testing, modeling and economic studies to demonstrate the UFP technology. This is the third annual technical progress report for the UFP program supported by U.S. DOE NETL (Contract No. DE-FC26-00FT40974). This report summarizes program accomplishments for the period starting October 1, 2002 and ending September 30, 2003. The report includes an introduction summarizing the UFP technology, main program tasks, and program objectives; it also provides a summary of program activities and accomplishments covering progress in tasks including lab-scale experimental testing, bench-scale experimental testing, process modeling, pilot-scale system design and assembly, and program management.« less
Novel polymer membrane process for pre-combustion CO{sub 2} capture from coal-fired syngas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merkel, Tim
2011-09-14
This final report describes work conducted for the Department of Energy (DOE NETL) on development of a novel polymer membrane process for pre-combustion CO{sub 2} capture from coalfired syngas (award number DE-FE0001124). The work was conducted by Membrane Technology and Research, Inc. (MTR) from September 15, 2009, through December 14, 2011. Tetramer Technologies, LLC (Tetramer) was our subcontract partner on this project. The National Carbon Capture Center (NCCC) at Wilsonville, AL, provided access to syngas gasifier test facilities. The main objective of this project was to develop a cost-effective membrane process that could be used in the relatively near-term tomore » capture CO{sub 2} from shifted syngas generated by a coal-fired Integrated Gasification Combined Cycle (IGCC) power plant. In this project, novel polymeric membranes (designated as Proteus™ membranes) with separation properties superior to conventional polymeric membranes were developed. Hydrogen permeance of up to 800 gpu and H{sub 2}/CO{sub 2} selectivity of >12 was achieved using a simulated syngas mixture at 150°C and 50 psig, which exceeds the original project targets of 200 gpu for hydrogen permeance and 10 for H{sub 2}/CO{sub 2} selectivity. Lab-scale Proteus membrane modules (with a membrane area of 0.13 m{sup 2}) were also developed using scaled-up Proteus membranes and high temperature stable module components identified during this project. A mixed-gas hydrogen permeance of about 160 gpu and H{sub 2}/CO{sub 2} selectivity of >12 was achieved using a simulated syngas mixture at 150°C and 100 psig. We believe that a significant improvement in the membrane and module performance is likely with additional development work. Both Proteus membranes and lab-scale Proteus membrane modules were further evaluated using coal-derived syngas streams at the National Carbon Capture Center (NCCC). The results indicate that all module components, including the Proteus membrane, were stable under the field conditions (feed pressures: 150-175 psig and feed temperatures: 120-135°C) for over 600 hours. The field performance of both Proteus membrane stamps and Proteus membrane modules is consistent with the results obtained in the lab, suggesting that the presence of sulfur-containing compounds (up to 780 ppm hydrogen sulfide), saturated water vapor, carbon monoxide and heavy hydrocarbons in the syngas feed stream has no adverse effect on the Proteus membrane or module performance. We also performed an economic analysis for a number of membrane process designs developed in this project (using hydrogen-selective membranes, alone or in the combination with CO{sub 2}- selective membranes). The current field performance for Proteus membranes was used in the design analysis. The study showed the current best design has the potential to reduce the increase in Levelized Cost of Electricity (LCOE) caused by 90% CO{sub 2} capture to about 15% if co-sequestration of H{sub 2}S is viable. This value is still higher than the DOE target for increase in LCOE (10%); however, compared to the base-case Selexol process that gives a 30% increase in LCOE at 90% CO2 capture, the membrane-based process appears promising. We believe future improvements in membrane performance have the potential to reach the DOE target.« less
FUEL-FLEXIBLE GASIFICATION-COMBUSTION TECHNOLOGY FOR PRODUCTION OF H2 AND SEQUESTRATION-READY CO2
DOE Office of Scientific and Technical Information (OSTI.GOV)
George Rizeq; Janice West; Arnaldo Frydman
It is expected that in the 21st century the Nation will continue to rely on fossil fuels for electricity, transportation, and chemicals. It will be necessary to improve both the process efficiency and environmental impact performance of fossil fuel utilization. GE Global Research has developed an innovative fuel-flexible Unmixed Fuel Processor (UFP) technology to produce H{sub 2}, power, and sequestration-ready CO{sub 2} from coal and other solid fuels. The UFP module offers the potential for reduced cost, increased process efficiency relative to conventional gasification and combustion systems, and near-zero pollutant emissions including NO{sub x}. GE Global Research (prime contractor) wasmore » awarded a contract from U.S. DOE NETL to develop the UFP technology. Work on this Phase I program started on October 1, 2000. The project team includes GE Global Research, Southern Illinois University at Carbondale (SIU-C), California Energy Commission (CEC), and T. R. Miles, Technical Consultants, Inc. In the UFP technology, coal and air are simultaneously converted into separate streams of (1) high-purity hydrogen that can be utilized in fuel cells or turbines, (2) sequestration-ready CO{sub 2}, and (3) high temperature/pressure vitiated air to produce electricity in a gas turbine. The process produces near-zero emissions and, based on ASPEN Plus process modeling, has an estimated process efficiency of 6 percentage points higher than IGCC with conventional CO{sub 2} separation. The current R&D program will determine the feasibility of the integrated UFP technology through pilot-scale testing, and will investigate operating conditions that maximize separation of CO{sub 2} and pollutants from the vent gas, while simultaneously maximizing coal conversion efficiency and hydrogen production. The program integrates experimental testing, modeling and economic studies to demonstrate the UFP technology. This is the fourteenth quarterly technical progress report for the UFP program, which is supported by U.S. DOE NETL (Contract No. DE-FC26-00FT40974) and GE. This report summarizes program accomplishments for the period starting January 1, 2004 and ending March 31, 2004. The report includes an introduction summarizing the UFP technology, main program tasks, and program objectives; it also provides a summary of program activities and accomplishments covering progress in tasks including lab-scale experimental testing, pilot-scale shakedown and performance testing, program management and technology transfer.« less
FUEL-FLEXIBLE GASIFICATION-COMBUSTION TECHNOLOGY FOR PRODUCTION OF H2 AND SEQUESTRATION-READY CO2
DOE Office of Scientific and Technical Information (OSTI.GOV)
George Rizeq; Janice West; Arnaldo Frydman
It is expected that in the 21st century the Nation will continue to rely on fossil fuels for electricity, transportation, and chemicals. It will be necessary to improve both the process efficiency and environmental impact performance of fossil fuel utilization. GE Global Research has developed an innovative fuel-flexible Unmixed Fuel Processor (UFP) technology to produce H{sub 2}, power, and sequestration-ready CO{sub 2} from coal and other solid fuels. The UFP module offers the potential for reduced cost, increased process efficiency relative to conventional gasification and combustion systems, and near-zero pollutant emissions including NO{sub x}. GE Global Research (prime contractor) wasmore » awarded a contract from U.S. DOE NETL to develop the UFP technology. Work on this Phase I program started on October 1, 2000. The project team includes GE Global Research, Southern Illinois University at Carbondale (SIU-C), California Energy Commission (CEC), and T. R. Miles, Technical Consultants, Inc. In the UFP technology, coal and air are simultaneously converted into separate streams of (1) high-purity hydrogen that can be utilized in fuel cells or turbines, (2) sequestration-ready CO{sub 2}, and (3) high temperature/pressure vitiated air to produce electricity in a gas turbine. The process produces near-zero emissions and, based on ASPEN Plus process modeling, has an estimated process efficiency of 6% higher than IGCC with conventional CO{sub 2} separation. The current R&D program will determine the feasibility of the integrated UFP technology through pilot-scale testing, and will investigate operating conditions that maximize separation of CO{sub 2} and pollutants from the vent gas, while simultaneously maximizing coal conversion efficiency and hydrogen production. The program integrates experimental testing, modeling and economic studies to demonstrate the UFP technology. This is the thirteenth quarterly technical progress report for the UFP program, which is supported by U.S. DOE NETL under Contract No. DE-FC26-00FT40974. This report summarizes program accomplishments for the period starting October 1, 2003 and ending December 31, 2003. The report includes an introduction summarizing the UFP technology, main program tasks, and program objectives; it also provides a summary of program activities and accomplishments covering progress in tasks including lab-scale experimental testing, pilot-scale assembly, pilot-scale demonstration and program management and technology transfer.« less
FUEL-FLEXIBLE GASIFICATION-COMBUSTION TECHNOLOGY FOR PRODUCTION OF H2 AND SEQUESTRATION-READY CO2
DOE Office of Scientific and Technical Information (OSTI.GOV)
George Rizeq; Janice West; Arnaldo Frydman
It is expected that in the 21st century the Nation will continue to rely on fossil fuels for electricity, transportation, and chemicals. It will be necessary to improve both the process efficiency and environmental impact performance of fossil fuel utilization. GE Global Research has developed an innovative fuel-flexible Unmixed Fuel Processor (UFP) technology to produce H{sub 2}, power, and sequestration-ready CO{sub 2} from coal and other solid fuels. The UFP module offers the potential for reduced cost, increased process efficiency relative to conventional gasification and combustion systems, and near-zero pollutant emissions including NO{sub x}. GE Global Research (prime contractor) wasmore » awarded a contract from U.S. DOE NETL to develop the UFP technology. Work on this Phase I program started on October 1, 2000. The project team includes GE Global Research, Southern Illinois University at Carbondale (SIU-C), California Energy Commission (CEC), and T. R. Miles, Technical Consultants, Inc. In the UFP technology, coal and air are simultaneously converted into separate streams of (1) high-purity hydrogen that can be utilized in fuel cells or turbines, (2) sequestration-ready CO{sub 2}, and (3) high temperature/pressure vitiated air to produce electricity in a gas turbine. The process produces near-zero emissions and, based on ASPEN Plus process modeling, has an estimated process efficiency of 6 percentage points higher than IGCC with conventional CO{sub 2} separation. The current R&D program has determined the feasibility of the integrated UFP technology through pilot-scale testing, and investigated operating conditions that maximize separation of CO{sub 2} and pollutants from the vent gas, while simultaneously maximizing coal conversion efficiency and hydrogen production. The program integrated experimental testing, modeling and economic studies to demonstrate the UFP technology. This is the fifteenth quarterly technical progress report for the UFP program, which is supported by U.S. DOE NETL (Contract No. DE-FC26-00FT40974) and GE. This report summarizes program accomplishments for the period starting April 1, 2004 and ending June 30, 2004. The report includes an introduction summarizing the UFP technology, main program tasks, and program objectives; it also provides a summary of program activities and accomplishments covering progress in tasks including lab-scale experimental testing, pilot-scale testing, kinetic modeling, program management and technology transfer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alptekin, Gokhan
The overall objective of the proposed research is to develop a low cost, high capacity CO{sub 2} sorbent and demonstrate its technical and economic viability for pre-combustion CO{sub 2} capture. The specific objectives supporting our research plan were to optimize the chemical structure and physical properties of the sorbent, scale-up its production using high throughput manufacturing equipment and bulk raw materials and then evaluate its performance, first in bench-scale experiments and then in slipstream tests using actual coal-derived synthesis gas. One of the objectives of the laboratory-scale evaluations was to demonstrate the life and durability of the sorbent for overmore » 10,000 cycles and to assess the impact of contaminants (such as sulfur) on its performance. In the field tests, our objective was to demonstrate the operation of the sorbent using actual coal-derived synthesis gas streams generated by air-blown and oxygen-blown commercial and pilot-scale coal gasifiers (the CO{sub 2} partial pressure in these gas streams is significantly different, which directly impacts the operating conditions hence the performance of the sorbent). To support the field demonstration work, TDA collaborated with Phillips 66 and Southern Company to carry out two separate field tests using actual coal-derived synthesis gas at the Wabash River IGCC Power Plant in Terre Haute, IN and the National Carbon Capture Center (NCCC) in Wilsonville, AL. In collaboration with the University of California, Irvine (UCI), a detailed engineering and economic analysis for the new CO{sub 2} capture system was also proposed to be carried out using Aspen PlusTM simulation software, and estimate its effect on the plant efficiency.« less
AVESTAR Center for Operational Excellence of Electricity Generation Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, Stephen
2012-08-29
To address industry challenges in attaining operational excellence for electricity generation plants, the U.S. Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has launched a world-class facility for Advanced Virtual Energy Simulation Training and Research (AVESTARTM). This presentation will highlight the AVESTARTM Center simulators, facilities, and comprehensive training, education, and research programs focused on the operation and control of high-efficiency, near-zero-emission electricity generation plants. The AVESTAR Center brings together state-of-the-art, real-time, high-fidelity dynamic simulators with full-scope operator training systems (OTSs) and 3D virtual immersive training systems (ITSs) into an integrated energy plant and control room environment. AVESTAR’s initial offeringmore » combines--for the first time--a “gasification with CO2 capture” process simulator with a “combined-cycle” power simulator together in a single OTS/ITS solution for an integrated gasification combined cycle (IGCC) power plant with carbon dioxide (CO2) capture. IGCC systems are an attractive technology option for power generation, especially when capturing and storing CO2 is necessary to satisfy emission targets. The AVESTAR training program offers a variety of courses that merge classroom learning, simulator-based OTS learning in a control-room operations environment, and immersive learning in the interactive 3D virtual plant environment or ITS. All of the courses introduce trainees to base-load plant operation, control, startups, and shutdowns. Advanced courses require participants to become familiar with coordinated control, fuel switching, power-demand load shedding, and load following, as well as to problem solve equipment and process malfunctions. Designed to ensure work force development, training is offered for control room and plant field operators, as well as engineers and managers. Such comprehensive simulator-based instruction allows for realistic training without compromising worker, equipment, and environmental safety. It also better prepares operators and engineers to manage the plant closer to economic constraints while minimizing or avoiding the impact of any potentially harmful, wasteful, or inefficient events. The AVESTAR Center is also used to augment graduate and undergraduate engineering education in the areas of process simulation, dynamics, control, and safety. Students and researchers gain hands-on simulator-based training experience and learn how the commercial-scale power plants respond dynamically to changes in manipulated inputs, such as coal feed flow rate and power demand. Students also analyze how the regulatory control system impacts power plant performance and stability. In addition, students practice start-up, shutdown, and malfunction scenarios. The 3D virtual ITSs are used for plant familiarization, walk-through, equipment animations, and safety scenarios. To further leverage the AVESTAR facilities and simulators, NETL and its university partners are pursuing an innovative and collaborative R&D program. In the area of process control, AVESTAR researchers are developing enhanced strategies for regulatory control and coordinated plant-wide control, including gasifier and gas turbine lead, as well as advanced process control using model predictive control (MPC) techniques. Other AVESTAR R&D focus areas include high-fidelity equipment modeling using partial differential equations, dynamic reduced order modeling, optimal sensor placement, 3D virtual plant simulation, and modern grid. NETL and its partners plan to continue building the AVESTAR portfolio of dynamic simulators, immersive training systems, and advanced research capabilities to satisfy industry’s growing need for training and experience with the operation and control of clean energy plants. Future dynamic simulators under development include natural gas combined cycle (NGCC) and supercritical pulverized coal (SCPC) plants with post-combustion CO2 capture. These dynamic simulators are targeted for use in establishing a Virtual Carbon Capture Center (VCCC), similar in concept to the DOE’s National Carbon Capture Center for slipstream testing. The VCCC will enable developers of CO2 capture technologies to integrate, test, and optimize the operation of their dynamic capture models within the context of baseline power plant dynamic models. The objective is to provide hands-on, simulator-based “learn-by-operating” test platforms to accelerate the scale-up and deployment of CO2 capture technologies. Future AVESTAR plans also include pursuing R&D on the dynamics, operation, and control of integrated electricity generation and storage systems for the modern grid era. Special emphasis will be given to combining load-following energy plants with renewable and distributed generating supplies and fast-ramping energy storage systems to provide near constant baseload power.« less
CO{sub 2}-philic oligomers as novel solvents for CO{sub 2} absorption
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Matthew B; Luebke, David R; Enick, Robert M
2010-01-01
Desirable properties for an oligomeric CO{sub 2}-capture solvent in an integrated gasification combined cycle (IGCC) plant include high selectivity for CO{sub 2} over H{sub 2} and water, low viscosity, low vapor pressure, low cost, and minimal environmental, health, and safety impacts. The neat solvent viscosity and solubility of CO{sub 2}, measured via bubble-point loci and presented on a pressure−composition diagram (weight basis), and water miscibility in CO{sub 2}-philic solvents have been determined and compared to results obtained with Selexol, a commercial oligomeric CO{sub 2} solvent. The solvents tested include polyethyleneglycol dimethylether (PEGDME), polypropyleneglycol dimethylether (PPGDME), polypropyleneglycol diacetate (PPGDAc), polybutyleneglycol diacetatemore » (PBGDAc), polytetramethyleneetherglycol diacetate (PTMEGDAc), glyceryl triacetate (GTA), polydimethyl siloxane (PDMS), and perfluorpolyether (PFPE) that has a perfluorinated propyleneglycol monomer unit. Overall, PDMS and PPGDME are the best oligomeric solvents tested and exhibit properties that make them very promising alternatives for the selective absorption of CO{sub 2} from a mixed gas stream, especially if the absorption of water is undesirable.« less
Tzanidakis, Konstantinos; Oxley, Tim; Cockerill, Tim; ApSimon, Helen
2013-06-01
Integrated Assessment, and the development of strategies to reduce the impacts of air pollution, has tended to focus only upon the direct emissions from different sources, with the indirect emissions associated with the full life-cycle of a technology often overlooked. Carbon Capture and Storage (CCS) reflects a number of new technologies designed to reduce CO2 emissions, but which may have much broader environmental implications than greenhouse gas emissions. This paper considers a wider range of pollutants from a full life-cycle perspective, illustrating a methodology for assessing environmental impacts using source-apportioned effects based impact factors calculated by the national scale UK Integrated Assessment Model (UKIAM). Contrasting illustrative scenarios for the deployment of CCS towards 2050 are presented which compare the life-cycle effects of air pollutant emissions upon human health and ecosystems of business-as-usual, deployment of CCS and widespread uptake of IGCC for power generation. Together with estimation of the transboundary impacts we discuss the benefits of an effects based approach to such assessments in relation to emissions based techniques. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kato, Moritoshi; Zhou, Yicheng
This paper presents a novel method to analyze the optimal generation mix based on portfolio theory with considering the basic condition for power supply, which means that electricity generation corresponds with load curve. The optimization of portfolio is integrated with the calculation of a capacity factor of each generation in order to satisfy the basic condition for power supply. Besides, each generation is considered to be an asset, and risks of the generation asset both in its operation period and construction period are considered. Environmental measures are evaluated through restriction of CO2 emissions, which are indicated by CO2 price. Numerical examples show the optimal generation mix according to risks such as the deviation of capacity factor of nuclear power or restriction of CO2 emissions, the possibility of introduction of clean coal technology (IGCC, CCS) or renewable energy, and so on. The results of this work will be possibly applied as setting the target of the generation mix for the future according to prospects of risks of each generation and restrictions of CO2 emissions.
Scoping Studies to Evaluate the Benefits of an Advanced Dry Feed System on the Use of Low-Rank Coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rader, Jeff; Aguilar, Kelly; Aldred, Derek
2012-11-30
This report describes the development of the design of an advanced dry feed system that was carried out under Task 4.0 of Cooperative Agreement DE-FE0007902 with the US DOE, “Scoping Studies to Evaluate the Benefits of an Advanced Dry Feed System on the use of Low- Rank Coal.” The resulting design will be used for the advanced technology IGCC case with 90% carbon capture for sequestration to be developed under Task 5.0 of the same agreement. The scope of work covered coal preparation and feeding up through the gasifier injector. Subcomponents have been broken down into feed preparation (including grindingmore » and drying), low pressure conveyance, pressurization, high pressure conveyance, and injection. Pressurization of the coal feed is done using Posimetric1 Feeders sized for the application. In addition, a secondary feed system is described for preparing and feeding slag additive and recycle fines to the gasifier injector. This report includes information on the basis for the design, requirements for down selection of the key technologies used, the down selection methodology and the final, down selected design for the Posimetric Feed System, or PFS.« less
Are renewables portfolio standards cost-effective emission abatement policy?
Dobesova, Katerina; Apt, Jay; Lave, Lester B
2005-11-15
Renewables portfolio standards (RPS) could be an important policy instrument for 3P and 4P control. We examine the costs of renewable power, accounting for the federal production tax credit, the market value of a renewable credit, and the value of producing electricity without emissions of SO2, NOx, mercury, and CO2. We focus on Texas, which has a large RPS and is the largest U.S. electricity producer and one of the largest emitters of pollutants and CO2. We estimate the private and social costs of wind generation in an RPS compared with the current cost of fossil generation, accounting for the pollution and CO2 emissions. We find that society paid about 5.7 cent/kWh more for wind power, counting the additional generation, transmission, intermittency, and other costs. The higher cost includes credits amounting to 1.1 cent/kWh in reduced SO2, NOx, and Hg emissions. These pollution reductions and lower CO2 emissions could be attained at about the same cost using pulverized coal (PC) or natural gas combined cycle (NGCC) plants with carbon capture and sequestration (CCS); the reductions could be obtained more cheaply with an integrated coal gasification combined cycle (IGCC) plant with CCS.
CONCEPTUAL DESIGN AND ECONOMICS OF THE ADVANCED CO2 HYBRID POWER CYCLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Nehrozoglu
2004-12-01
Research has been conducted under United States Department of Energy Contract DEFC26-02NT41621 to analyze the feasibility of a new type of coal-fired plant for electric power generation. This new type of plant, called the Advanced CO{sub 2} Hybrid Power Plant, offers the promise of efficiencies nearing 36 percent, while concentrating CO{sub 2} for 100% sequestration. Other pollutants, such as SO{sub 2} and NOx, are sequestered along with the CO{sub 2} yielding a zero emissions coal plant. The CO{sub 2} Hybrid is a gas turbine-steam turbine combined cycle plant that uses CO{sub 2} as its working fluid to facilitate carbon sequestration. The key components of the plant are a cryogenic air separation unit (ASU), a pressurized circulating fluidized bed gasifier, a CO{sub 2} powered gas turbine, a circulating fluidized bed boiler, and a super-critical pressure steam turbine. The gasifier generates a syngas that fuels the gas turbine and a char residue that, together with coal, fuels a CFB boiler to power the supercritical pressure steam turbine. Both the gasifier and the CFB boiler use a mix of ASU oxygen and recycled boiler flue gas as their oxidant. The resulting CFB boiler flue gas is essentially a mixture of oxygen, carbon dioxide and water. Cooling the CFB flue gas to 80 deg. F condenses most of the moisture and leaves a CO{sub 2} rich stream containing 3%v oxygen. Approximately 30% of this flue gas stream is further cooled, dried, and compressed for pipeline transport to the sequestration site (the small amount of oxygen in this stream is released and recycled to the system when the CO{sub 2} is condensed after final compression and cooling). The remaining 70% of the flue gas stream is mixed with oxygen from the ASU and is ducted to the gas turbine compressor inlet. As a result, the gas turbine compresses a mixture of carbon dioxide (ca. 64%v) and oxygen (ca. 32.5%v) rather than air. This carbon dioxide rich mixture then becomes the gas turbine working fluid and also becomes the oxidant in the gasification and combustion processes. As a result, the plant provides CO{sub 2} for sequestration without the performance and economic penalties associated with water gas shifting and separating CO{sub 2} from gas streams containing nitrogen. The cost estimate of the reference plant (the Foster Wheeler combustion hybrid) was based on a detailed prior study of a nominal 300 MWe demonstration plant with a 6F turbine. Therefore, the reference plant capital costs were found to be 30% higher than an estimate for a 425 MW fully commercial IGCC with an H class turbine (1438more » $/kW vs. 1111 $$/kW). Consequently, the capital cost of the CO{sub 2} hybrid plant was found to be 25% higher than that of the IGCC with pre-combustion CO{sub 2} removal (1892 $$/kW vs. 1510 $/kW), and the levelized cost of electricity (COE) was found to be 20% higher (7.53 c/kWh vs. 6.26 c/kWh). Although the final costs for the CO{sub 2} hybrid are higher, the study confirms that the relative change in cost (or mitigation cost) will be lower. The conceptual design of the plant and its performance and cost, including losses due to CO{sub 2} sequestration, is reported. Comparison with other proposed power plant CO{sub 2} removal techniques reported by a December 2000 EPRI report is shown. This project supports the DOE research objective of development of concepts for the capture and storage of CO{sub 2}.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Jayesh; Hess, Fernando; Horzen, Wessel van
This reports examines the feasibility of converting the existing Wabash Integrated Gasification Combined Cycle (IGCC) plant into a liquid fuel facility, with the goal of maximizing jet fuel production. The fuels produced are required to be in compliance with Section 526 of the Energy Independence and Security Act of 2007 (EISA 2007 §526) lifecycle greenhouse gas (GHG) emissions requirements, so lifecycle GHG emissions from the fuel must be equal to or better than conventional fuels. Retrofitting an existing gasification facility reduces the technical risk and capital costs associated with a coal to liquids project, leading to a higher probability ofmore » implementation and more competitive liquid fuel prices. The existing combustion turbine will continue to operate on low cost natural gas and low carbon fuel gas from the gasification facility. The gasification technology utilized at Wabash is the E-Gas™ Technology and has been in commercial operation since 1995. In order to minimize capital costs, the study maximizes reuse of existing equipment with minimal modifications. Plant data and process models were used to develop process data for downstream units. Process modeling was utilized for the syngas conditioning, acid gas removal, CO 2 compression and utility units. Syngas conversion to Fischer Tropsch (FT) liquids and upgrading of the liquids was modeled and designed by Johnson Matthey Davy Technologies (JM Davy). In order to maintain the GHG emission profile below that of conventional fuels, the CO 2 from the process must be captured and exported for sequestration or enhanced oil recovery. In addition the power utilized for the plant’s auxiliary loads had to be supplied by a low carbon fuel source. Since the process produces a fuel gas with sufficient energy content to power the plant’s loads, this fuel gas was converted to hydrogen and exported to the existing gas turbine for low carbon power production. Utilizing low carbon fuel gas and process steam in the existing combined cycle power plant provides sufficient power for all plant loads. The lifecycle GHG profile of the produced jet fuel is 95% of conventional jet fuel. Without converting the fuel gas to a low carbon fuel gas, the emissions would be 108% of conventional jet fuel and without any GHG mitigation, the profile would be 206%. Oil prices greater than $120 per barrel are required to reach a targeted internal rate of return on equity (IRROE) of 12%. Although capital expenditure is much less than if a greenfield facility was built, the relatively small size of the plant, assumed coal price, and the CTL risk profile used in the economic assumptions lead to a high cost of production. Assuming more favorable factors, the economic oil price could be reduced to $78 per barrel with GHG mitigation and $55 per barrel with no GHG mitigation.« less
Integrated low emissions cleanup system for direct coal-fueled turbines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lippert, T.E.; Newby, R.A.; Alvin, M.A.
1992-01-01
The Westinghouse Electric Corporation, Science Technology Center (W-STC) is developing an Integrated Low Emissions Cleanup (ILEC) concept for high-temperature gas cleaning to meet environmental standards, as well as to economical gas turbine life. The ILEC concept simultaneously controls sulfur, particulate, and alkali contaminants in high-pressure fuel gases or combustion gases at temperatures up to 1850[degrees]F for advanced power generation systems (PFBC, APFBC, IGCC, DCF7). The objective of this program is to demonstrate, at a bench scale, the conceptual, technical feasibility of the REC concept. The ELEC development program has a 3 phase structure: Phase 1 - laboratory-scale testing; phase 2more » - bench-scale equipment; design and fabrication; and phase 3 - bench-scale testing. Phase 1 laboratory testing has been completed. In Phase 1, entrained sulfur and alkali sorbent kinetics were measured and evaluated, and commercial-scale performance was projected. Related cold flow model testing has shown that gas-particle contacting within the ceramic barrier filter vessel will provide a good reactor environment. The Phase 1 test results and the commercial evaluation conducted in the Phase 1 program support the bench-scale facility testing to be performed in Phase 3. Phase 2 is nearing completion with the design and assembly of a modified, bench-scale test facility to demonstrate the technical feasibility of the ILEC features. This feasibility testing will be conducted in Phase 3.« less
Integrated low emissions cleanup system for direct coal-fueled turbines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lippert, T.E.; Newby, R.A.; Alvin, M.A.
1992-12-31
The Westinghouse Electric Corporation, Science & Technology Center (W-STC) is developing an Integrated Low Emissions Cleanup (ILEC) concept for high-temperature gas cleaning to meet environmental standards, as well as to economical gas turbine life. The ILEC concept simultaneously controls sulfur, particulate, and alkali contaminants in high-pressure fuel gases or combustion gases at temperatures up to 1850{degrees}F for advanced power generation systems (PFBC, APFBC, IGCC, DCF7). The objective of this program is to demonstrate, at a bench scale, the conceptual, technical feasibility of the REC concept. The ELEC development program has a 3 phase structure: Phase 1 - laboratory-scale testing; phasemore » 2 - bench-scale equipment; design and fabrication; and phase 3 - bench-scale testing. Phase 1 laboratory testing has been completed. In Phase 1, entrained sulfur and alkali sorbent kinetics were measured and evaluated, and commercial-scale performance was projected. Related cold flow model testing has shown that gas-particle contacting within the ceramic barrier filter vessel will provide a good reactor environment. The Phase 1 test results and the commercial evaluation conducted in the Phase 1 program support the bench-scale facility testing to be performed in Phase 3. Phase 2 is nearing completion with the design and assembly of a modified, bench-scale test facility to demonstrate the technical feasibility of the ILEC features. This feasibility testing will be conducted in Phase 3.« less
High Temperature Syngas Cleanup Technology Scale-up and Demonstration Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, Ben; Turk, Brian; Denton, David
Gasification is a technology for clean energy conversion of diverse feedstocks into a wide variety of useful products such as chemicals, fertilizers, fuels, electric power, and hydrogen. Existing technologies can be employed to clean the syngas from gasification processes to meet the demands of such applications, but they are expensive to build and operate and consume a significant fraction of overall parasitic energy requirements, thus lowering overall process efficiency. RTI International has developed a warm syngas desulfurization process (WDP) utilizing a transport-bed reactor design and a proprietary attrition-resistant, high-capacity solid sorbent with excellent performance replicated at lab, bench, and pilotmore » scales. Results indicated that WDP technology can improve both efficiency and cost of gasification plants. The WDP technology achieved ~99.9% removal of total sulfur (as either H 2S or COS) from coal-derived syngas at temperatures as high as 600°C and over a wide range of pressures (20-80 bar, pressure independent performance) and sulfur concentrations. Based on the success of these tests, RTI negotiated a cooperative agreement with the U.S. Department of Energy for precommercial testing of this technology at Tampa Electric Company’s Polk Power Station IGCC facility in Tampa, Florida. The project scope also included a sweet water-gas-shift process for hydrogen enrichment and an activated amine process for 90+% total carbon capture. Because the activated amine process provides some additional non-selective sulfur removal, the integration of these processes was expected to reduce overall sulfur in the syngas to sub-ppmv concentrations, suitable for most syngas applications. The overall objective of this project was to mitigate the technical risks associated with the scale up and integration of the WDP and carbon dioxide capture technologies, enabling subsequent commercial-scale demonstration. The warm syngas cleanup pre-commercial test unit was designed and constructed on schedule and under budget and was operated for approximately 1,500 total hours utilizing ~20% of the IGCC’s total syngas as feed (~1.5 MM scfh of dry syngas). The WDP system reduced total sulfur levels to ~10 ppmv (~99.9% removal) from raw syngas that contained as high as 14,000 ppmv of total sulfur. The integration of WDP with the activated amine process enabled further reduction of total sulfur in the final treated syngas to the anticipated sub-ppmv concentrations (>99.99% removal), suitable for stringent syngas applications such as chemicals, fertilizers, and fuels. Techno-economic assessments by RTI and by third parties indicate potential for significant (up to 50%) capital and operating cost reductions for the entire syngas cleanup block when WDP technology is integrated with a broad spectrum of conventional and emerging carbon capture or acid gas removal technologies. This final scientific/technical report covers the pre-FEED, FEED, EPC, commissioning, and operation phases of this project, as well as system performance results. In addition, the report addresses other parallel-funded R&D efforts focused on development and testing of trace contaminant removal process (TCRP) sorbents, a direct sulfur recovery process (DSRP), and a novel sorbent for warm carbon dioxide capture, as well as pre-FEED, FEED, and techno-economic studies to consider the potential benefit for use of WDP for polygeneration of electric power and ammonia/urea fertilizers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mattes, Karl
Summit Texas Clean Energy, LLC (STCE) is developing the Texas Clean Energy Project (TCEP or the Project) to be located near Penwell, Texas. The TCEP will include an Integrated Gasification Combined Cycle (IGCC) power plant with a nameplate capacity of 400 megawatts electric (MWe), combined with the production of urea fertilizer and the capture, utilization and storage of carbon dioxide (CO 2) sold commercially for regional use in enhanced oil recovery (EOR) in the Permian Basin of west Texas. The TCEP will utilize coal gasification technology to convert Powder River Basin subbituminous coal delivered by rail from Wyoming into amore » synthetic gas (syngas) that will be cleaned and further treated so that at least 90 percent of the overall carbon entering the IGCC facility will be captured. The clean syngas will then be divided into two highhydrogen (H 2) concentration streams, one of which will be combusted as a fuel in a combined cycle power block for power generation and the other converted into urea fertilizer for commercial sale. The captured CO 2 will be divided into two streams: one will be used in producing the urea fertilizer and the other will be compressed for transport by pipeline for offsite use in EOR and permanent underground sequestration. The TCEP was selected by the U.S. Department of Energy (DOE) Office of Fossil Energy (FE) for cost-shared co-funded financial assistance under Round 3 of its Clean Coal Power Initiative (CCPI). A portion of this financial assistance was budgeted and provided for initial development, permitting and design activities. STCE and the DOE executed a Cooperative Agreement dated January 29, 2010, which defined the objectives of the Project for all phases. During Phase 1, STCE conducted and completed all objectives defined in the initial development, permitting and design portions of the Cooperative Agreement. This topical report summarizes all work associated with the project objectives, and additional work required to complete the financing of the Project. In general, STCE completed project definition, a front-end, engineering and design study (FEED), applied for and received its Record of Decision (ROD) associated with the NEPA requirements summarized in a detailed Environmental Impact Statement. A topical report covering the results of the FEED is the subject of a separate report submitted to the DOE on January 26, 2012. References to the FEED report are contained herein. In August 2013, STCE executed fixed-price turnkey EPC contracts and previously, in December 2011 a long-term O&M agreement, with industry-leading contractors. Other work completed during Phase 1 includes execution of all commercial input and offtake agreements required for project financing. STCE negotiated long-term agreements for power, CO 2 and urea offtake. A contract for the purchase of coal feedstock from Cloud Peak Energy’s Cordero Rojo mine was executed, as well as necessary agreements (supplementing the tariff) with the Union Pacific Railroad (UPRR) for delivery of the coal to the TCEP site. STCE executed firm agreements for natural gas transportation with ONEOK for long-term water supply with a private landowner. In addition, STCE secured options for critical easements and rights-of-way, completed and updated a transmission study, executed an interconnection agreement and has agreed a target October 31, 2013 financial closing date with debt and conventional and tax equity.« less
WETTING AND REACTIVE AIR BRAZING OF BSCF FOR OXYGEN SEPARATION DEVICES
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaDouceur, Richard M.; Meier, Alan; Joshi, Vineet V.
Reactive air brazes Ag-CuO and Ag-V2O5 were evaluated for brazing Ba0.5Sr0.5Co0.8Fe0.2O(3-δ) (BSCF). BSCF has been determined in previous work to have the highest potential mixed ionic/electronic conducting (MIEC) ceramic material based on the design and oxygen flux requirements of an oxy-fuel plant such as an integrated gasification combined cycle (IGCC) used to facilitate high-efficiency carbon capture. Apparent contact angles were observed for Ag-CuO and Ag-V2O5 mixtures at 1000 °C for isothermal hold times of 0, 10, 30, and 60 minutes. Wetting apparent contact angles (θ<90°) were obtained for 1%, 2%, and 5% Ag-CuO and Ag-V2O5 mixtures, with the apparent contactmore » angles between 74° and 78° for all compositions and furnace dwell times. Preliminary microstructural analysis indicates that two different interfacial reactions are occurring: Ag-CuO interfacial microstructures revealed the same dissolution of copper oxide into the BSCF matrix to form copper-cobalt-oxygen rich dissolution products along the BSCF grain boundaries and Ag-V2O5 interfacial microstructures revealed the infiltration and replacement of cobalt and iron with vanadium and silver filling pores in the BSCF microstructure. The Ag-V2O5 interfacial reaction product layer was measured to be significantly thinner than the Ag-CuO reaction product layer. Using a fully articulated four point flexural bend test fixture, the flexural fracture strength for BSCF was determined to be 95 ± 33 MPa. The fracture strength will be used to ascertain the success of the reactive air braze alloys. Based on these results, brazes were fabricated and mechanically tested to begin to optimize the brazing parameters for this system. Ag-2.5% CuO braze alloy with a 2.5 minute thermal cycle achieved a hermetic seal with a joint flexural strength of 34 ± 15 MPa and Ag-1% V2O5 with a 30 minute thermal cycle had a joint flexural strength of 20 ± 15 MPa.« less
NASA Astrophysics Data System (ADS)
Montalbano, Timothy
Gas turbine engines remain an integral part of providing the world's propulsion and power generation needs. The continued use of gas turbines requires increased temperature operation to reach higher efficiencies and the implementation of alternative fuels for a lower net-carbon footprint. This necessitates evaluation of the material coatings used to shield the hot section components of gas turbines in these new extreme environments in order to understand how material degradation mechanisms change. Recently, the US Navy has sought to reduce its use of fossil fuels by implementing a blended hydroprocessed renewable diesel (HRD) derived from algae in its fleet. To evaluate the material degradation in this alternative environment, metal alloys are exposed in a simulated combustion environment using this blended fuel or the traditional diesel-like fuel. Evaluation of the metal alloys showed the development of thick, porous scales with a large depletion of aluminum for the blend fuel test. A mechanism linking an increased solubility of the scale to the blend fuel test environment will be discussed. For power generation applications, Integrated Gasification Combined Cycle (IGCC) power plants can provide electricity with 45% efficiency and full carbon capture by using a synthetic gas (syngas) derived from coal, biomass, or another carbon feedstock. However, the combustion of syngas is known to cause high water vapor content levels in the exhaust stream with unknown material consequences. To evaluate the effect of increased humidity, air-plasma sprayed (APS), yttria-stabilized zirconia (YSZ) is thermally aged in an environment with and without humidity. An enhanced destabilization of the parent phase by humid aging is revealed by x-ray diffraction (XRD) and Raman spectroscopy. Microstructural analysis by transmission electron microscopy (TEM) and scanning-TEM (STEM) indicate an enhanced coarsening of the domain structure of the YSZ in the humid environment. The enhanced destabilization and coarsening in the humid aging environment is explained mechanistically by water-derived species being incorporated into the YSZ structure and altering the anion sublattice. The characterization of the metal alloy and ceramic coatings exposed in these alternative environments allows for a deeper understanding of the mechanisms behind the material evolution in these environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howe, Gary; Albritton, John; Denton, David
In September 2010, RTI and the DOE/NETL signed a cooperative agreement (DE-FE000489) to design, build, and operate a pre-commercial syngas cleaning system that would capture up to 90% of the CO 2 in the syngas slipstream, and demonstrate the ability to reduce syngas contaminants to meet DOE’s specifications for chemical production application. This pre-commercial syngas cleaning system is operated at Tampa Electric Company’s (TEC) 250-MWe integrated gasification combined cycle (IGCC) plant at Polk Power Station (PPS), located near Tampa, Florida. The syngas cleaning system consists of the following units: Warm Gas Desulfurization Process (WDP) - this unit processes a syngasmore » flow equivalent of 50 MWe of power (50 MWe equivalent corresponds to about 2.0 MM scfh of syngas on dry basis) to produce a desulfurized syngas with a total sulfur (H 2S+COS) concentration ~ 10 ppmv. Water Gas Shift (WGS) Reactor - this unit converts sufficient CO into CO 2 to enable 90% capture of the CO 2 in the syngas slipstream. This reactor uses conventional commercial shift catalyst technologies. Low Temperature Gas Cooling (LTGC) - this unit cools the syngas for the low temperature activated MDEA process and separates any condensed water. Activated MDEA Process (aMDEA) - this unit employs a non-selective separation for the CO 2 and H 2S present in the raw syngas stream. Because of the selective sulfur removal by the upstream WDP unit, the CO 2 capture target of 90% CO 2 can be achieved with the added benefit that total sulfur concentration in the CO 2 product is < 100 ppmv. An additional advantage of the activated MDEA process is that the non-selective sulfur removal from the treated syngas reduces sulfur in the treated gas to very low sub-ppmv concentrations, which are required for chemical production applications. Testing to date of this pre-commercial syngas cleaning system has shown that the technology has great potential to provide clean syngas from coal and petcoke-based gasification at increased efficiency and at significantly lower capital and operating costs than conventional syngas cleanup technologies. However, before the technology can be deemed ready for scale-up to a full commercial-scale demonstration, additional R&D testing is needed at the site to address the following critical technical risks: WDP sorbent stability and performance; Impact of WDP on downstream cleanup and conversion steps; Metallurgy and refractory; Syngas cleanup performance and controllability; Carbon capture performance and additional syngas cleanup The proposed plan to acquire this additional R&D data involves: Operation of the units to achieve an additional 3,000 hours of operation of the system within the performance period, with a target of achieving 1,000 hours of those hours via continuous operation of the entire integrated pre-commercial demonstration system; Rapid turnaround of repairs and/or modifications required as necessary to return any specific unit to operating status with documentation and lessons learned to support technology maturation, and; Proactive performance of maintenance activities during any unplanned outages and if possible while operating.« less
Low-Chrome/Chrome Free Refractories for Slagging Gasifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, J.P.; Kwong, K.-S.; Powell, C.P.
2007-01-01
Gasifiers are containment vessels used to react carbon-containing materials with oxygen and water, producing syngas (CO and H2) that is used in chemical and power production. It is also a potential source of H2 in a future hydrogen economy. Air cooled slagging gasifiers are one type of gasifier, operating at temperatures from 1275-1575º C and at pressures of 400 psi or higher. They typically use coal or petroleum coke as the carbon source, materials which contain ash impurities that liquefy at the gasification temperatures, producing liquid slag in quantities of 100 or more tons/day, depending on the carbon fed ratemore » and the percent ash present in the feedstock. The molten slag is corrosive to refractory linings, causing chemical dissolution and spalling. The refractory lining is composed of chrome oxide, alumina, and zirconia; and is replaced every 3-24 months. Gasifier users would like greater on-line availability and reliability of gasifier liners, something that has impacted gasifier acceptance by industry. Research is underway at NETL to improve refractory service life and to develop a no-chrome or low-chrome oxide alternative refractory liner. Over 250 samples of no- or low-chrome oxide compositions have been evaluated for slag interactions by cup testing; with potential candidates for further studies including those with ZrO2, Al2O3, and MgO materials. The development of improved liner materials is necessary if technologies such as IGCC and DOE’s Near Zero Emissions Advanced Fossil Fuel Power Plant are to be successful and move forward in the marketplace.« less
Catalytic Combustion for Ultra-Low NOx Hydrogen Turbines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etemad, Shahrokh; Baird, Benjamin; Alavandi, Sandeep
2011-06-30
Precision Combustion, Inc., (PCI) in close collaboration with Solar Turbines, Incorporated, has developed and demonstrated a combustion system for hydrogen fueled turbines that reduces NOx to low single digit level while maintaining or improving current levels of efficiency and eliminating emissions of carbon dioxide. Full scale Rich Catalytic Hydrogen (RCH1) injector was developed and successfully tested at Solar Turbines, Incorporated high pressure test facility demonstrating low single digit NOx emissions for hydrogen fuel in the range of 2200F-2750F. This development work was based on initial subscale development for faster turnaround and reduced cost. Subscale testing provided promising results for 42%more » and 52% H2 with NOx emissions of less than 2 ppm with improved flame stability. In addition, catalytic reactor element testing for substrate oxidation, thermal cyclic injector testing to simulate start-stop operation in a gas turbine environment, and steady state 15 atm. operation testing were performed successfully. The testing demonstrated stable and robust catalytic element component life for gas turbine conditions. The benefit of the catalytic hydrogen combustor technology includes capability of delivering near-zero NOx without costly post-combustion controls and without requirement for added sulfur control. In addition, reduced acoustics increase gas turbine component life. These advantages advances Department of Energy (DOE’s) objectives for achievement of low single digit NOx emissions, improvement in efficiency vs. postcombustion controls, fuel flexibility, a significant net reduction in Integrated Gasification Combined Cycle (IGCC) system net capital and operating costs, and a route to commercialization across the power generation field from micro turbines to industrial and utility turbines.« less
The Mesaba Energy Project: Clean Coal Power Initiative, Round 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, Richard; Gray, Gordon; Evans, Robert
2014-07-31
The Mesaba Energy Project is a nominal 600 MW integrated gasification combine cycle power project located in Northeastern Minnesota. It was selected to receive financial assistance pursuant to code of federal regulations (?CFR?) 10 CFR 600 through a competitive solicitation under Round 2 of the Department of Energy?s Clean Coal Power Initiative, which had two stated goals: (1) to demonstrate advanced coal-based technologies that can be commercialized at electric utility scale, and (2) to accelerate the likelihood of deploying demonstrated technologies for widespread commercial use in the electric power sector. The Project was selected in 2004 to receive a totalmore » of $36 million. The DOE portion that was equally cost shared in Budget Period 1 amounted to about $22.5 million. Budget Period 1 activities focused on the Project Definition Phase and included: project development, preliminary engineering, environmental permitting, regulatory approvals and financing to reach financial close and start of construction. The Project is based on ConocoPhillips? E-Gas? Technology and is designed to be fuel flexible with the ability to process sub-bituminous coal, a blend of sub-bituminous coal and petroleum coke and Illinois # 6 bituminous coal. Major objectives include the establishment of a reference plant design for Integrated Gasification Combined Cycle (?IGCC?) technology featuring advanced full slurry quench, multiple train gasification, integration of the air separation unit, and the demonstration of 90% operational availability and improved thermal efficiency relative to previous demonstration projects. In addition, the Project would demonstrate substantial environmental benefits, as compared with conventional technology, through dramatically lower emissions of sulfur dioxide, nitrogen oxides, volatile organic compounds, carbon monoxide, particulate matter and mercury. Major milestones achieved in support of fulfilling the above goals include obtaining Site, High Voltage Transmission Line Route, and Natural Gas Pipeline Route Permits for a Large Electric Power Generating Plant to be located in Taconite, Minnesota. In addition, major pre-construction permit applications have been filed requesting authorization for the Project to i) appropriate water sufficient to accommodate its worst case needs, ii) operate a major stationary source in compliance with regulations established to protect public health and welfare, and iii) physically alter the geographical setting to accommodate its construction. As of the current date, the Water Appropriation Permits have been obtained.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Javad Abbasian; Armin Hassanzadeh Khayyat; Rachid B. Slimane
The specific objective of this project was to develop physically durable and chemically regenerable MgO-based sorbents that can remove carbon dioxide from raw coal gas at operating condition prevailing in IGCC processes. A total of sixty two (62) different sorbents were prepared in this project. The sorbents were prepared either by various sol-gel techniques (22 formulations) or modification of dolomite (40 formulations). The sorbents were prepared in the form of pellets and in granular forms. The solgel based sorbents had very high physical strength, relatively high surface area, and very low average pore diameter. The magnesium content of the sorbentsmore » was estimated to be 4-6 % w/w. To improve the reactivity of the sorbents toward CO{sub 2}, The sorbents were impregnated with potassium salts. The potassium content of the sorbents was about 5%. The dolomite-based sorbents were prepared by calcination of dolomite at various temperature and calcination environment (CO{sub 2} partial pressure and moisture). Potassium carbonate was added to the half-calcined dolomite through wet impregnation method. The estimated potassium content of the impregnated sorbents was in the range of 1-6% w/w. In general, the modified dolomite sorbents have significantly higher magnesium content, larger pore diameter and lower surface area, resulting in significantly higher reactivity compared to the sol-gel sorbents. The reactivities of a number of sorbents toward CO{sub 2} were determined in a Thermogravimetric Analyzer (TGA) unit. The results indicated that at the low CO{sub 2} partial pressures (i.e., 1 atm), the reactivities of the sorbents toward CO{sub 2} are very low. At elevated pressures (i.e., CO{sub 2} partial pressure of 10 bar) the maximum conversion of MgO obtained with the sol-gel based sorbents was about 5%, which corresponds to a maximum CO{sub 2} absorption capacity of less than 1%. The overall capacity of modified dolomite sorbents were at least one order of magnitude higher than those of the sol-gel based sorbents. The results of the tests conducted with various dolomite-based sorbent indicate that the reactivity of the modified dolomite sorbent increases with increasing potassium concentration, while higher calcination temperature adversely affects the sorbent reactivity. Furthermore, the results indicate that as long as the absorption temperature is well below the equilibrium temperature, the reactivity of the sorbent improves with increasing temperature (350-425 C). As the temperature approaches the equilibrium temperature, because of the significant increase in the rate of reverse (i.e., regeneration) reaction, the rate of CO{sub 2} absorption decreases. The results of cyclic tests show that the reactivity of the sorbent gradually decreases in the cyclic process. To improve long-term durability (i.e., reactivity and capacity) of the sorbent, the sorbent was periodically re-impregnated with potassium additive and calcined. The results indicate that, in general, re-treatment improves the performance of the sorbent, and that, the extent of improvement gradually decreases in the cyclic process. The presence of steam significantly enhances the sorbent reactivity and significantly decreases the rate of decline in sorbent deactivation in the cyclic process.« less
Comparison of Intra-cluster and M87 Halo Orphan Globular Clusters in the Virgo Cluster
NASA Astrophysics Data System (ADS)
Louie, Tiffany Kaye; Tuan, Jin Zong; Martellini, Adhara; Guhathakurta, Puragra; Toloba, Elisa; Peng, Eric; Longobardi, Alessia; Lim, Sungsoon
2018-01-01
We present a study of “orphan” globular clusters (GCs) — GCs with no identifiable nearby host galaxy — discovered in NGVS, a 104 deg2 CFHT/MegaCam imaging survey. At the distance of the Virgo cluster, GCs are bright enough to make good spectroscopic targets and many are barely resolved in good ground-based seeing. Our orphan GC sample is derived from a subset of NGVS-selected GC candidates that were followed up with Keck/DEIMOS spectroscopy. While our primary spectroscopic targets were candidate GC satellites of Virgo dwarf elliptical and ultra-diffuse galaxies, many objects turned out to be non-satellites based on a radial velocity mismatch with the Virgo galaxy they are projected close to. Using a combination of spectral characteristics (e.g., absorption vs. emission), Gaussian mixture modeling of radial velocity and positions, and extreme deconvolution analysis of ugrizk photometry and image morphology, these non-satellites were classified into: (1) intra-cluster GCs (ICGCs) in the Virgo cluster, (2) GCs in the outer halo of M87, (3) foreground Milky Way stars, and (4) background galaxies. The statistical distinction between ICGCs and M87 halo GCs is based on velocity distributions (mean of 1100 vs. 1300 km/s and dispersions of 700 vs. 400 km/s, respectively) and radial distribution (diffuse vs. centrally concentrated, respectively). We used coaddition to increase the spectral SNR for the two classes of orphan GCs and measured the equivalent widths (EWs) of the Mg b and H-beta absorption lines. These EWs were compared to single stellar population models to obtain mean age and metallicity estimates. The ICGCs and M87 halo GCs have <[Fe/H> = –0.6+/–0.3 and –0.4+/–0.3 dex, respectively, and mean ages of >~ 5 and >~ 10 Gyr, respectively. This suggests the M87 halo GCs formed in relatively high-mass galaxies that avoided being tidally disrupted by M87 until they were close to the cluster center, while IGCCs formed in relatively low-mass galaxies that were tidally disrupted in the cluster outskirts. Most of this work was carried out by high school students working under the auspices of the Science Internship Program (SIP) at UC Santa Cruz. We are grateful for financial support from the NSF and NASA/STScI.
NASA Astrophysics Data System (ADS)
Newcomer, Adam
Increasing demand for electricity and an aging fleet of generators are the principal drivers behind an increasing need for a large amount of capital investments in the US electric power sector in the near term. The decisions (or lack thereof) by firms, regulators and policy makers in response to this challenge have long lasting consequences, incur large economic and environmental risks, and must be made despite large uncertainties about the future operating and business environment. Capital investment decisions are complex: rates of return are not guaranteed; significant uncertainties about future environmental legislation and regulations exist at both the state and national levels---particularly about carbon dioxide emissions; there is an increasing number of shareholder mandates requiring public utilities to reduce their exposure to potentially large losses from stricter environmental regulations; and there are significant concerns about electricity and fuel price levels, supplies, and security. Large scale, low carbon electricity generation facilities using coal, such as integrated gasification combined cycle (IGCC) facilities coupled with carbon capture and sequestration (CCS) technologies, have been technically proven but are unprofitable in the current regulatory and business environment where there is no explicit or implicit price on carbon dioxide emissions. The paper examines two separate scenarios that are actively discussed by policy and decision makers at corporate, state and national levels: a future US electricity system where coal plays a role; and one where the role of coal is limited or nonexistent. The thesis intends to provide guidance for firms and policy makers and outline applications and opportunities for public policies and for private investment decisions to limit financial risks of electricity generation capital investments under carbon constraints.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ames, Forrest; Bons, Jeffrey
2014-09-30
The Department of Energy has goals to move land based gas turbine systems to alternate fuels including coal derived synthetic gas and hydrogen. Coal is the most abundant energy resource in the US and in the world and it is economically advantageous to develop power systems which can use coal. Integrated gasification combined cycles are (IGCC) expected to allow the clean use of coal derived fuels while improving the ability to capture and sequester carbon dioxide. These cycles will need to maintain or increase turbine entry temperatures to develop competitive efficiencies. The use of coal derived syngas introduces a rangemore » of potential contaminants into the hot section of the gas turbine including sulfur, iron, calcium, and various alkali metals. Depending on the effectiveness of the gas clean up processes, there exists significant likelihood that the remaining materials will become molten in the combustion process and potentially deposit on downstream turbine surfaces. Past evidence suggests that deposition will be a strong function of increasing temperature. Currently, even with the best gas cleanup processes a small level of particulate matter in the syngas is expected. Consequently, particulate deposition is expected to be an important consideration in the design of turbine components. The leading edge region of first stage vanes most often have higher deposition rates than other areas due to strong fluid acceleration and streamline curvature in the vicinity of the surface. This region remains one of the most difficult areas in a turbine nozzle to cool due to high inlet temperatures and only a small pressure ratio for cooling. The leading edge of a vane often has relatively high heat transfer coefficients and is often cooled using showerhead film cooling arrays. The throat of the first stage nozzle is another area where deposition potentially has a strongly adverse effect on turbine performance as this region meters the turbine inlet flow. Based on roughness levels found on in service vanes (Bons, et al., 2001, up to 300 microns) flow blockage in first stage turbine nozzles can easily reach 1 to 2 percent in conventional turbines. Deposition levels in syngas fueled gas turbines are expected to be even more problematic. The likelihood of significant deposition to the leading edge of vanes in a syngas environment indicates the need to examine this effect on the leading edge cooling problem. It is critical to understand the influence of leading edge geometry and turbulence on deposition rates for both internally and showerhead cooled leading edge regions. The expected level of deposition in a vane stagnation region not only significantly changes the heat transfer problem but also suggests that cooling arrays may clog. Addressing the cooling issue suggests a need to better understand stagnation region heat transfer with realistic roughness as well as the other variables affecting transport near the leading edge. Also, the question of whether leading edge regions can be cooled internally with modern cooling approaches should also be raised, thus avoiding the clogging issue. Addressing deposition in the pressure side throat region of the nozzle is another critical issue for this environment. Issues such as examining the protective effect of slot and full coverage discrete-hole film cooling on limiting deposition as well as the influence of roughness and turbulence on effectiveness should be raised. The objective of this present study is to address these technical challenges to help enable the development of high efficiency syngas tolerant gas turbine engines.« less
NASA Astrophysics Data System (ADS)
Maldonado, Sergio Elzar
Over 92% of the coal consumed by power plants is used to generate electricity in the United States (U.S.). The U.S. has the world's largest recoverable reserves of coal, it is estimated that reserves of coal will last more than 200 years based in current production and demand levels. Integrated Gasification Combined Cycle (IGCC) power plants aim to reduce the amount of pollutants by gasifying coal and producing synthesis gas. Synthesis gas, also known as syngas, is a product of coal gasification and can be used in gas turbines for energy production. Syngas is primarily a mixture of hydrogen and carbon monoxide and is produced by gasifying a solid fuel feedstock such as coal or biomass. The objective of the thesis is to create a flame stability map by performing various experiments using high-content hydrogen fuels with varying compositions of hydrogen representing different coal feedstocks. The experiments shown in this thesis were performed using the High-Pressure Combustion facility in the Center for Space Exploration Technology Research (CSETR) at the University of Texas at El Paso (UTEP). The combustor was fitted with a novel Multi-Tube fuel Injector (MTI) designed to improve flame stability. This thesis presents the results of testing of syngas fuels with compositions of 20, 30, and 40% hydrogen concentrations in mixtures with carbon monoxide. Tests were completed for lean conditions ranging from equivalence ratios between 0.6 and 0.9. The experimental results showed that at an equivalence ratio of 0.6, a stable flame was not achieved for any of the fuel mixtures tested. It was also observed that the stability region of the syngas flame increased as equivalence ratio and the hydrogen concentration in syngas fuel increases with the 40% hydrogen-carbon monoxide mixture demonstrating the greatest stability region. Design improvements to the MTI are also discussed as part of the future work on this topic.
A Review of Materials for Gas Turbines Firing Syngas Fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibbons, Thomas; Wright, Ian G
2009-05-01
Following the extensive development work carried out in the 1990's, gas turbine combined-cycle (GTCC) systems burning natural gas represent a reliable and efficient power generation technology widely used in many parts of the world. A critical factor was that, in order to operate at the high turbine entry temperatures required for high efficiency operation, aero-engine technology, i.e., single-crystal blades, thermal barrier coatings, and sophisticated cooling techniques had to be rapidly scaled up and introduced into these large gas turbines. The problems with reliability that resulted have been largely overcome, so that the high-efficiency GTCC power generation system is now amore » mature technology, capable of achieving high levels of availability. The high price of natural gas and concern about emission of greenhouse gases has focused attention on the desirability of replacing natural gas with gas derived from coal (syngas) in these gas turbine systems, since typical systems analyses indicate that IGCC plants have some potential to fulfil the requirement for a zero-emissions power generation system. In this review, the current status of materials for the critical hot gas path parts in large gas turbines is briefly considered in the context of the need to burn syngas. A critical factor is that the syngas is a low-Btu fuel, and the higher mass flow compared to natural gas will tend to increase the power output of the engine. However, modifications to the turbine and to the combustion system also will be necessary. It will be shown that many of the materials used in current engines will also be applicable to units burning syngas but, since the combustion environment will contain a greater level of impurities (especially sulfur, water vapor, and particulates), the durability of some components may be prejudiced. Consequently, some effort will be needed to develop improved coatings to resist attack by sulfur-containing compounds, and also erosion.« less
Development of an Acoustic Sensor for On-Line Gas Temperature Measurement in Gasifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peter Ariessohn; Hans Hornung
2006-01-15
This project was awarded under U.S. Department of Energy (DOE) National Energy Technology Laboratory (NETL) Program Solicitation DE-PS26-02NT41422 and specifically addresses Technical Topical Area 2-Gasification Technologies. The project team includes Enertechnix, Inc. as the main contractor and ConocoPhillips Company as a technical partner, who also provides access to the SG Solutions Gasification Facility (formerly Wabash River Energy Limited), host for the field-testing portion of the research. Since 1989 the U.S. Department of Energy has supported development of advanced coal gasification technology. The Wabash River and TECO IGCC demonstration projects supported by the DOE have demonstrated the ability of these plantsmore » to achieve high levels of energy efficiency and extremely low emissions of hazardous pollutants. However, a continuing challenge for this technology is the tradeoff between high carbon conversion which requires operation with high internal gas temperatures, and limited refractory life which is exacerbated by those high operating temperatures. Attempts to control internal gas temperature so as to operate these gasifiers at the optimum temperature have been hampered by the lack of a reliable technology for measuring internal gas temperatures. Thermocouples have serious survival problems and provide useful temperature information for only a few days or weeks after startup before burning out. For this reason, the Department of Energy has funded several research projects to develop more robust and reliable temperature measurement approaches for use in coal gasifiers. Enertechnix has developed a line of acoustic gas temperature sensors for use in coal-fired electric utility boilers, kraft recovery boilers, cement kilns and petrochemical process heaters. Acoustic pyrometry provides several significant advantages for gas temperature measurement in hostile process environments. First, it is non-intrusive so survival of the measurement components is not a serious problem. Second, it provides a line-of-sight average temperature rather than a point measurement, so the measured temperature is more representative of the process conditions than those provided by thermocouples. Unlike radiation pyrometers, the measured temperature is a linear average over the full path rather than a complicated function of gas temperature and the exponential Beer's law. For this reason, acoustic pyrometry is well suited to tomography allowing detailed temperature maps to be created through the use of multiple path measurements in a plane. Therefore, acoustic pyrometry is an attractive choice for measuring gas temperature inside a coal gasifier. The objective of this project is to adapt acoustic pyrometer technology to make it suitable for measuring gas temperature inside a coal gasifier, to develop a prototype sensor based on this technology, and to demonstrate its performance through testing on a commercial gasifier. The project is organized in three phases, each of approximately one year duration. The first phase consists of researching a variety of sound generation and coupling approaches suitable for use with a high pressure process, evaluation of the impact of gas composition variability on the acoustic temperature measurement approach, evaluation of the impact of suspended particles on sound attenuation, evaluation of slagging issues and development of concepts to deal with this issue, development and testing of key prototype components to allow selection of the best approaches, and development of a conceptual design for a field prototype sensor that can be tested on an operating gasifier. The second phase consists of designing and fabricating a series of prototype sensors, testing them in the lab and at a gasifier facility, and developing a conceptual design for an engineering prototype sensor. The third phase consists of designing and fabricating the engineering prototype, testing it in the lab and in a commercial gasifier, and conducting extended field trials to demonstrate sensor performance and investigate the ability to improve gasifier performance through the use of the sensor.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiquan Tao
2006-12-31
The chemistry of sol-gel derived silica and refractive metal oxide has been systematically studied. Sol-gel processes have been developed for preparing porous silica and semiconductor metal oxide materials. Micelle/reversed micelle techniques have been developed for preparing nanometer sized semiconductor metal oxides and noble metal particles. Techniques for doping metal ions, metal oxides and nanosized metal particles into porous sol-gel material have also been developed. Optical properties of sol-gel derived materials in ambient and high temperature gases have been studied by using fiber optic spectroscopic techniques, such as fiber optic ultraviolet/visible absorption spectrometry, fiber optic near infrared absorption spectrometry and fibermore » optic fluorescence spectrometry. Fiber optic spectrometric techniques have been developed for investigating the optical properties of these sol-gel derived materials prepared as porous optical fibers or as coatings on the surface of silica optical fibers. Optical and electron microscopic techniques have been used to observe the microstructure, such as pore size, pore shape, sensing agent distribution, of sol-gel derived material, as well as the size and morphology of nanometer metal particle doped in sol-gel derived porous silica, the nature of coating of sol-gel derived materials on silica optical fiber surface. In addition, the chemical reactions of metal ion, nanostructured semiconductor metal oxides and nanometer sized metal particles with gas components at room temperature and high temperatures have also been investigated with fiber optic spectrometric methods. Three classes of fiber optic sensors have been developed based on the thorough investigation of sol-gel chemistry and sol-gel derived materials. The first group of fiber optic sensors uses porous silica optical fibers doped with metal ions or metal oxide as transducers for sensing trace NH{sub 3} and H{sub 2}S in high temperature gas samples. The second group of fiber optic sensors uses sol-gel derived porous silica materials doped with nanometer particles of noble metals in the form of fiber or coating for sensing trace H{sub 2}, NH{sub 3} and HCl in gas samples at for applications ambient temperature. The third classes of fiber optic sensors use sol-gel derived semiconductor metal oxide coating on the surface of silica optical fiber as transducers for selectively sensing H{sub 2}, CH{sub 4} and CO at high temperature. In addition, optical fiber temperature sensors use the fluorescence signal of rare-earth metal ions doped porous silica optical fiber or the optical absorption signal of thermochromic metal oxide materials coated on the surface of silica optical fibers have also been developed for monitoring gas temperature of corrosive gas. Based on the results obtained from this project, the principle of fiber optic sensor techniques for monitoring matrix gas components as well as trace components of coal gasification derived syngas has been established. Prototype sensors for sensing trace ammonia and hydrogen sulfide in gasification derived syngas have been built up in our laboratory and have been tested using gas samples with matrix gas composition similar to that of gasification derived fuel gas. Test results illustrated the feasibility of these sensors for applications in IGCC processes.« less
NASA Astrophysics Data System (ADS)
Kennedy, Scott Warren
A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable contribution by synthesizing information from research in power market economics, power system reliability, and environmental impact assessment, to develop a comprehensive methodology for analyzing wind power in the context of long-term energy planning.
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
NASA Astrophysics Data System (ADS)
Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.
2017-08-01
The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.
Criteria for Comparing Domain Analysis Approaches Version 01.00.00
1991-12-01
Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management
NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
Designing Image Analysis Pipelines in Light Microscopy: A Rational Approach.
Arganda-Carreras, Ignacio; Andrey, Philippe
2017-01-01
With the progress of microscopy techniques and the rapidly growing amounts of acquired imaging data, there is an increased need for automated image processing and analysis solutions in biological studies. Each new application requires the design of a specific image analysis pipeline, by assembling a series of image processing operations. Many commercial or free bioimage analysis software are now available and several textbooks and reviews have presented the mathematical and computational fundamentals of image processing and analysis. Tens, if not hundreds, of algorithms and methods have been developed and integrated into image analysis software, resulting in a combinatorial explosion of possible image processing sequences. This paper presents a general guideline methodology to rationally address the design of image processing and analysis pipelines. The originality of the proposed approach is to follow an iterative, backwards procedure from the target objectives of analysis. The proposed goal-oriented strategy should help biologists to better apprehend image analysis in the context of their research and should allow them to efficiently interact with image processing specialists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
Methods utilized in evaluating the profitability of commercial space processing
NASA Technical Reports Server (NTRS)
Bloom, H. L.; Schmitt, P. T.
1976-01-01
Profitability analysis is applied to commercial space processing on the basis of business concept definition and assessment and the relationship between ground and space functions. Throughput analysis is demonstrated by analysis of the space manufacturing of surface acoustic wave devices. The paper describes a financial analysis model for space processing and provides key profitability measures for space processed isoenzymes.
QUAGOL: a guide for qualitative data analysis.
Dierckx de Casterlé, Bernadette; Gastmans, Chris; Bryon, Els; Denier, Yvonne
2012-03-01
Data analysis is a complex and contested part of the qualitative research process, which has received limited theoretical attention. Researchers are often in need of useful instructions or guidelines on how to analyze the mass of qualitative data, but face the lack of clear guidance for using particular analytic methods. The aim of this paper is to propose and discuss the Qualitative Analysis Guide of Leuven (QUAGOL), a guide that was developed in order to be able to truly capture the rich insights of qualitative interview data. The article describes six major problems researchers are often struggling with during the process of qualitative data analysis. Consequently, the QUAGOL is proposed as a guide to facilitate the process of analysis. Challenges emerged and lessons learned from own extensive experiences with qualitative data analysis within the Grounded Theory Approach, as well as from those of other researchers (as described in the literature), were discussed and recommendations were presented. Strengths and pitfalls of the proposed method were discussed in detail. The Qualitative Analysis Guide of Leuven (QUAGOL) offers a comprehensive method to guide the process of qualitative data analysis. The process consists of two parts, each consisting of five stages. The method is systematic but not rigid. It is characterized by iterative processes of digging deeper, constantly moving between the various stages of the process. As such, it aims to stimulate the researcher's intuition and creativity as optimal as possible. The QUAGOL guide is a theory and practice-based guide that supports and facilitates the process of analysis of qualitative interview data. Although the method can facilitate the process of analysis, it cannot guarantee automatic quality. The skills of the researcher and the quality of the research team remain the most crucial components of a successful process of analysis. Additionally, the importance of constantly moving between the various stages throughout the research process cannot be overstated. Copyright © 2011 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... of performing the technical analysis, management assessment, and program evaluation tasks required to.... Analysis of elements of the review process (including the presubmission process, and investigational device... time to facilitate a more efficient process. This includes analysis of root causes for inefficiencies...
Uncertainty Budget Analysis for Dimensional Inspection Processes (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdez, Lucas M.
2012-07-26
This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less
Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data
2017-01-01
files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...ER D C/ CH L TR -1 7- 2 Coastal Inlets Research Program Tidal Analysis and Arrival Process Mining Using Automatic Identification System...17-2 January 2017 Tidal Analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data Brandan M. Scully Coastal and
Post-test navigation data analysis techniques for the shuttle ALT
NASA Technical Reports Server (NTRS)
1975-01-01
Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.
Schaub, Jochen; Clemens, Christoph; Kaufmann, Hitto; Schulz, Torsten W
2012-01-01
Development of efficient bioprocesses is essential for cost-effective manufacturing of recombinant therapeutic proteins. To achieve further process improvement and process rationalization comprehensive data analysis of both process data and phenotypic cell-level data is essential. Here, we present a framework for advanced bioprocess data analysis consisting of multivariate data analysis (MVDA), metabolic flux analysis (MFA), and pathway analysis for mapping of large-scale gene expression data sets. This data analysis platform was applied in a process development project with an IgG-producing Chinese hamster ovary (CHO) cell line in which the maximal product titer could be increased from about 5 to 8 g/L.Principal component analysis (PCA), k-means clustering, and partial least-squares (PLS) models were applied to analyze the macroscopic bioprocess data. MFA and gene expression analysis revealed intracellular information on the characteristics of high-performance cell cultivations. By MVDA, for example, correlations between several essential amino acids and the product concentration were observed. Also, a grouping into rather cell specific productivity-driven and process control-driven processes could be unraveled. By MFA, phenotypic characteristics in glycolysis, glutaminolysis, pentose phosphate pathway, citrate cycle, coupling of amino acid metabolism to citrate cycle, and in the energy yield could be identified. By gene expression analysis 247 deregulated metabolic genes were identified which are involved, inter alia, in amino acid metabolism, transport, and protein synthesis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arastoopour, Hamid; Abbasian, Javad
2014-07-31
This project describes the work carried out to prepare a highly reactive and mechanically strong MgO based sorbents and to develop a Population Balance Equations (PBE) approach to describe the evolution of the particle porosity distribution that is linked with Computational Fluid Dynamics (CFD) to perform simulations of the CO2 capture and sorbent regeneration. A large number of MgO-based regenerable sorbents were prepared using low cost and abundant dolomite as the base material. Among various preparation parameters investigated the potassium/magnesium (K/Mg) ratio was identified as the key variable affecting the reactivity and CO2 capacity of the sorbent. The optimum K/Mgmore » ratio is about 0.15. The sorbent formulation HD52-P2 was identified as the “best” sorbent formulation and a large batch (one kg) of the sorbent was prepared for the detailed study. The results of parametric study indicate the optimum carbonation and regeneration temperatures are 360° and 500°C, respectively. The results also indicate that steam has a beneficial effect on the rate of carbonation and regeneration of the sorbent and that the reactivity and capacity of the sorbent decreases in the cycling process (sorbent deactivation). The results indicate that to achieve a high CO2 removal efficiency, the bed of sorbent should be operated at a temperature range of 370-410°C which also favors production of hydrogen through the WGS reaction. To describe the carbonation reaction kinetics of the MgO, the Variable Diffusivity shrinking core Model (VDM) was developed in this project, which was shown to accurately fit the experimental data. An important advantage of this model is that the changes in the sorbent conversion with time can be expressed in an explicit manner, which will significantly reduce the CFD computation time. A Computational Fluid Dynamic/Population Balance Equations (CFD/PBE) model was developed that accounts for the particle (sorbent) porosity distribution and a new version of the method of moments, called Finite size domain Complete set of trial functions Method Of Moments (FCMOM) was used to solve the population balance equations. The PBE model was implemented in a commercial CFD code, Ansys Fluent 13.0. The code was used to test the model in some simple cases and the results were verified against available analytical solution in the literature. Furthermore, the code was used to simulate CO2 capture in a packed-bed and the results were in excellent agreement with the experimental data obtained in the packed bed. The National Energy Laboratory (NETL) Carbon Capture Unit (C2U) design was used in simulate of the hydrodynamics of the cold flow gas/solid system (Clark et al.58). The results indicate that the pressure drop predicted by the model is in good agreement with the experimental data. Furthermore, the model was shown to be able to predict chugging behavior, which was observed during the experiment. The model was used as a base-case for simulations of reactive flow at elevated pressure and temperatures. The results indicate that by controlling the solid circulation rate, up to 70% CO2 removal can be achieved and that the solid hold up in the riser is one of the main factors controlling the extent of CO2 removal. The CFD/PBE simulation model indicates that by using a simulated syngas with a composition of 20% CO2, 20% H2O, 30% CO, and 30% H2, the composition (wet basis) in the reactor outlet corresponded to about 60% CO2 capture with and exit gas containing 65% H2. A preliminary base-case-design was developed for a regenerative MgO-based pre-combustion carbon capture process for a 500 MW IGCC power plant. To minimize the external energy requirement, an extensive heat integration network was developed in Aspen/HYSYS® to produce the steam required in the regenerator and heat integration. In this process, liquid CO2 produced at 50 atm can easily be pumped and sequestered or stored. The preliminary economic analyses indicate that the estimated cost of carbon v capture is in the range of $31-$44/ton, suggesting that a regenerative MgO-Based process can be a viable option for pre-combustion carbon dioxide capture in advanced gasification based power systems.« less
Dynamic analysis of process reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shadle, L.J.; Lawson, L.O.; Noel, S.D.
1995-06-01
The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less
[Process management in the hospital pharmacy for the improvement of the patient safety].
Govindarajan, R; Perelló-Juncá, A; Parès-Marimòn, R M; Serrais-Benavente, J; Ferrandez-Martí, D; Sala-Robinat, R; Camacho-Calvente, A; Campabanal-Prats, C; Solà-Anderiu, I; Sanchez-Caparrós, S; Gonzalez-Estrada, J; Martinez-Olalla, P; Colomer-Palomo, J; Perez-Mañosas, R; Rodríguez-Gallego, D
2013-01-01
To define a process management model for a hospital pharmacy in order to measure, analyse and make continuous improvements in patient safety and healthcare quality. In order to implement process management, Igualada Hospital was divided into different processes, one of which was the Hospital Pharmacy. A multidisciplinary management team was given responsibility for each process. For each sub-process one person was identified to be responsible, and a working group was formed under his/her leadership. With the help of each working group, a risk analysis using failure modes and effects analysis (FMEA) was performed, and the corresponding improvement actions were implemented. Sub-process indicators were also identified, and different process management mechanisms were introduced. The first risk analysis with FMEA produced more than thirty preventive actions to improve patient safety. Later, the weekly analysis of errors, as well as the monthly analysis of key process indicators, permitted us to monitor process results and, as each sub-process manager participated in these meetings, also to assume accountability and responsibility, thus consolidating the culture of excellence. The introduction of different process management mechanisms, with the participation of people responsible for each sub-process, introduces a participative management tool for the continuous improvement of patient safety and healthcare quality. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haurykiewicz, John Paul; Dinehart, Timothy Grant; Parker, Robert Young
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with informationmore » and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mattes, Karl
Summit Texas Clean Energy, LLC (Summit) is developing the Texas Clean Energy Project (TCEP or the project) to be located near Penwell, Texas. The TCEP will include an Integrated Gasification Combined Cycle (IGCC) plant with a nameplate capacity of 400 megawatts electric (MWe), combined with the production of urea fertilizer and the capture, utilization and storage of carbon dioxide (CO 2) sold commercially for regional use in enhanced oil recovery (EOR) in the Permian Basin of west Texas. The TCEP will utilize coal gasification technology to convert Powder River Basin sub-bituminous coal delivered by rail from Wyoming into a syntheticmore » gas (syngas) which will be cleaned and further treated so that at least 90 percent of the overall carbon entering the facility will be captured. The clean syngas will then be divided into two high-hydrogen (H 2) concentration streams, one of which will be combusted as a fuel in a combined cycle power block for power generation and the other converted into urea fertilizer for commercial sale. The captured CO 2 will be divided into two streams: one will be used in producing the urea fertilizer and the other will be compressed for transport by pipeline for offsite use in EOR. The TCEP was selected by the U.S. Department of Energy (DOE) Office of Fossil Energy (FE) for cost-shared co-funded financial assistance under Round 3 of its Clean Coal Power Initiative (CCPI). A portion of this financial assistance was budgeted and provided for initial development, permitting and design activities. Front-end Engineering and Design (FEED) commenced in June 2010 and was completed in July 2011, setting the design basis for entering into the detailed engineering phase of the project. During Phase 1, TCEP conducted and completed the FEED, applied for and received its air construction permit, provided engineering and other technical information required for development of the draft Environmental Impact Statement, and completed contracts for the sale of all of the urea and most of the CO 2. Significant progress was made on the contracts for the purchase of coal feedstock from Cloud Peak Energy’s Cordero Rojo mine and the sale of electricity to CPS Energy, as well as a memorandum of understanding with the Union Pacific Railroad (UPRR) for delivery of the coal to the TCEP site.« less
Chemical Sensing in Process Analysis.
ERIC Educational Resources Information Center
Hirschfeld, T.; And Others
1984-01-01
Discusses: (1) rationale for chemical sensors in process analysis; (2) existing types of process chemical sensors; (3) sensor limitations, considering lessons of chemometrics; (4) trends in process control sensors; and (5) future prospects. (JN)
Articulating the Resources for Business Process Analysis and Design
ERIC Educational Resources Information Center
Jin, Yulong
2012-01-01
Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…
Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process
ERIC Educational Resources Information Center
Chenail, Ronald J.
2012-01-01
In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…
Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A
[A SAS marco program for batch processing of univariate Cox regression analysis for great database].
Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin
2015-02-01
To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.
Using task analysis to improve the requirements elicitation in health information system.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2007-01-01
This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.
The finite element simulation analysis research of 38CrSi cylindrical power spinning
NASA Astrophysics Data System (ADS)
Liang, Wei; Lv, Qiongying; Zhao, Yujuan; Lv, Yunxia
2018-01-01
In order to grope for the influence of the main cylindrical spinning process parameters on the spinning process, this paper combines with real tube power spinning process and uses ABAQUS finite element analysis software to simulate the tube power spinning process of 38CrSi steel materials, through the analysis of the stress, strain of the part forming process, analyzes the influence of the thickness reduction and the feed rate to the forming process, and analyzes the variation of the spinning force, finally determines the reasonable main spinning process parameters combination.
Tornado detection data reduction and analysis
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1977-01-01
Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.
Optical analysis of crystal growth
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Passeur, Andrea; Harper, Sabrina
1994-01-01
Processing and data reduction of holographic images from Spacelab presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Evaluation of several processing techniques, including the Computerized Holographic Image Processing System and the image processing software ITEX150, will provide fundamental information for holographic analysis of the space flight data.
Integrated Structural Analysis and Test Program
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2005-01-01
An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.
Negotiation Process Analysis: A Research and Training Tool.
ERIC Educational Resources Information Center
Williams, Timothy
This paper proposes the use of interaction process analysis to study negotiation behaviors. Following a review of current literature in the field, the paper presents a theoretical framework for the analysis of both labor/management and social negotiation processes. Central to the framework described are two systems of activities that together…
Quantitative analysis of geomorphic processes using satellite image data at different scales
NASA Technical Reports Server (NTRS)
Williams, R. S., Jr.
1985-01-01
When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
Thermodynamic analysis of resources used in manufacturing processes.
Gutowski, Timothy G; Branham, Matthew S; Dahmus, Jeffrey B; Jones, Alissa J; Thiriez, Alexandre
2009-03-01
In this study we use a thermodynamic framework to characterize the material and energy resources used in manufacturing processes. The analysis and data span a wide range of processes from "conventional" processes such as machining, casting, and injection molding, to the so-called "advanced machining" processes such as electrical discharge machining and abrasive waterjet machining, and to the vapor-phase processes used in semiconductor and nanomaterials fabrication. In all, 20 processes are analyzed. The results show that the intensity of materials and energy used per unit of mass of material processed (measured either as specific energy or exergy) has increased by at least 6 orders of magnitude over the past several decades. The increase of material/energy intensity use has been primarily a consequence of the introduction of new manufacturing processes, rather than changes in traditional technologies. This phenomenon has been driven by the desire for precise small-scale devices and product features and enabled by stable and declining material and energy prices over this period. We illustrate the relevance of thermodynamics (including exergy analysis) for all processes in spite of the fact that long-lasting focus in manufacturing has been on product quality--not necessarily energy/material conversion efficiency. We promote the use of thermodynamics tools for analysis of manufacturing processes within the context of rapidly increasing relevance of sustainable human enterprises. We confirm that exergy analysis can be used to identify where resources are lost in these processes, which is the first step in proposing and/or redesigning new more efficient processes.
Thermochemical Conversion Techno-Economic Analysis | Bioenergy | NREL
Conversion Techno-Economic Analysis Thermochemical Conversion Techno-Economic Analysis NREL's Thermochemical Conversion Analysis team focuses on the conceptual process design and techno-economic analysis , detailed process models, and TEA developed under this project provide insights into the potential economic
Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.
ERIC Educational Resources Information Center
Carlson, David H.
1986-01-01
This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…
Meta-analysis using Dirichlet process.
Muthukumarana, Saman; Tiwari, Ram C
2016-02-01
This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods. © The Author(s) 2012.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-19
... performing the technical analysis, management assessment, and program evaluation tasks required to address... premarket reviews that meet regulatory review standards. 2. Analysis of elements of the review process... process. This includes analysis of root causes for inefficiencies that may affect review performance and...
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.
1974-01-01
The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...
Code of Federal Regulations, 2014 CFR
2014-07-01
... initial analysis, processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become... information are available for submission, inspection, and selection? 580.40 Section 580.40 Mineral Resources...
Code of Federal Regulations, 2012 CFR
2012-07-01
..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...
Code of Federal Regulations, 2012 CFR
2012-07-01
... initial analysis, processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become... information are available for submission, inspection, and selection? 580.40 Section 580.40 Mineral Resources...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...
Code of Federal Regulations, 2013 CFR
2013-07-01
... initial analysis, processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become... information are available for submission, inspection, and selection? 580.40 Section 580.40 Mineral Resources...
Code of Federal Regulations, 2011 CFR
2011-07-01
... complete the initial analysis, processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information... information are available for submission, inspection, and selection? 280.40 Section 280.40 Mineral Resources...
Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin
2015-01-01
The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.
1994-01-01
Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.
Commercialization Development of Oxygen Fired CFB for Greenhouse Gas Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nsakala ya Nsakala; Gregory N. Liljedahl; David G. Turek
2007-03-31
Given that fossil fuel fired power plants are among the largest and most concentrated producers of CO{sub 2} emissions, recovery and sequestration of CO{sub 2} from the flue gas of such plants has been identified as one of the primary means for reducing anthropogenic (i.e., man-made) CO{sub 2} emissions. In 2001, ALSTOM Power Inc. (ALSTOM) began a two-phase program to investigate the feasibility of various carbon capture technologies. This program was sponsored under a Cooperative Agreement from the US Department of Energy's National Energy Technology Laboratory (DOE). The first phase entailed a comprehensive study evaluating the technical feasibility and economicsmore » of alternate CO{sub 2} capture technologies applied to Greenfield US coal-fired electric generation power plants. Thirteen cases, representing various levels of technology development, were evaluated. Seven cases represented coal combustion in CFB type equipment. Four cases represented Integrated Gasification Combined Cycle (IGCC) systems. Two cases represented advanced Chemical Looping Combined Cycle systems. Marion, et al. reported the details of this work in 2003. One of the thirteen cases studied utilized an oxygen-fired circulating fluidized bed (CFB) boiler. In this concept, the fuel is fired with a mixture of oxygen and recirculated flue gas (mainly CO{sub 2}). This combustion process yields a flue gas containing over 80 percent (by volume) CO{sub 2}. This flue gas can be processed relatively easily to enrich the CO{sub 2} content to over 96 percent for use in enhanced oil or gas recovery (EOR or EGR) or simply dried for sequestration. The Phase I study identified the O{sub 2}-fired CFB as having a near term development potential, because it uses conventional commercial CFB technology and commercially available CO{sub 2} capture enabling technologies such as cryogenic air separation and simple rectification or distillation gas processing systems. In the long term, air separation technology advancements offer significant reductions in power requirements, which would improve plant efficiency and economics for the oxygen-fired technology. The second phase consisted of pilot-scale testing followed by a refined performance and economic evaluation of the O{sub 2} fired CFB concept. As a part of this workscope, ALSTOM modified its 3 MW{sub th} (9.9 MMBtu/hr) Multiuse Test Facility (MTF) pilot plant to operate with O{sub 2}/CO{sub 2} mixtures of up to 70 percent O{sub 2} by volume. Tests were conducted with coal and petroleum coke. The test objectives were to determine the impacts of oxygen firing on heat transfer, bed dynamics, potential agglomeration, and gaseous and particulate emissions. The test data results were used to refine the design, performance, costs, and economic models developed in Phase-I for the O{sub 2}-fired CFB with CO{sub 2} capture. Nsakala, Liljedahl, and Turek reported results from this study in 2004. ALSTOM identified several items needing further investigation in preparation for large scale demonstration of the oxygen-fired CFB concept, namely: (1) Operation and performance of the moving bed heat exchanger (MBHE) to avoid recarbonation and also for cost savings compared to the standard bubbling fluid bed heat exchanger (FBHE); (2) Performance of the back-end flash dryer absorber (FDA) for sulfur capture under high CO{sub 2}/high moisture flue gas environment using calcined limestone in the fly ash and using fresh commercial lime directly in the FDA; (3) Determination of the effect of recarbonation on fouling in the convective pass; (4) Assessment of the impact of oxygen firing on the mercury, other trace elements, and volatile organic compound (VOC) emissions; and (5) Develop a proposal-level oxygen-fired retrofit design for a relatively small existing CFB steam power plant in preparation for a large-scale demonstration of the O{sub 2} fired CFB concept. Hence, ALSTOM responded to a DOE Solicitation to address all these issues with further O{sub 2} fired MTF pilot testing and a subsequent retrofit design study of oxygen firing and CO{sub 2} capture on an existing air-fired CFB plant. ALSTOM received a contract award from the DOE to conduct a project entitled 'Commercialization Development of Oxygen Fired CFB for Greenhouse Gas Control', under Cooperative Agreement DE-FC26-04NT42205 that is the subject of this topical report.« less
Analysis of acoustic emission signals and monitoring of machining processes
Govekar; Gradisek; Grabec
2000-03-01
Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.
Assessing Group Interaction with Social Language Network Analysis
NASA Astrophysics Data System (ADS)
Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.
In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.
Process mining techniques: an application to time management
NASA Astrophysics Data System (ADS)
Khowaja, Ali Raza
2018-04-01
In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
Code of Federal Regulations, 2011 CFR
2011-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
Code of Federal Regulations, 2013 CFR
2013-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
Code of Federal Regulations, 2012 CFR
2012-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
Code of Federal Regulations, 2010 CFR
2010-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
Code of Federal Regulations, 2014 CFR
2014-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process
NASA Technical Reports Server (NTRS)
Carter, M. C.; Madison, M. W.
1973-01-01
The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.
NASA Technical Reports Server (NTRS)
Casasent, D.
1978-01-01
The article discusses several optical configurations used for signal processing. Electronic-to-optical transducers are outlined, noting fixed window transducers and moving window acousto-optic transducers. Folded spectrum techniques are considered, with reference to wideband RF signal analysis, fetal electroencephalogram analysis, engine vibration analysis, signal buried in noise, and spatial filtering. Various methods for radar signal processing are described, such as phased-array antennas, the optical processing of phased-array data, pulsed Doppler and FM radar systems, a multichannel one-dimensional optical correlator, correlations with long coded waveforms, and Doppler signal processing. Means for noncoherent optical signal processing are noted, including an optical correlator for speech recognition and a noncoherent optical correlator.
Conducting qualitative research in mental health: Thematic and content analyses.
Crowe, Marie; Inder, Maree; Porter, Richard
2015-07-01
The objective of this paper is to describe two methods of qualitative analysis - thematic analysis and content analysis - and to examine their use in a mental health context. A description of the processes of thematic analysis and content analysis is provided. These processes are then illustrated by conducting two analyses of the same qualitative data. Transcripts of qualitative interviews are analysed using each method to illustrate these processes. The illustration of the processes highlights the different outcomes from the same set of data. Thematic and content analyses are qualitative methods that serve different research purposes. Thematic analysis provides an interpretation of participants' meanings, while content analysis is a direct representation of participants' responses. These methods provide two ways of understanding meanings and experiences and provide important knowledge in a mental health context. © The Royal Australian and New Zealand College of Psychiatrists 2015.
Canister Storage Building (CSB) Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
POWERS, T.B.
2000-03-16
This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less
Neurophysiological analysis of echolocation in bats
NASA Technical Reports Server (NTRS)
Suga, N.
1972-01-01
An analysis of echolocation and signal processing in brown bats is presented. Data cover echo detection, echo ranging, echolocalization, and echo analysis. Efforts were also made to identify the part of the brain that carries out the most essential processing function for echolocation. Results indicate the inferior colliculus and the auditory nuclei function together to process this information.
National Job Corps Study: Report on the Process Analysis. Research and Evaluation Report Series.
ERIC Educational Resources Information Center
Johnson, Terry; Gritz, Mark; Jackson, Russell; Burghardt, John; Boussy, Carol; Leonard, Jan; Orians, Carlyn
This report presents results of a process analysis that describes and documents Job Corps services and operations. Chapter one provides overviews of Job Corps, the national Job Corps study, and the process analysis. Chapter two describes the administrative structure of Job Corps and presents data on the geographic distribution and characteristics…
Research on the raw data processing method of the hydropower construction project
NASA Astrophysics Data System (ADS)
Tian, Zhichao
2018-01-01
In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.
Usage of information safety requirements in improving tube bending process
NASA Astrophysics Data System (ADS)
Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.
2018-05-01
This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.
Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.
Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy
2015-12-30
While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.
A framework supporting the development of a Grid portal for analysis based on ROI.
Ichikawa, K; Date, S; Kaishima, T; Shimojo, S
2005-01-01
In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.
Optimization of Parameter Ranges for Composite Tape Winding Process Based on Sensitivity Analysis
NASA Astrophysics Data System (ADS)
Yu, Tao; Shi, Yaoyao; He, Xiaodong; Kang, Chao; Deng, Bo; Song, Shibo
2017-08-01
This study is focus on the parameters sensitivity of winding process for composite prepreg tape. The methods of multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis are proposed. The polynomial empirical model of interlaminar shear strength is established by response surface experimental method. Using this model, the relative sensitivity of key process parameters including temperature, tension, pressure and velocity is calculated, while the single-parameter sensitivity curves are obtained. According to the analysis of sensitivity curves, the stability and instability range of each parameter are recognized. Finally, the optimization method of winding process parameters is developed. The analysis results show that the optimized ranges of the process parameters for interlaminar shear strength are: temperature within [100 °C, 150 °C], tension within [275 N, 387 N], pressure within [800 N, 1500 N], and velocity within [0.2 m/s, 0.4 m/s], respectively.
Biomedical image analysis and processing in clouds
NASA Astrophysics Data System (ADS)
Bednarz, Tomasz; Szul, Piotr; Arzhaeva, Yulia; Wang, Dadong; Burdett, Neil; Khassapov, Alex; Chen, Shiping; Vallotton, Pascal; Lagerstrom, Ryan; Gureyev, Tim; Taylor, John
2013-10-01
Cloud-based Image Analysis and Processing Toolbox project runs on the Australian National eResearch Collaboration Tools and Resources (NeCTAR) cloud infrastructure and allows access to biomedical image processing and analysis services to researchers via remotely accessible user interfaces. By providing user-friendly access to cloud computing resources and new workflow-based interfaces, our solution enables researchers to carry out various challenging image analysis and reconstruction tasks. Several case studies will be presented during the conference.
Problem Based Learning: Cognitive and Metacognitive Processes during Problem Analysis.
ERIC Educational Resources Information Center
De Grave, W. S.; And Others
1996-01-01
To investigate whether problem-based learning leads to conceptual change, the cognitive and metacognitive processes of a group of medical students were studied during the problem analysis phase, and their verbal communication and thinking processes were analyzed. Stimulated recall of the thinking process during the discussion detected a conceptual…
Multi-Disciplinary, Multi-Fidelity Discrete Data Transfer Using Degenerate Geometry Forms
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
In a typical multi-fidelity design process, different levels of geometric abstraction are used for different analysis methods, and transitioning from one phase of design to the next often requires a complete re-creation of the geometry. To maintain consistency between lower-order and higher-order analysis results, Vehicle Sketch Pad (OpenVSP) recently introduced the ability to generate and export several degenerate forms of the geometry, representing the type of abstraction required to perform low- to medium-order analysis for a range of aeronautical disciplines. In this research, the functionality of these degenerate models was extended, so that in addition to serving as repositories for the geometric information that is required as input to an analysis, the degenerate models can also store the results of that analysis mapped back onto the geometric nodes. At the same time, the results are also mapped indirectly onto the nodes of lower-order degenerate models using a process called aggregation, and onto higher-order models using a process called disaggregation. The mapped analysis results are available for use by any subsequent analysis in an integrated design and analysis process. A simple multi-fidelity analysis process for a single-aisle subsonic transport aircraft is used as an example case to demonstrate the value of the approach.
Carroll, Adam J; Badger, Murray R; Harvey Millar, A
2010-07-14
Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.
Viscoelastic properties of chalcogenide glasses and the simulation of their molding processes
NASA Astrophysics Data System (ADS)
Liu, Weiguo; Shen, Ping; Jin, Na
In order to simulate the precision molding process, the viscoelastic properties of chalcogenide glasses under high temperatures were investigated. Thermomechanical analysis were performed to measure and analysis the thermomechanical properties of chalcogenide glasses. The creep responses of the glasses at different temperatures were obtained. Finite element analysis was applied for the simulation of the molding processes. The simulation results were in consistence with previously reported experiment results. Stress concentration and evolution during the molding processes was also described with the simulation results.
Microscopic Evaluation of Friction Plug Welds- Correlation to a Processing Analysis
NASA Technical Reports Server (NTRS)
Rabenberg, Ellen M.; Chen, Poshou; Gorti, Sridhar
2017-01-01
Recently an analysis of dynamic forge load data from the friction plug weld (FPW) process and the corresponding tensile test results showed that good plug welds fit well within an analytically determined processing parameter box. There were, however, some outliers that compromised the predictions. Here the microstructure of the plug weld material is presented in view of the load analysis with the intent of further understanding the FPW process and how it is affected by the grain structure and subsequent mechanical properties.
The application of digital techniques to the analysis of metallurgical experiments
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1977-01-01
The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.
Vision-sensing image analysis for GTAW process control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, D.D.
1994-11-01
Image analysis of a gas tungsten arc welding (GTAW) process was completed using video images from a charge coupled device (CCD) camera inside a specially designed coaxial (GTAW) electrode holder. Video data was obtained from filtered and unfiltered images, with and without the GTAW arc present, showing weld joint features and locations. Data Translation image processing boards, installed in an IBM PC AT 386 compatible computer, and Media Cybernetics image processing software were used to investigate edge flange weld joint geometry for image analysis.
Online Analysis Enhances Use of NASA Earth Science Data
NASA Technical Reports Server (NTRS)
Acker, James G.; Leptoukh, Gregory
2007-01-01
Giovanni, the Goddard Earth Sciences Data and Information Services Center (GES DISC) Interactive Online Visualization and Analysis Infrastructure, has provided researchers with advanced capabilities to perform data exploration and analysis with observational data from NASA Earth observation satellites. In the past 5-10 years, examining geophysical events and processes with remote-sensing data required a multistep process of data discovery, data acquisition, data management, and ultimately data analysis. Giovanni accelerates this process by enabling basic visualization and analysis directly on the World Wide Web. In the last two years, Giovanni has added new data acquisition functions and expanded analysis options to increase its usefulness to the Earth science research community.
Sensitivity Analysis in RIPless Compressed Sensing
2014-10-01
SECURITY CLASSIFICATION OF: The compressive sensing framework finds a wide range of applications in signal processing and analysis. Within this...Analysis of Compressive Sensing Solutions Report Title The compressive sensing framework finds a wide range of applications in signal processing and...compressed sensing. More specifically, we show that in a noiseless and RIP-less setting [11], the recovery process of a compressed sensing framework is
Self-conscious robotic system design process--from analysis to implementation.
Chella, Antonio; Cossentino, Massimo; Seidita, Valeria
2011-01-01
Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.
STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitch, S.H.; Morris, J.W.
1962-12-15
Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
NASA Astrophysics Data System (ADS)
Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita
2017-05-01
Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.
ERIC Educational Resources Information Center
Ho, Hsuan-Fu; Hung, Chia-Chi
2008-01-01
Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…
Identifying influential factors of business process performance using dependency analysis
NASA Astrophysics Data System (ADS)
Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank
2011-02-01
We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.
Zeng, Rui; Fu, Juan; Wu, La-Bin; Huang, Lin-Fang
2013-07-01
To analyze components of Citrus reticulata and salt-processed C. reticulata by ultra-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (UPLC-Q-TOF/MS), and compared the changes in components before and after being processed with salt. Principal component analysis (PCA) and partial least squares discriminant analysis (OPLS-DA) were adopted to analyze the difference in fingerprint between crude and processed C. reticulata, showing increased content of eriocitrin, limonin, nomilin and obacunone increase in salt-processed C. reticulata. Potential chemical markers were identified as limonin, obacunone and nomilin, which could be used for distinguishing index components of crude and processed C. reticulata.
NASA Astrophysics Data System (ADS)
Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya
2017-10-01
Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
Implementing EVM Data Analysis Adding Value from a NASA Project Manager's Perspective
NASA Technical Reports Server (NTRS)
Counts, Stacy; Kerby, Jerald
2006-01-01
Data Analysis is one of the keys to an effective Earned Value Management (EVM) Process. Project Managers (PM) must continually evaluate data in assessing the health of their projects. Good analysis of data can assist PMs in making better decisions in managing projects. To better support our P Ms, National Aeronautics and Space Administration (NASA) - Marshall Space Flight Center (MSFC) recently renewed its emphasis on sound EVM data analysis practices and processes, During this presentation we will discuss the approach that MSFC followed in implementing better data analysis across its Center. We will address our approach to effectively equip and support our projects in applying a sound data analysis process. In addition, the PM for the Space Station Biological Research Project will share her experiences of how effective data analysis can benefit a PM in the decision making process. The PM will discuss how the emphasis on data analysis has helped create a solid method for assessing the project s performance. Using data analysis successfully can be an effective and efficient tool in today s environment with increasing workloads and downsizing workforces
NASA Technical Reports Server (NTRS)
Bonine, Lauren
2015-01-01
The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.
Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan
2017-12-01
A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
Thread concept for automatic task parallelization in image analysis
NASA Astrophysics Data System (ADS)
Lueckenhaus, Maximilian; Eckstein, Wolfgang
1998-09-01
Parallel processing of image analysis tasks is an essential method to speed up image processing and helps to exploit the full capacity of distributed systems. However, writing parallel code is a difficult and time-consuming process and often leads to an architecture-dependent program that has to be re-implemented when changing the hardware. Therefore it is highly desirable to do the parallelization automatically. For this we have developed a special kind of thread concept for image analysis tasks. Threads derivated from one subtask may share objects and run in the same context but may process different threads of execution and work on different data in parallel. In this paper we describe the basics of our thread concept and show how it can be used as basis of an automatic task parallelization to speed up image processing. We further illustrate the design and implementation of an agent-based system that uses image analysis threads for generating and processing parallel programs by taking into account the available hardware. The tests made with our system prototype show that the thread concept combined with the agent paradigm is suitable to speed up image processing by an automatic parallelization of image analysis tasks.
Principal process analysis of biological models.
Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc
2018-06-14
Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.
Qin, Kunming; Wang, Bin; Li, Weidong; Cai, Hao; Chen, Danni; Liu, Xiao; Yin, Fangzhou; Cai, Baochang
2015-05-01
In traditional Chinese medicine, raw and processed herbs are used to treat different diseases. Suitable quality assessment methods are crucial for the discrimination between raw and processed herbs. The dried fruit of Arctium lappa L. and their processed products are widely used in traditional Chinese medicine, yet their therapeutic effects are different. In this study, a novel strategy using high-performance liquid chromatography and diode array detection coupled with multivariate statistical analysis to rapidly explore raw and processed Arctium lappa L. was proposed and validated. Four main components in a total of 30 batches of raw and processed Fructus Arctii samples were analyzed, and ten characteristic peaks were identified in the fingerprint common pattern. Furthermore, similarity evaluation, principal component analysis, and hierachical cluster analysis were applied to demonstrate the distinction. The results suggested that the relative amounts of the chemical components of raw and processed Fructus Arctii samples are different. This new method has been successfully applied to detect the raw and processed Fructus Arctii in marketed herbal medicinal products. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Integration of rocket turbine design and analysis through computer graphics
NASA Technical Reports Server (NTRS)
Hsu, Wayne; Boynton, Jim
1988-01-01
An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.
Launch COLA Gap Analysis for Protection of the International Space Station
NASA Astrophysics Data System (ADS)
Jenkin, Alan B.; McVey, John P.; Peterson, Glenn E.; Sorge, Marlon E.
2013-08-01
For launch missions in general, a collision avoidance (COLA) gap exists between the end of the time interval covered by standard launch COLA screening and the time that other spacecraft can clear a collision with the newly launched objects. To address this issue for the International Space Station (ISS), a COLA gap analysis process has been developed. The first part of the process, nodal separation analysis, identifies launch dates and launch window opportunities when the orbit traces of a launched object and the ISS could cross during the COLA gap. The second and newest part of the analysis process, Monte Carlo conjunction probability analysis, is performed closer to the launch dates of concern to reopen some of the launch window opportunities that would be closed by nodal separation analysis alone. Both parts of the process are described and demonstrated on sample missions.
NASA Technical Reports Server (NTRS)
Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga
2005-01-01
Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.
Text analysis devices, articles of manufacture, and text analysis methods
Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C
2013-05-28
Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.
This Applications Analysis Report evaluates the Soliditech, Inc., solidification/ stabilization process for the on-site treatment of waste materials. The Soliditech process mixes and chemically treats waste material with Urrichem (a proprietary reagent), additives, pozzolanic mat...
NASA Technical Reports Server (NTRS)
Lieberman, S. L.
1974-01-01
Tables are presented which include: material properties; elemental analysis; silicone RTV formulations; polyester systems and processing; epoxy preblends and processing; urethane materials and processing; epoxy-urethanes elemental analysis; flammability test results, and vacuum effects.
Applications of High-speed motion analysis system on Solid Rocket Motor (SRM)
NASA Astrophysics Data System (ADS)
Liu, Yang; He, Guo-qiang; Li, Jiang; Liu, Pei-jin; Chen, Jian
2007-01-01
High-speed motion analysis system could record images up to 12,000fps and analyzed with the image processing system. The system stored data and images directly in electronic memory convenient for managing and analyzing. The high-speed motion analysis system and the X-ray radiography system were established the high-speed real-time X-ray radiography system, which could diagnose and measure the dynamic and high-speed process in opaque. The image processing software was developed for improve quality of the original image for acquiring more precise information. The typical applications of high-speed motion analysis system on solid rocket motor (SRM) were introduced in the paper. The research of anomalous combustion of solid propellant grain with defects, real-time measurement experiment of insulator eroding, explosion incision process of motor, structure and wave character of plume during the process of ignition and flameout, measurement of end burning of solid propellant, measurement of flame front and compatibility between airplane and missile during the missile launching were carried out using high-speed motion analysis system. The significative results were achieved through the research. Aim at application of high-speed motion analysis system on solid rocket motor, the key problem, such as motor vibrancy, electrical source instability, geometry aberrance, and yawp disturbance, which damaged the image quality, was solved. The image processing software was developed which improved the capability of measuring the characteristic of image. The experimental results showed that the system was a powerful facility to study instantaneous and high-speed process in solid rocket motor. With the development of the image processing technique, the capability of high-speed motion analysis system was enhanced.
First On-Site Data Analysis System for Subaru/Suprime-Cam
NASA Astrophysics Data System (ADS)
Furusawa, Hisanori; Okura, Yuki; Mineo, Sogo; Takata, Tadafumi; Nakata, Fumiaki; Tanaka, Manobu; Katayama, Nobuhiko; Itoh, Ryosuke; Yasuda, Naoki; Miyazaki, Satoshi; Komiyama, Yutaka; Utsumi, Yousuke; Uchida, Tomohisa; Aihara, Hiroaki
2011-03-01
We developed an automated on-site quick analysis system for mosaic CCD data of Suprime-Cam, which is a wide-field camera mounted at the prime focus of the Subaru Telescope, Mauna Kea, Hawaii. The first version of the data-analysis system was constructed, and started to operate in general observations. This system is a new function of observing support at the Subaru Telescope to provide the Subaru user community with an automated on-site data evaluation, aiming at improvements of observers' productivity, especially in large imaging surveys. The new system assists the data evaluation tasks in observations by the continuous monitoring of the characteristics of every data frame during observations. The evaluation results and data frames processed by this system are also useful for reducing the data-processing time in a full analysis after an observation. The primary analysis functions implemented in the data-analysis system are composed of automated realtime analysis for data evaluation and on-demand analysis, which is executed upon request, including mosaicing analysis and flat making analysis. In data evaluation, which is controlled by the organizing software, the database keeps track of the analysis histories, as well as the evaluated values of data frames, including seeing and sky background levels; it also helps in the selection of frames for mosaicing and flat making analysis. We examined the system performance and confirmed an improvement in the data-processing time by a factor of 9 with the aid of distributed parallel data processing and on-memory data processing, which makes the automated data evaluation effective.
Pedagogical issues for effective teaching of biosignal processing and analysis.
Sandham, William A; Hamilton, David J
2010-01-01
Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The MATLAB and Mathcad software packages offer an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of MATLAB and Mathcad for teaching and research is illustrated with a number of examples.
An analysis of the Petri net based model of the human body iron homeostasis process.
Sackmann, Andrea; Formanowicz, Dorota; Formanowicz, Piotr; Koch, Ina; Blazewicz, Jacek
2007-02-01
In the paper a Petri net based model of the human body iron homeostasis is presented and analyzed. The body iron homeostasis is an important but not fully understood complex process. The modeling of the process presented in the paper is expressed in the language of Petri net theory. An application of this theory to the description of biological processes allows for very precise analysis of the resulting models. Here, such an analysis of the body iron homeostasis model from a mathematical point of view is given.
ERIC Educational Resources Information Center
Verger, Antoni; Hermo, Javier Pablo
2010-01-01
The article analyses two processes of higher education regionalisation, MERCOSUR-Educativo in Latin America and the Bologna Process in Europe, from a comparative perspective. The comparative analysis is centered on the content and the governance of both processes and, specifically, on the reasons of their uneven evolution and implementation. We…
Analysis of Doppler radar windshear data
NASA Technical Reports Server (NTRS)
Williams, F.; Mckinney, P.; Ozmen, F.
1989-01-01
The objective of this analysis is to process Lincoln Laboratory Doppler radar data obtained during FLOWS testing at Huntsville, Alabama, in the summer of 1986, to characterize windshear events. The processing includes plotting velocity and F-factor profiles, histogram analysis to summarize statistics, and correlation analysis to demonstrate any correlation between different data fields.
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
IRB Process Improvements: A Machine Learning Analysis.
Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A
2017-06-01
Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.
Teaching concept analysis to graduate nursing students.
Schiller, Catharine J
2018-04-01
To provide guidance to educators who use the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011), in their graduate nursing curriculum BACKGROUND: While graduate nursing curricula often include a concept analysis assignment, there is a paucity of literature to assist educators in guiding students through this challenging process. This article details one way for educators to assist graduate nursing students in learning how to undertake each step of the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Using examples, this article walks the reader through the Walker and Avant (2011) concept analysis process and addresses those issues commonly encountered by educators during this process. This article presented one way of walking students through a Walker and Avant (2011) concept analysis. Having clear information about the steps involved in developing a concept analysis will make it easier for educators to incorporate it into their graduate nursing curriculum and to effectively guide students on their journey through this process. © 2018 Wiley Periodicals, Inc.
Environmental Assessment: Hurlburt Field Soundside Boathouse and Restroom Facility Construction
2007-08-01
seq., and Air Force Instruction (AFI) 32-7061, The Environmental Impact Analysis Process, the USAF concludes that the Proposed Action will have no...U.S.C.) §4321, et seq., and Air Force Instruction (AFI) 32-7061, The Environmental Impact Analysis Process, the USAF concludes that the Proposed...et seq. • AFI 32-7061, The Environmental Impact Analysis Process These regulations require federal agencies to analyze the potential environmental
Signal Processing and Interpretation Using Multilevel Signal Abstractions.
1986-06-01
mappings expressed in the Fourier domain. Pre- viously proposed causal analysis techniques for diagnosis are based on the analysis of intermediate data ...can be processed either as individual one-dimensional waveforms or as multichannel data 26 I P- - . . . ." " ." h9. for source detection and direction...microphone data . The signal processing for both spectral analysis of microphone signals and direc- * tion determination of acoustic sources involves
NASA Astrophysics Data System (ADS)
Nesladek, Pavel; Wiswesser, Andreas; Sass, Björn; Mauermann, Sebastian
2008-04-01
The Critical dimension off-target (CDO) is a key parameter for mask house customer, affecting directly the performance of the mask. The CDO is the difference between the feature size target and the measured feature size. The change of CD during the process is either compensated within the process or by data correction. These compensation methods are commonly called process bias and data bias, respectively. The difference between data bias and process bias in manufacturing results in systematic CDO error, however, this systematic error does not take into account the instability of the process bias. This instability is a result of minor variations - instabilities of manufacturing processes and changes in materials and/or logistics. Using several masks the CDO of the manufacturing line can be estimated. For systematic investigation of the unit process contribution to CDO and analysis of the factors influencing the CDO contributors, a solid understanding of each unit process and huge number of masks is necessary. Rough identification of contributing processes and splitting of the final CDO variation between processes can be done with approx. 50 masks with identical design, material and process. Such amount of data allows us to identify the main contributors and estimate the effect of them by means of Analysis of variance (ANOVA) combined with multivariate analysis. The analysis does not provide information about the root cause of the variation within the particular unit process, however, it provides a good estimate of the impact of the process on the stability of the manufacturing line. Additionally this analysis can be used to identify possible interaction between processes, which cannot be investigated if only single processes are considered. Goal of this work is to evaluate limits for CDO budgeting models given by the precision and the number of measurements as well as partitioning the variation within the manufacturing process. The CDO variation splits according to the suggested model into contributions from particular processes or process groups. Last but not least the power of this method to determine the absolute strength of each parameter will be demonstrated. Identification of the root cause of this variation within the unit process itself is not scope of this work.
Salari-Moghaddam, Asma; Milajerdi, Alireza; Larijani, Bagher; Esmaillzadeh, Ahmad
2018-06-01
No earlier study has summarized findings from previous publications on processed red meat intake and risk of Chronic Obstructive Pulmonary Disease (COPD). This systematic review and meta-analysis was conducted to examine the association between processed red meat intake and COPD risk. We searched in PubMed/Medline, ISI Web of Knowledge, Scopus, EMBASE and Google Scholar up to April 2018 to identify relevant studies. Prospective cohort studies that considered processed red meat as the exposure variable and COPD as the main outcome variable or as one of the outcomes were included in the systematic review. Publications in which hazard ratios (HRs) were reported as effect size were included in the meta-analysis. Finally, five cohort studies were considered in this systematic review and meta-analysis. In total, 289,952 participants, including 8338 subjects with COPD, aged ≥27 years were included in the meta-analysis. These studies were from Sweden and the US. Linear dose response meta-analysis revealed that each 50 gr/week increase in processed red meat intake was associated with 8% higher risk of COPD (HR: 1.08; 95% CI: 1.03, 1.13). There was an evidence of non-linear association between processed red meat intake and risk of COPD (P < 0.001). In this systematic review and meta-analysis, we found a significant positive association between processed red meat intake and risk of COPD. CRD42017077971. Copyright © 2018 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
NASA Astrophysics Data System (ADS)
Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.
2017-01-01
Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.
2014-04-01
The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.
NASA Technical Reports Server (NTRS)
Davis, Frank W.; Quattrochi, Dale A.; Ridd, Merrill K.; Lam, Nina S.-N.; Walsh, Stephen J.
1991-01-01
This paper discusses some basic scientific issues and research needs in the joint processing of remotely sensed and GIS data for environmental analysis. Two general topics are treated in detail: (1) scale dependence of geographic data and the analysis of multiscale remotely sensed and GIS data, and (2) data transformations and information flow during data processing. The discussion of scale dependence focuses on the theory and applications of spatial autocorrelation, geostatistics, and fractals for characterizing and modeling spatial variation. Data transformations during processing are described within the larger framework of geographical analysis, encompassing sampling, cartography, remote sensing, and GIS. Development of better user interfaces between image processing, GIS, database management, and statistical software is needed to expedite research on these and other impediments to integrated analysis of remotely sensed and GIS data.
Combined process automation for large-scale EEG analysis.
Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E
2012-01-01
Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.
Heat and Mass Transfer Processes in Scrubber of Flue Gas Heat Recovery Device
NASA Astrophysics Data System (ADS)
Veidenbergs, Ivars; Blumberga, Dagnija; Vigants, Edgars; Kozuhars, Grigorijs
2010-01-01
The paper deals with the heat and mass transfer process research in a flue gas heat recovery device, where complicated cooling, evaporation and condensation processes are taking place simultaneously. The analogy between heat and mass transfer is used during the process of analysis. In order to prepare a detailed process analysis based on heat and mass process descriptive equations, as well as the correlation for wet gas parameter calculation, software in the
An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis
NASA Astrophysics Data System (ADS)
Kim, Yongmin; Alexander, Thomas
1986-06-01
In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.
Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko
2016-05-01
Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.
Summary and recommendations. [reduced gravitational effects on materials manufactured in space
NASA Technical Reports Server (NTRS)
1975-01-01
An economic analysis using econometric and cost benefit analysis techniques was performed to determine the feasibility of space processing of certain products. The overall objectives of the analysis were (1) to determine specific products or processes uniquely connected with space manufacturing, (2) to select a specific product or process from each of the areas of semiconductors, metals, and biochemicals, and (3) to determine the overall price/cost structure of each product or process considered. The economic elements of the analysis involved a generalized decision making format for analyzing space manufacturing, a comparative cost study of the selected processes in space vs. earth manufacturing, and a supply and demand study of the economic relationships of one of the manufacturing processes. Space processing concepts were explored. The first involved the use of the shuttle as the factory with all operations performed during individual flights. The second concept involved a permanent unmanned space factory which would be launched separately. The shuttle in this case would be used only for maintenance and refurbishment. Finally, some consideration was given to a permanent manned space factory.
Clinical process analysis and activity-based costing at a heart center.
Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans
2002-08-01
Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Process factors of enzyme concentration, time, power and frequency were investigated for ultrasound-enhanced bioscouring of greige cotton. A fractional factorial experimental design and subsequent regression analysis of the process factors were employed to determine the significance of each factor a...
Quality Assessment of College Admissions Processes.
ERIC Educational Resources Information Center
Fisher, Caroline; Weymann, Elizabeth; Todd, Amy
2000-01-01
This study evaluated the admissions process for a Master's in Business Administration Program using such quality improvement techniques as customer surveys, benchmarking, and gap analysis. Analysis revealed that student dissatisfaction with the admissions process may be a factor influencing declining enrollment. Cycle time and number of student…
Li, Liang; Wang, Yiying; Xu, Jiting; Flora, Joseph R V; Hoque, Shamia; Berge, Nicole D
2018-08-01
Hydrothermal carbonization (HTC) is a wet, low temperature thermal conversion process that continues to gain attention for the generation of hydrochar. The importance of specific process conditions and feedstock properties on hydrochar characteristics is not well understood. To evaluate this, linear and non-linear models were developed to describe hydrochar characteristics based on data collected from HTC-related literature. A Sobol analysis was subsequently conducted to identify parameters that most influence hydrochar characteristics. Results from this analysis indicate that for each investigated hydrochar property, the model fit and predictive capability associated with the random forest models is superior to both the linear and regression tree models. Based on results from the Sobol analysis, the feedstock properties and process conditions most influential on hydrochar yield, carbon content, and energy content were identified. In addition, a variational process parameter sensitivity analysis was conducted to determine how feedstock property importance changes with process conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Burnside, Jathan J.
2012-01-01
Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
Hanson, A L; Metzger, L E
2010-02-01
The objective of this study was to determine the effect of increased vitamin D fortification (250 IU/serving) of high-temperature, short-time (HTST)-processed 2% fat milk, UHT-processed 2% fat chocolate milk, and low-fat strawberry yogurt on the sensory characteristics and stability of vitamin D during processing and storage. Three replicates of HTST pasteurized 2% fat milk, UHT pasteurized 2% fat chocolate milk, and low-fat strawberry yogurt were manufactured. Each of the 3 replicates for all products contained a control (no vitamin D fortification), a treatment group with 100 IU vitamin D/serving (current level of vitamin D fortification), and a treatment group with 250 IU vitamin D/serving. A cold-water dispersible vitamin D(3) concentrate was used for all fortifications. The HTST-processed 2% fat milk was stored for 21 d, with vitamin D analysis done before processing and on d 0, 14, and 21. Sensory analysis was conducted on d 14. The UHT-processed 2% fat chocolate milk was stored for 60 d, with vitamin D analysis done before processing and on d 0, 40, and 60. Sensory analysis was conducted on d 40. Low-fat strawberry yogurt was stored for 42 d, with vitamin D analysis done before processing, and on d 0, 28, and 42. Sensory analysis was conducted on d 28. Vitamin D levels in the fortified products were found to be similar to the target levels of fortification (100 and 250 IU vitamin D per serving) for all products, indicating no loss of vitamin D during processing. Vitamin D was also found to be stable over the shelf life of each product. Increasing the fortification of vitamin D from 100 to 250 IU/serving did not result in a change in the sensory characteristics of HTST-processed 2% fat milk, UHT-processed 2% fat chocolate milk, or low-fat strawberry yogurt. These results indicate that it is feasible to increase vitamin D fortification from 100 to 250 IU per serving in these products. Copyright 2010 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Vocational Education Operations Analysis Process.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento. Vocational Education Services.
This manual on the vocational education operations analysis process is designed to provide vocational administrators/coordinators with an internal device to collect, analyze, and display vocational education performance data. The first section describes the system and includes the following: analysis worksheet, data sources, utilization, system…
Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach
2008-06-01
develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the
Wet weather highway accident analysis and skid resistance data management system (volume I).
DOT National Transportation Integrated Search
1992-06-01
The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...
Multiobjective Sensitivity Analysis Of Sediment And Nitrogen Processes With A Watershed Model
This paper presents a computational analysis for evaluating critical non-point-source sediment and nutrient (specifically nitrogen) processes and management actions at the watershed scale. In the analysis, model parameters that bear key uncertainties were presumed to reflect the ...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Acquisition Programs and Major Automated Information System Acquisition Programs. 1 To comply with NEPA and... ANALYSIS PROCESS (EIAP) § 989.1 Purpose. (a) This part implements the Air Force Environmental Impact Analysis Process (EIAP) and provides procedures for environmental impact analysis both within the United...
A DMAIC approach for process capability improvement an engine crankshaft manufacturing process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa
2014-05-01
The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.
Lu, Lingbo; Li, Jingshan; Gisler, Paula
2011-06-01
Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.
NASA Astrophysics Data System (ADS)
Citraresmi, A. D. P.; Wahyuni, E. E.
2018-03-01
The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.
Walking the Fine Line: Political Decision Making with or without Data.
ERIC Educational Resources Information Center
Merkel-Keller, Claudia
The stages of the policy process are examined and explained in terms of the decision making framework. The policy process is comprised of four stages; policy analysis, policy formation, policy decision, and political analysis. Political analysis is the performance of the market analysis needed for a decision. The political weight, rather than the…
The COMPTEL Processing and Analysis Software system (COMPASS)
NASA Astrophysics Data System (ADS)
de Vries, C. P.; COMPTEL Collaboration
The data analysis system of the gamma-ray Compton Telescope (COMPTEL) onboard the Compton-GRO spacecraft is described. A continous stream of data of the order of 1 kbytes per second is generated by the instrument. The data processing and analysis software is build around a relational database managment system (RDBMS) in order to be able to trace heritage and processing status of all data in the processing pipeline. Four institutes cooperate in this effort requiring procedures to keep local RDBMS contents identical between the sites and swift exchange of data using network facilities. Lately, there has been a gradual move of the system from central processing facilities towards clusters of workstations.
Human Factors Analysis to Improve the Processing of Ares-1 Launch Vehicle
NASA Technical Reports Server (NTRS)
Dippolito, Gregory M.; Stambolian, Damon B.
2011-01-01
The Constellation Program (CxP) is composed of an array of vehicles used to go to the Moon and Mars. The Ares vehicle one of the components of CxP, goes through several stages of processing before it is launched at the Kennedy Space Center. In order to have efficient and effective ground processing inside and outside the vehicle, all of the ground processing activities should be analyzed. The analysis for this program was performed, by engineers, technicians, and human factors experts with spacecraft processing experience. The procedure used to gather data was accomplished by observing human activities within physical mockups. The paper will focus on the procedures, analysis and results from these observations.
Welding process modelling and control
NASA Technical Reports Server (NTRS)
Romine, Peter L.; Adenwala, Jinen A.
1993-01-01
The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.
Simulation and Analysis of One-time Forming Process of Automobile Steering Ball Head
NASA Astrophysics Data System (ADS)
Shi, Peicheng; Zhang, Xujun; Xu, Zengwei; Zhang, Rongyun
2018-03-01
Aiming at the problems such as large machining allowance, low production efficiency and material waste during die forging of ball pin, the cold extrusion process of ball head was studied and the analog simulation of the forming process was carried out by using the finite element analysis software DEFORM-3D. Through the analysis of the equivalent stress strain, velocity vector field and load-displacement curve, the flow regularity of the metal during the cold extrusion process of ball pin was clarified, and possible defects during the molding were predicted. The results showed that this process could solve the forming problem of ball pin and provide theoretical basis for actual production of enterprises.
STARS Conceptual Framework for Reuse Processes (CFRP). Volume 2: application Version 1.0
1993-09-30
Analysis and Design DISA/CIM process x OProcess [DIS93] Feature-Oriented Domain SEI process x Analysis ( FODA ) [KCH+90] JIAWG Object-Oriented Domain JIAWG...Domain Analysis ( FODA ) Feasibility Study. Technical Report CMU/S[1 ,N. I R 21. Soft- ware Engineering Institute, Carnegie Mellon University, Pittsburgh...Electronic Systems Center Air Force Materiel Command, USAF Hanscom AFB, MA 01731-5000 Prepared by: The Boeing Company , IBM, Unisys Corporation, Defense
NASA Astrophysics Data System (ADS)
Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa
We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.
Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder
2009-12-01
To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.
Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS
NASA Astrophysics Data System (ADS)
Joshi, D. M.; Patel, H. K.
2015-10-01
Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.
M-DAS: System for multispectral data analysis. [in Saginaw Bay, Michigan
NASA Technical Reports Server (NTRS)
Johnson, R. H.
1975-01-01
M-DAS is a ground data processing system designed for analysis of multispectral data. M-DAS operates on multispectral data from LANDSAT, S-192, M2S and other sources in CCT form. Interactive training by operator-investigators using a variable cursor on a color display was used to derive optimum processing coefficients and data on cluster separability. An advanced multivariate normal-maximum likelihood processing algorithm was used to produce output in various formats: color-coded film images, geometrically corrected map overlays, moving displays of scene sections, coverage tabulations and categorized CCTs. The analysis procedure for M-DAS involves three phases: (1) screening and training, (2) analysis of training data to compute performance predictions and processing coefficients, and (3) processing of multichannel input data into categorized results. Typical M-DAS applications involve iteration between each of these phases. A series of photographs of the M-DAS display are used to illustrate M-DAS operation.
Comprehensive Mass Analysis for Chemical Processes, a Case Study on L-Dopa Manufacture
To evaluate the “greenness” of chemical processes in route selection and process development, we propose a comprehensive mass analysis to inform the stakeholders from different fields. This is carried out by characterizing the mass intensity for each contributing chemical or wast...
The Community Innovation Process: A Conceptualization and Empirical Analysis.
ERIC Educational Resources Information Center
Agnew, John A.; And Others
1978-01-01
Previous research into the community innovation process has tended to emphasize either intercommunity communication or local socioeconomic and political factors. This article incorporates both sets of factors in an analysis of urban renewal, public housing, automated data processing by local municipalities, and public water fluoridation.…
DOT National Transportation Integrated Search
1992-06-01
The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
Metacognition and evidence analysis instruction: an educational framework and practical experience.
Parrott, J Scott; Rubinstein, Matthew L
2015-08-21
The role of metacognitive skills in the evidence analysis process has received little attention in the research literature. While the steps of the evidence analysis process are well defined, the role of higher-level cognitive operations (metacognitive strategies) in integrating the steps of the process is not well understood. In part, this is because it is not clear where and how metacognition is implicated in the evidence analysis process nor how these skills might be taught. The purposes of this paper are to (a) suggest a model for identifying critical thinking and metacognitive skills in evidence analysis instruction grounded in current educational theory and research and (b) demonstrate how freely available systematic review/meta-analysis tools can be used to focus on higher-order metacognitive skills, while providing a framework for addressing common student weaknesses. The final goal of this paper is to provide an instructional framework that can generate critique and elaboration while providing the conceptual basis and rationale for future research agendas on this topic.
A method for identifying EMI critical circuits during development of a large C3
NASA Astrophysics Data System (ADS)
Barr, Douglas H.
The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.
Operation, Modeling and Analysis of the Reverse Water Gas Shift Process
NASA Technical Reports Server (NTRS)
Whitlow, Jonathan E.
2001-01-01
The Reverse Water Gas Shift process is a candidate technology for water and oxygen production on Mars under the In-Situ Propellant Production project. This report focuses on the operation and analysis of the Reverse Water Gas Shift (RWGS) process, which has been constructed at Kennedy Space Center. A summary of results from the initial operation of the RWGS, process along with an analysis of these results is included in this report. In addition an evaluation of a material balance model developed from the work performed previously under the summer program is included along with recommendations for further experimental work.
NASA Technical Reports Server (NTRS)
Parrish, R. S.; Carter, M. C.
1974-01-01
This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.
TU-AB-BRD-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-02: Failure Modes and Effects Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
What carries a mediation process? Configural analysis of mediation.
von Eye, Alexander; Mun, Eun Young; Mair, Patrick
2009-09-01
Mediation is a process that links a predictor and a criterion via a mediator variable. Mediation can be full or partial. This well-established definition operates at the level of variables even if they are categorical. In this article, two new approaches to the analysis of mediation are proposed. Both of these approaches focus on the analysis of categorical variables. The first involves mediation analysis at the level of configurations instead of variables. Thus, mediation can be incorporated into the arsenal of methods of analysis for person-oriented research. Second, it is proposed that Configural Frequency Analysis (CFA) can be used for both exploration and confirmation of mediation relationships among categorical variables. The implications of using CFA are first that mediation hypotheses can be tested at the level of individual configurations instead of variables. Second, this approach leaves the door open for different types of mediation processes to exist within the same set. Using a data example, it is illustrated that aggregate-level analysis can overlook mediation processes that operate at the level of individual configurations.
Metabolic profiling of body fluids and multivariate data analysis.
Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten
2017-01-01
Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.
Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.
Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru
2015-01-21
Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1979-01-01
Analyses of slicing processes and junction formation processes are presented. A simple method for evaluation of the relative economic merits of competing process options with respect to the cost of energy produced by the system is described. An energy consumption analysis was developed and applied to determine the energy consumption in the solar module fabrication process sequence, from the mining of the SiO2 to shipping. The analysis shows that, in current technology practice, inordinate energy use in the purification step, and large wastage of the invested energy through losses, particularly poor conversion in slicing, as well as inadequate yields throughout. The cell process energy expenditures already show a downward trend based on increased throughput rates. The large improvement, however, depends on the introduction of a more efficient purification process and of acceptable ribbon growing techniques.
Inverse Thermal Analysis of Titanium GTA Welds Using Multiple Constraints
NASA Astrophysics Data System (ADS)
Lambrakos, S. G.; Shabaev, A.; Huang, L.
2015-06-01
Inverse thermal analysis of titanium gas-tungsten-arc welds using multiple constraint conditions is presented. This analysis employs a methodology that is in terms of numerical-analytical basis functions for inverse thermal analysis of steady-state energy deposition in plate structures. The results of this type of analysis provide parametric representations of weld temperature histories that can be adopted as input data to various types of computational procedures, such as those for prediction of solid-state phase transformations. In addition, these temperature histories can be used to construct parametric function representations for inverse thermal analysis of welds corresponding to other process parameters or welding processes whose process conditions are within similar regimes. The present study applies an inverse thermal analysis procedure that provides for the inclusion of constraint conditions associated with both solidification and phase transformation boundaries.
The Market Responses to the Government Regulation of Chlorinated Solvents: A Policy Analysis
1988-10-01
in the process of statistical estimation of model parameters. The results of the estimation process applied to chlorinated solvent markets show the...93 C.5. Marginal Feedstock Cost Series Estimates for Process Share of Total Production .................................. 94 F.I...poliay context for this research. Section III provides analysis necessary to understand the chemicals involved, their production processes and costs, and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohan, P.; Yuan, B.; Patterson, T.
2007-11-15
The presence of vanadium, phosphorus, and sodium impurities in petcoke and coal/petcoke blends used in integrated gasification combined cycle (IGCC) plants warrants a clear understanding of high-temperature material degradation for the development of fuel-flexible gas turbines. In this study, degradation reactions of free-standing air plasma-sprayed (APS) yttria-stabilized zirconia (YSZ) in contact with V{sub 2}O{sub 5}, P{sub 2}O{sub 5}, and Na{sub 2}SO{sub 4} were investigated at temperatures up to 1200{sup o}C. Phase transformations and microstructural development were examined using X-ray diffraction, scanning electron microscopy, and transmission electron microscopy. Molten V{sub 2}O{sub 5} reacted with solid YSZ to form ZrV{sub 2}O{sub 7}more » at temperatures below 747{sup o}C. However, at temperatures above 747{sup o}C, molten V{sub 2}O{sub 5} reacted with YSZ to form yttrium vanadate (YVO{sub 4}). The formation of YVO{sub 4} led to the depletion of the Y2O{sub 3} stabilizer and deleterious transformation to the monoclinic ZrO{sub 2} phase. In addition, studies on YSZ degradation by Na{sub 2}SO{sub 4} and a Na{sub 2}SO{sub 4}+V{sub 2}O{sub 5} mixture (50-50 mol%) showed that Na{sub 2}SO{sub 4} itself had no effect on the degradation of YSZ. However, in the presence of V{sub 2}O{sub 5} at high temperatures, Na{sub 2}SO{sub 4} forms vanadate compounds having a lower melting point such as sodium metavanadate (610{sup o}C), which was found to degrade YSZ by the formation of YVO{sub 4} at a relatively lower temperature of 700{sup o}C. P{sub 2}O{sub 5} was found to react with APS YSZ by the formation of ZrP{sub 2}O{sub 7} at all the temperatures studied. At temperatures as low as 200{sup o}C and as high as 1200{sup o}C, molten P{sub 2}O{sub 5} was observed to react with solid YSZ to yield ZrP{sub 2}O{sub 7}, which led to the depletion of ZrO{sub 2} in YSZ that promoted the formation of the fluorite-cubic ZrO{sub 2} phase.« less
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
Optimization, an Important Stage of Engineering Design
ERIC Educational Resources Information Center
Kelley, Todd R.
2010-01-01
A number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization. The analysis stage of the engineering design process is when mathematical models and scientific principles are employed to help the designer predict design…
The Impact of Meaning and Dimensionality on Copying Accuracy in Individuals with Autism
ERIC Educational Resources Information Center
Sheppard, Elizabeth; Ropar, Danielle; Mitchell, Peter
2007-01-01
Weak Central Coherence (Frith, 1989) predicts that, in autism, perceptual processing is relatively unaffected by conceptual analysis. Enhanced Perceptual Functioning (Mottron & Burack, 2001) predicts that the perceptual processing of those with autism is less influenced by conceptual analysis only when higher-level processing is detrimental to…
Encapsulation Processing and Manufacturing Yield Analysis
NASA Technical Reports Server (NTRS)
Willis, P. B.
1984-01-01
The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.
Vygotsky's Analysis of Children's Meaning Making Processes
ERIC Educational Resources Information Center
Mahn, Holbrook
2012-01-01
Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…
Encapsulation processing and manufacturing yield analysis
NASA Astrophysics Data System (ADS)
Willis, P. B.
1984-10-01
The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.
Model prototype utilization in the analysis of fault tolerant control and data processing systems
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.
2016-04-01
The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.
Language-Based Curriculum Analysis: A Collaborative Assessment and Intervention Process.
ERIC Educational Resources Information Center
Prelock, Patricia A.
1997-01-01
Presents a systematic process for completing a language-based curriculum analysis to address curriculum expectations that may challenge students with communication impairments. Analysis of vocabulary and the demands for comprehension, oral, and written expression within specific content areas provides a framework for collaboration between teachers…
Theories of State Analyzing the Policy Process,
1973-11-01
values and goals - which is the heart of the rational process-- in reality cannot be separated from the actor’s empirical analysis of the situation...rigorous and objective in analysis . How different would our foreign policy actually be? Would it necessarily be better? In fact, would one even need...State, but the fact is that much of the outside research and analysis of policy process is pointed at the 6 As Robert Rothstein says in his valuable
Space processing applications payload equipment study. Volume 2A: Experiment requirements
NASA Technical Reports Server (NTRS)
Smith, A. G.; Anderson, W. T., Jr.
1974-01-01
An analysis of the space processing applications payload equipment was conducted. The primary objective was to perform a review and an update of the space processing activity research equipment requirements and specifications that were derived in the first study. The analysis is based on the six major experimental classes of: (1) biological applications, (2) chemical processes in fluids, (3) crystal growth, (4) glass technology, (5) metallurgical processes, and (6) physical processes in fluids. Tables of data are prepared to show the functional requirements for the areas of investigation.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM)
NASA Astrophysics Data System (ADS)
Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.
2017-09-01
The warpage is often encountered which occur during injection moulding process of thin shell part depending the process condition. The statistical design of experiment method which are Integrating Finite Element (FE) Analysis, moldflow analysis and response surface methodology (RSM) are the stage of few ways in minimize the warpage values of x,y and z on thin shell plastic parts that were investigated. A battery cover of a remote controller is one of the thin shell plastic part that produced by using injection moulding process. The optimum process condition parameter were determined as to achieve the minimum warpage from being occur. Packing pressure, Cooling time, Melt temperature and Mould temperature are 4 parameters that considered in this study. A two full factorial experimental design was conducted in Design Expert of RSM analysis as to combine all these parameters study. FE analysis result gain from analysis of variance (ANOVA) method was the one of the important process parameters influenced warpage. By using RSM, a predictive response surface model for warpage data will be shown.
Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping
2018-05-16
As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Khaligh-Razavi, Seyed-Mahdi; Cichy, Radoslaw Martin; Pantazis, Dimitrios; Oliva, Aude
2018-06-07
Animacy and real-world size are properties that describe any object and thus bring basic order into our perception of the visual world. Here, we investigated how the human brain processes real-world size and animacy. For this, we applied representational similarity to fMRI and MEG data to yield a view of brain activity with high spatial and temporal resolutions, respectively. Analysis of fMRI data revealed that a distributed and partly overlapping set of cortical regions extending from occipital to ventral and medial temporal cortex represented animacy and real-world size. Within this set, parahippocampal cortex stood out as the region representing animacy and size stronger than most other regions. Further analysis of the detailed representational format revealed differences among regions involved in processing animacy. Analysis of MEG data revealed overlapping temporal dynamics of animacy and real-world size processing starting at around 150 msec and provided the first neuromagnetic signature of real-world object size processing. Finally, to investigate the neural dynamics of size and animacy processing simultaneously in space and time, we combined MEG and fMRI with a novel extension of MEG-fMRI fusion by representational similarity. This analysis revealed partly overlapping and distributed spatiotemporal dynamics, with parahippocampal cortex singled out as a region that represented size and animacy persistently when other regions did not. Furthermore, the analysis highlighted the role of early visual cortex in representing real-world size. A control analysis revealed that the neural dynamics of processing animacy and size were distinct from the neural dynamics of processing low-level visual features. Together, our results provide a detailed spatiotemporal view of animacy and size processing in the human brain.
Rouhani, M H; Salehi-Abargouei, A; Surkan, P J; Azadbakht, L
2014-09-01
A body of literature exists regarding the association of red and processed meats with obesity; however, the nature and extent of this relation has not been clearly established. The aim of this study is to conduct a systematic review and meta-analysis of the relationship between red and processed meat intake and obesity. We searched multiple electronic databases for observational studies on the relationship between red and processed meat intake and obesity published until July 2013. Odds ratios (ORs) and means for obesity-related indices and for variables that may contribute to heterogeneity were calculated. A systematic review and a meta-analysis were conducted with 21 and 18 studies, respectively (n = 1,135,661). The meta-analysis (n = 113,477) showed that consumption of higher quantities of red and processed meats was a risk factor for obesity (OR: 1.37; 95% CI: 1.14-1.64). Pooled mean body mass index (BMI) and waist circumference (WC) trends showed that in comparison to those in the lowest ntile, subjects in the highest ntile of red and processed meat consumption had higher BMI (mean difference: 1.37; 95% CI: 0.90-1.84 for red meat; mean difference: 1.32; 95% CI: 0.64-2.00 for processed meat) and WC (mean difference: 2.79; 95% CI: 1.86-3.70 for red meat; mean difference: 2.77; 95% CI: 1.87-2.66 for processed meat). The current analysis revealed that red and processed meat intake is directly associated with risk of obesity, and higher BMI and WC. However, the heterogeneity among studies is significant. These findings suggest a decrease in red and processed meat intake. © 2014 The Authors. obesity reviews © 2014 World Obesity.
Statistical quality control through overall vibration analysis
NASA Astrophysics Data System (ADS)
Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos
2010-05-01
The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.
Gourley, Paul L.; Gourley, Mark F.
1997-01-01
An apparatus and method for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis thereof.
Gourley, P.L.; Gourley, M.F.
1997-03-04
An apparatus and method are disclosed for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis. 20 figs.
Child versus adult psychoanalysis: two processes or one?
Sugarman, Alan
2009-12-01
Child analysis continues to be seen as a different technique from adult analysis because children are still involved in a developmental process and because the primary objects continue to play active roles in their lives. This paper argues that this is a false dichotomy. An extended vignette of the analysis of a latency-aged girl is used to demonstrate that the psychoanalytic process that develops in child analysis is structurally the same as that in adult analysis. Both revolve around the analysis of resistance and transference and use both to promote knowledge of the patient's mind at work. And both techniques formulate interventions based on the analyst's appraisal of the patient's mental organization. It is hoped that stressing the essential commonality of both techniques will promote the development of an overarching theory of psychoanalytic technique.
An application of computer aided requirements analysis to a real time deep space system
NASA Technical Reports Server (NTRS)
Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.
1981-01-01
The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
Choosing order of operations to accelerate strip structure analysis in parameter range
NASA Astrophysics Data System (ADS)
Kuksenko, S. P.; Akhunov, R. R.; Gazizov, T. R.
2018-05-01
The paper considers the issue of using iteration methods in solving the sequence of linear algebraic systems obtained in quasistatic analysis of strip structures with the method of moments. Using the analysis of 4 strip structures, the authors have proved that additional acceleration (up to 2.21 times) of the iterative process can be obtained during the process of solving linear systems repeatedly by means of choosing a proper order of operations and a preconditioner. The obtained results can be used to accelerate the process of computer-aided design of various strip structures. The choice of the order of operations to accelerate the process is quite simple, universal and could be used not only for strip structure analysis but also for a wide range of computational problems.
Second-Order Analysis of Semiparametric Recurrent Event Processes
Guan, Yongtao
2011-01-01
Summary A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a followup period. Such data have become increasingly available in medical and epidemiological studies. In this paper, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on Meningococcal disease cases in Merseyside, UK to illustrate their practical value. PMID:21361885
Quantifiable and objective approach to organizational performance enhancement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholand, Andrew Joseph; Tausczik, Yla R.
This report describes a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to identify socially situated relationships between individuals which, though subtle, are highly influential. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships aremore » latent or unrecognized. This report outlines the philosophical antecedents of SLNA, the mechanics of preprocessing, processing, and post-processing stages, and some example results obtained by applying this approach to a 15-month corporate discussion archive.« less
Dai, Heng; Ye, Ming; Walker, Anthony P.; ...
2017-03-28
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
A State Space Modeling Approach to Mediation Analysis
ERIC Educational Resources Information Center
Gu, Fei; Preacher, Kristopher J.; Ferrer, Emilio
2014-01-01
Mediation is a causal process that evolves over time. Thus, a study of mediation requires data collected throughout the process. However, most applications of mediation analysis use cross-sectional rather than longitudinal data. Another implicit assumption commonly made in longitudinal designs for mediation analysis is that the same mediation…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-18
... technical analysis submitted for parallel-processing by DNREC on December 9, 2010, to address significant... technical analysis submitted by DNREC for parallel-processing on December 9, 2010, to satisfy the... consists of a technical analysis that provides detailed support for Delaware's position that it has...
Image Analysis, Microscopic, and Spectrochemical Study of the PVC Dry Blending Process,
The dry blending process used in the production of electrical grade pvc formulations has been studies using a combination of image analysis , microscopic...by image analysis techniques. Optical and scanning electron microscopy were used to assess morphological differences. Spectrochemical techniques were used to indicate chemical changes.
Processing Cones: A Computational Structure for Image Analysis.
1981-12-01
image analysis applications, referred to as a processing cone, is described and sample algorithms are presented. A fundamental characteristic of the structure is its hierarchical organization into two-dimensional arrays of decreasing resolution. In this architecture, a protypical function is defined on a local window of data and applied uniformly to all windows in a parallel manner. Three basic modes of processing are supported in the cone: reduction operations (upward processing), horizontal operations (processing at a single level) and projection operations (downward
Lab-on-a-chip based total-phosphorus analysis device utilizing a photocatalytic reaction
NASA Astrophysics Data System (ADS)
Jung, Dong Geon; Jung, Daewoong; Kong, Seong Ho
2018-02-01
A lab-on-a-chip (LOC) device for total phosphorus (TP) analysis was fabricated for water quality monitoring. Many commercially available TP analysis systems used to estimate water quality have good sensitivity and accuracy. However, these systems also have many disadvantages such as bulky size, complex pretreatment processes, and high cost, which limit their application. In particular, conventional TP analysis systems require an indispensable pretreatment step, in which the fluidic analyte is heated to 120 °C for 30 min to release the dissolved phosphate, because many phosphates are soluble in water at a standard temperature and pressure. In addition, this pretreatment process requires elevated pressures of up to 1.1 kg cm-2 in order to prevent the evaporation of the heated analyte. Because of these limiting conditions required by the pretreatment processes used in conventional systems, it is difficult to miniaturize TP analysis systems. In this study, we employed a photocatalytic reaction in the pretreatment process. The reaction was carried out by illuminating a photocatalytic titanium dioxide (TiO2) surface formed in a microfluidic channel with ultraviolet (UV) light. This pretreatment process does not require elevated temperatures and pressures. By applying this simplified, photocatalytic-reaction-based pretreatment process to a TP analysis system, greater degrees of freedom are conferred to the design and fabrication of LOC devices for TP monitoring. The fabricated LOC device presented in this paper was characterized by measuring the TP concentration of an unknown sample, and comparing the results with those measured by a conventional TP analysis system. The TP concentrations of the unknown sample measured by the proposed LOC device and the conventional TP analysis system were 0.018 mgP/25 mL and 0.019 mgP/25 mL, respectively. The experimental results revealed that the proposed LOC device had a performance comparable to the conventional bulky TP analysis system. Therefore, our device could be directly employed in water quality monitoring as an alternative to conventional TP analysis systems.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
Spectroscopic analysis technique for arc-welding process control
NASA Astrophysics Data System (ADS)
Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel
2005-09-01
The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.
Emotion processing in the visual brain: a MEG analysis.
Peyk, Peter; Schupp, Harald T; Elbert, Thomas; Junghöfer, Markus
2008-06-01
Recent functional magnetic resonance imaging (fMRI) and event-related brain potential (ERP) studies provide empirical support for the notion that emotional cues guide selective attention. Extending this line of research, whole head magneto-encephalogram (MEG) was measured while participants viewed in separate experimental blocks a continuous stream of either pleasant and neutral or unpleasant and neutral pictures, presented for 330 ms each. Event-related magnetic fields (ERF) were analyzed after intersubject sensor coregistration, complemented by minimum norm estimates (MNE) to explore neural generator sources. Both streams of analysis converge by demonstrating the selective emotion processing in an early (120-170 ms) and a late time interval (220-310 ms). ERF analysis revealed that the polarity of the emotion difference fields was reversed across early and late intervals suggesting distinct patterns of activation in the visual processing stream. Source analysis revealed the amplified processing of emotional pictures in visual processing areas with more pronounced occipito-parieto-temporal activation in the early time interval, and a stronger engagement of more anterior, temporal, regions in the later interval. Confirming previous ERP studies showing facilitated emotion processing, the present data suggest that MEG provides a complementary look at the spread of activation in the visual processing stream.
Deng, Bo; Shi, Yaoyao; Yu, Tao; Kang, Chao; Zhao, Pan
2018-01-31
The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing.
Yu, Tao; Kang, Chao; Zhao, Pan
2018-01-01
The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing. PMID:29385048
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
Janknegt, Robert; Scott, Mike; Mairs, Jill; Timoney, Mark; McElnay, James; Brenninkmeijer, Rob
2007-10-01
Drug selection should be a rational process that embraces the principles of evidence-based medicine. However, many factors may affect the choice of agent. It is against this background that the System of Objectified Judgement Analysis (SOJA) process for rational drug-selection was developed. This article describes how the information on which the SOJA process is based, was researched and processed.
NASA Astrophysics Data System (ADS)
Mehrpooya, Mehdi; Ansarinasab, Hojat; Moftakhari Sharifzadeh, Mohammad Mehdi; Rosen, Marc A.
2017-10-01
An integrated power plant with a net electrical power output of 3.71 × 105 kW is developed and investigated. The electrical efficiency of the process is found to be 60.1%. The process includes three main sub-systems: molten carbonate fuel cell system, heat recovery section and cryogenic carbon dioxide capturing process. Conventional and advanced exergoeconomic methods are used for analyzing the process. Advanced exergoeconomic analysis is a comprehensive evaluation tool which combines an exergetic approach with economic analysis procedures. With this method, investment and exergy destruction costs of the process components are divided into endogenous/exogenous and avoidable/unavoidable parts. Results of the conventional exergoeconomic analyses demonstrate that the combustion chamber has the largest exergy destruction rate (182 MW) and cost rate (13,100 /h). Also, the total process cost rate can be decreased by reducing the cost rate of the fuel cell and improving the efficiency of the combustion chamber and heat recovery steam generator. Based on the total avoidable endogenous cost rate, the priority for modification is the heat recovery steam generator, a compressor and a turbine of the power plant, in rank order. A sensitivity analysis is done to investigate the exergoeconomic factor parameters through changing the effective parameter variations.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2009-08-01
Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative.
Practical, transparent prospective risk analysis for the clinical laboratory.
Janssens, Pim Mw
2014-11-01
Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Planning applications in image analysis
NASA Technical Reports Server (NTRS)
Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.
1994-01-01
We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.
1992-12-21
in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59
Parallel Algorithms for Image Analysis.
1982-06-01
8217 _ _ _ _ _ _ _ 4. TITLE (aid Subtitle) S. TYPE OF REPORT & PERIOD COVERED PARALLEL ALGORITHMS FOR IMAGE ANALYSIS TECHNICAL 6. PERFORMING O4G. REPORT NUMBER TR-1180...Continue on reverse side it neceesary aid Identlfy by block number) Image processing; image analysis ; parallel processing; cellular computers. 20... IMAGE ANALYSIS TECHNICAL 6. PERFORMING ONG. REPORT NUMBER TR-1180 - 7. AUTHOR(&) S. CONTRACT OR GRANT NUMBER(s) Azriel Rosenfeld AFOSR-77-3271 9
USDA-ARS?s Scientific Manuscript database
Using five centimeter resolution images acquired with an unmanned aircraft system (UAS), we developed and evaluated an image processing workflow that included the integration of resolution-appropriate field sampling, feature selection, object-based image analysis, and processing approaches for UAS i...
The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...
Knowledge Reasoning with Semantic Data for Real-Time Data Processing in Smart Factory
Wang, Shiyong; Li, Di; Liu, Chengliang
2018-01-01
The application of high-bandwidth networks and cloud computing in manufacturing systems will be followed by mass data. Industrial data analysis plays important roles in condition monitoring, performance optimization, flexibility, and transparency of the manufacturing system. However, the currently existing architectures are mainly for offline data analysis, not suitable for real-time data processing. In this paper, we first define the smart factory as a cloud-assisted and self-organized manufacturing system in which physical entities such as machines, conveyors, and products organize production through intelligent negotiation and the cloud supervises this self-organized process for fault detection and troubleshooting based on data analysis. Then, we propose a scheme to integrate knowledge reasoning and semantic data where the reasoning engine processes the ontology model with real time semantic data coming from the production process. Based on these ideas, we build a benchmarking system for smart candy packing application that supports direct consumer customization and flexible hybrid production, and the data are collected and processed in real time for fault diagnosis and statistical analysis. PMID:29415444
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1978-01-01
Several experimental and projected Czochralski crystal growing process methods were studied and compared to available operations and cost-data of recent production Cz-pulling, in order to elucidate the role of the dominant cost contributing factors. From this analysis, it becomes apparent that the specific add-on costs of the Cz-process can be expected to be reduced by about a factor of three by 1982, and about a factor of five by 1986. A format to guide in the accumulation of the data needed for thorough techno-economic analysis of solar cell production processes was developed.
NASA Technical Reports Server (NTRS)
Junkin, B. G. (Principal Investigator)
1979-01-01
A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.
Data processing for a cosmic ray experiment onboard the solar probes Helios 1 and 2: Experiment 6
NASA Technical Reports Server (NTRS)
Mueller-Mellin, R.; Green, G.; Iwers, B.; Kunow, H.; Wibberenz, G.; Fuckner, J.; Hempe, H.; Witte, M.
1982-01-01
The data processing system for the Helios experiment 6, measuring energetic charged particles of solar, planetary and galactic origin in the inner solar system, is described. The aim of this experiment is to extend knowledge on origin and propagation of cosmic rays. The different programs for data reduction, analysis, presentation, and scientific evaluation are described as well as hardware and software of the data processing equipment. A chronological presentation of the data processing operation is given. Procedures and methods for data analysis which were developed can be used with minor modifications for analysis of other space research experiments.
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
Generalized Majority Logic Criterion to Analyze the Statistical Strength of S-Boxes
NASA Astrophysics Data System (ADS)
Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan
2012-05-01
The majority logic criterion is applicable in the evaluation process of substitution boxes used in the advanced encryption standard (AES). The performance of modified or advanced substitution boxes is predicted by processing the results of statistical analysis by the majority logic criteria. In this paper, we use the majority logic criteria to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, the majority logic criterion is applied to AES, affine power affine (APA), Gray, Lui J, residue prime, S8 AES, Skipjack, and Xyi substitution boxes. The majority logic criterion is further extended into a generalized majority logic criterion which has a broader spectrum of analyzing the effectiveness of substitution boxes in image encryption applications. The integral components of the statistical analyses used for the generalized majority logic criterion are derived from results of entropy analysis, contrast analysis, correlation analysis, homogeneity analysis, energy analysis, and mean of absolute deviation (MAD) analysis.
Wang, Shengnan; Hua, Yujiao; Zou, Lisi; Liu, Xunhong; Yan, Ying; Zhao, Hui; Luo, Yiyuan; Liu, Juanxiu
2018-02-01
Scrophulariae Radix is one of the most popular traditional Chinese medicines (TCMs). Primary processing of Scrophulariae Radix is an important link which closely related to the quality of products in this TCM. The aim of this study is to explore the influence of different processing methods on chemical constituents in Scrophulariae Radix. The difference of chemical constituents in Scrophulariae Radix processed by different methods was analyzed by using ultra fast liquid chromatography-triple quadrupole-time of flight mass spectrometry coupled with principal component analysis and orthogonal partial least squares discriminant analysis. Furthermore, the contents of 12 index differential constituents in Scrophulariae Radix processed by different methods were simultaneously determined by using ultra fast liquid chromatography coupled with triple quadrupole-linear ion trap mass spectrometry. Gray relational analysis was performed to evaluate the different processed samples according to the contents of 12 constituents. All of the results demonstrated that the quality of Scrophulariae Radix processed by "sweating" method was better. This study will provide the basic information for revealing the change law of chemical constituents in Scrophulariae Radix processed by different methods and facilitating selection of the suitable processing method of this TCM. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven
2017-01-01
Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313
NASA Astrophysics Data System (ADS)
Mariajayaprakash, Arokiasamy; Senthilvelan, Thiyagarajan; Vivekananthan, Krishnapillai Ponnambal
2013-07-01
The various process parameters affecting the quality characteristics of the shock absorber during the process were identified using the Ishikawa diagram and by failure mode and effect analysis. The identified process parameters are welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process parameters (load, hydraulic pressure, air pressure, and fixture height), washing process parameters (total alkalinity, temperature, pH value of rinsing water, and timing), and painting process parameters (flowability, coating thickness, pointage, and temperature). In this paper, the process parameters, namely, painting and washing process parameters, are optimized by Taguchi method. Though the defects are reasonably minimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by Taguchi method.
NASA Technical Reports Server (NTRS)
Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.
2004-01-01
This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
In-Situ Molecular Vapor Composition Measurements During Lyophilization.
Liechty, Evan T; Strongrich, Andrew D; Moussa, Ehab M; Topp, Elizabeth; Alexeenko, Alina A
2018-04-11
Monitoring process conditions during lyophilization is essential to ensuring product quality for lyophilized pharmaceutical products. Residual gas analysis has been applied previously in lyophilization applications for leak detection, determination of endpoint in primary and secondary drying, monitoring sterilization processes, and measuring complex solvents. The purpose of this study is to investigate the temporal evolution of the process gas for various formulations during lyophilization to better understand the relative extraction rates of various molecular compounds over the course of primary drying. In this study, residual gas analysis is used to monitor molecular composition of gases in the product chamber during lyophilization of aqueous formulations typical for pharmaceuticals. Residual gas analysis is also used in the determination of the primary drying endpoint and compared to the results obtained using the comparative pressure measurement technique. The dynamics of solvent vapors, those species dissolved therein, and the ballast gas (the gas supplied to maintain a set-point pressure in the product chamber) are observed throughout the course of lyophilization. In addition to water vapor and nitrogen, the two most abundant gases for all considered aqueous formulations are oxygen and carbon dioxide. In particular, it is observed that the relative concentrations of carbon dioxide and oxygen vary depending on the formulation, an observation which stems from the varying solubility of these species. This result has implications on product shelf life and stability during the lyophilization process. Chamber process gas composition during lyophilization is quantified for several representative formulations using residual gas analysis. The advantages of the technique lie in its ability to measure the relative concentration of various species during the lyophilization process. This feature gives residual gas analysis utility in a host of applications from endpoint determination to quality assurance. In contrast to other methods, residual gas analysis is able to determine oxygen and water vapor content in the process gas. These compounds have been shown to directly influence product shelf life. With these results, residual gas analysis technique presents a potential new method for real-time lyophilization process control and improved understanding of formulation and processing effects for lyophilized pharmaceutical products.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang
2017-05-01
Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Glenn, Sigrid S.
1985-01-01
Behavior analysis and institutional economics are viewed as having common origins in the early 20th century effort to benefit from the conceptual revolution spurred by Darwin's synthesis. Institutional economics, initiated by Thorstein Veblen, appears to have failed to develop a progressive scientific technology, while behavior analysis has done so. It is suggested that institutional economics has been held back by lack of a synthesizing scientific mechanism that elucidates the relation between technological and ceremonial processes, the two cultural forces described by Veblen. The theory of institutional economist C. E. Ayres, built on Veblen's distinction, is used to clarify the concepts of technological and ceremonial processes for the reader. An analysis of the behavioral processes that might underlie the cultural processes described by Veblen/Ayres suggests that the experimental analysis of behavior has provided concepts that might function as a synthesizing mechanism for the social sciences and, in particular, institutional economics. The Veblen/Ayres dichotomy, now seen in terms of underlying behavioral processes, is used to examine the field of behavior analysis in terms of its origins, its relation to psychology and its current state. The paper concludes with a few practical suggestions as to how behavior analysts might work to enhance survival. PMID:22478617
Increasing Transparency Through a Multiverse Analysis.
Steegen, Sara; Tuerlinckx, Francis; Gelman, Andrew; Vanpaemel, Wolf
2016-09-01
Empirical research inevitably includes constructing a data set by processing raw data into a form ready for statistical analysis. Data processing often involves choices among several reasonable options for excluding, transforming, and coding data. We suggest that instead of performing only one analysis, researchers could perform a multiverse analysis, which involves performing all analyses across the whole set of alternatively processed data sets corresponding to a large set of reasonable scenarios. Using an example focusing on the effect of fertility on religiosity and political attitudes, we show that analyzing a single data set can be misleading and propose a multiverse analysis as an alternative practice. A multiverse analysis offers an idea of how much the conclusions change because of arbitrary choices in data construction and gives pointers as to which choices are most consequential in the fragility of the result. © The Author(s) 2016.
Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft
NASA Technical Reports Server (NTRS)
Schlierf, Roland; Stambolian, Damon B.; Miller, Darcy; Posanda, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderson, Gena; Barth, Tim
2010-01-01
The Constellation Program (CxP) Orion vehicle goes through several areas and stages of processing before its launched at the Kennedy Space Center. In order to have efficient and effective processing, all of the activities need to be analyzed. This was accomplished by first developing a timeline of events that included each activity, and then each activity was analyzed by operability experts and human factors experts with spacecraft processing experience. This papers focus is to explain the results and the process for developing this human factors operability timeline analysis to improve the processing flow of Orion.
Reduced product yield in chemical processes by second law effects
NASA Technical Reports Server (NTRS)
England, C.; Funk, J. E.
1980-01-01
An analysis of second law effects in chemical processes, where product yield is explicitly related to the individual irreversibilities within the process to indicate a maximum theoretical yield, is presented. Examples are given that indicate differences between first and second law approaches toward process efficiency and process yield. This analysis also expresses production capacity in terms of the heating value of a product. As a result, it is particularly convenient in analyzing fuel conversion plants and their potential for improvement. Relationships are also given for the effects of irreversibilities on requirements for process heat and for feedstocks.
Sensitivity analysis of the add-on price estimate for the edge-defined film-fed growth process
NASA Technical Reports Server (NTRS)
Mokashi, A. R.; Kachare, A. H.
1981-01-01
The analysis is in terms of cost parameters and production parameters. The cost parameters include equipment, space, direct labor, materials, and utilities. The production parameters include growth rate, process yield, and duty cycle. A computer program was developed specifically to do the sensitivity analysis.
Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model
Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance
2014-01-01
Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...
State Analysis: A Control Architecture View of Systems Engineering
NASA Technical Reports Server (NTRS)
Rasmussen, Robert D.
2005-01-01
A viewgraph presentation on the state analysis process is shown. The topics include: 1) Issues with growing complexity; 2) Limits of common practice; 3) Exploiting a control point of view; 4) A glimpse at the State Analysis process; 5) Synergy with model-based systems engineering; and 6) Bridging the systems to software gap.
Strategic and Market Analysis | Bioenergy | NREL
recent efforts in comparative techno-economic analysis. Our analysis considers a wide range of conversion Intermediates NREL has developed first-of-its-kind process models and economic assessments of the co-processing work strives to understand the economic incentives, technical risks, and key data gaps that need to be
2013-03-01
9 B. REQUIREMENTS ANALYSIS PROCESS ..................................................9 1. Requirements Management and... Analysis Plan ................................9 2. Knowledge Point Reviews .................................................................11 3...are Identified .......12 5. RMAP/CDD Process Analysis and Results......................................13 IV. TD PHASE BEGINS
ERIC Educational Resources Information Center
Coad, Jane; Evans, Ruth
2008-01-01
This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…
ERIC Educational Resources Information Center
Bjerstedt, Ake
A three-volume series describes the construction of a self-instructional system as a work process with three main phases: system analysis, system synthesis, and system modification and evaluation. After an introductory discussion of some basic principles of instructional programing, this first volume focuses on the system analysis phase,…
Wójcicki, Tomasz; Nowicki, Michał
2016-01-01
The article presents a selected area of research and development concerning the methods of material analysis based on the automatic image recognition of the investigated metallographic sections. The objectives of the analyses of the materials for gas nitriding technology are described. The methods of the preparation of nitrided layers, the steps of the process and the construction and operation of devices for gas nitriding are given. We discuss the possibility of using the methods of digital images processing in the analysis of the materials, as well as their essential task groups: improving the quality of the images, segmentation, morphological transformations and image recognition. The developed analysis model of the nitrided layers formation, covering image processing and analysis techniques, as well as selected methods of artificial intelligence are presented. The model is divided into stages, which are formalized in order to better reproduce their actions. The validation of the presented method is performed. The advantages and limitations of the developed solution, as well as the possibilities of its practical use, are listed. PMID:28773389
Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes
Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin
2012-01-01
Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993
GaiaGrid : Its Implications and Implementation
NASA Astrophysics Data System (ADS)
Ansari, S. G.; Lammers, U.; Ter Linden, M.
2005-12-01
Gaia is an ESA space mission to determine positions of 1 billion objects in the Galaxy at micro-arcsecond precision. The data analysis and processing requirements of the mission involves about 20 institutes across Europe, each providing specific algorithms for specific tasks, which range from relativistic effects on positional determination, classification, astrometric binary star detection, photometric analysis, spectroscopic analysis etc. In an initial phase, a study has been ongoing over the past three years to determine the complexity of Gaia's data processing. Two processing categories have materialised: core and shell. While core deals with routine data processing, shell tasks are algorithms to carry out data analysis, which involves the Gaia Community at large. For this latter category, we are currently experimenting with use of Grid paradigms to allow access to the core data and to augment processing power to simulate and analyse the data in preparation for the actual mission. We present preliminary results and discuss the sociological impact of distributing the tasks amongst the community.
Ibrahim, Reham S; Fathy, Hoda
2018-03-30
Tracking the impact of commonly applied post-harvesting and industrial processing practices on the compositional integrity of ginger rhizome was implemented in this work. Untargeted metabolite profiling was performed using digitally-enhanced HPTLC method where the chromatographic fingerprints were extracted using ImageJ software then analysed with multivariate Principal Component Analysis (PCA) for pattern recognition. A targeted approach was applied using a new, validated, simple and fast HPTLC image analysis method for simultaneous quantification of the officially recognized markers 6-, 8-, 10-gingerol and 6-shogaol in conjunction with chemometric Hierarchical Clustering Analysis (HCA). The results of both targeted and untargeted metabolite profiling revealed that peeling, drying in addition to storage employed during processing have a great influence on ginger chemo-profile, the different forms of processed ginger shouldn't be used interchangeably. Moreover, it deemed necessary to consider the holistic metabolic profile for comprehensive evaluation of ginger during processing. Copyright © 2018. Published by Elsevier B.V.
Zimmermann, Hartmut F; Hentschel, Norbert
2011-01-01
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Tests of Spectral Cloud Classification Using DMSP Fine Mode Satellite Data.
1980-06-02
processing techniques of potential value. Fourier spectral analysis was identified as the most promising technique to upgrade automated processing of...these measurements on the Earth’s surface is 0. 3 n mi. 3. Pickett, R.M., and Blackman, E.S. (1976) Automated Processing of Satellite Imagery Data at Air...and Pickett. R. Al. (1977) Automated Processing of Satellite Imagery Data at the Air Force Global Weather Central: Demonstrations of Spectral Analysis
A Comparative Analysis of Extract, Transformation and Loading (ETL) Process
NASA Astrophysics Data System (ADS)
Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.
2018-02-01
The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).
Second-order analysis of semiparametric recurrent event processes.
Guan, Yongtao
2011-09-01
A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
Process Feasibility Study in Support of Silicon Material Task 1
NASA Technical Reports Server (NTRS)
Li, K. Y.; Hansen, K. C.; Yaws, C. L.
1979-01-01
Analysis of process system properties was continued for silicon source materials under consideration for producing silicon. The following property data are reported for dichlorosilane which is involved in processing operations for silicon: critical constants, vapor pressure, heat of vaporization, heat capacity, density, surface tension, thermal conductivity, heat of formation and Gibb's free energy of formation. The properties are reported as a function of temperature to permit rapid engineering usage. The preliminary economic analysis of the process is described. Cost analysis results for the process (case A-two deposition reactors and six electrolysis cells) are presented based on a preliminary process design of a plant to produce 1,000 metric tons/year of silicon. Fixed capital investment estimate for the plant is $12.47 million (1975 dollars) ($17.47 million, 1980 dollars). Product cost without profit is 8.63 $/kg of silicon (1975 dollars)(12.1 $/kg, 1980 dollars).
10 CFR 712.36 - Medical assessment process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... assigned duties. (b) Employers must provide a job task analysis for those individuals involved in HRP... performed if a job task analysis has not been provided. (c) The medical process by the Designated Physician...
Thermo-Mechanical Analysis for John Deere Electronics Solutions | Advanced
impacts of alternative manufacturing processes Die, package, and interface material analysis for power module reliability Manufacturing process impacts versus thermal cycling impacts on power module
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Amanda M.; Nelson, Gilbert L.; Casella, Amanda J.
Microfluidic devices are a growing field with significant potential for application to small scale processing of solutions. Much like large scale processing, fast, reliable, and cost effective means of monitoring the streams during processing are needed. Here we apply a novel Micro-Raman probe to the on-line monitoring of streams within a microfluidic device. For either macro or micro scale process monitoring via spectroscopic response, there is the danger of interfering or confounded bands obfuscating results. By utilizing chemometric analysis, a form of multivariate analysis, species can be accurately quantified in solution despite the presence of overlapping or confounded spectroscopic bands.more » This is demonstrated on solutions of HNO 3 and NaNO 3 within micro-flow and microfluidic devices.« less
Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants
NASA Astrophysics Data System (ADS)
Kulbjakina, A. V.; Dolotovskij, I. V.
2018-01-01
The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.
Introduction of male circumcision for HIV prevention in Uganda: analysis of the policy process.
Odoch, Walter Denis; Kabali, Kenneth; Ankunda, Racheal; Zulu, Joseph Mumba; Tetui, Moses
2015-06-20
Health policy analysis is important for all health policies especially in fields with ever changing evidence-based interventions such as HIV prevention. However, there are few published reports of health policy analysis in sub-Saharan Africa in this field. This study explored the policy process of the introduction of male circumcision (MC) for HIV prevention in Uganda in order to inform the development processes of similar health policies. Desk review of relevant documents was conducted between March and May 2012. Thematic analysis was used to analyse the data. Conceptual frameworks that demonstrate the interrelationship within the policy development processes and influence of actors in the policy development processes guided the analysis. Following the introduction of MC on the national policy agenda in 2007, negotiation and policy formulation preceded its communication and implementation. Policy proponents included academic researchers in the early 2000s and development partners around 2007. Favourable contextual factors that supported the development of the policy included the rising HIV prevalence, adoption of MC for HIV prevention in other sub-Saharan African countries, and expertise on MC. Additionally, the networking capability of proponents facilitated the change in position of non-supportive or neutral actors. Non-supportive and neutral actors in the initial stages of the policy development process included the Ministry of Health, traditional and Muslim leaders, and the Republican President. Using political authority, legitimacy, and charisma, actors who opposed the policy tried to block the policy development process. Researchers' initial disregard of the Ministry of Health in the research process of MC and the missing civil society advocacy arm contributed to delays in the policy development process. This study underscores the importance of securing top political leadership as well as key implementing partners' support in policy development processes. Equally important is the appreciation of the various forms of actors' power and how such power shapes the policy agenda, development process, and content.
Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice
2009-02-01
To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.
NASA Astrophysics Data System (ADS)
Xie, Dongjin; Xu, Jing; Cheng, Haifeng; Wang, Nannan; Zhou, Qun
2018-06-01
Thermochromic compound [(C2H5)2NH2]2CuCl4 displays a solid-solid phase transition at 52 °C apparent with color changing from green to yellow, induced by the geometry of [CuCl4]2- anion (regarded as chromophore of the compound) ranging from square-planar to flattened tetrahedral structure. Fourier transform infrared (FTIR) spectroscopy and two-dimensional correlation (2D-COS) analysis have been applied to study the role played by the amine and ethyl group of the ammonium cation during the phase transition process in heating and cooling process. With temperature increasing, strength weakening of the N-H…Cl H-bond and thermal disordering of the alkyl chain both occur in the phase transition. 2D-COS analysis reveals the N-H…Cl H-bond responds to increasing temperature in the first place, and may the dominating driving force for the structure variation of [CuCl4]2- anion. Although the thermochromic process of [(C2H5)2NH2]2CuCl4 is a reversible process, the sequential order of the variation of NH2+ and alkyl group of [(C2H5)2NH2]2CuCl4 derived by 2D-COS analysis during heating and cooling process are reverse, indicating the dynamic process of the phase transition is not perfect reversible. The existence of undercooling phenomenon in the cooling process has been revealed by 2D-COS analysis.
Query-Driven Visualization and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver; Bethel, E. Wes; Prabhat, Mr.
2012-11-01
This report focuses on an approach to high performance visualization and analysis, termed query-driven visualization and analysis (QDV). QDV aims to reduce the amount of data that needs to be processed by the visualization, analysis, and rendering pipelines. The goal of the data reduction process is to separate out data that is "scientifically interesting'' and to focus visualization, analysis, and rendering on that interesting subset. The premise is that for any given visualization or analysis task, the data subset of interest is much smaller than the larger, complete data set. This strategy---extracting smaller data subsets of interest and focusing ofmore » the visualization processing on these subsets---is complementary to the approach of increasing the capacity of the visualization, analysis, and rendering pipelines through parallelism. This report discusses the fundamental concepts in QDV, their relationship to different stages in the visualization and analysis pipelines, and presents QDV's application to problems in diverse areas, ranging from forensic cybersecurity to high energy physics.« less
Coal gasification systems engineering and analysis. Appendix A: Coal gasification catalog
NASA Technical Reports Server (NTRS)
1980-01-01
The scope of work in preparing the Coal Gasification Data Catalog included the following subtasks: (1) candidate system subsystem definition, (2) raw materials analysis, (3) market analysis for by-products, (4) alternate products analysis, (5) preliminary integrated facility requirements. Definition of candidate systems/subsystems includes the identity of and alternates for each process unit, raw material requirements, and the cost and design drivers for each process design.
Tao, Ling; Aden, Andy; Elander, Richard T; Pallapolu, Venkata Ramesh; Lee, Y Y; Garlock, Rebecca J; Balan, Venkatesh; Dale, Bruce E; Kim, Youngmi; Mosier, Nathan S; Ladisch, Michael R; Falls, Matthew; Holtzapple, Mark T; Sierra, Rocio; Shi, Jian; Ebrik, Mirvat A; Redmond, Tim; Yang, Bin; Wyman, Charles E; Hames, Bonnie; Thomas, Steve; Warner, Ryan E
2011-12-01
Six biomass pretreatment processes to convert switchgrass to fermentable sugars and ultimately to cellulosic ethanol are compared on a consistent basis in this technoeconomic analysis. The six pretreatment processes are ammonia fiber expansion (AFEX), dilute acid (DA), lime, liquid hot water (LHW), soaking in aqueous ammonia (SAA), and sulfur dioxide-impregnated steam explosion (SO(2)). Each pretreatment process is modeled in the framework of an existing biochemical design model so that systematic variations of process-related changes are consistently captured. The pretreatment area process design and simulation are based on the research data generated within the Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) 3 project. Overall ethanol production, total capital investment, and minimum ethanol selling price (MESP) are reported along with selected sensitivity analysis. The results show limited differentiation between the projected economic performances of the pretreatment options, except for processes that exhibit significantly lower monomer sugar and resulting ethanol yields. Copyright © 2011 Elsevier Ltd. All rights reserved.
Xue, Xiu-Juan; Gao, Qing; Qiao, Jian-Hong; Zhang, Jie; Xu, Cui-Ping; Liu, Ju
2014-01-01
This meta-analysis was to summarize the published studies about the association between red/processed meat consumption and the risk of lung cancer. 5 databases were systematically reviewed, and random-effect model was used to pool the study results and to assess dose-response relationships. Results shown that six cohort studies and twenty eight case-control studies were included in this meat-analysis. The pooled Risk Radios (RR) for total red meat and processed meat were 1.44 (95% CI, 1.29-1.61) and 1.23 (95% CI, 1.10-1.37), respectively. Dose-response analysis revealed that for every increment of 120 grams red meat per day the risk of lung cancer increases 35% and for every increment of 50 grams red meat per day the risk of lung cancer increases 20%. The present dose-response meta-analysis suggested that both red and processed meat consumption showed a positive effect on lung cancer risk. PMID:25035778
Retinal imaging analysis based on vessel detection.
Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila
2017-07-01
With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.
FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing
NASA Technical Reports Server (NTRS)
Berner, Stephan; DeLeon, Phillip
1999-01-01
One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.
ERIC Educational Resources Information Center
McCormick, Joe Lew
This study examined major stakeholders' perceptions of their involvement and role in the legislative process surrounding the introduction, deliberation, and ultimate passage of the Direct Loan Demonstration Program (DLDP), a federal pilot student loan program. Data analysis was based on a detailed description of the legislative process surrounding…
DEP : a computer program for evaluating lumber drying costs and investments
Stewart Holmes; George B. Harpole; Edward Bilek
1983-01-01
The DEP computer program is a modified discounted cash flow computer program designed for analysis of problems involving economic analysis of wood drying processes. Wood drying processes are different from other processes because of the large amounts of working capital required to finance inventories, and because of relatively large shares of costs charged to inventory...
Understanding Processes and Timelines for Distributed Photovoltaic
data from more than 30,000 PV systems across 87 utilities in 16 states to better understand how solar photovoltaic (PV) interconnection process time frames in the United States. This study includes an analysis of Analysis Metrics" that shows the four steps involved in the utility interconnection process for solar
Information Acquisition, Analysis and Integration
2016-08-03
of sensing and processing, theory, applications, signal processing, image and video processing, machine learning , technology transfer. 16. SECURITY... learning . 5. Solved elegantly old problems like image and video debluring, intro- ducing new revolutionary approaches. 1 DISTRIBUTION A: Distribution...Polatkan, G. Sapiro, D. Blei, D. B. Dunson, and L. Carin, “ Deep learning with hierarchical convolution factor analysis,” IEEE 6 DISTRIBUTION A
ERIC Educational Resources Information Center
Duffy, Melissa C.; Azevedo, Roger; Sun, Ning-Zi; Griscom, Sophia E.; Stead, Victoria; Crelinsten, Linda; Wiseman, Jeffrey; Maniatis, Thomas; Lachapelle, Kevin
2015-01-01
This study examined the nature of cognitive, metacognitive, and affective processes among a medical team experiencing difficulty managing a challenging simulated medical emergency case by conducting in-depth analysis of process data. Medical residents participated in a simulation exercise designed to help trainees to develop medical expertise,…
Code of Federal Regulations, 2010 CFR
2010-07-01
..., processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... information are available for submission, inspection, and selection? 280.40 Section 280.40 Mineral Resources...
Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells
NASA Technical Reports Server (NTRS)
Miller, L.; Doan, D. J.; Carr, E. S.
1971-01-01
A program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells is described. The determination and study of the process variables associated with the positive and negative plaque impregnation/polarization process are emphasized. The experimental data resulting from the implementation of fractional factorial design experiments are analyzed by means of a linear multiple regression analysis technique. This analysis permits the selection of preferred levels for certain process variables to achieve desirable impregnated plaque characteristics.
NASA Astrophysics Data System (ADS)
Syafrina, R.; Rohman, I.; Yuliani, G.
2018-05-01
This study aims to analyze the concept characteristics of solubility and solubility products that will serve as the basis for the development of virtual laboratory and students' science process skills. Characteristics of the analyzed concepts include concept definitions, concept attributes, and types of concepts. The concept analysis method uses concept analysis according to Herron. The results of the concept analysis show that there are twelve chemical concepts that become the prerequisite concept before studying the solubility and solubility and five core concepts that students must understand in the solubility and Solubility product. As many as 58.3% of the definitions of the concepts contained in high school textbooks support students' science process skills, the rest of the definition of the concept is memorized. Concept attributes that meet three levels of chemical representation and can be poured into a virtual laboratory have a percentage of 66.6%. Type of concept, 83.3% is a concept based on principle; and 16.6% concepts that state the process. Meanwhile, the science process skills that can be developed based on concept analysis are the ability to observe, calculate, measure, predict, interpret, hypothesize, apply, classify, and inference.
The effects of pre-processing strategies in sentiment analysis of online movie reviews
NASA Astrophysics Data System (ADS)
Zin, Harnani Mat; Mustapha, Norwati; Murad, Masrah Azrifah Azmi; Sharef, Nurfadhlina Mohd
2017-10-01
With the ever increasing of internet applications and social networking sites, people nowadays can easily express their feelings towards any products and services. These online reviews act as an important source for further analysis and improved decision making. These reviews are mostly unstructured by nature and thus, need processing like sentiment analysis and classification to provide a meaningful information for future uses. In text analysis tasks, the appropriate selection of words/features will have a huge impact on the effectiveness of the classifier. Thus, this paper explores the effect of the pre-processing strategies in the sentiment analysis of online movie reviews. In this paper, supervised machine learning method was used to classify the reviews. The support vector machine (SVM) with linear and non-linear kernel has been considered as classifier for the classification of the reviews. The performance of the classifier is critically examined based on the results of precision, recall, f-measure, and accuracy. Two different features representations were used which are term frequency and term frequency-inverse document frequency. Results show that the pre-processing strategies give a significant impact on the classification process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-04: Development of Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)
NASA Astrophysics Data System (ADS)
Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.
2017-12-01
We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.
Magnesium-Aluminum-Zirconium Oxide Amorphous Ternary Composite: A Dense and Stable Optical Coating
NASA Technical Reports Server (NTRS)
Sahoo, N. K.; Shapiro, A. P.
1998-01-01
In the present work, the process parameter dependent optical and structural properties of MgO-Al(2)O(3)-ZrO(2) ternary mixed-composite material have been investigated. Optical properties were derived from spectrophotometric measurements. The surface morphology, grain size distributions, crystallographic phases and process dependent material composition of films have been investigated through the use of Atomic Force Microscopy (AFM), X-ray diffraction analysis and Energy Dispersive X- ray (EDX) analysis. EDX analysis made evident the correlation between the optical constants and the process dependent compositions in the films. It is possible to achieve environmentally stable amorphous films with high packing density under certain optimized process conditions.
NASA Technical Reports Server (NTRS)
Thomson, F.
1972-01-01
The additional processing performed on data collected over the Rhode River Test Site and Forestry Site in November 1970 is reported. The techniques and procedures used to obtain the processed results are described. Thermal data collected over three approximately parallel lines of the site were contoured, and the results color coded, for the purpose of delineating important scene constituents and to identify trees attacked by pine bark beetles. Contouring work and histogram preparation are reviewed and the important conclusions from the spectral analysis and recognition computer (SPARC) signature extension work are summarized. The SPARC setup and processing records are presented and recommendations are made for future data collection over the site.
Parallel processing in finite element structural analysis
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1987-01-01
A brief review is made of the fundamental concepts and basic issues of parallel processing. Discussion focuses on parallel numerical algorithms, performance evaluation of machines and algorithms, and parallelism in finite element computations. A computational strategy is proposed for maximizing the degree of parallelism at different levels of the finite element analysis process including: 1) formulation level (through the use of mixed finite element models); 2) analysis level (through additive decomposition of the different arrays in the governing equations into the contributions to a symmetrized response plus correction terms); 3) numerical algorithm level (through the use of operator splitting techniques and application of iterative processes); and 4) implementation level (through the effective combination of vectorization, multitasking and microtasking, whenever available).
MgO-Al2O3-ZrO2 Amorphous Ternary Composite: A Dense and Stable Optical Coating
NASA Technical Reports Server (NTRS)
Shaoo, Naba K.; Shapiro, Alan P.
1998-01-01
The process-parameter-dependent optical and structural properties of MgO-Al2O3-ZrO2 ternary mixed-composite material were investigated. Optical properties were derived from spectrophotometric measurements. The surface morphology, grain size distributions, crystallographic phases, and process- dependent material composition of films were investigated through the use of atomic force microscopy, x-ray diffraction analysis, and energy-dispersive x-ray analysis. Energy-dispersive x-ray analysis made evident the correlation between the optical constants and the process-dependent compositions in the films. It is possible to achieve environmentally stable amorphous films with high packing density under certain optimized process conditions.
NASA Technical Reports Server (NTRS)
Nagy, S.
1988-01-01
Due to extraordinary distances scanned by modern telescopes, optical surfaces in such telescopes must be manufactured to unimaginable standards of perfection of a few thousandths of a centimeter. The detection of imperfections of less than 1/20 of a wavelength of light, for application in the building of the mirror for the Space Infrared Telescope Facility, was undertaken. Because the mirror must be kept very cold while in space, another factor comes into effect: cryogenics. The process to test a specific morror under cryogenic conditions is described; including the follow-up analysis accomplished through computer work. To better illustrate the process and analysis, a Pyrex Hex-Core mirror is followed through the process from the laser interferometry in the lab, to computer analysis via a computer program called FRINGE. This analysis via FRINGE is detailed.
dada - a web-based 2D detector analysis tool
NASA Astrophysics Data System (ADS)
Osterhoff, Markus
2017-06-01
The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.
Analysis of launch site processing effectiveness for the Space Shuttle 26R payload
NASA Technical Reports Server (NTRS)
Flores, Carlos A.; Heuser, Robert E.; Pepper, Richard E., Jr.; Smith, Anthony M.
1991-01-01
A trend analysis study has been performed on problem reports recorded during the Space Shuttle 26R payload's processing cycle at NASA-Kennedy, using the defect-flow analysis (DFA) methodology; DFA gives attention to the characteristics of the problem-report 'population' as a whole. It is established that the problem reports contain data which distract from pressing problems, and that fully 60 percent of such reports were caused during processing at NASA-Kennedy. The second major cause of problem reports was design defects.
NASA Technical Reports Server (NTRS)
Brown, R. A.
1986-01-01
This research program focuses on analysis of the transport mechanisms in solidification processes, especially one of interest to the Microgravity Sciences and Applications Program of NASA. Research during the last year has focused on analysis of the dynamics of the floating zone process for growth of small-scale crystals, on studies of the effect of applied magnetic fields on convection and solute segregation in directional solidification, and on the dynamics of microscopic cell formation in two-dimensional solidification of binary alloys. Significant findings are given.
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
From perceptual to lexico-semantic analysis--cortical plasticity enabling new levels of processing.
Schlaffke, Lara; Rüther, Naima N; Heba, Stefanie; Haag, Lauren M; Schultz, Thomas; Rosengarth, Katharina; Tegenthoff, Martin; Bellebaum, Christian; Schmidt-Wilcke, Tobias
2015-11-01
Certain kinds of stimuli can be processed on multiple levels. While the neural correlates of different levels of processing (LOPs) have been investigated to some extent, most of the studies involve skills and/or knowledge already present when performing the task. In this study we specifically sought to identify neural correlates of an evolving skill that allows the transition from perceptual to a lexico-semantic stimulus analysis. Eighteen participants were trained to decode 12 letters of Morse code that were presented acoustically inside and outside of the scanner environment. Morse code was presented in trains of three letters while brain activity was assessed with fMRI. Participants either attended to the stimulus length (perceptual analysis), or evaluated its meaning distinguishing words from nonwords (lexico-semantic analysis). Perceptual and lexico-semantic analyses shared a mutual network comprising the left premotor cortex, the supplementary motor area (SMA) and the inferior parietal lobule (IPL). Perceptual analysis was associated with a strong brain activation in the SMA and the superior temporal gyrus bilaterally (STG), which remained unaltered from pre and post training. In the lexico-semantic analysis post learning, study participants showed additional activation in the left inferior frontal cortex (IFC) and in the left occipitotemporal cortex (OTC), regions known to be critically involved in lexical processing. Our data provide evidence for cortical plasticity evolving with a learning process enabling the transition from perceptual to lexico-semantic stimulus analysis. Importantly, the activation pattern remains task-related LOP and is thus the result of a decision process as to which LOP to engage in. © 2015 The Authors. Human Brain Mapping Published by Wiley Periodicals, Inc.
The road to smoke-free legislation in Ireland.
Currie, Laura M; Clancy, Luke
2011-01-01
To describe the process through which Ireland changed its policies towards smoking in work-places and distil lessons for others implementing or extending smoke-free laws. This analysis is informed by a review of secondary sources including a commissioned media analysis, documentary analysis and key informant interviews with policy actors who provide insight into the process of smoke-free policy development. The policy analysis techniques used include the development of a time-line for policy reform, stakeholder analysis, policy mapping techniques, impact analysis through use of secondary data and a review process. The policy analysis triangle, which highlights the importance of examining policy content, context, actors and processes, will be used as an analytical framework. The importance of the political, economic, social and cultural context emerged clearly. The interaction of the context with the policy process both in identification of need for policy and its formulation demonstrated the opportunity for advocates to exert influence at all points of the process. The campaign to support the legislation had the following characteristics: a sustained consistent simple health message, sustained political leadership/commitment, a strong coalition between the Health Alliance, the Office of Tobacco Control and the Department of Health and Children, with cross-party political support and trade union support. The public and the media support clearly defined the benefit of deliberate and consistent planning and organization of a communication strategy. The Irish smoke-free legislation was a success as a policy initiative because of timing, dedication, planning, implementation and the existence of strong leadership and a powerful convinced credible political champion. © 2010 The Authors, Addiction © 2010 Society for the Study of Addiction.
Contributions to systemic analysis for worm screw production using thread whirling devices
NASA Astrophysics Data System (ADS)
Cretu, G.
2017-08-01
The paper aims to achieve a systemic analysis of worms processing using whirling threaded devices with highlighting all the factors involved in this system. It will also carry out an analysis of these factors depending on specific conditions such machining. Are also presented the stages of experimentation program and ways of processing for data obtained.
The Controlling Function of the Agent in the Analysis of Question-Response Relationships.
ERIC Educational Resources Information Center
Bierschenk, Inger
In contrast to traditional linguistic analysis, a model based on the empirical agent is presented and tested. A text is regarded as an intentionally produced cognitive process. The analysis has to take the agent (perspective) into account to facilitate an adequate processing of its objectives (viewpoints). Moreover, the model is surface-oriented…
Code of Federal Regulations, 2014 CFR
2014-10-01
... over the lands covered by your application a written analysis of those factors applicable to your... actual costs (see § 2804.14(f) of this subpart). Submitting your analysis with the application will.... While we consider your written analysis, BLM will not process your Category 6 application. (a) FLPMA...
Code of Federal Regulations, 2013 CFR
2013-10-01
... over the lands covered by your application a written analysis of those factors applicable to your... actual costs (see § 2804.14(f) of this subpart). Submitting your analysis with the application will.... While we consider your written analysis, BLM will not process your Category 6 application. (a) FLPMA...
Code of Federal Regulations, 2011 CFR
2011-10-01
... over the lands covered by your application a written analysis of those factors applicable to your... actual costs (see § 2804.14(f) of this subpart). Submitting your analysis with the application will.... While we consider your written analysis, BLM will not process your Category 6 application. (a) FLPMA...
Code of Federal Regulations, 2012 CFR
2012-10-01
... over the lands covered by your application a written analysis of those factors applicable to your... actual costs (see § 2804.14(f) of this subpart). Submitting your analysis with the application will.... While we consider your written analysis, BLM will not process your Category 6 application. (a) FLPMA...
Linear circuit analysis program for IBM 1620 Monitor 2, 1311/1443 data processing system /CIRCS/
NASA Technical Reports Server (NTRS)
Hatfield, J.
1967-01-01
CIRCS is modification of IBSNAP Circuit Analysis Program, for use on smaller systems. This data processing system retains the basic dc, transient analysis, and FORTRAN 2 formats. It can be used on the IBM 1620/1311 Monitor I Mod 5 system, and solves a linear network containing 15 nodes and 45 branches.
The SeaDAS Processing and Analysis System: SeaWiFS, MODIS, and Beyond
NASA Astrophysics Data System (ADS)
MacDonald, M. D.; Ruebens, M.; Wang, L.; Franz, B. A.
2005-12-01
The SeaWiFS Data Analysis System (SeaDAS) is a comprehensive software package for the processing, display, and analysis of ocean data from a variety of satellite sensors. Continuous development and user support by programmers and scientists for more than a decade has helped to make SeaDAS the most widely used software package in the world for ocean color applications, with a growing base of users from the land and sea surface temperature community. Full processing support for past (CZCS, OCTS, MOS) and present (SeaWiFS, MODIS) sensors, and anticipated support for future missions such as NPP/VIIRS, enables end users to reproduce the standard ocean archive product suite distributed by NASA's Ocean Biology Processing Group (OBPG), as well as a variety of evaluation and intermediate ocean, land, and atmospheric products. Availability of the processing algorithm source codes and a software build environment also provide users with the tools to implement custom algorithms. Recent SeaDAS enhancements include synchronization of MODIS processing with the latest code and calibration updates from the MODIS Calibration Support Team (MCST), support for all levels of MODIS processing including Direct Broadcast, a port to the Macintosh OS X operating system, release of the display/analysis-only SeaDAS-Lite, and an extremely active web-based user support forum.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Nott, Melissa T; Chapparo, Christine
2008-09-01
Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance over short intervals and can provide direct guidance to occupational therapy intervention to improve task embedded information processing by categorising errors under four stages of an information processing model: Perceive, Recall, Plan and Perform.
Preliminary Thermal-Mechanical Sizing of Metallic TPS: Process Development and Sensitivity Studies
NASA Technical Reports Server (NTRS)
Poteet, Carl C.; Abu-Khajeel, Hasan; Hsu, Su-Yuen
2002-01-01
The purpose of this research was to perform sensitivity studies and develop a process to perform thermal and structural analysis and sizing of the latest Metallic Thermal Protection System (TPS) developed at NASA LaRC (Langley Research Center). Metallic TPS is a key technology for reducing the cost of reusable launch vehicles (RLV), offering the combination of increased durability and competitive weights when compared to other systems. Accurate sizing of metallic TPS requires combined thermal and structural analysis. Initial sensitivity studies were conducted using transient one-dimensional finite element thermal analysis to determine the influence of various TPS and analysis parameters on TPS weight. The thermal analysis model was then used in combination with static deflection and failure mode analysis of the sandwich panel outer surface of the TPS to obtain minimum weight TPS configurations at three vehicle stations on the windward centerline of a representative RLV. The coupled nature of the analysis requires an iterative analysis process, which will be described herein. Findings from the sensitivity analysis are reported, along with TPS designs at the three RLV vehicle stations considered.
Xu, Jia-Min; Wang, Ce-Qun; Lin, Long-Nian
2014-06-25
Multi-channel in vivo recording techniques are used to record ensemble neuronal activity and local field potentials (LFP) simultaneously. One of the key points for the technique is how to process these two sets of recorded neural signals properly so that data accuracy can be assured. We intend to introduce data processing approaches for action potentials and LFP based on the original data collected through multi-channel recording system. Action potential signals are high-frequency signals, hence high sampling rate of 40 kHz is normally chosen for recording. Based on waveforms of extracellularly recorded action potentials, tetrode technology combining principal component analysis can be used to discriminate neuronal spiking signals from differently spatially distributed neurons, in order to obtain accurate single neuron spiking activity. LFPs are low-frequency signals (lower than 300 Hz), hence the sampling rate of 1 kHz is used for LFPs. Digital filtering is required for LFP analysis to isolate different frequency oscillations including theta oscillation (4-12 Hz), which is dominant in active exploration and rapid-eye-movement (REM) sleep, gamma oscillation (30-80 Hz), which is accompanied by theta oscillation during cognitive processing, and high frequency ripple oscillation (100-250 Hz) in awake immobility and slow wave sleep (SWS) state in rodent hippocampus. For the obtained signals, common data post-processing methods include inter-spike interval analysis, spike auto-correlation analysis, spike cross-correlation analysis, power spectral density analysis, and spectrogram analysis.
The initial design of LAPAN's IR micro bolometer using mission analysis process
NASA Astrophysics Data System (ADS)
Bustanul, A.; Irwan, P.; M. T., Andi; Firman, B.
2016-11-01
As new player in Infra Red (IR) sector, uncooled, small, and lightweight IR Micro Bolometer has been chosen as one of payloads for LAPAN's next micro satellite project. Driven the desire to create our own IR Micro Bolometer, mission analysis design procedure has been applied. After tracing all possible missions, the Planck's and Wien's Law for black body, Temperature Responsivity (TR), and sub-pixel response had been utilized in order to determine the appropriate spectral radiance. The 3.8 - 4 μm wavelength were available to detect wild fire (forest fire) and active volcanoes, two major problems faced by Indonesia. In order to strengthen and broaden the result, iteration process had been used throughout the process. The analysis, then, were continued by calculating Ground pixel size, IFOV pixel, swath width, and focus length. Meanwhile, regarding of resolution, at least it is 400 m. The further procedure covered the integrated of optical design, wherein we combined among optical design software, Zemax, with mechanical analysis software (structure and thermal analysis), such as Nastran and Thermal Desktop / Sinda Fluint. The integration process was intended to produce high performance optical system of our IR Micro Bolometer that can be used under extreme environment. The results of all those analysis, either in graphs or in measurement, show that the initial design of LAPAN'S IR Micro Bolometer meets the determined requirement. However, it needs the further evaluation (iteration). This paper describes the initial design of LAPAN's IR Micro Bolometer using mission analysis process
2012-02-09
different sources [12,13], but the analytical techniques needed for such analysis (XRD, INAA , & ICP-MS) are time consuming and require expensive...partial least-squares discriminant analysis (PLSDA) that used the SIMPLS solving method [33]. In the experi- ment design, a leave-one-sample-out (LOSO) para...REPORT Advanced signal processing analysis of laser-induced breakdown spectroscopy data for the discrimination of obsidian sources 14. ABSTRACT 16
General RMP Guidance - Appendix D: OSHA Guidance on PSM
OSHA's Process Safety Management (PSM) Guidance on providing complete and accurate written information concerning process chemicals, process technology, and process equipment; including process hazard analysis and material safety data sheets.
CPAS Preflight Drop Test Analysis Process
NASA Technical Reports Server (NTRS)
Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.
2015-01-01
Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.
Conducting a narrative analysis.
Emden, C
1998-07-01
This paper describes the process of narrative analysis as undertaken within a nursing study on scholars and scholarship. If follows an earlier paper titled: Theoretical perspectives on narrative inquiry that described the influencing ideas of Bruner (1987) and Roof (1994) upon the same study. Analysis procedures are described here in sufficient detail for other researchers wishing to implement a similar approach to do so. The process as described has two main components: (A) strategies of 'core story creation' and 'employment'; and (B) issues and dilemmas of narrative analysis, especially relating to rigour. The ideas of Polkinghorne (1988), Mishler (1986), and Labov (in Mishler 1986a) are introduced in so far as they impinge upon the analysis process. These relate especially to the development of key terms, and to the analysis strategies of core story creation and employment. Outcomes of the study in question are termed 'Signposting the lived-world of scholarship'.
The detection and analysis of point processes in biological signals
NASA Technical Reports Server (NTRS)
Anderson, D. J.; Correia, M. J.
1977-01-01
A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.
Processing and Analysis of Mars Pathfinder Science Data at JPL's Science Data Processing Section
NASA Technical Reports Server (NTRS)
LaVoie, S.; Green, W.; Runkle, A.; Alexander, D.; Andres, P.; DeJong, E.; Duxbury, E.; Freda, D.; Gorjian, Z.; Hall, J.;
1998-01-01
The Mars Pathfinder mission required new capabilities and adaptation of existing capabilities in order to support science analysis and flight operations requirements imposed by the in-situ nature of the mission.
Sustainability Analysis for Products and Processes
Sustainability Analysis for Products and Processes Subhas K. Sikdar National Risk Management Research Laboratory United States Environmental protection Agency 26 W. M.L. King Dr. Cincinnati, OH 45237 Sikdar.subhas@epa.gov ABSTRACT Claims of both sustainable and unsu...
ERIC Educational Resources Information Center
Burns, Daniel J.; Martens, Nicholas J.; Bertoni, Alicia A.; Sweeney, Emily J.; Lividini, Michelle D.
2006-01-01
In a repeated testing paradigm, list items receiving item-specific processing are more likely to be recovered across successive tests (item gains), whereas items receiving relational processing are likely to be forgotten progressively less on successive tests. Moreover, analysis of cumulative-recall curves has shown that item-specific processing…
ERIC Educational Resources Information Center
Bohrn, Isabel C.; Altmann, Ulrike; Jacobs, Arthur M.
2012-01-01
A quantitative, coordinate-based meta-analysis combined data from 354 participants across 22 fMRI studies and one positron emission tomography (PET) study to identify the differences in neural correlates of figurative and literal language processing, and to investigate the role of the right hemisphere (RH) in figurative language processing.…
MSEE: Stochastic Cognitive Linguistic Behavior Models for Semantic Sensing
2013-09-01
recognition, a Gaussian Process Dynamic Model with Social Network Analysis (GPDM-SNA) for a small human group action recognition, an extended GPDM-SNA...44 3.2. Small Human Group Activity Modeling Based on Gaussian Process Dynamic Model and Social Network Analysis (SN-GPDM...51 Approved for public release; distribution unlimited. 3 3.2.3. Gaussian Process Dynamical Model and
PSP, TSP, XP, CMMI...Eating the Alphabet Soup!
2011-05-19
Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other...4 Q tit t Continuous process improvement Organizational Performance Management Causal Analysis and Resolution Level Focus Process Areas Requirements...Project Management process standardization Risk management Decision Analysis and Resolution Product Integration 2 M d R i t t anage Basic Project
A cost analysis: processing maple syrup products
Neil K. Huyler; Lawrence D. Garrett
1979-01-01
A cost analysis of processing maple sap to syrup for three fuel types, oil-, wood-, and LP gas-fired evaporators, indicates that: (1) fuel, capital, and labor are the major cost components of processing sap to syrup; (2) wood-fired evaporators show a slight cost advantage over oil- and LP gas-fired evaporators; however, as the cost of wood approaches $50 per cord, wood...
2011-12-01
systems engineering technical and technical management processes. Technical Planning, Stakeholders Requirements Development, and Architecture Design were...Stakeholder Requirements Definition, Architecture Design and Technical Planning. A purposive sampling of AFRL rapid development program managers and engineers...emphasize one process over another however Architecture Design , Implementation scored higher among Technical Processes. Decision Analysis, Technical
The instrumental genesis process in future primary teachers using Dynamic Geometry Software
NASA Astrophysics Data System (ADS)
Ruiz-López, Natalia
2018-05-01
This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis addresses the techniques and types of dragging used, the obstacles to learning encountered, a description of the interaction between the pair and their interaction with the teacher, and the type of language used. Based on this analysis, possibilities and limitations of the instrumental genesis process are identified for the development of geometric competencies such as conjecture creation, property checking and problem researching. It is also suggested that the methodology used in the analysis of the problem solving process may be useful for those teachers and researchers who want to integrate Dynamic Geometry Software (DGS) in their classrooms.
Closed Loop Requirements and Analysis Management
NASA Technical Reports Server (NTRS)
Lamoreaux, Michael; Verhoef, Brett
2015-01-01
Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.