Science.gov

Sample records for protection process optimization

  1. Design of experiments for thermal protection system process optimization

    NASA Astrophysics Data System (ADS)

    Longani, Hans R.

    2000-01-01

    Solid Rocket Booster (SRB) structures were protected from heating due to aeroshear, radiation and plume impingement by a Thermal Protection System (TPS) known as Marshall Sprayable Ablative (MSA-2). MSA-2 contains Volatile Organic Compounds (VOCs) which due to strict environmental legislation was eliminated. MSA-2 was also classified as hazardous waste, which makes the disposal very costly. Marshall Convergent Coating (MCC-1) replaced MSA-2, and eliminated the use of solvents by delivering the dry filler materials and the fluid resin system to a patented spray gun which utilizes Convergent Spray Technologies spray process. The selection of TPS material was based on risk assessment, performance comparisons, processing, application and cost. Design of Experiments technique was used to optimize the spraying parameters. .

  2. Optimization of radiation protection

    SciTech Connect

    Lochard, J.

    1981-07-01

    The practical and theoretical problems raised by the optimization of radiological protection merit a review of decision-making methods, their relevance, and the way in which they are used in order to better determine what role they should play in the decision-making process. Following a brief summary of the theoretical background of the cost-benefit analysis, we examine the methodological choices implicit in the model presented in the International Commission on Radiological Protection Publication No. 26 and, particularly, the consequences of the theory that the level of radiation protection, the benefits, and the production costs of an activity can be treated separately.

  3. Multiobjective optimization of temporal processes.

    PubMed

    Song, Zhe; Kusiak, Andrew

    2010-06-01

    This paper presents a dynamic predictive-optimization framework of a nonlinear temporal process. Data-mining (DM) and evolutionary strategy algorithms are integrated in the framework for solving the optimization model. DM algorithms learn dynamic equations from the process data. An evolutionary strategy algorithm is then applied to solve the optimization problem guided by the knowledge extracted by the DM algorithm. The concept presented in this paper is illustrated with the data from a power plant, where the goal is to maximize the boiler efficiency and minimize the limestone consumption. This multiobjective optimization problem can be either transformed into a single-objective optimization problem through preference aggregation approaches or into a Pareto-optimal optimization problem. The computational results have shown the effectiveness of the proposed optimization framework. PMID:19900853

  4. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, Kevin M.; Meservey, Richard H.; Landon, Mark D.

    1999-01-01

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D&D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded.

  5. Process optimization in optical fabrication

    NASA Astrophysics Data System (ADS)

    Faehnle, Oliver

    2016-03-01

    Predictable and stable fabrication processes are essential for reliable cost and quality management in optical fabrication technology. This paper reports on strategies to generate and control optimum sets of process parameters for, e.g., subaperture polishing of small optics (featuring clear apertures smaller than 2 mm). Emphasis is placed on distinguishing between machine and process optimization, demonstrating that it is possible to set up the ductile mode grinding process by means other than controlling critical depth of cut. Finally, a recently developed in situ testing technique is applied to monitor surface quality on-machine while abrasively working the surface under test enabling an online optimization of polishing processes eventually minimizing polishing time and fabrication cost.

  6. Abrasion protection in process piping

    SciTech Connect

    Accetta, J.

    1996-07-01

    Process piping often is subjected to failure from abrasion or a combination of abrasion and corrosion. Abrasion is a complex phenomenon, with many factors involved to varying degrees. Hard, mineral based alumina ceramic and basalt materials are used to provide protection against abrasion in many piping systems. Successful life extension examples are presented from many different industries. Lined piping components require special attention with regard to operating conditions as well as design and engineering considerations. Economic justification involves direct cost comparisons and avoided costs.

  7. Optimal Climate Protection Policies Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Weber, M.; Barth, V.; Hasselmann, K.; Hooss, G.

    A cost-benefit analysis for greenhouse warming based on a globally integrated cou- pled climate-macro economic cost model SIAM2 (Structural Integrated Assessment Model) is used to compute optimal paths of global CO2 emissions. The aim of the model is to minimize the net time-integrated sum of climate damage and mitigation costs (or maximize the economic and social welfare). The climate model is repre- sented by a nonlinear impulse-response model (NICCS) calibrated against a coupled ocean-atmosphere general circulation model and a three-dimensional global carbon cycle model. The latest version of the economic module is based a macro economic growth model, which is designed to capture not only the interactions between cli- mate damages and economic development, but also the conflicting goals of individual firms and society (government). The model includes unemployment, limited fossil fuel resources, endogenous and stochastic exogenous technological development (unpre- dictable labor or fuel efficiency innovations of random impact amplitude at random points in time). One objective of the project is to examine optimal climate protection policies in the presence of uncertainty. A stochastic model is introduced to simulate the development of technology as well as climate change and climate damages. In re- sponse to this (stochastic) prediction, the fiscal policy is adjusted gradually in a series of discrete steps. The stochastic module includes probability-based methods, sensitiv- ity studies and formal szenario analysis.

  8. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  9. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, K.M.; Meservey, R.H.; Landon, M.D.

    1999-08-10

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D and D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded. 3 figs.

  10. Optimal Protective Hypothermia in Arrested Mammalian Hearts

    PubMed Central

    Villet, Outi M.; Ge, Ming; Sekhar, Laigam N.; Corson, Marshall A.; Tylee, Tracy S.; Fan, Lu-Ping; Yao, Lin; Zhu, Chun; Olson, Aaron K.; Buroker, Norman E.; Xu, Cheng-Su; Anderson, David L.; Soh, Yong-Kian; Wang, Elise; Chen, Shi-Han; Portman, Michael A.

    2015-01-01

    Many therapeutic hypothermia recommendations have been reported, but the information supporting them is sparse, and reveals a need for the data of target therapeutic hypothermia (TTH) from well-controlled experiments. The core temperature ≤35°C is considered as hypothermia, and 29°C is a cooling injury threshold in pig heart in vivo. Thus, an optimal protective hypothermia (OPH) should be in the range 29–35°C. This study was conducted with a pig cardiopulmonary bypass preparation to decrease the core temperature to 29–35°C range at 20 minutes before and 60 minutes during heart arrest. The left ventricular (LV) developed pressure, maximum of the first derivative of LV (dP/dtmax), cardiac power, heart rate, cardiac output, and myocardial velocity (Vmax) were recorded continuously via an LV pressure catheter and an aortic flow probe. At 20 minutes of off-pump during reperfusion after 60 minutes arrest, 17 hypothermic hearts showed that the recovery of Vmax and dP/dtmax established sigmoid curves that consisted of two plateaus: a good recovery plateau at 29–30.5°C, the function recovered to baseline level (BL) (Vmax=118.4%±3.9% of BL, LV dP/dtmax=120.7%±3.1% of BL, n=6); another poor recovery plateau at 34–35°C (Vmax=60.2%±2.8% of BL, LV dP/dtmax=28.0%±5.9% of BL, p<0.05, n=6; ), which are similar to the four normothermia arrest (37°C) hearts (Vmax=55.9%±4.8% of BL, LV dP/dtmax=24.5%±2.1% of BL, n=4). The 32–32.5°C arrest hearts showed moderate recovery (n=5). A point of inflection (around 30.5–31°C) existed at the edge of a good recovery plateau followed by a steep slope. The point presented an OPH that should be the TTH. The results are concordant with data in the mammalian hearts, suggesting that the TTH should be initiated to cool core temperature at 31°C. PMID:25514569

  11. Pinch technology/process optimization

    SciTech Connect

    Not Available

    1992-12-01

    Improved process efficiency is of great importance to electric utilities and their industrial customers. It enhances company profitability, thereby fostering load retention and strategic load growth. Moreover, the technical means of achieving improved efficiency can significantly impact utility load shapes. By understanding the energy use patterns and options in an industrial facility, the utility and industrial user can work together to define mutually beneficial investment and operating decisions and to clarify how the decisions might be impacted by existing or alternative energy prices. Efforts to achieve such understanding are facilitated by using pinch technology, an innovative and highly effective methodology for systematically analyzing total industrial sites. This report documents a series of twelve industrial process optimization case studies. The studies were carried out using pinch technology. '' Each study was cosponsored by the industrial site's local electric utility. The twelve studies are follows: (1) pulp and paper, (2) refinery, (3) refinery, (4) yeast, (5) soups/sauces, (6) cellulose- acetate, (7) refinery, (8) chemicals, (9) gelatin-capsules, (10) refinery, (11) brewery, (12) cereal grains.

  12. Optimized shielding for space radiation protection

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Cucinotta, F. A.; Kim, M. H.; Schimmerling, W.

    2001-01-01

    Future deep space mission and International Space Station exposures will be dominated by the high-charge and -energy (HZE) ions of the Galactic Cosmic Rays (GCR). A few mammalian systems have been extensively tested over a broad range of ion types and energies. For example, C3H10T1/2 cells, V79 cells, and Harderian gland tumors have been described by various track-structure dependent response models. The attenuation of GCR induced biological effects depends strongly on the biological endpoint, response model used, and material composition. Optimization of space shielding is then driven by the nature of the response model and the transmission characteristics of the given material.

  13. Optimized Shielding for Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Cucinotta, F. A.; Kim, M.-H. Y.; Schimmerling, W.

    2000-01-01

    Abstract. Future deep space mission and International Space Station exposures will be dominated by the high-charge and -energy (HZE) ions of the Galactic Cosmic Rays (GCR). A few mammalian systems have been extensively tested over a broad range of ion types and energies. For example, C3H10T1/2 cells, V79 cells, and Harderian gland tumors have been described by various track-structure dependent response models. The attenuation of GCR induced biological effects depends strongly on the biological endpoint, response model used, and material composition. Optimization of space shielding is then driven by the nature of the response model and the transmission characteristics of the given material.

  14. Optimal timing in biological processes

    USGS Publications Warehouse

    Williams, B.K.; Nichols, J.D.

    1984-01-01

    A general approach for obtaining solutions to a class of biological optimization problems is provided. The general problem is one of determining the appropriate time to take some action, when the action can be taken only once during some finite time frame. The approach can also be extended to cover a number of other problems involving animal choice (e.g., mate selection, habitat selection). Returns (assumed to index fitness) are treated as random variables with time-specific distributions, and can be either observable or unobservable at the time action is taken. In the case of unobservable returns, the organism is assumed to base decisions on some ancillary variable that is associated with returns. Optimal policies are derived for both situations and their properties are discussed. Various extensions are also considered, including objective functions based on functions of returns other than the mean, nonmonotonic relationships between the observable variable and returns; possible death of the organism before action is taken; and discounting of future returns. A general feature of the optimal solutions for many of these problems is that an organism should be very selective (i.e., should act only when returns or expected returns are relatively high) at the beginning of the time frame and should become less and less selective as time progresses. An example of the application of optimal timing to a problem involving the timing of bird migration is discussed, and a number of other examples for which the approach is applicable are described.

  15. Optimization of the ion implantation process

    NASA Astrophysics Data System (ADS)

    Maczka, D.; Latuszynski, A.; Kuduk, R.; Partyka, J.

    This work is devoted to the optimization of the ion implantation process in the implanter Unimas of the Institute of Physics, Maria Curie-Sklodowska University, Lublin. The results obtained during several years of operation allow us to determine the optimal work parameters of the device [1-3].

  16. Multiobjective optimization approach: thermal food processing.

    PubMed

    Abakarov, A; Sushkov, Y; Almonacid, S; Simpson, R

    2009-01-01

    The objective of this study was to utilize a multiobjective optimization technique for the thermal sterilization of packaged foods. The multiobjective optimization approach used in this study is based on the optimization of well-known aggregating functions by an adaptive random search algorithm. The applicability of the proposed approach was illustrated by solving widely used multiobjective test problems taken from the literature. The numerical results obtained for the multiobjective test problems and for the thermal processing problem show that the proposed approach can be effectively used for solving multiobjective optimization problems arising in the food engineering field.

  17. Optimization of the Processing of Mo Disks

    SciTech Connect

    Tkac, Peter; Rotsch, David A.; Stepinski, Dominique; Makarashvili, Vakhtang; Harvey, James; Vandegrift, George F.

    2016-01-01

    The objective of this work is to decrease the processing time for irradiated disks of enriched Mo for the production of 99Mo. Results are given for the dissolution of nonirradiated Mo disks, optimization of the process for large-scale dissolution of sintered disks, optimization of the removal of the main side products (Zr and Nb) from dissolved targets, and dissolution of irradiated Mo disks.

  18. Synthesis and optimization of integrated chemical processes

    SciTech Connect

    Barton, Paul I.; Evans, Lawrence B.

    2002-04-26

    This is the final technical report for the project titled ''Synthesis and optimization of integrated chemical processes''. Progress is reported on novel algorithms for the computation of all heteroazeotropic compositions present in complex liquid mixtures; the design of novel flexible azeotropic separation processes using middle vessel batch distillation columns; and theory and algorithms for sensitivity analysis and numerical optimization of hybrid discrete/continuous dynamic systems.

  19. Process optimization for mask fabrication

    NASA Astrophysics Data System (ADS)

    Sakurai, Hideaki; Itoh, Masamitsu; Kumagae, Akitoshi; Anze, Hirohito; Abe, Takayuki; Higashikawa, Iwao

    1998-09-01

    Recently, next-generation mask fabrication processes have been actively examined for application with Electron Beam writing tools and chemically amplified resists. In this study, we used a variable shaped electron beam writing system with an accelerating voltage and chemically amplified resist to investigate the dependence of the CD error in a localized area of a 6025 mask on the process factors, with the goal of fabricating more accurate masks with improving sensitivity. Our results indicated that CD error in a localized area did not depend on the resist thickness. Higher sensitivity and CD uniformity were achieved simultaneously. Moreover, we could isolate the CD error caused by the resist heating effect is more apparent for higher doses than lower doses. However, a higher dose gives rise to a small CD change rate. In this experiment, the effect of the lower CD change rate at a higher dose counterbalances the resist heating effect. By decreasing CD error in a localized area, we obtained a CD uniformity of 14 nm in a 100 mm area on the mask.

  20. Global nonlinear optimization of spacecraft protective structures design

    NASA Technical Reports Server (NTRS)

    Mog, R. A.; Lovett, J. N., Jr.; Avans, S. L.

    1990-01-01

    The global optimization of protective structural designs for spacecraft subject to hypervelocity meteoroid and space debris impacts is presented. This nonlinear problem is first formulated for weight minimization of the space station core module configuration using the Nysmith impact predictor. Next, the equivalence and uniqueness of local and global optima is shown using properties of convexity. This analysis results in a new feasibility condition for this problem. The solution existence is then shown, followed by a comparison of optimization techniques. Finally, a sensitivity analysis is presented to determine the effects of variations in the systemic parameters on optimal design. The results show that global optimization of this problem is unique and may be achieved by a number of methods, provided the feasibility condition is satisfied. Furthermore, module structural design thicknesses and weight increase with increasing projectile velocity and diameter and decrease with increasing separation between bumper and wall for the Nysmith predictor.

  1. Design time optimization for hardware watermarking protection of HDL designs.

    PubMed

    Castillo, E; Morales, D P; García, A; Parrilla, L; Todorovich, E; Meyer-Baese, U

    2015-01-01

    HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time. PMID:25861681

  2. Design time optimization for hardware watermarking protection of HDL designs.

    PubMed

    Castillo, E; Morales, D P; García, A; Parrilla, L; Todorovich, E; Meyer-Baese, U

    2015-01-01

    HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time.

  3. Design Time Optimization for Hardware Watermarking Protection of HDL Designs

    PubMed Central

    Castillo, E.; Morales, D. P.; García, A.; Parrilla, L.; Todorovich, E.; Meyer-Baese, U.

    2015-01-01

    HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time. PMID:25861681

  4. GRA prospectus: optimizing design and management of protected areas

    USGS Publications Warehouse

    Bernknopf, Richard; Halsing, David

    2001-01-01

    Protected areas comprise one major type of global conservation effort that has been in the form of parks, easements, or conservation concessions. Though protected areas are increasing in number and size throughout tropical ecosystems, there is no systematic method for optimally targeting specific local areas for protection, designing the protected area, and monitoring it, or for guiding follow-up actions to manage it or its surroundings over the long run. Without such a system, conservation projects often cost more than necessary and/or risk protecting ecosystems and biodiversity less efficiently than desired. Correcting these failures requires tools and strategies for improving the placement, design, and long-term management of protected areas. The objective of this project is to develop a set of spatially based analytical tools to improve the selection, design, and management of protected areas. In this project, several conservation concessions will be compared using an economic optimization technique. The forest land use portfolio model is an integrated assessment that measures investment in different land uses in a forest. The case studies of individual tropical ecosystems are developed as forest (land) use and preservation portfolios in a geographic information system (GIS). Conservation concessions involve a private organization purchasing development and resource access rights in a certain area and retiring them. Forests are put into conservation, and those people who would otherwise have benefited from extracting resources or selling the right to do so are compensated. Concessions are legal agreements wherein the exact amount and nature of the compensation result from a negotiated agreement between an agent of the conservation community and the local community. Funds are placed in a trust fund, and annual payments are made to local communities and regional/national governments. The payments are made pending third-party verification that the forest expanse

  5. Synthesis of optimal adsorptive carbon capture processes.

    SciTech Connect

    chang, Y.; Cozad, A.; Kim, H.; Lee, A.; Vouzis, P.; Konda, M.; Simon, A.; Sahinidis, N.; Miller, D.

    2011-01-01

    Solid sorbent carbon capture systems have the potential to require significantly lower regeneration energy compared to aqueous monoethanol amine (MEA) systems. To date, the majority of work on solid sorbents has focused on developing the sorbent materials themselves. In order to advance these technologies, it is necessary to design systems that can exploit the full potential and unique characteristics of these materials. The Department of Energy (DOE) recently initiated the Carbon Capture Simulation Initiative (CCSI) to develop computational tools to accelerate the commercialization of carbon capture technology. Solid sorbents is the first Industry Challenge Problem considered under this initiative. An early goal of the initiative is to demonstrate a superstructure-based framework to synthesize an optimal solid sorbent carbon capture process. For a given solid sorbent, there are a number of potential reactors and reactor configurations consisting of various fluidized bed reactors, moving bed reactors, and fixed bed reactors. Detailed process models for these reactors have been modeled using Aspen Custom Modeler; however, such models are computationally intractable for large optimization-based process synthesis. Thus, in order to facilitate the use of these models for process synthesis, we have developed an approach for generating simple algebraic surrogate models that can be used in an optimization formulation. This presentation will describe the superstructure formulation which uses these surrogate models to choose among various process alternatives and will describe the resulting optimal process configuration.

  6. Bidirectional optimization of the melting spinning process.

    PubMed

    Liang, Xiao; Ding, Yongsheng; Wang, Zidong; Hao, Kuangrong; Hone, Kate; Wang, Huaping

    2014-02-01

    A bidirectional optimizing approach for the melting spinning process based on an immune-enhanced neural network is proposed. The proposed bidirectional model can not only reveal the internal nonlinear relationship between the process configuration and the quality indices of the fibers as final product, but also provide a tool for engineers to develop new fiber products with expected quality specifications. A neural network is taken as the basis for the bidirectional model, and an immune component is introduced to enlarge the searching scope of the solution field so that the neural network has a larger possibility to find the appropriate and reasonable solution, and the error of prediction can therefore be eliminated. The proposed intelligent model can also help to determine what kind of process configuration should be made in order to produce satisfactory fiber products. To make the proposed model practical to the manufacturing, a software platform is developed. Simulation results show that the proposed model can eliminate the approximation error raised by the neural network-based optimizing model, which is due to the extension of focusing scope by the artificial immune mechanism. Meanwhile, the proposed model with the corresponding software can conduct optimization in two directions, namely, the process optimization and category development, and the corresponding results outperform those with an ordinary neural network-based intelligent model. It is also proved that the proposed model has the potential to act as a valuable tool from which the engineers and decision makers of the spinning process could benefit.

  7. Hydrocarbon Processing`s process design and optimization `96

    SciTech Connect

    1996-06-01

    This paper compiles information on hydrocarbon processes, describing the application, objective, economics, commercial installations, and licensor. Processes include: alkylation, ammonia, catalytic reformer, crude fractionator, crude unit, vacuum unit, dehydration, delayed coker, distillation, ethylene furnace, FCCU, polymerization, gas sweetening, hydrocracking, hydrogen, hydrotreating (naphtha, distillate, and resid desulfurization), natural gas processing, olefins, polyethylene terephthalate, refinery, styrene, sulfur recovery, and VCM furnace.

  8. Optimization Of A Mass Spectrometry Process

    SciTech Connect

    Lopes, Jose; Alegria, F. Correa; Redondo, Luis; Barradas, N. P.; Alves, E.; Rocha, Jorge

    2011-06-01

    In this paper we present and discuss a system developed in order to optimize the mass spectrometry process of an ion implanter. The system uses a PC to control and display the mass spectrum. The operator interacts with the I/O board, that interfaces with the computer and the ion implanter by a LabVIEW code. Experimental results are shown and the capabilities of the system are discussed.

  9. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability.

  10. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. PMID:22208397

  11. Resist process optimization for further defect reduction

    NASA Astrophysics Data System (ADS)

    Tanaka, Keiichi; Iseki, Tomohiro; Marumoto, Hiroshi; Takayanagi, Koji; Yoshida, Yuichi; Uemura, Ryouichi; Yoshihara, Kosuke

    2012-03-01

    Defect reduction has become one of the most important technical challenges in device mass-production. Knowing that resist processing on a clean track strongly impacts defect formation in many cases, we have been trying to improve the track process to enhance customer yield. For example, residual type defect and pattern collapse are strongly related to process parameters in developer, and we have reported new develop and rinse methods in the previous papers. Also, we have reported the optimization method of filtration condition to reduce bridge type defects, which are mainly caused by foreign substances such as gels in resist. Even though we have contributed resist caused defect reduction in past studies, defect reduction requirements continue to be very important. In this paper, we will introduce further process improvements in terms of resist defect reduction, including the latest experimental data.

  12. Optimal Vitamin D Supplementation Levels for Cardiovascular Disease Protection

    PubMed Central

    Lugg, Sebastian T.; Howells, Phillip A.; Thickett, David R.

    2015-01-01

    First described in relation to musculoskeletal disease, there is accumulating data to suggest that vitamin D may play an important role in cardiovascular disease (CVD). In this review we aim to provide an overview of the role of vitamin D status as both a marker of and potentially causative agent of hypertension, coronary artery disease, heart failure, atrial fibrillation, stroke, and peripheral vascular disease. The role of vitamin D levels as a disease marker for all-cause mortality is also discussed. We review the current knowledge gathered from experimental studies, observational studies, randomised controlled trials, and subsequent systematic reviews in order to suggest the optimal vitamin D level for CVD protection. PMID:26435569

  13. ABC proteins protect the human body and maintain optimal health.

    PubMed

    Ueda, Kazumitsu

    2011-01-01

    Human MDR1, a multi-drug transporter gene, was isolated as the first of the eukaryote ATP Binding Cassette (ABC) proteins from a multidrug-resistant carcinoma cell line in 1986. To date, over 25 years, many ABC proteins have been found to play important physiological roles by transporting hydrophobic compounds. Defects in their functions cause various diseases, indicating that endogenous hydrophobic compounds, as well as water-soluble compounds, are properly transported by transmembrane proteins. MDR1 transports a large number of structurally unrelated drugs and is involved in their pharmacokinetics, and thus is a key factor in drug interaction. ABCA1, an ABC protein, eliminates excess cholesterol in peripheral cells by generating HDL. Because ABCA1 is a key molecule in cholesterol homeostasis, its function and expression are highly regulated. Eukaryote ABC proteins function on the body surface facing the outside and in organ pathways to adapt to the extracellular environment and protect the body to maintain optimal health.

  14. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  15. Mathematical Analysis and Optimization of Infiltration Processes

    NASA Technical Reports Server (NTRS)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  16. Process-centered revenue-cycle management optimizes payment process.

    PubMed

    Schneider, R J; Mandelbaum, S P; Graboys, K; Bailey, C

    2001-01-01

    By implementing a process-centered revenue cycle, healthcare organizations and group practices can achieve a seamless payment process with clear lines of accountability to achieve target outcomes. The integrated, end-to-end, revenue-cycle process involves four key components: jobs, skills, staffing, and structure; information and information systems; organizational alignment and accountability; and performance measures and evaluation measures. The Henry Ford Health System (HFHS), based in Detroit, Michigan, exemplifies the type of results that are achievable with this model. HFHS includes a group practice with more than 1,000 physicians in 40 specialties. After implementing a process-centered revenue cycle, HFHS dramatically improved registration and verification transactions and optimized revenues.

  17. Optimism and the brain: trait optimism mediates the protective role of the orbitofrontal cortex gray matter volume against anxiety.

    PubMed

    Dolcos, Sanda; Hu, Yifan; Iordan, Alexandru D; Moore, Matthew; Dolcos, Florin

    2016-02-01

    Converging evidence identifies trait optimism and the orbitofrontal cortex (OFC) as personality and brain factors influencing anxiety, but the nature of their relationships remains unclear. Here, the mechanisms underlying the protective role of trait optimism and of increased OFC volume against symptoms of anxiety were investigated in 61 healthy subjects, who completed measures of trait optimism and anxiety, and underwent structural scanning using magnetic resonance imaging. First, the OFC gray matter volume (GMV) was associated with increased optimism, which in turn was associated with reduced anxiety. Second, trait optimism mediated the relation between the left OFC volume and anxiety, thus demonstrating that increased GMV in this brain region protects against symptoms of anxiety through increased optimism. These results provide novel evidence about the brain-personality mechanisms protecting against anxiety symptoms in healthy functioning, and identify potential targets for preventive and therapeutic interventions aimed at reducing susceptibility and increasing resilience against emotional disturbances.

  18. Optimal signal processing for continuous qubit readout

    NASA Astrophysics Data System (ADS)

    Ng, Shilin; Tsang, Mankei

    2014-08-01

    The measurement of a quantum two-level system, or a qubit in modern terminology, often involves an electromagnetic field that interacts with the qubit, before the field is measured continuously and the qubit state is inferred from the noisy field measurement. During the measurement, the qubit may undergo spontaneous transitions, further obscuring the initial qubit state from the observer. Taking advantage of some well-known techniques in stochastic detection theory, here we propose a signal processing protocol that can infer the initial qubit state optimally from the measurement in the presence of noise and qubit dynamics. Assuming continuous quantum-nondemolition measurements with Gaussian or Poissonian noise and a classical Markov model for the qubit, we derive analytic solutions to the protocol in some special cases of interest using Itō calculus. Our method is applicable to multihypothesis testing for robust qubit readout and relevant to experiments on qubits in superconducting microwave circuits, trapped ions, nitrogen-vacancy centers in diamond, semiconductor quantum dots, or phosphorus donors in silicon.

  19. Adaptive, predictive controller for optimal process control

    SciTech Connect

    Brown, S.K.; Baum, C.C.; Bowling, P.S.; Buescher, K.L.; Hanagandi, V.M.; Hinde, R.F. Jr.; Jones, R.D.; Parkinson, W.J.

    1995-12-01

    One can derive a model for use in a Model Predictive Controller (MPC) from first principles or from experimental data. Until recently, both methods failed for all but the simplest processes. First principles are almost always incomplete and fitting to experimental data fails for dimensions greater than one as well as for non-linear cases. Several authors have suggested the use of a neural network to fit the experimental data to a multi-dimensional and/or non-linear model. Most networks, however, use simple sigmoid functions and backpropagation for fitting. Training of these networks generally requires large amounts of data and, consequently, very long training times. In 1993 we reported on the tuning and optimization of a negative ion source using a special neural network[2]. One of the properties of this network (CNLSnet), a modified radial basis function network, is that it is able to fit data with few basis functions. Another is that its training is linear resulting in guaranteed convergence and rapid training. We found the training to be rapid enough to support real-time control. This work has been extended to incorporate this network into an MPC using the model built by the network for predictive control. This controller has shown some remarkable capabilities in such non-linear applications as continuous stirred exothermic tank reactors and high-purity fractional distillation columns[3]. The controller is able not only to build an appropriate model from operating data but also to thin the network continuously so that the model adapts to changing plant conditions. The controller is discussed as well as its possible use in various of the difficult control problems that face this community.

  20. Utility Theory for Evaluation of Optimal Process Condition of SAW: A Multi-Response Optimization Approach

    SciTech Connect

    Datta, Saurav; Biswas, Ajay; Bhaumik, Swapan; Majumdar, Gautam

    2011-01-17

    Multi-objective optimization problem has been solved in order to estimate an optimal process environment consisting of optimal parametric combination to achieve desired quality indicators (related to bead geometry) of submerged arc weld of mild steel. The quality indicators selected in the study were bead height, penetration depth, bead width and percentage dilution. Taguchi method followed by utility concept has been adopted to evaluate the optimal process condition achieving multiple objective requirements of the desired quality weld.

  1. Optimal conservation outcomes require both restoration and protection.

    PubMed

    Possingham, Hugh P; Bode, Michael; Klein, Carissa J

    2015-01-01

    Conservation outcomes are principally achieved through the protection of intact habitat or the restoration of degraded habitat. Restoration is generally considered a lower priority action than protection because protection is thought to provide superior outcomes, at lower costs, without the time delay required for restoration. Yet while it is broadly accepted that protected intact habitat safeguards more biodiversity and generates greater ecosystem services per unit area than restored habitat, conservation lacks a theory that can coherently compare the relative outcomes of the two actions. We use a dynamic landscape model to integrate these two actions into a unified conservation theory of protection and restoration. Using nonlinear benefit functions, we show that both actions are crucial components of a conservation strategy that seeks to optimise either biodiversity conservation or ecosystem services provision. In contrast to conservation orthodoxy, in some circumstances, restoration should be strongly preferred to protection. The relative priority of protection and restoration depends on their costs and also on the different time lags that are inherent to both protection and restoration. We derive a simple and easy-to-interpret heuristic that integrates these factors into a single equation that applies equally to biodiversity conservation and ecosystem service objectives. We use two examples to illustrate the theory: bird conservation in tropical rainforests and coastal defence provided by mangrove forests. PMID:25625277

  2. Optimal Conservation Outcomes Require Both Restoration and Protection

    PubMed Central

    Possingham, Hugh P.; Bode, Michael; Klein, Carissa J.

    2015-01-01

    Conservation outcomes are principally achieved through the protection of intact habitat or the restoration of degraded habitat. Restoration is generally considered a lower priority action than protection because protection is thought to provide superior outcomes, at lower costs, without the time delay required for restoration. Yet while it is broadly accepted that protected intact habitat safeguards more biodiversity and generates greater ecosystem services per unit area than restored habitat, conservation lacks a theory that can coherently compare the relative outcomes of the two actions. We use a dynamic landscape model to integrate these two actions into a unified conservation theory of protection and restoration. Using nonlinear benefit functions, we show that both actions are crucial components of a conservation strategy that seeks to optimise either biodiversity conservation or ecosystem services provision. In contrast to conservation orthodoxy, in some circumstances, restoration should be strongly preferred to protection. The relative priority of protection and restoration depends on their costs and also on the different time lags that are inherent to both protection and restoration. We derive a simple and easy-to-interpret heuristic that integrates these factors into a single equation that applies equally to biodiversity conservation and ecosystem service objectives. We use two examples to illustrate the theory: bird conservation in tropical rainforests and coastal defence provided by mangrove forests. PMID:25625277

  3. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  4. Multiobjective process optimization of a power unit

    SciTech Connect

    Garduno-Ramirez, R.; Lee, K.Y.

    1999-11-01

    Recent years have witnessed an increased participation of fossil fuel power units (FFPU) in wide-range load-following duties in order to match current power demand patterns and to deal with uncertain economic contexts. This mode of operation imposes high physical stress on the main components and leads to conflicting operational and control situations, since most power units were designed to operate most efficiently at constant rated conditions. The needs for extended periods without maintenance and replacement, compliance with stringent emission regulations and efficient operation requirements, call for the development of effective plant wide optimization and control methods and systems. Supervisory control, as an interface between the feedback control loops and the economic dispatch and unit commitment systems at upper control layers in power systems, could certainly play a key role in this regard. This paper presents a systematic procedure to generate optimal set-points for the feedback control loops in a FFPU from a given unit load demand profile. The method is flexible enough to accommodate any number of set-points. Also, the optimization procedure is formulated as a multiobjective optimization problem for which the form and number of the objective functions, as well as their preferences, may be modified as required. This approach facilitates adaptation to different operating policies and the realization of performance trade-off analyses.

  5. Intelligent Processing Equipment Within the Environmental Protection Agency

    NASA Technical Reports Server (NTRS)

    Greathouse, Daniel G.; Nalesnik, Richard P.

    1992-01-01

    Protection of the environment and environmental remediation requires the cooperation, at all levels, of government and industry. Intelligent processing equipment, in addition to other artificial intelligence based tools, was used by the Environmental Protection Agency to provide personnel safety and improve the efficiency of those responsible for protection and remediation of the environment. These exploratory efforts demonstrate the feasibility and utility of expanding development and widespread use of these tools. A survey of current intelligent processing equipment applications in the Agency is presented and is followed by a brief discussion of possible uses in the future.

  6. Optimization of health protection of the public following a major nuclear accident: Interaction between radiation protection and social and psychological factors

    SciTech Connect

    Allen, P.T.; Archangelskaya, G.V.; Ramsaev, P.V.

    1996-11-01

    National and international guidance on the optimization of countermeasures to reduce doses in the post-release phase of an accident rightly emphasizes the importance and relevance of psychological, social, and economic factors to this process (e.g., NRPB 1990; ICRP 1991: CEC 1993; IAEA 1994). However, whilst economic factors are, at least partially, taken into account in developing the advice, explicit guidance is not provided on how psychological and social factors should be included in the optimization. Instead it is suggested that this is a matter for those with the appropriate competence and those with responsibility for making the final decisions. This approach implicitly assumes that the optimization of psychological and social factors, and that the results of the two procedures can then be combined to arrive at an optimum course of action. We recognize that formal optimization only forms one input to the process of making decisions on countermeasures and that it is important that psychological and social factors, as well as any other factors, are not {open_quotes}double-counted.{close_quotes} i.e., accounted for within international advice and then again at the time of the decision. It is our view that the optimization of radiation protection and economic factors, and certain psychological and social factors, should not be carried out independently. Research conducted by our respective organization indicates a number of areas in which the optimization of radiation protection and economic factors requires an understanding of key psychological and social processes. These areas fall into three groups; the need to ensure that countermeasures are successfully implemented, the need to achieve a net benefit for overall health, and the need to ensure a smooth transition back to normal living. 10 refs.

  7. Optimizing signal and image processing applications using Intel libraries

    NASA Astrophysics Data System (ADS)

    Landré, Jérôme; Truchetet, Frédéric

    2007-01-01

    This paper presents optimized signal and image processing libraries from Intel Corporation. Intel Performance Primitives (IPP) is a low-level signal and image processing library developed by Intel Corporation to optimize code on Intel processors. Open Computer Vision library (OpenCV) is a high-level library dedicated to computer vision tasks. This article describes the use of both libraries to build flexible and efficient signal and image processing applications.

  8. Optimizing carbon storage and biodiversity protection in tropical agricultural landscapes.

    PubMed

    Gilroy, James J; Woodcock, Paul; Edwards, Felicity A; Wheeler, Charlotte; Medina Uribe, Claudia A; Haugaasen, Torbjørn; Edwards, David P

    2014-07-01

    With the rapidly expanding ecological footprint of agriculture, the design of farmed landscapes will play an increasingly important role for both carbon storage and biodiversity protection. Carbon and biodiversity can be enhanced by integrating natural habitats into agricultural lands, but a key question is whether benefits are maximized by including many small features throughout the landscape ('land-sharing' agriculture) or a few large contiguous blocks alongside intensive farmland ('land-sparing' agriculture). In this study, we are the first to integrate carbon storage alongside multi-taxa biodiversity assessments to compare land-sparing and land-sharing frameworks. We do so by sampling carbon stocks and biodiversity (birds and dung beetles) in landscapes containing agriculture and forest within the Colombian Chocó-Andes, a zone of high global conservation priority. We show that woodland fragments embedded within a matrix of cattle pasture hold less carbon per unit area than contiguous primary or advanced secondary forests (>15 years). Farmland sites also support less diverse bird and dung beetle communities than contiguous forests, even when farmland retains high levels of woodland habitat cover. Landscape simulations based on these data suggest that land-sparing strategies would be more beneficial for both carbon storage and biodiversity than land-sharing strategies across a range of production levels. Biodiversity benefits of land-sparing are predicted to be similar whether spared lands protect primary or advanced secondary forests, owing to the close similarity of bird and dung beetle communities between the two forest classes. Land-sparing schemes that encourage the protection and regeneration of natural forest blocks thus provide a synergy between carbon and biodiversity conservation, and represent a promising strategy for reducing the negative impacts of agriculture on tropical ecosystems. However, further studies examining a wider range of ecosystem

  9. Intelligent Signal Processing for Detection System Optimization

    SciTech Connect

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-06-18

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  10. Intelligent Signal Processing for Detection System Optimization

    SciTech Connect

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-12-05

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  11. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  12. Energy optimization aspects by injection process technology

    NASA Astrophysics Data System (ADS)

    Tulbure, A.; Ciortea, M.; Hutanu, C.; Farcas, V.

    2016-08-01

    In the proposed paper, the authors examine the energy aspects related to the injection moulding process technology in the automotive industry. Theoretical considerations have been validated by experimental measurements on the manufacturing process, for two types of injections moulding machines, hydraulic and electric. Practical measurements have been taken with professional equipment separately on each technological operation: lamination, compression, injection and expansion. For results traceability, the following parameters were, whenever possible, maintained: cycle time, product weight and the relative time. The aim of the investigations was to carry out a professional energy audit with accurate losses identification. Base on technological diagram for each production cycle, at the end of this contribution, some measure to reduce the energy consumption were proposed.

  13. Optimal estimation for discrete time jump processes

    NASA Technical Reports Server (NTRS)

    Vaca, M. V.; Tretter, S. A.

    1977-01-01

    Optimum estimates of nonobservable random variables or random processes which influence the rate functions of a discrete time jump process (DTJP) are obtained. The approach is based on the a posteriori probability of a nonobservable event expressed in terms of the a priori probability of that event and of the sample function probability of the DTJP. A general representation for optimum estimates and recursive equations for minimum mean squared error (MMSE) estimates are obtained. MMSE estimates are nonlinear functions of the observations. The problem of estimating the rate of a DTJP when the rate is a random variable with a probability density function of the form cx super K (l-x) super m and show that the MMSE estimates are linear in this case. This class of density functions explains why there are insignificant differences between optimum unconstrained and linear MMSE estimates in a variety of problems.

  14. Process sequence optimization for digital microfluidic integration using EWOD technique

    NASA Astrophysics Data System (ADS)

    Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil

    2016-04-01

    Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.

  15. [Optimization of the pertussis vaccine production process].

    PubMed

    Germán Santiago, J; Zamora, N; de la Rosa, E; Alba Carrión, C; Padrón, P; Hernández, M; Betancourt, M; Moretti, N

    1995-01-01

    The production of Pertussis Vaccine was reevaluated at the Instituto Nacional de Higiene "Rafael Rangel" in order to optimise it in terms of vaccine yield, potency, specific toxicity and efficiency (cost per doses). Four different processes, using two culture media (Cohen-Wheeler and Fermentación Glutamato Prolina-1) and two types of bioreactors (25 L Fermentador Caracas and a 450 L industrial fermentor) were compared. Runs were started from freeze-dried strains (134 or 509) and continued until the obtention of the maximal yield. It was found that the combination Fermentación Glutamato Prolina-1/industrial fermentor, shortened the process to 40 hours while consistently yielding a vaccine of higher potency (7.91 +/- 2.56 IU/human dose) and lower specific toxicity in a mice bioassay. In addition, the physical aspect of the preparation was rather homogeneous and free of dark aggregates. Most importantly, the biomass yield more than doubled those of the Fermentador Caracas using the two different media and that in the industrial fermentor with the Cohen-Wheeler medium. Therefore, the cost per doses was substantially decreased. PMID:9279028

  16. Optimization in the systems engineering process

    NASA Astrophysics Data System (ADS)

    Lemmerman, Loren A.

    The essential elements of the design process consist of the mission definition phase that provides the system requirements, the conceptual design, the preliminary design and finally the detailed design. Mission definition is performed largely by operations analysts in conjunction with the customer. The result of their study is handed off to the systems engineers for documentation as the systems requirements. The document that provides these requirements is the basis for the further design work of the design engineers at the Lockheed-Georgia Company. The design phase actually begins with conceptual design, which is generally conducted by a small group of engineers using multidisciplinary design programs. Because of the complexity of the design problem, the analyses are relatively simple and generally dependent on parametric analyses of the configuration. The result of this phase is a baseline configuration from which preliminary design may be initiated.

  17. A split-optimization approach for obtaining multiple solutions in single-objective process parameter optimization.

    PubMed

    Rajora, Manik; Zou, Pan; Yang, Yao Guang; Fan, Zhi Wen; Chen, Hung Yi; Wu, Wen Chieh; Li, Beizhi; Liang, Steven Y

    2016-01-01

    It can be observed from the experimental data of different processes that different process parameter combinations can lead to the same performance indicators, but during the optimization of process parameters, using current techniques, only one of these combinations can be found when a given objective function is specified. The combination of process parameters obtained after optimization may not always be applicable in actual production or may lead to undesired experimental conditions. In this paper, a split-optimization approach is proposed for obtaining multiple solutions in a single-objective process parameter optimization problem. This is accomplished by splitting the original search space into smaller sub-search spaces and using GA in each sub-search space to optimize the process parameters. Two different methods, i.e., cluster centers and hill and valley splitting strategy, were used to split the original search space, and their efficiency was measured against a method in which the original search space is split into equal smaller sub-search spaces. The proposed approach was used to obtain multiple optimal process parameter combinations for electrochemical micro-machining. The result obtained from the case study showed that the cluster centers and hill and valley splitting strategies were more efficient in splitting the original search space than the method in which the original search space is divided into smaller equal sub-search spaces.

  18. A split-optimization approach for obtaining multiple solutions in single-objective process parameter optimization.

    PubMed

    Rajora, Manik; Zou, Pan; Yang, Yao Guang; Fan, Zhi Wen; Chen, Hung Yi; Wu, Wen Chieh; Li, Beizhi; Liang, Steven Y

    2016-01-01

    It can be observed from the experimental data of different processes that different process parameter combinations can lead to the same performance indicators, but during the optimization of process parameters, using current techniques, only one of these combinations can be found when a given objective function is specified. The combination of process parameters obtained after optimization may not always be applicable in actual production or may lead to undesired experimental conditions. In this paper, a split-optimization approach is proposed for obtaining multiple solutions in a single-objective process parameter optimization problem. This is accomplished by splitting the original search space into smaller sub-search spaces and using GA in each sub-search space to optimize the process parameters. Two different methods, i.e., cluster centers and hill and valley splitting strategy, were used to split the original search space, and their efficiency was measured against a method in which the original search space is split into equal smaller sub-search spaces. The proposed approach was used to obtain multiple optimal process parameter combinations for electrochemical micro-machining. The result obtained from the case study showed that the cluster centers and hill and valley splitting strategies were more efficient in splitting the original search space than the method in which the original search space is divided into smaller equal sub-search spaces. PMID:27625978

  19. When teams shift among processes: insights from simulation and optimization.

    PubMed

    Kennedy, Deanna M; McComb, Sara A

    2014-09-01

    This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research.

  20. Image processing to optimize wave energy converters

    NASA Astrophysics Data System (ADS)

    Bailey, Kyle Marc-Anthony

    The world is turning to renewable energies as a means of ensuring the planet's future and well-being. There have been a few attempts in the past to utilize wave power as a means of generating electricity through the use of Wave Energy Converters (WEC), but only recently are they becoming a focal point in the renewable energy field. Over the past few years there has been a global drive to advance the efficiency of WEC. Placing a mechanical device either onshore or offshore that captures the energy within ocean surface waves to drive a mechanical device is how wave power is produced. This paper seeks to provide a novel and innovative way to estimate ocean wave frequency through the use of image processing. This will be achieved by applying a complex modulated lapped orthogonal transform filter bank to satellite images of ocean waves. The complex modulated lapped orthogonal transform filterbank provides an equal subband decomposition of the Nyquist bounded discrete time Fourier Transform spectrum. The maximum energy of the 2D complex modulated lapped transform subband is used to determine the horizontal and vertical frequency, which subsequently can be used to determine the wave frequency in the direction of the WEC by a simple trigonometric scaling. The robustness of the proposed method is provided by the applications to simulated and real satellite images where the frequency is known.

  1. Design and Optimization of an Austenitic TRIP Steel for Blast and Fragment Protection

    NASA Astrophysics Data System (ADS)

    Feinberg, Zechariah Daniel

    In light of the pervasive nature of terrorist attacks, there is a pressing need for the design and optimization of next generation materials for blast and fragment protection applications. Sadhukhan used computational tools and a systems-based approach to design TRIP-120---a fully austenitic transformation-induced plasticity (TRIP) steel. Current work more completely evaluates the mechanical properties of the prototype, optimizes the processing for high performance in tension and shear, and builds models for more predictive power of the mechanical behavior and austenite stability. Under quasi-static and dynamic tension and shear, the design exhibits high strength and high uniform ductility as a result of a strain hardening effect that arises with martensitic transformation. Significantly more martensitic transformation occurred under quasi-static loading conditions (69% in tension and 52% in shear) compared to dynamic loading conditions (13% tension and 5% in shear). Nonetheless, significant transformation occurs at high-strain rates which increases strain hardening, delays the onset of necking instability, and increases total energy absorption under adiabatic conditions. Although TRIP-120 effectively utilizes a TRIP effect to delay necking instability, a common trend of abrupt failure with limited fracture ductility was observed in tension and shear at all strain rates. Further characterization of the structure of TRIP-120 showed that an undesired grain boundary cellular reaction (η phase formation) consumed the fine dispersion of the metastable gamma' phase and limited the fracture ductility. A warm working procedure was added to the processing of TRIP-120 in order to eliminate the grain boundary cellular reaction from the structure. By eliminating η formation at the grain boundaries, warm-worked TRIP-120 exhibits a drastic improvement in the mechanical properties in tension and shear. In quasi-static tension, the optimized warm-worked TRIP-120 with an Mssigma

  2. Information processing capacity while wearing personal protective eyewear.

    PubMed

    Wade, Chip; Davis, Jerry; Marzilli, Thomas S; Weimar, Wendi H

    2006-08-15

    It is difficult to overemphasize the function vision plays in information processing, specifically in maintaining postural control. Vision appears to be an immediate, effortless event; suggesting that eyes need only to be open to employ the visual information provided by the environment. This study is focused on investigating the effect of Occupational Safety and Health Administration regulated personal protective eyewear (29 CFR 1910.133) on physiological and cognitive factors associated with information processing capabilities. Twenty-one college students between the ages of 19 and 25 years were randomly tested in each of three eyewear conditions (control, new and artificially aged) on an inclined and horizontal support surface for auditory and visual stimulus reaction time. Data collection trials consisted of 50 randomly selected (25 auditory, 25 visual) stimuli over a 10-min surface-eyewear condition trial. Auditory stimulus reaction time was significantly affected by the surface by eyewear interaction (F2,40 = 7.4; p < 0.05). Similarly, analysis revealed a significant surface by eyewear interaction in reaction time following the visual stimulus (F2,40 = 21.7; p < 0.05). The current findings do not trivialize the importance of personal protective eyewear usage in an occupational setting; rather, they suggest the value of future research focused on the effect that personal protective eyewear has on the physiological, cognitive and biomechanical contributions to postural control. These findings suggest that while personal protective eyewear may serve to protect an individual from eye injury, an individual's use of such personal protective eyewear may have deleterious effects on sensory information associated with information processing and postural control.

  3. Economic optimization of natural hazard protection - conceptual study of existing approaches

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Straub, Daniel

    2013-04-01

    Risk-based planning of protection measures against natural hazards has become a common practice in many countries. The selection procedure aims at identifying an economically efficient strategy with regard to the estimated costs and risk (i.e. expected damage). A correct setting of the evaluation methodology and decision criteria should ensure an optimal selection of the portfolio of risk protection measures under a limited state budget. To demonstrate the efficiency of investments, indicators such as Benefit-Cost Ratio (BCR), Marginal Costs (MC) or Net Present Value (NPV) are commonly used. However, the methodologies for efficiency evaluation differ amongst different countries and different hazard types (floods, earthquakes etc.). Additionally, several inconsistencies can be found in the applications of the indicators in practice. This is likely to lead to a suboptimal selection of the protection strategies. This study provides a general formulation for optimization of the natural hazard protection measures from a socio-economic perspective. It assumes that all costs and risks can be expressed in monetary values. The study regards the problem as a discrete hierarchical optimization, where the state level sets the criteria and constraints, while the actual optimization is made on the regional level (towns, catchments) when designing particular protection measures and selecting the optimal protection level. The study shows that in case of an unlimited budget, the task is quite trivial, as it is sufficient to optimize the protection measures in individual regions independently (by minimizing the sum of risk and cost). However, if the budget is limited, the need for an optimal allocation of resources amongst the regions arises. To ensure this, minimum values of BCR or MC can be required by the state, which must be achieved in each region. The study investigates the meaning of these indicators in the optimization task at the conceptual level and compares their

  4. Video enhancement method with color-protection post-processing

    NASA Astrophysics Data System (ADS)

    Kim, Youn Jin; Kwak, Youngshin

    2015-01-01

    The current study is aimed to propose a post-processing method for video enhancement by adopting a color-protection technique. The color-protection intends to attenuate perceptible artifacts due to over-enhancements in visually sensitive image regions such as low-chroma colors, including skin and gray objects. In addition, reducing the loss in color texture caused by the out-of-color-gamut signals is also taken into account. Consequently, color reproducibility of video sequences could be remarkably enhanced while the undesirable visual exaggerations are minimized.

  5. Maximizing the efficiency of multienzyme process by stoichiometry optimization.

    PubMed

    Dvorak, Pavel; Kurumbang, Nagendra P; Bendl, Jaroslav; Brezovsky, Jan; Prokop, Zbynek; Damborsky, Jiri

    2014-09-01

    Multienzyme processes represent an important area of biocatalysis. Their efficiency can be enhanced by optimization of the stoichiometry of the biocatalysts. Here we present a workflow for maximizing the efficiency of a three-enzyme system catalyzing a five-step chemical conversion. Kinetic models of pathways with wild-type or engineered enzymes were built, and the enzyme stoichiometry of each pathway was optimized. Mathematical modeling and one-pot multienzyme experiments provided detailed insights into pathway dynamics, enabled the selection of a suitable engineered enzyme, and afforded high efficiency while minimizing biocatalyst loadings. Optimizing the stoichiometry in a pathway with an engineered enzyme reduced the total biocatalyst load by an impressive 56 %. Our new workflow represents a broadly applicable strategy for optimizing multienzyme processes.

  6. PLUTONIUM PROCESSING OPTIMIZATION IN SUPPORT OF THE MOX FUEL PROGRAM

    SciTech Connect

    GRAY, DEVIN W.; COSTA, DAVID A.

    2007-02-02

    After Los Alamos National Laboratory (LANL) personnel completed polishing 125 Kg of plutonium as highly purified PuO{sub 2} from surplus nuclear weapons, Duke, COGEMA, Stone, and Webster (DCS) required as the next process stage, the validation and optimization of all phases of the plutonium polishing flow sheet. Personnel will develop the optimized parameters for use in the upcoming 330 kg production mission.

  7. Economic-Oriented Stochastic Optimization in Advanced Process Control of Chemical Processes

    PubMed Central

    Dobos, László; Király, András; Abonyi, János

    2012-01-01

    Finding the optimal operating region of chemical processes is an inevitable step toward improving economic performance. Usually the optimal operating region is situated close to process constraints related to product quality or process safety requirements. Higher profit can be realized only by assuring a relatively low frequency of violation of these constraints. A multilevel stochastic optimization framework is proposed to determine the optimal setpoint values of control loops with respect to predetermined risk levels, uncertainties, and costs of violation of process constraints. The proposed framework is realized as direct search-type optimization of Monte-Carlo simulation of the controlled process. The concept is illustrated throughout by a well-known benchmark problem related to the control of a linear dynamical system and the model predictive control of a more complex nonlinear polymerization process. PMID:23213298

  8. Economic-oriented stochastic optimization in advanced process control of chemical processes.

    PubMed

    Dobos, László; Király, András; Abonyi, János

    2012-01-01

    Finding the optimal operating region of chemical processes is an inevitable step toward improving economic performance. Usually the optimal operating region is situated close to process constraints related to product quality or process safety requirements. Higher profit can be realized only by assuring a relatively low frequency of violation of these constraints. A multilevel stochastic optimization framework is proposed to determine the optimal setpoint values of control loops with respect to predetermined risk levels, uncertainties, and costs of violation of process constraints. The proposed framework is realized as direct search-type optimization of Monte-Carlo simulation of the controlled process. The concept is illustrated throughout by a well-known benchmark problem related to the control of a linear dynamical system and the model predictive control of a more complex nonlinear polymerization process.

  9. Mars Soil-Based Resource Processing and Planetary Protection

    NASA Technical Reports Server (NTRS)

    Sanders, G. B.; Mueller, R. P.

    2015-01-01

    The ability to extract and process resources at the site of exploration into products and services, commonly referred to as In Situ Resource Utilization (ISRU), can have significant benefits for robotic and human exploration missions. In particular, the ability to use in situ resources to make propellants, fuel cell reactants, and life support consumables has been shown in studies to significantly reduce mission mass, cost, and risk, while enhancing or enabling missions not possible without the incorporation of ISRU. In December 2007, NASA completed the Mars Human Design Reference Architecture (DRA) 5.0 study. For the first time in a large scale Mars architecture study, water from Mars soil was considered as a potential resource. At the time of the study, knowledge of water resources (their form, concentration, and distribution) was extremely limited. Also, due to lack of understanding of how to apply planetary protection rules and requirements to ISRU soil-based excavation and processing, an extremely conservative approach was incorporated where only the top several centimeters of ultraviolet (UV) radiated soil could be processed (assumed to be 3% water by mass). While results of the Mars DRA 5.0 study showed that combining atmosphere processing to make oxygen and methane with soil processing to extract water provided the lowest mission mass, atmosphere processing to convert carbon dioxide (CO2) into oxygen was baselined for the mission since it was the lowest power and risk option. With increased knowledge and further clarification of Mars planetary protection rules, and the recent release of the Mars Exploration Program Analysis Group (MEPAG) report on "Special Regions and the Human Exploration of Mars", it is time to reexamine potential water resources on Mars, options for soil processing to extract water, and the implications with respect to planetary protection and Special Regions on Mars.

  10. Stress Exposure and Depression in Disadvantaged Women: The Protective Effects of Optimism and Perceived Control

    ERIC Educational Resources Information Center

    Grote, Nancy K.; Bledsoe, Sarah E.; Larkin, Jill; Lemay, Edward P., Jr.; Brown, Charlotte

    2007-01-01

    In the present study, the authors predicted that the individual protective factors of optimism and perceived control over acute and chronic stressors would buffer the relations between acute and chronic stress exposure and severity of depression, controlling for household income, in a sample of financially disadvantaged women. Ninety-seven African…

  11. Optimization of Gas Metal Arc Welding Process Parameters

    NASA Astrophysics Data System (ADS)

    Kumar, Amit; Khurana, M. K.; Yadav, Pradeep K.

    2016-09-01

    This study presents the application of Taguchi method combined with grey relational analysis to optimize the process parameters of gas metal arc welding (GMAW) of AISI 1020 carbon steels for multiple quality characteristics (bead width, bead height, weld penetration and heat affected zone). An orthogonal array of L9 has been implemented to fabrication of joints. The experiments have been conducted according to the combination of voltage (V), current (A) and welding speed (Ws). The results revealed that the welding speed is most significant process parameter. By analyzing the grey relational grades, optimal parameters are obtained and significant factors are known using ANOVA analysis. The welding parameters such as speed, welding current and voltage have been optimized for material AISI 1020 using GMAW process. To fortify the robustness of experimental design, a confirmation test was performed at selected optimal process parameter setting. Observations from this method may be useful for automotive sub-assemblies, shipbuilding and vessel fabricators and operators to obtain optimal welding conditions.

  12. Optimizing a Laser Process for Making Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Arepalli, Sivaram; Nikolaev, Pavel; Holmes, William

    2010-01-01

    A systematic experimental study has been performed to determine the effects of each of the operating conditions in a double-pulse laser ablation process that is used to produce single-wall carbon nanotubes (SWCNTs). The comprehensive data compiled in this study have been analyzed to recommend conditions for optimizing the process and scaling up the process for mass production. The double-pulse laser ablation process for making SWCNTs was developed by Rice University researchers. Of all currently known nanotube-synthesizing processes (arc and chemical vapor deposition), this process yields the greatest proportion of SWCNTs in the product material. The aforementioned process conditions are important for optimizing the production of SWCNTs and scaling up production. Reports of previous research (mostly at Rice University) toward optimization of process conditions mention effects of oven temperature and briefly mention effects of flow conditions, but no systematic, comprehensive study of the effects of process conditions was done prior to the study described here. This was a parametric study, in which several production runs were carried out, changing one operating condition for each run. The study involved variation of a total of nine parameters: the sequence of the laser pulses, pulse-separation time, laser pulse energy density, buffer gas (helium or nitrogen instead of argon), oven temperature, pressure, flow speed, inner diameter of the flow tube, and flow-tube material.

  13. Experimental reversion of the optimal quantum cloning and flipping processes

    SciTech Connect

    Sciarrino, Fabio; Secondi, Veronica; De Martini, Francesco

    2006-04-15

    The quantum cloner machine maps an unknown arbitrary input qubit into two optimal clones and one optimal flipped qubit. By combining linear and nonlinear optical methods we experimentally implement a scheme that, after the cloning transformation, restores the original input qubit in one of the output channels, by using local measurements, classical communication, and feedforward. This nonlocal method demonstrates how the information on the input qubit can be restored after the cloning process. The realization of the reversion process is expected to find useful applications in the field of modern multipartite quantum cryptography.

  14. Optimization of polyetherimide processing parameters for optical interconnect applications

    NASA Astrophysics Data System (ADS)

    Zhao, Wei; Johnson, Peter; Wall, Christopher

    2015-10-01

    ULTEM® polyetherimide (PEI) resins have been used in opto-electronic markets since the optical properties of these materials enable the design of critical components under tight tolerances. PEI resins are the material of choice for injection molded integrated lens applications due to good dimensional stability, near infrared (IR) optical transparency, low moisture uptake and high heat performance. In most applications, parts must be produced consistently with minimal deviations to insure compatibility throughout the lifetime of the part. With the large number of lenses needed for this market, injection molding has been optimized to maximize the production rate. These optimized parameters for high throughput may or may not translate to an optimized optical performance. In this paper, we evaluate and optimize PEI injection molding processes with a focus on optical property performance. A commonly used commercial grade was studied to determine factors and conditions which contribute to optical transparency, color, and birefringence. Melt temperature, mold temperature, injection speed and cycle time were varied to develop optimization trials and evaluate optical properties. These parameters could be optimized to reduce in-plane birefringence from 0.0148 to 0.0006 in this study. In addition, we have studied an optically smooth, sub-10nm roughness mold to re-evaluate material properties with minimal influence from mold quality and further refine resin and process effects for the best optical performance.

  15. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  16. Protecting HAZMAT personnel: A multi-phase process

    SciTech Connect

    Ziegler, P. )

    1993-03-01

    Protecting personnel during hazardous substance releases is a process requiring several integrated elements. Managers must ensure the proper training has occurred and the appropriate personal protective equipment is available. They also must have a thorough understanding of applicable regulations, a well-defined contingency planning program, a ready inventory of air monitoring equipment and provisions for outside assistance. Several regulations apply to an organization that could be responsible for an oil or hazardous substance spill. These have been issued by several regulatory agencies, primarily the US Environmental Protection Agency (EPA) and the Occupational Safety and Health Administration (OSHA). For treatment, storage and disposal facilities and both large and small quantity generators under the jurisdiction of the Resource Conservation and Recovery Act (RCRA), several rules detail emergency planning and training requirements. Title 2 of the Superfund Amendments and Reauthorization Act of 1986 (SARA) established requirements that apply to nearly all industries mainly for emergency incident and chemical use notification. The goal of its provisions, also known as the Emergency Planning and Community Right-to-Know Act (EPCRA), is to enable states and communities to improve chemical safety and better protect public health and the environment.

  17. Multi-Objective Optimization for Alumina Laser Sintering Process

    NASA Astrophysics Data System (ADS)

    Fayed, E. M.; Elmesalamy, A. S.; Sobih, M.; Elshaer, Y.

    2016-09-01

    Selective laser sintering processes has become one of the most popular additive manufacturing processes due to its flexibility in creation of complex components. This process has many interacting parameters, which have a significant influence on the process output. In this work, high purity alumina is sintered through a pulsed Nd:YAG laser sintering process. The aim of this work is to understand the effect of relevant sintering process parameters (laser power and laser scanning speed) on the quality of the sintered layer (layer surface roughness, layer thickness and vector/line width, and density). Design of experiments and statistical modeling techniques are employed to optimize the process control factors and to establish a relationship between these factors and output responses. Model results have been verified through experimental work and show reasonable prediction of process responses within the limits of sintering parameters.

  18. Optimal Signal Processing of Frequency-Stepped CW Radar Data

    NASA Technical Reports Server (NTRS)

    Ybarra, Gary A.; Wu, Shawkang M.; Bilbro, Griff L.; Ardalan, Sasan H.; Hearn, Chase P.; Neece, Robert T.

    1995-01-01

    An optimal signal processing algorithm is derived for estimating the time delay and amplitude of each scatterer reflection using a frequency-stepped CW system. The channel is assumed to be composed of abrupt changes in the reflection coefficient profile. The optimization technique is intended to maximize the target range resolution achievable from any set of frequency-stepped CW radar measurements made in such an environment. The algorithm is composed of an iterative two-step procedure. First, the amplitudes of the echoes are optimized by solving an overdetermined least squares set of equations. Then, a nonlinear objective function is scanned in an organized fashion to find its global minimum. The result is a set of echo strengths and time delay estimates. Although this paper addresses the specific problem of resolving the time delay between the two echoes, the derivation is general in the number of echoes. Performance of the optimization approach is illustrated using measured data obtained from an HP-851O network analyzer. It is demonstrated that the optimization approach offers a significant resolution enhancement over the standard processing approach that employs an IFFT. Degradation in the performance of the algorithm due to suboptimal model order selection and the effects of additive white Gaussion noise are addressed.

  19. Optimal Signal Processing of Frequency-Stepped CW Radar Data

    NASA Technical Reports Server (NTRS)

    Ybarra, Gary A.; Wu, Shawkang M.; Bilbro, Griff L.; Ardalan, Sasan H.; Hearn, Chase P.; Neece, Robert T.

    1995-01-01

    An optimal signal processing algorithm is derived for estimating the time delay and amplitude of each scatterer reflection using a frequency-stepped CW system. The channel is assumed to be composed of abrupt changes in the reflection coefficient profile. The optimization technique is intended to maximize the target range resolution achievable from any set of frequency-stepped CW radar measurements made in such an environment. The algorithm is composed of an iterative two-step procedure. First, the amplitudes of the echoes are optimized by solving an overdetermined least squares set of equations. Then, a nonlinear objective function is scanned in an organized fashion to find its global minimum. The result is a set of echo strengths and time delay estimates. Although this paper addresses the specific problem of resolving the time delay between the first two echoes, the derivation is general in the number of echoes. Performance of the optimization approach is illustrated using measured data obtained from an HP-X510 network analyzer. It is demonstrated that the optimization approach offers a significant resolution enhancement over the standard processing approach that employs an IFFT. Degradation in the performance of the algorithm due to suboptimal model order selection and the effects of additive white Gaussion noise are addressed.

  20. Marine protected areas and the value of spatially optimized fishery management

    PubMed Central

    Rassweiler, Andrew; Costello, Christopher; Siegel, David A.

    2012-01-01

    There is a growing focus around the world on marine spatial planning, including spatial fisheries management. Some spatial management approaches are quite blunt, as when marine protected areas (MPAs) are established to restrict fishing in specific locations. Other management tools, such as zoning or spatial user rights, will affect the distribution of fishing effort in a more nuanced manner. Considerable research has focused on the ability of MPAs to increase fishery returns, but the potential for the broader class of spatial management approaches to outperform MPAs has received far less attention. We use bioeconomic models of seven nearshore fisheries in Southern California to explore the value of optimized spatial management in which the distribution of fishing is chosen to maximize profits. We show that fully optimized spatial management can substantially increase fishery profits relative to optimal nonspatial management but that the magnitude of this increase depends on characteristics of the fishing fleet and target species. Strategically placed MPAs can also increase profits substantially compared with nonspatial management, particularly if fishing costs are low, although profit increases available through optimal MPA-based management are roughly half those from fully optimized spatial management. However, if the same total area is protected by randomly placing MPAs, starkly contrasting results emerge: most random MPA designs reduce expected profits. The high value of spatial management estimated here supports continued interest in spatially explicit fisheries regulations but emphasizes that predicted increases in profits can only be achieved if the fishery is well understood and the regulations are strategically designed. PMID:22753469

  1. The optimization of operating parameters on microalgae upscaling process planning.

    PubMed

    Ma, Yu-An; Huang, Hsin-Fu; Yu, Chung-Chyi

    2016-03-01

    The upscaling process planning developed in this study primarily involved optimizing operating parameters, i.e., dilution ratios, during process designs. Minimal variable cost was used as an indicator for selecting the optimal combination of dilution ratios. The upper and lower mean confidence intervals obtained from the actual cultured cell density data were used as the final cell density stability indicator after the operating parameters or dilution ratios were selected. The process planning method and results were demonstrated through three case studies of batch culture simulation. They are (1) final objective cell densities were adjusted, (2) high and low light intensities were used for intermediate-scale cultures, and (3) the number of culture days was expressed as integers for the intermediate-scale culture.

  2. Plasma process optimization for N-type doping applications

    NASA Astrophysics Data System (ADS)

    Raj, Deven; Persing, Harold; Salimian, Siamak; Lacey, Kerry; Qin, Shu; Hu, Jeff Y.; McTeer, Allen

    2012-11-01

    Plasma doping (PLAD) has been adopted across the implant technology space and into high volume production for both conventional DRAM and NAND doping applications. PLAD has established itself as an alternative to traditional ion implantation by beamline implantation. The push for high doping concentration, shallow doping depth, and conformal doping capability expand the need for a PLAD solution to meet such requirements. The unique doping profile and doping characteristics at high dose rates allow for PLAD to deliver a high throughput, differentiated solution to meet the demand of evolving transistor technology. In the PLAD process, ions are accelerated to the wafer as with a negative wafer bias applied to the wafer. Competing mechanisms, such as deposition, sputtering, and etching inherent in plasma doping require unique control and process optimization. In this work, we look at the distinctive process tool control and characterization features which enable an optimized doping process using n-type (PH3 or AsH3) chemistries. The data in this paper will draw the relationship between process optimization through plasma chemistry study to the wafer level result.

  3. Plasma process optimization for N-type doping applications

    SciTech Connect

    Raj, Deven; Persing, Harold; Salimian, Siamak; Lacey, Kerry; Qin Shu; Hu, Jeff Y.; McTeer, Allen

    2012-11-06

    Plasma doping (PLAD) has been adopted across the implant technology space and into high volume production for both conventional DRAM and NAND doping applications. PLAD has established itself as an alternative to traditional ion implantation by beamline implantation. The push for high doping concentration, shallow doping depth, and conformal doping capability expand the need for a PLAD solution to meet such requirements. The unique doping profile and doping characteristics at high dose rates allow for PLAD to deliver a high throughput, differentiated solution to meet the demand of evolving transistor technology. In the PLAD process, ions are accelerated to the wafer as with a negative wafer bias applied to the wafer. Competing mechanisms, such as deposition, sputtering, and etching inherent in plasma doping require unique control and process optimization. In this work, we look at the distinctive process tool control and characterization features which enable an optimized doping process using n-type (PH{sub 3} or AsH{sub 3}) chemistries. The data in this paper will draw the relationship between process optimization through plasma chemistry study to the wafer level result.

  4. Instrumentation for optimizing an underground coal-gasification process

    NASA Astrophysics Data System (ADS)

    Seabaugh, W.; Zielinski, R. E.

    1982-06-01

    While the United States has a coal resource base of 6.4 trillion tons, only seven percent is presently recoverable by mining. The process of in-situ gasification can recover another twenty-eight percent of the vast resource, however, viable technology must be developed for effective in-situ recovery. The key to this technology is system that can optimize and control the process in real-time. An instrumentation system is described that optimizes the composition of the injection gas, controls the in-situ process and conditions the product gas for maximum utilization. The key elements of this system are Monsanto PRISM Systems, a real-time analytical system, and a real-time data acquisition and control system. This system provides from complete automation of the process but can easily be overridden by manual control. The use of this cost effective system can provide process optimization and is an effective element in developing a viable in-situ technology.

  5. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    NASA Astrophysics Data System (ADS)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  6. An atmosphere protection subsystem in the thermal power station automated process control system

    NASA Astrophysics Data System (ADS)

    Parchevskii, V. M.; Kislov, E. A.

    2014-03-01

    Matters concerned with development of methodical and mathematical support for an atmosphere protection subsystem in the thermal power station automated process control system are considered taking as an example the problem of controlling nitrogen oxide emissions at a gas-and-oil-fired thermal power station. The combined environmental-and-economic characteristics of boilers, which correlate the costs for suppressing emissions with the boiler steam load and mass discharge of nitrogen oxides in analytic form, are used as the main tool for optimal control. A procedure for constructing and applying environmental-and-economic characteristics on the basis of technical facilities available in modern instrumentation and control systems is presented.

  7. Simulation and optimization of the waste nitric acid recovery process

    SciTech Connect

    Oh, S.C.; Yeo, Y.K.; Oh, Y.S.

    1998-02-01

    This paper deals with the simulation and optimization of composite distillation columns for the waste nitric acid recovery process. The composite distillation columns which consist of a multistage vacuum column and an atmospheric pressure column, half of which, consists of a packed bed, were modeled by using an equilibrium stage method and a nonequilibrium stage method. The required physical properties of a nitric acid solution for simulation were obtained from correlations based on experimental data. Simulation results using the nonequilibrium model showed better agreement with actual plant data than those of the equilibrium model. Based on the simulation results, the optimal operation conditions were studied. In the optimization reflux ratio was employed as the key variable to maximize the operating profit.

  8. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  9. Optimal design of the satellite constellation arrangement reconfiguration process

    NASA Astrophysics Data System (ADS)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  10. Process optimization and evaluation of novel baicalin solid nanocrystals

    PubMed Central

    Yue, Peng-Fei; Li, Yu; Wan, Jing; Wang, Yong; Yang, Ming; Zhu, Wei-Feng; Wang, Chang-Hong; Yuan, Hai-Long

    2013-01-01

    The objective of this study was to prepare baicalin solid nanocrystals (BCN-SNS) to enhance oral bioavailability of baicalin. A Box–Behnken design approach was used for process optimization. The physicochemical properties and pharmacokinetics of the optimal BCN-SNS were investigated. Multiple linear regression analysis for process optimization revealed that the fine BCN-SNS was obtained wherein the optimal values of homogenization pressure (bar), homogenization cycles (cycles), amount of TPGS to drug (w/w), and amount of MCCS to drug (w/w) were 850 bar, 25 cycles, 10%, and 10%, respectively. Transmission electron microscopy and scanning electron microscopy results indicated that no significant aggregation or crystal growth could be observed in the redispersed freeze-dried BCN-SNS. Differential scanning calorimetry and X-ray diffraction results showed that BCN remained in a crystalline state. Dissolution velocity of the freeze-dried BCN-SNS powder was distinctly superior compared to those of the crude powder and physical mixture. The bioavailability of BCN in rats was increased remarkably after oral administration of BCN-SNS (P < 0.05), compared with those of BCN or the physical mixture. The SNS might be a good choice for oral administration of poorly soluble BCN, due to an improvement of the bioavailability and dissolution velocity of BCN-SNS. PMID:23976849

  11. Optimization of Forming Processes with Different Sheet Metal Alloys

    NASA Astrophysics Data System (ADS)

    Sousa, Luísa C.; Castro, Catarina F.; António, Carlos C.

    2007-05-01

    Over the past decades relatively heavy components made of steel alloys comprise the majority of many manufactured parts due to steel's low cost, high formability and good strength. The desire to produce lightweight parts has led to studies searching for lighter and stronger materials such as aluminum alloys. However, they exhibit lower elastic stiffness than steel resulting in higher elastic strains causing known distortions such as spring-back and so decreasing accuracy of manufactured net-shape components. This paper presents a developed computational method to optimize the design of sheet metal processes using genetic algorithms. An inverse approach is considered so that the final geometry of the bended blank closely follows a prescribed one. The developed computational method couples a finite element forming simulation and an evolutionary algorithm searching the optimal design parameters of the process. The developed method searches the optimal parameters that ensure a perfect net-shape part. Different aluminum alloys candidates for automotive structural applications are considered and the optimal solutions are analyzed.

  12. Optimization of Forming Processes with Different Sheet Metal Alloys

    SciTech Connect

    Sousa, Luisa C.; Castro, Catarina F.; Antonio, Carlos C.

    2007-05-17

    Over the past decades relatively heavy components made of steel alloys comprise the majority of many manufactured parts due to steel's low cost, high formability and good strength. The desire to produce lightweight parts has led to studies searching for lighter and stronger materials such as aluminum alloys. However, they exhibit lower elastic stiffness than steel resulting in higher elastic strains causing known distortions such as spring-back and so decreasing accuracy of manufactured net-shape components. This paper presents a developed computational method to optimize the design of sheet metal processes using genetic algorithms. An inverse approach is considered so that the final geometry of the bended blank closely follows a prescribed one. The developed computational method couples a finite element forming simulation and an evolutionary algorithm searching the optimal design parameters of the process. The developed method searches the optimal parameters that ensure a perfect net-shape part. Different aluminum alloys candidates for automotive structural applications are considered and the optimal solutions are analyzed.

  13. Optimization of a reversible hood for protecting a pedestrian's head during car collisions.

    PubMed

    Huang, Sunan; Yang, Jikuang

    2010-07-01

    This study evaluated and optimized the performance of a reversible hood (RH) for the prevention of the head injuries of an adult pedestrian from car collisions. The FE model of a production car front was introduced and validated. The baseline RH was developed from the original hood in the validated car front model. In order to evaluate the protective performance of the baseline RH, the FE models of an adult headform and a 50th percentile human head were used in parallel to impact the baseline RH. Based on the evaluation, the response surface method was applied to optimize the RH in terms of the material stiffness, lifting speed, and lifted height. Finally, the headform model and the human head model were again used to evaluate the protective performance of the optimized RH. It was found that the lifted baseline RH can obviously reduce the impact responses of the headform model and the human head model by comparing with the retracted and lifting baseline RH. When the optimized RH was lifted, the HIC values of the headform model and the human head model were further reduced to much lower than 1000. The risk of pedestrian head injuries can be prevented as required by EEVC WG17.

  14. Attention as reward-driven optimization of sensory processing.

    PubMed

    Chalk, Matthew; Murray, Iain; Seriès, Peggy

    2013-11-01

    Attention causes diverse changes to visual neuron responses, including alterations in receptive field structure, and firing rates. A common theoretical approach to investigate why sensory neurons behave as they do is based on the efficient coding hypothesis: that sensory processing is optimized toward the statistics of the received input. We extend this approach to account for the influence of task demands, hypothesizing that the brain learns a probabilistic model of both the sensory input and reward received for performing different actions. Attention-dependent changes to neural responses reflect optimization of this internal model to deal with changes in the sensory environment (stimulus statistics) and behavioral demands (reward statistics). We use this framework to construct a simple model of visual processing that is able to replicate a number of attention-dependent changes to the responses of neurons in the midlevel visual cortices. The model is consistent with and provides a normative explanation for recent divisive normalization models of attention (Reynolds & Heeger, 2009).

  15. Optimization of the rapping process of an intermittent electrostatic precipitator

    NASA Astrophysics Data System (ADS)

    Miloua, F.; Tilmatine, A.; Gouri, R.; Kadous, N.; Dascalescu, L.

    2008-01-01

    Intermittent operation mode is specific to electrostatic precipitators (ESP) used in workshops where the polluting product is produced in a discontinuous way. The rapping system is necessary in order to ensure a continuous and effective operation of a dry electrostatic precipitator, but causes at the same time, a problem of re-entrainment of dust and thus the degradation of filtration efficiency. The objective of this paper is to propose a procedure based on the methodology of experimental designs (Tagushi's Methodology) aiming at optimizing the rapping process; it consists to determine optimal values of rapping, i.e. the moment, the position and the force of rapping. Several “one-factor-at-a-time" experimental designs followed by a Full Factorial design, made it possible to model the process and to analyze interactions between the factors. The experiments were carried out on a laboratory experimental device which simulates an industrial precipitator with intermittent operation.

  16. Optimizing The DSSC Fabrication Process Using Lean Six Sigma

    NASA Astrophysics Data System (ADS)

    Fauss, Brian

    Alternative energy technologies must become more cost effective to achieve grid parity with fossil fuels. Dye sensitized solar cells (DSSCs) are an innovative third generation photovoltaic technology, which is demonstrating tremendous potential to become a revolutionary technology due to recent breakthroughs in cost of fabrication. The study here focused on quality improvement measures undertaken to improve fabrication of DSSCs and enhance process efficiency and effectiveness. Several quality improvement methods were implemented to optimize the seven step individual DSSC fabrication processes. Lean Manufacturing's 5S method successfully increased efficiency in all of the processes. Six Sigma's DMAIC methodology was used to identify and eliminate each of the root causes of defects in the critical titanium dioxide deposition process. These optimizations resulted with the following significant improvements in the production process: 1. fabrication time of the DSSCs was reduced by 54 %; 2. fabrication procedures were improved to the extent that all critical defects in the process were eliminated; 3. the quantity of functioning DSSCs fabricated was increased from 17 % to 90 %.

  17. Process optimization electrospinning fibrous material based on polyhydroxybutyrate

    NASA Astrophysics Data System (ADS)

    Olkhov, A. A.; Tyubaeva, P. M.; Staroverova, O. V.; Mastalygina, E. E.; Popov, A. A.; Ischenko, A. A.; Iordanskii, A. L.

    2016-05-01

    The article analyzes the influence of the main technological parameters of electrostatic spinning on the morphology and properties of ultrathin fibers on the basis of polyhydroxybutyrate. It is found that the electric conductivity and viscosity of the spinning solution affects the process of forming fibers macrostructure. The fiber-based materials PHB lets control geometry and optimize the viscosity and conductivity of a spinning solution. The resulting fibers have found use in medicine, particularly in the construction elements musculoskeletal.

  18. Characterizations of Overtaking Optimality for Controlled Diffusion Processes

    SciTech Connect

    Jasso-Fuentes, Hector Hernandez-Lerma, Onesimo

    2008-06-15

    In this paper we give conditions for (the existence and) several characterizations of overtaking optimal policies for a general class of controlled diffusion processes. Our characterization results are of a lexicographical type; namely, first we identify the class of so-called canonical policies, and then within this class we search for policies with some special feature-for instance, canonical policies that in addition maximize the bias.

  19. On the optimal design of the disassembly and recovery processes

    SciTech Connect

    Xanthopoulos, A.; Iakovou, E.

    2009-05-15

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  20. Atmospheric Pressure Plasma-Electrospin Hybrid Process for Protective Applications

    NASA Astrophysics Data System (ADS)

    Vitchuli Gangadharan, Narendiran

    2011-12-01

    Chemical and biological (C-B) warfare agents like sarin, sulfur mustard, anthrax are usually dispersed into atmosphere in the form of micro aerosols. They are considered to be dangerous weapon of mass destruction next to nuclear weapons. The airtight protective clothing materials currently available are able to stop the diffusion of threat agents but not good enough to detoxify them, which endangers the wearers. Extensive research efforts are being made to prepare advanced protective clothing materials that not only prevent the diffusion of C-B agents, but also detoxify them into harmless products thus ensuring the safety and comfort of the wearer. Electrospun nanofiber mats are considered to have effective filtration characteristics to stop the diffusion of submicron level particulates without sacrificing air permeability characteristics and could be used in protective application as barrier material. In addition, functional nanofibers could be potentially developed to detoxify the C-B warfare threats into harmless products. In this research, electrospun nanofibers were deposited on fabric surface to improve barrier efficiency without sacrificing comfort-related properties of the fabrics. Multi-functional nanofibers were fabricated through an electrospinning-electrospraying hybrid process and their ability to detoxify simulants of C-B agents was evaluated. Nanofibers were also deposited onto plasma-pretreated woven fabric substrate through a newly developed plasma-electrospinning hybrid process, to improve the adhesive properties of nanofibers on the fabric surface. The nanofiber adhesion and durability properties were evaluated by peel test, flex and abrasion resistance tests. In this research work, following tasks have been carried out: i) Controlled deposition of nanofiber mat onto woven fabric substrate Electrospun Nylon 6 fiber mats were deposited onto woven 50/50 Nylon/Cotton fabric with the motive of making them into protective material against submicron

  1. Atmospheric Pressure Plasma-Electrospin Hybrid Process for Protective Applications

    NASA Astrophysics Data System (ADS)

    Vitchuli Gangadharan, Narendiran

    2011-12-01

    Chemical and biological (C-B) warfare agents like sarin, sulfur mustard, anthrax are usually dispersed into atmosphere in the form of micro aerosols. They are considered to be dangerous weapon of mass destruction next to nuclear weapons. The airtight protective clothing materials currently available are able to stop the diffusion of threat agents but not good enough to detoxify them, which endangers the wearers. Extensive research efforts are being made to prepare advanced protective clothing materials that not only prevent the diffusion of C-B agents, but also detoxify them into harmless products thus ensuring the safety and comfort of the wearer. Electrospun nanofiber mats are considered to have effective filtration characteristics to stop the diffusion of submicron level particulates without sacrificing air permeability characteristics and could be used in protective application as barrier material. In addition, functional nanofibers could be potentially developed to detoxify the C-B warfare threats into harmless products. In this research, electrospun nanofibers were deposited on fabric surface to improve barrier efficiency without sacrificing comfort-related properties of the fabrics. Multi-functional nanofibers were fabricated through an electrospinning-electrospraying hybrid process and their ability to detoxify simulants of C-B agents was evaluated. Nanofibers were also deposited onto plasma-pretreated woven fabric substrate through a newly developed plasma-electrospinning hybrid process, to improve the adhesive properties of nanofibers on the fabric surface. The nanofiber adhesion and durability properties were evaluated by peel test, flex and abrasion resistance tests. In this research work, following tasks have been carried out: i) Controlled deposition of nanofiber mat onto woven fabric substrate Electrospun Nylon 6 fiber mats were deposited onto woven 50/50 Nylon/Cotton fabric with the motive of making them into protective material against submicron

  2. Optimization of Ethanol Autothermal Reforming Process with Chemical Equilibrium Calculations

    NASA Astrophysics Data System (ADS)

    Markova, D.; Valters, K.; Bažbauers, G.

    2009-01-01

    The dependence of carbon formation, hydrogen yield and efficiency of the ethanol autothermal reforming process on critical process factors is studied in the work by using chemical equilibrium calculations with a process simulation model made in the ChemCAD environment. The studied process factors are carbon-to-steam ratio S/C, air-to-fuel ratio λ and temperature in the reactor TATR. Since the goal of the reforming process is to achieve possibly higher values of H2 concentration in the reforming gas, by operating reformer at the maximum efficiency at the same time, the optimization of the reforming process was done by using objective functions which include hydrogen yield and the amount of heat supplied to the process. As a result it was found that the maximum process efficiency, which is defined as the ratio of obtained hydrogen energy to the energy supplied to the process in the studied range of process factors is 0,61, and this value can be achieved at λ value of 0,1, S/C values of 2,5-3 and temperatures in the reactor TATR 680 - 695°C. Hydrogen yield under these conditions is 4,41-4,55 mol/molC2H5OH.

  3. OPTIMIZATION STUDY FOR FILL STEM MANUFACTURINGAND PINCH WELD PROCESSING

    SciTech Connect

    Korinko, P; Karl Arnold, K

    2006-09-06

    A statistically designed experiment was conducted as part of a six sigma project for Fill Stem Manufacturing and Pinch Weld Processing. This multi-year/multi-site project has successfully completed a screening study and used those results as inputs to this optimization study. Eleven welds were made using fairly tight current and cycle range. The welds demonstrate increased burst strength, longer closure length, more net displacement, and improved bond rating with increased current. However, excessive melting remains a concern from a processing viewpoint and may cause adverse metallurgical interactions. Therefore, the highest current levels specified cannot be utilized. A Validation Study is proposed for the Defense Programs Inert Facility.

  4. 28nm node process optimization: a lithography centric view

    NASA Astrophysics Data System (ADS)

    Seltmann, Rolf

    2014-10-01

    Many experts claim that the 28nm technology node will be the most cost effective technology node forever. This results from primarily from the cost of manufacturing due to the fact that 28nm is the last true Single Patterning (SP) node. It is also affected by the dramatic increase of design costs and the limited shrink factor of the next following nodes. Thus, it is assumed that this technology still will be alive still for many years. To be cost competitive, high yields are mandatory. Meanwhile, leading edge foundries have optimized the yield of the 28nm node to such a level that that it is nearly exclusively defined by random defectivity. However, it was a long way to go to come to that level. In my talk I will concentrate on the contribution of lithography to this yield learning curve. I will choose a critical metal patterning application. I will show what was needed to optimize the process window to a level beyond the usual OPC model work that was common on previous nodes. Reducing the process (in particular focus) variability is a complementary need. It will be shown which improvements were needed in tooling, process control and design-mask-wafer interaction to remove all systematic yield detractors. Over the last couple of years new scanner platforms were introduced that were targeted for both better productivity and better parametric performance. But this was not a clear run-path. It needed some extra affords of the tool suppliers together with the Fab to bring the tool variability down to the necessary level. Another important topic to reduce variability is the interaction of wafer none-planarity and lithography optimization. Having an accurate knowledge of within die topography is essential for optimum patterning. By completing both the variability reduction work and the process window enhancement work we were able to transfer the original marginal process budget to a robust positive budget and thus ensuring high yield and low costs.

  5. Plasma sprayed manganese-cobalt spinel coatings: Process sensitivity on phase, electrical and protective performance

    NASA Astrophysics Data System (ADS)

    Han, Su Jung; Pala, Zdenek; Sampath, Sanjay

    2016-02-01

    Manganese cobalt spinel (Mn1.5Co1.5O4, MCO) coatings are prepared by the air plasma spray (APS) process to examine their efficacy in serving as protective coatings from Cr-poisoning of the cathode side in intermediate temperature-solid oxide fuel cells (IT-SOFCs). These complex oxides are susceptible to process induced stoichiometric and phase changes which affect their functional performance. To critically examine these effects, MCO coatings are produced with deliberate modifications to the spray process parameters to explore relationship among process conditions, microstructure and functional properties. The resultant interplay among particle thermal and kinetic energies are captured through process maps, which serve to characterize the parametric effects on properties. The results show significant changes to the chemistry and phase composition of the deposited material resulting from preferential evaporation of oxygen. Post deposition annealing recovers oxygen in the coatings and allows partial recovery of the spinel phase, which is confirmed through thermo-gravimetric analysis (TGA)/differential scanning calorimetry (DSC), X-ray Diffraction (XRD), and magnetic hysteresis measurements. In addition, coatings with high density after sintering show excellent electrical conductivity of 40 S cm-1 at 800 °C while simultaneously providing requisite protection characteristics against Cr-poisoning. This study provides a framework for optimal evaluation of MCO coatings in intermediate temperature SOFCs.

  6. Parallel particle swarm optimization on a graphics processing unit with application to trajectory optimization

    NASA Astrophysics Data System (ADS)

    Wu, Q.; Xiong, F.; Wang, F.; Xiong, Y.

    2016-10-01

    In order to reduce the computational time, a fully parallel implementation of the particle swarm optimization (PSO) algorithm on a graphics processing unit (GPU) is presented. Instead of being executed on the central processing unit (CPU) sequentially, PSO is executed in parallel via the GPU on the compute unified device architecture (CUDA) platform. The processes of fitness evaluation, updating of velocity and position of all particles are all parallelized and introduced in detail. Comparative studies on the optimization of four benchmark functions and a trajectory optimization problem are conducted by running PSO on the GPU (GPU-PSO) and CPU (CPU-PSO). The impact of design dimension, number of particles and size of the thread-block in the GPU and their interactions on the computational time is investigated. The results show that the computational time of the developed GPU-PSO is much shorter than that of CPU-PSO, with comparable accuracy, which demonstrates the remarkable speed-up capability of GPU-PSO.

  7. Protective emotional regulation processes towards adjustment in infertile patients.

    PubMed

    Pinto-Gouveia, José; Galhardo, Ana; Cunha, Marina; Matos, Marcela

    2012-03-01

    Little is known about emotional regulation processes of psychological flexibility/acceptance, self-compassion, and coping styles in infertility and the way they may exert a protective function towards depression. The aim of the current study was to explore how these emotion regulation processes are related to depression and to the sense of self-efficacy to deal with infertility in infertile patients. Gender differences were also considered. One hundred couples without known fertility problems and 100 couples with an infertility diagnosis completed the instruments: Beck Depression Inventory, Coping Styles Questionnaire, Acceptance and Action Questionnaire, Self-Compassion Scale and Infertility Self-efficacy Scale. Infertile couples presented statistically significantly higher scores on depression and lower scores in psychological flexibility/acceptance and self-compassion than the control group. This pattern was particularly identified in women who also tended to use less an emotional/detached coping style and to perceive themselves as less confident to deal with infertility than men. Multiple regression analysis showed that psychological flexibility/acceptance was a significant predictor of depressive symptoms in men and women with infertility. Emotional regulation processes, such as psychological flexibility/acceptance and self-compassion, seem to be relevant to the understanding of depressive symptoms and psychological adjustment to infertility, suggesting that these issues should be addressed in a therapeutic context with these couples.

  8. Optimization of chemical etching process in niobium cavities

    SciTech Connect

    Tajima, T.; Trabia, M.; Culbreth, W.; Subramanian, S.

    2004-01-01

    Superconducting niobium cavities are important components of linear accelerators. Buffered chemical polishing (BCP) on the inner surface of the cavity is a standard procedure to improve its performance. The quality of BCP, however, has not been optimized well in terms of the uniformity of surface smoothness. A finite element computational fluid dynamics (CFD) model was developed to simulate the chemical etching process inside the cavity. The analysis confirmed the observation of other researchers that the iris section of the cavity received more etching than the equator regions due to higher flow rate. The baffle, which directs flow towards the walls of the cavity, was redesigned using optimization techniques. The redesigned baffle significantly improves the performance of the etching process. To verify these results an experimental setup for flow visualization was created. The setup consists of a high speed, high resolution CCD camera. The camera is positioned by a computer-controlled traversing mechanism. A dye injecting arrangement is used for tracking the fluid path. Experimental results are in general agreement with CFD and optimization results.

  9. Optimism

    PubMed Central

    Carver, Charles S.; Scheier, Michael F.; Segerstrom, Suzanne C.

    2010-01-01

    Optimism is an individual difference variable that reflects the extent to which people hold generalized favorable expectancies for their future. Higher levels of optimism have been related prospectively to better subjective well-being in times of adversity or difficulty (i.e., controlling for previous well-being). Consistent with such findings, optimism has been linked to higher levels of engagement coping and lower levels of avoidance, or disengagement, coping. There is evidence that optimism is associated with taking proactive steps to protect one's health, whereas pessimism is associated with health-damaging behaviors. Consistent with such findings, optimism is also related to indicators of better physical health. The energetic, task-focused approach that optimists take to goals also relates to benefits in the socioeconomic world. Some evidence suggests that optimism relates to more persistence in educational efforts and to higher later income. Optimists also appear to fare better than pessimists in relationships. Although there are instances in which optimism fails to convey an advantage, and instances in which it may convey a disadvantage, those instances are relatively rare. In sum, the behavioral patterns of optimists appear to provide models of living for others to learn from. PMID:20170998

  10. Roll levelling semi-analytical model for process optimization

    NASA Astrophysics Data System (ADS)

    Silvestre, E.; Garcia, D.; Galdos, L.; Saenz de Argandoña, E.; Mendiguren, J.

    2016-08-01

    Roll levelling is a primary manufacturing process used to remove residual stresses and imperfections of metal strips in order to make them suitable for subsequent forming operations. In the last years the importance of this process has been evidenced with the apparition of Ultra High Strength Steels with strength > 900 MPa. The optimal setting of the machine as well as a robust machine design has become critical for the correct processing of these materials. Finite Element Method (FEM) analysis is the widely used technique for both aspects. However, in this case, the FEM simulation times are above the admissible ones in both machine development and process optimization. In the present work, a semi-analytical model based on a discrete bending theory is presented. This model is able to calculate the critical levelling parameters i.e. force, plastification rate, residual stresses in a few seconds. First the semi-analytical model is presented. Next, some experimental industrial cases are analyzed by both the semi-analytical model and the conventional FEM model. Finally, results and computation times of both methods are compared.

  11. Process Parameters Optimization in Single Point Incremental Forming

    NASA Astrophysics Data System (ADS)

    Gulati, Vishal; Aryal, Ashmin; Katyal, Puneet; Goswami, Amitesh

    2016-04-01

    This work aims to optimize the formability and surface roughness of parts formed by the single-point incremental forming process for an Aluminium-6063 alloy. The tests are based on Taguchi's L18 orthogonal array selected on the basis of DOF. The tests have been carried out on vertical machining center (DMC70V); using CAD/CAM software (SolidWorks V5/MasterCAM). Two levels of tool radius, three levels of sheet thickness, step size, tool rotational speed, feed rate and lubrication have been considered as the input process parameters. Wall angle and surface roughness have been considered process responses. The influential process parameters for the formability and surface roughness have been identified with the help of statistical tool (response table, main effect plot and ANOVA). The parameter that has the utmost influence on formability and surface roughness is lubrication. In the case of formability, lubrication followed by the tool rotational speed, feed rate, sheet thickness, step size and tool radius have the influence in descending order. Whereas in surface roughness, lubrication followed by feed rate, step size, tool radius, sheet thickness and tool rotational speed have the influence in descending order. The predicted optimal values for the wall angle and surface roughness are found to be 88.29° and 1.03225 µm. The confirmation experiments were conducted thrice and the value of wall angle and surface roughness were found to be 85.76° and 1.15 µm respectively.

  12. Graphics Processing Units and High-Dimensional Optimization

    PubMed Central

    Zhou, Hua; Lange, Kenneth; Suchard, Marc A.

    2011-01-01

    This paper discusses the potential of graphics processing units (GPUs) in high-dimensional optimization problems. A single GPU card with hundreds of arithmetic cores can be inserted in a personal computer and dramatically accelerates many statistical algorithms. To exploit these devices fully, optimization algorithms should reduce to multiple parallel tasks, each accessing a limited amount of data. These criteria favor EM and MM algorithms that separate parameters and data. To a lesser extent block relaxation and coordinate descent and ascent also qualify. We demonstrate the utility of GPUs in nonnegative matrix factorization, PET image reconstruction, and multidimensional scaling. Speedups of 100 fold can easily be attained. Over the next decade, GPUs will fundamentally alter the landscape of computational statistics. It is time for more statisticians to get on-board. PMID:21847315

  13. Optimal and adaptive methods of processing hydroacoustic signals (review)

    NASA Astrophysics Data System (ADS)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  14. Thickness optimization for lithography process on silicon substrate

    NASA Astrophysics Data System (ADS)

    Su, Xiaojing; Su, Yajuan; Liu, Yansong; Chen, Fong; Liu, Zhimin; Zhang, Wei; Li, Bifeng; Gao, Tao; Wei, Yayi

    2015-03-01

    With the development of the lithography, the demand for critical dimension (CD) and CD uniformity (CDU) has reached a new level, which is harder and harder to achieve. There exists reflection at the interface between photo-resist and substrate during lithography exposure. This reflection has negative impact on CD and CDU control. It is possible to optimize the litho stack and film stack thickness on different lithography conditions. With the optimized stack, the total reflectivity for all incident angles at the interface can be controlled less than 0.5%, ideally 0.1%, which enhances process window (PW) most of the time. The theoretical results are verified by the experiment results from foundry, which helps the foundry achieve the mass production finally.

  15. Low cost lift-off process optimization for MEMS applications

    NASA Astrophysics Data System (ADS)

    Pandey, Shilpi; Bansal, Deepak; Panwar, Deepak; Shukla, Neha; Kumar, Arvind; Kothari, Prateek; Verma, Seema; Rangra, K. J.

    2016-04-01

    The patterning of thin films play major role in the performance of MEMS devices. The wet etching gives an isotropic profile and etch rate depends on the temperature, size of the microstructures and repetitive use of the solution. Even with the use of selective etchants, it significantly attacks the underlying layer. On the other side, dry etching is expensive process. In this paper, double layer of photoresist is optimized for lift-off process. Double layer lift-off technique offers process simplicity, low cost, over conventional single layer lift-off or bilayer lift-off with LOR. The problem of retention and flagging is resolved. The thickness of double coat photoresist is increased by 2.3 times to single coat photo resist.

  16. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  17. An Optimizing Algorithm for Automating Lifecycle Assembly Processes

    SciTech Connect

    Brown, R.G.; Calton, T.L.

    1998-12-09

    Designing products for ~ assembly and disassembly during its entire Iifecycle for purposes including service, field repair, upgrade, and disposal is a process that involves many disciplines. In additiou finding the best solution often involves considering the design as a whole and by considering its intended Iifecycle. DifFerent goals and cortstmints (compared to initial assembly) require us to re-visit the significant fi,mdamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of assembly planning or applied studies of lifecycle assembly processes, which give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for, analyze, and optimize the disassembly and assembly processes.

  18. US soybean processing industry: optimal size, number and location

    SciTech Connect

    D'Souza, G.E.; Phillips, T.D.; Free, W.J.

    1986-01-01

    The US dominates world soybean production and trade. The soybean industry is confronted with organizing facilties to minimize the costs of assembly, processing, and distribution of the final products. A transshipment model was constructed to minimize the combined costs of assemblying and processing soybeans, and distributing the co-products - meal and oil - to demand centers. Four solutions are presented, one each for the years 1977, 1981, 1990, and 2000. Trends of supply and utilization data indicate that the growth of exports would be such that future export demand could be fully satisfied only if some degree of domestic use of soybeans for meal and oil were sacrificed. Trucks were the dominant mode of domestic soybean and soybean meal shipments. Rail dominated soybean oil shipments. When regional processsing constraints are eliminated, substantial savings in total costs result. This indicates that the potential exists for significant future cost reductions on a national basis, if processing transshipment points can be more optimally sized and located.

  19. Simulative design and process optimization of the two-stage stretch-blow molding process

    SciTech Connect

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  20. Computational techniques for design optimization of thermal protective systems for the space shuttle vehicle. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A modular program for design optimization of thermal protection systems is discussed. Its capabilities and limitations are reviewed. Instructions for the operation of the program, output, and the program itself are given.

  1. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    SciTech Connect

    Ward, Richard C; Allgood, Glenn O; Knox, John R

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  2. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  3. Ultrasound assisted manufacturing of paraffin wax nanoemulsions: process optimization.

    PubMed

    Jadhav, A J; Holkar, C R; Karekar, S E; Pinjari, D V; Pandit, A B

    2015-03-01

    This work reports on the process optimization of ultrasound-assisted, paraffin wax in water nanoemulsions, stabilized by modified sodium dodecyl sulfate (SDS). This work focuses on the optimization of major emulsification process variables including sonication time, applied power and surfactant concentration. The effects of these variables were investigated on the basis of mean droplet diameter and stability of the prepared emulsion. It was found that the stable emulsion with droplet diameters about 160.9 nm could be formed with the surfactant concentration of 10 mg/ml and treated at 40% of applied power (power density: 0.61 W/ml) for 15 min. Scanning electron microscopy (SEM) was used to study the morphology of the emulsion droplets. The droplets were solid at room temperature, showing bright spots under polarized light and a spherical shape under SEM. The electrophoretic properties of emulsion droplets showed a negative zeta potential due to the adsorption of head sulfate groups of the SDS surfactant. For the sake of comparison, paraffin wax emulsion was prepared via emulsion inversion point method and was checked its intrinsic stability. Visually, it was found that the emulsion get separated/creamed within 30 min. while the emulsion prepared via ultrasonically is stable for more than 3 months. From this study, it was found that the ultrasound-assisted emulsification process could be successfully used for the preparation of stable paraffin wax nanoemulsions.

  4. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  5. Ultrasound assisted manufacturing of paraffin wax nanoemulsions: process optimization.

    PubMed

    Jadhav, A J; Holkar, C R; Karekar, S E; Pinjari, D V; Pandit, A B

    2015-03-01

    This work reports on the process optimization of ultrasound-assisted, paraffin wax in water nanoemulsions, stabilized by modified sodium dodecyl sulfate (SDS). This work focuses on the optimization of major emulsification process variables including sonication time, applied power and surfactant concentration. The effects of these variables were investigated on the basis of mean droplet diameter and stability of the prepared emulsion. It was found that the stable emulsion with droplet diameters about 160.9 nm could be formed with the surfactant concentration of 10 mg/ml and treated at 40% of applied power (power density: 0.61 W/ml) for 15 min. Scanning electron microscopy (SEM) was used to study the morphology of the emulsion droplets. The droplets were solid at room temperature, showing bright spots under polarized light and a spherical shape under SEM. The electrophoretic properties of emulsion droplets showed a negative zeta potential due to the adsorption of head sulfate groups of the SDS surfactant. For the sake of comparison, paraffin wax emulsion was prepared via emulsion inversion point method and was checked its intrinsic stability. Visually, it was found that the emulsion get separated/creamed within 30 min. while the emulsion prepared via ultrasonically is stable for more than 3 months. From this study, it was found that the ultrasound-assisted emulsification process could be successfully used for the preparation of stable paraffin wax nanoemulsions. PMID:25465097

  6. Wireless image transmission using turbo codes and optimal unequal error protection.

    PubMed

    Thomos, Nikolaos; Boulgouris, Nikolaos V; Strintzis, Michael G

    2005-11-01

    A novel image transmission scheme is proposed for the communication of set partitioning in hierarchical trees image streams over wireless channels. The proposed scheme employs turbo codes and Reed-Solomon codes in order to deal effectively with burst errors. An algorithm for the optimal unequal error protection of the compressed bitstream is also proposed and applied in conjunction with an inherently more efficient technique for product code decoding. The resulting scheme is tested for the transmission of images over wireless channels. Experimental evaluation clearly demonstrates the superiority of the proposed transmission system in comparison to well-known robust coding schemes.

  7. Optimizing the lithography model calibration algorithms for NTD process

    NASA Astrophysics Data System (ADS)

    Hu, C. M.; Lo, Fred; Yang, Elvis; Yang, T. H.; Chen, K. C.

    2016-03-01

    As patterns shrink to the resolution limits of up-to-date ArF immersion lithography technology, negative tone development (NTD) process has been an increasingly adopted technique to get superior imaging quality through employing bright-field (BF) masks to print the critical dark-field (DF) metal and contact layers. However, from the fundamental materials and process interaction perspectives, several key differences inherently exist between NTD process and the traditional positive tone development (PTD) system, especially the horizontal/vertical resist shrinkage and developer depletion effects, hence the traditional resist parameters developed for the typical PTD process have no longer fit well in NTD process modeling. In order to cope with the inherent differences between PTD and NTD processes accordingly get improvement on NTD modeling accuracy, several NTD models with different combinations of complementary terms were built to account for the NTD-specific resist shrinkage, developer depletion and diffusion, and wafer CD jump induced by sub threshold assistance feature (SRAF) effects. Each new complementary NTD term has its definite aim to deal with the NTD-specific phenomena. In this study, the modeling accuracy is compared among different models for the specific patterning characteristics on various feature types. Multiple complementary NTD terms were finally proposed to address all the NTD-specific behaviors simultaneously and further optimize the NTD modeling accuracy. The new algorithm of multiple complementary NTD term tested on our critical dark-field layers demonstrates consistent model accuracy improvement for both calibration and verification.

  8. Source-optimized irregular repeat accumulate codes with inherent unequal error protection capabilities and their application to scalable image transmission.

    PubMed

    Lan, Ching-Fu; Xiong, Zixiang; Narayanan, Krishna R

    2006-07-01

    The common practice for achieving unequal error protection (UEP) in scalable multimedia communication systems is to design rate-compatible punctured channel codes before computing the UEP rate assignments. This paper proposes a new approach to designing powerful irregular repeat accumulate (IRA) codes that are optimized for the multimedia source and to exploiting the inherent irregularity in IRA codes for UEP. Using the end-to-end distortion due to the first error bit in channel decoding as the cost function, which is readily given by the operational distortion-rate function of embedded source codes, we incorporate this cost function into the channel code design process via density evolution and obtain IRA codes that minimize the average cost function instead of the usual probability of error. Because the resulting IRA codes have inherent UEP capabilities due to irregularity, the new IRA code design effectively integrates channel code optimization and UEP rate assignments, resulting in source-optimized channel coding or joint source-channel coding. We simulate our source-optimized IRA codes for transporting SPIHT-coded images over a binary symmetric channel with crossover probability p. When p = 0.03 and the channel code length is long (e.g., with one codeword for the whole 512 x 512 image), we are able to operate at only 9.38% away from the channel capacity with code length 132380 bits, achieving the best published results in terms of average peak signal-to-noise ratio (PSNR). Compared to conventional IRA code design (that minimizes the probability of error) with the same code rate, the performance gain in average PSNR from using our proposed source-optimized IRA code design is 0.8759 dB when p = 0.1 and the code length is 12800 bits. As predicted by Shannon's separation principle, we observe that this performance gain diminishes as the code length increases. PMID:16830898

  9. An Improved Ant Colony Optimization Approach for Optimization of Process Planning

    PubMed Central

    Wang, JinFeng; Fan, XiaoLiang; Ding, Haimin

    2014-01-01

    Computer-aided process planning (CAPP) is an important interface between computer-aided design (CAD) and computer-aided manufacturing (CAM) in computer-integrated manufacturing environments (CIMs). In this paper, process planning problem is described based on a weighted graph, and an ant colony optimization (ACO) approach is improved to deal with it effectively. The weighted graph consists of nodes, directed arcs, and undirected arcs, which denote operations, precedence constraints among operation, and the possible visited path among operations, respectively. Ant colony goes through the necessary nodes on the graph to achieve the optimal solution with the objective of minimizing total production costs (TPCs). A pheromone updating strategy proposed in this paper is incorporated in the standard ACO, which includes Global Update Rule and Local Update Rule. A simple method by controlling the repeated number of the same process plans is designed to avoid the local convergence. A case has been carried out to study the influence of various parameters of ACO on the system performance. Extensive comparative experiments have been carried out to validate the feasibility and efficiency of the proposed approach. PMID:25097874

  10. Numerical Tool Path Optimization for Conventional Sheet Metal Spinning Processes

    NASA Astrophysics Data System (ADS)

    Rentsch, Benedikt; Manopulo, Niko; Hora, Pavel

    2016-08-01

    To this day, conventional sheet metal spinning processes are designed with a very low degree of automation. They are usually executed by experienced personnel, who actively adjust the tool paths during production. The practically unlimited freedom in designing the tool paths enables the efficient manufacturing of complex geometries on one hand, but is challenging to translate into a standardized procedure on the other. The present study aims to propose a systematic methodology, based on a 3D FEM model combined with a numerical optimization strategy, in order to design tool paths. The accurate numerical modelling of the spinning process is firstly discussed, followed by an analysis of appropriate objective functions and constraints required to obtain a failure free tool path design.

  11. Integration of Advanced Simulation and Visualization for Manufacturing Process Optimization

    NASA Astrophysics Data System (ADS)

    Zhou, Chenn; Wang, Jichao; Tang, Guangwu; Moreland, John; Fu, Dong; Wu, Bin

    2016-05-01

    The integration of simulation and visualization can provide a cost-effective tool for process optimization, design, scale-up and troubleshooting. The Center for Innovation through Visualization and Simulation (CIVS) at Purdue University Northwest has developed methodologies for such integration with applications in various manufacturing processes. The methodologies have proven to be useful for virtual design and virtual training to provide solutions addressing issues on energy, environment, productivity, safety, and quality in steel and other industries. In collaboration with its industrial partnerships, CIVS has provided solutions to companies, saving over US38 million. CIVS is currently working with the steel industry to establish an industry-led Steel Manufacturing Simulation and Visualization Consortium through the support of National Institute of Standards and Technology AMTech Planning Grant. The consortium focuses on supporting development and implementation of simulation and visualization technologies to advance steel manufacturing across the value chain.

  12. Optimization of drying process parameters for cauliflower drying.

    PubMed

    Gupta, Manoj Kumar; Sehgal, V K; Arora, Sadhna

    2013-02-01

    The different sizes (3, 4 and 5 cm) of hybrid variety of cauliflower (variety no. 71) were dehydrated in thin layer at three temperatures of 55, 60 and 65 °C with velocities of 40, 50 and 60 m/min. Dehydrated samples were analyzed for vitamin C, rehydration ratio and browning. Statistical analysis indicated that drying time was dependent on initial size of cauliflower, drying air temperature and velocity, but rehydration ratio was significantly affected by the combined effect of temperature and airflow velocity. Vitamin C content of the dried cauliflower samples were significantly affected by temperature only and non enzymatic browning was function of temperature, airflow velocity, and combined effect of temperature and airflow velocity. Optimization of the drying process parameters for the given constraints resulted in 60.10(0)C, 59.28 m/min, 3.35 cm. The predicted responses for the optimized combination of process parameters were time, vitamin C content, rehydration ratio, and browning values of 491.22 min (time), 289.86 mg/100 g (Vitamin C), 6.91 ( rehydration ratio), and 0.14 (browning), respectively with the desirability factor of 0.787. PMID:24425888

  13. Analysis and optimization of coagulation and flocculation process

    NASA Astrophysics Data System (ADS)

    Saritha, V.; Srinivas, N.; Srikanth Vuppala, N. V.

    2015-02-01

    Natural coagulants have been the focus of research of many investigators through the last decade owing to the problems caused by the chemical coagulants. Optimization of process parameters is vital for the effectiveness of coagulation process. In the present study optimization of parameters like pH, dose of coagulant and mixing speed were studied using natural coagulants sago and chitin in comparison with alum. Jar test apparatus was used to perform the coagulation. The results showed that the removal of turbidity was up to 99 % by both alum and chitin at lower doses of coagulant, i.e., 0.1-0.3 g/L, whereas sago has shown a reduction of 70-100 % at doses of 0.1 and 0.2 g/L. The optimum conditions observed for sago were 6 and 7 whereas chitin was stable at all pH ranges, lower coagulant doses, i.e., 0.1-0.3 g/L and mixing speed—rapid mixing at 100 rpm for 10 min and slow mixing 20 rpm for 20 min. Hence, it can be concluded that sago and chitin can be used for treating water even with large seasonal variation in turbidity.

  14. Development of Processing Techniques for Advanced Thermal Protection Materials

    NASA Technical Reports Server (NTRS)

    Selvaduray, Guna; Cox, Michael; Srinivasan, Vijayakumar

    1997-01-01

    Thermal Protection Materials Branch (TPMB) has been involved in various research programs to improve the properties and structural integrity of the existing aerospace high temperature materials. Specimens from various research programs were brought into the analytical laboratory for the purpose of obtaining and refining the material characterization. The analytical laboratory in TPMB has many different instruments which were utilized to determine the physical and chemical characteristics of materials. Some of the instruments that were utilized by the SJSU students are: Scanning Electron Microscopy (SEM), Energy Dispersive X-ray analysis (EDX), X-ray Diffraction Spectrometer (XRD), Fourier Transform-Infrared Spectroscopy (FTIR), Ultra Violet Spectroscopy/Visible Spectroscopy (UV/VIS), Particle Size Analyzer (PSA), and Inductively Coupled Plasma Atomic Emission Spectrometer (ICP-AES). The above mentioned analytical instruments were utilized in the material characterization process of the specimens from research programs such as: aerogel ceramics (I) and (II), X-33 Blankets, ARC-Jet specimens, QUICFIX specimens and gas permeability of lightweight ceramic ablators. In addition to analytical instruments in the analytical laboratory at TPMB, there are several on-going experiments. One particular experiment allows the measurement of permeability of ceramic ablators. From these measurements, physical characteristics of the ceramic ablators can be derived.

  15. Process for protecting bonded components from plating shorts

    DOEpatents

    Tarte, Lisa A.; Bonde, Wayne L.; Carey, Paul G.; Contolini, Robert J.; McCarthy, Anthony M.

    2000-01-01

    A method which protects the region between a component and the substrate onto which the components is bonded using an electrically insulating fillet of photoresist. The fillet protects the regions from subsequent plating with metal and therefore shorting the plated conductors which run down the sides of the component and onto the substrate.

  16. Strategies for optimal operation of the tellurium electrowinning process

    SciTech Connect

    Broderick, G.; Handle, B.; Paschen, P.

    1999-02-01

    Empirical models predicting the purity of electrowon tellurium have been developed using data from 36 pilot-plant trials. Based on these models, a numerical optimization of the process was performed to identify conditions which minimize the total contamination in Pb and Se while reducing electrical consumption per kilogram of electrowon tellurium. Results indicate that product quality can be maintained and even improved while operating at the much higher electroplating production rates obtained at high current densities. Using these same process settings, the electrical consumption of the process can be reduced by up to 10 pct by operating at midrange temperatures of close to 50 C. This is particularly attractive when waste heat is available at the plant to help preheat the electrolyte feed. When both Pb and Se are present as contaminants, the most energy-efficient strategy involves the use of a high current density, at a moderate temperature with high flow, for low concentrations of TeO{sub 2}. If Pb is removed prior to the electrowinning process, the use of a low current density and low electrolyte feed concentration, while operating at a low temperature and moderate flow rates, provides the most significant reduction in Se codeposition.

  17. Cooling system optimization analysis for hot forming processes

    NASA Astrophysics Data System (ADS)

    Ghoo, Bonyoung; Umezu, Yasuyoshi; Watanabe, Yuko

    2013-12-01

    Hot forming technology was developed to produce automotive panels having ultra-high tensile stress over 1500MPa. The elevated temperature corresponds with decreased flow stress and increased ductility. Furthermore, hot forming products have almost zero springback amounts. This advanced forming technology accelerates the needs for numerical simulations coupling with thermal-mechanical formulations. In the present study, 3-dimensional finite element analyses for hot forming processes are conducted using JSTAMP/NV and LS-DYNA considering cooling system. Special attention is paid to the optimization of cooling system using thermo-mechanical finite element analysis through the influence of various cooling parameters. The presented work shows an adequate cooling system functions and microstructural phase transformation material model together with a proper set of numerical parameters can give both efficient and accurate design insight in hot forming manufacturing process. JSTAMP/NV and LS-DYNA can become a robust combination set for complex hot forming analysis which needs thermo-mechanical and microstructural material modeling and various process modeling. The use of the new JSTAMP/NV function for multishot manufacturing process is shown good capabilities in cooling system evaluation. And the use of the advanced LS-DYNA microstructural phase transformation model is shown good evaluation results in martensite amount and Vickers hardness after quenching.

  18. Optimal processes for probabilistic work extraction beyond the second law

    NASA Astrophysics Data System (ADS)

    Cavina, Vasco; Mari, Andrea; Giovannetti, Vittorio

    2016-07-01

    According to the second law of thermodynamics, for every transformation performed on a system which is in contact with an environment of fixed temperature, the average extracted work is bounded by the decrease of the free energy of the system. However, in a single realization of a generic process, the extracted work is subject to statistical fluctuations which may allow for probabilistic violations of the previous bound. We are interested in enhancing this effect, i.e. we look for thermodynamic processes that maximize the probability of extracting work above a given arbitrary threshold. For any process obeying the Jarzynski identity, we determine an upper bound for the work extraction probability that depends also on the minimum amount of work that we are willing to extract in case of failure, or on the average work we wish to extract from the system. Then we show that this bound can be saturated within the thermodynamic formalism of quantum discrete processes composed by sequences of unitary quenches and complete thermalizations. We explicitly determine the optimal protocol which is given by two quasi-static isothermal transformations separated by a finite unitary quench.

  19. Optimal processes for probabilistic work extraction beyond the second law.

    PubMed

    Cavina, Vasco; Mari, Andrea; Giovannetti, Vittorio

    2016-01-01

    According to the second law of thermodynamics, for every transformation performed on a system which is in contact with an environment of fixed temperature, the average extracted work is bounded by the decrease of the free energy of the system. However, in a single realization of a generic process, the extracted work is subject to statistical fluctuations which may allow for probabilistic violations of the previous bound. We are interested in enhancing this effect, i.e. we look for thermodynamic processes that maximize the probability of extracting work above a given arbitrary threshold. For any process obeying the Jarzynski identity, we determine an upper bound for the work extraction probability that depends also on the minimum amount of work that we are willing to extract in case of failure, or on the average work we wish to extract from the system. Then we show that this bound can be saturated within the thermodynamic formalism of quantum discrete processes composed by sequences of unitary quenches and complete thermalizations. We explicitly determine the optimal protocol which is given by two quasi-static isothermal transformations separated by a finite unitary quench. PMID:27377557

  20. Optimal processes for probabilistic work extraction beyond the second law.

    PubMed

    Cavina, Vasco; Mari, Andrea; Giovannetti, Vittorio

    2016-07-05

    According to the second law of thermodynamics, for every transformation performed on a system which is in contact with an environment of fixed temperature, the average extracted work is bounded by the decrease of the free energy of the system. However, in a single realization of a generic process, the extracted work is subject to statistical fluctuations which may allow for probabilistic violations of the previous bound. We are interested in enhancing this effect, i.e. we look for thermodynamic processes that maximize the probability of extracting work above a given arbitrary threshold. For any process obeying the Jarzynski identity, we determine an upper bound for the work extraction probability that depends also on the minimum amount of work that we are willing to extract in case of failure, or on the average work we wish to extract from the system. Then we show that this bound can be saturated within the thermodynamic formalism of quantum discrete processes composed by sequences of unitary quenches and complete thermalizations. We explicitly determine the optimal protocol which is given by two quasi-static isothermal transformations separated by a finite unitary quench.

  1. Optimal processes for probabilistic work extraction beyond the second law

    PubMed Central

    Cavina, Vasco; Mari, Andrea; Giovannetti, Vittorio

    2016-01-01

    According to the second law of thermodynamics, for every transformation performed on a system which is in contact with an environment of fixed temperature, the average extracted work is bounded by the decrease of the free energy of the system. However, in a single realization of a generic process, the extracted work is subject to statistical fluctuations which may allow for probabilistic violations of the previous bound. We are interested in enhancing this effect, i.e. we look for thermodynamic processes that maximize the probability of extracting work above a given arbitrary threshold. For any process obeying the Jarzynski identity, we determine an upper bound for the work extraction probability that depends also on the minimum amount of work that we are willing to extract in case of failure, or on the average work we wish to extract from the system. Then we show that this bound can be saturated within the thermodynamic formalism of quantum discrete processes composed by sequences of unitary quenches and complete thermalizations. We explicitly determine the optimal protocol which is given by two quasi-static isothermal transformations separated by a finite unitary quench. PMID:27377557

  2. Virtual Optimization of Nasal Insulin Therapy Predicts Immunization Frequency to Be Crucial for Diabetes Protection

    PubMed Central

    Fousteri, Georgia; Chan, Jason R.; Zheng, Yanan; Whiting, Chan; Dave, Amy; Bresson, Damien; Croft, Michael; von Herrath, Matthias

    2010-01-01

    OBJECTIVE Development of antigen-specific strategies to treat or prevent type 1 diabetes has been slow and difficult because of the lack of experimental tools and defined biomarkers that account for the underlying therapeutic mechanisms. RESEARCH DESIGN AND METHODS The type 1 diabetes PhysioLab platform, a large-scale mathematical model of disease pathogenesis in the nonobese diabetic (NOD) mouse, was used to investigate the possible mechanisms underlying the efficacy of nasal insulin B:9-23 peptide therapy. The experimental aim was to evaluate the impact of dose, frequency of administration, and age at treatment on Treg induction and optimal therapeutic outcome. RESULTS In virtual NOD mice, treatment efficacy was predicted to depend primarily on the immunization frequency and stage of the disease and to a lesser extent on the dose. Whereas low-frequency immunization protected from diabetes atrributed to Treg and interleukin (IL)-10 induction in the pancreas 1–2 weeks after treatment, high-frequency immunization failed. These predictions were confirmed with wet-lab approaches, where only low-frequency immunization started at an early disease stage in the NOD mouse resulted in significant protection from diabetes by inducing IL-10 and Treg. CONCLUSIONS Here, the advantage of applying computer modeling in optimizing the therapeutic efficacy of nasal insulin immunotherapy was confirmed. In silico modeling was able to streamline the experimental design and to identify the particular time frame at which biomarkers associated with protection in live NODs were induced. These results support the development and application of humanized platforms for the design of clinical trials (i.e., for the ongoing nasal insulin prevention studies). PMID:20864513

  3. Supplemental Assessment of the Y-12 Groundwater Protection Program Using Monitoring and Remediation Optimization System Software

    SciTech Connect

    Elvado Environmental LLC; GSI Environmental LLC

    2009-01-01

    A supplemental quantitative assessment of the Groundwater Protection Program (GWPP) at the Y-12 National Security Complex (Y-12) in Oak Ridge, TN was performed using the Monitoring and Remediation Optimization System (MAROS) software. This application was previously used as part of a similar quantitative assessment of the GWPP completed in December 2005, hereafter referenced as the 'baseline' MAROS assessment (BWXT Y-12 L.L.C. [BWXT] 2005). The MAROS software contains modules that apply statistical analysis techniques to an existing GWPP analytical database in conjunction with hydrogeologic factors, regulatory framework, and the location of potential receptors, to recommend an improved groundwater monitoring network and optimum sampling frequency for individual monitoring locations. The goal of this supplemental MAROS assessment of the Y-12 GWPP is to review and update monitoring network optimization recommendations resulting from the 2005 baseline report using data collected through December 2007. The supplemental MAROS assessment is based on the findings of the baseline MAROS assessment and includes only the groundwater sampling locations (wells and natural springs) currently granted 'Active' status in accordance with the Y-12 GWPP Monitoring Optimization Plan (MOP). The results of the baseline MAROS assessment provided technical rationale regarding the 'Active' status designations defined in the MOP (BWXT 2006). One objective of the current report is to provide a quantitative review of data collected from Active but infrequently sampled wells to confirm concentrations at these locations. This supplemental MAROS assessment does not include the extensive qualitative evaluations similar to those presented in the baseline report.

  4. Predictive Process Optimization for Fracture Ductility in Automotive TRIP Steels

    NASA Astrophysics Data System (ADS)

    Gong, Jiadong

    In light of the emerging challenges in the automotive industry of meeting new energy-saving and environment-friendly requirements imposed by both the government and the society, the auto makers have been working relentlessly to reduce the weight of automobiles. While steel makers pushed out a variety of novel Advanced High Strength Steels (AHSS) to serve this market with new needs, TRIP (Transformation Induced Plasticity) steels is one of the most promising materials for auto-body due to its exceptional combination of strength and formability. However, current commercial automotive TRIP steels demonstrate relatively low hole-expansion (HE) capability, which is critical in stretch forming of various auto parts. This shortcoming on ductility has been causing fracture issues in the forming process and limits the wider applications of this steel. The kinetic theory of martensitic transformations and associated transformation plasticity is applied to the optimization of transformation stability for enhanced mechanical properties in a class of high strength galvannealed TRIP steel. This research leverages newly developed characterization and simulation capabilities, supporting computational design of high-performance steels exploiting optimized transformation plasticity for desired mechanical behaviors, especially for the hole-expansion ductility. The microstructure of the automotive TRIP sheet steels was investigated, using advanced tomographic characterization including nanoscale Local Electrode Atom Probe (LEAP) microanalysis. The microstructural basis of austenite stability, the austenite carbon concentration in particular, was quantified and correlated with measured fracture ductility through transformation plasticity constitutive laws. Plastic flow stability for enhanced local fracture ductility at high strength is sought to maintain high hole-expansion ductility, through quantifying the optimal stability and the heat-treatment process to achieve it. An additional

  5. Process synthesis and optimization for the production of carbon nanostructures.

    PubMed

    Iyuke, S E; Mamvura, T A; Liu, K; Sibanda, V; Meyyappan, M; Varadan, V K

    2009-09-16

    A swirled fluidized bed chemical vapour deposition (SFCVD) reactor has been manufactured and optimized to produce carbon nanostructures on a continuous basis using in situ formation of floating catalyst particles by thermal decomposition of organometallic ferrocene. During the process optimization, carbon nanoballs were produced in the absence of a catalyst at temperatures higher than 1000 degrees C, while carbon nanofibres, single-walled carbon nanotubes, helical carbon nanotubes, multi-walled carbon nanotubes (MWCNTs) and carbon nanofibres (CNFs) were produced in the presence of a catalyst at lower temperatures of between 750 and 900 degrees C. The optimum conditions for producing carbon nanostructures were a temperature of 850 degrees C, acetylene flow rate of 100 ml min(-1), and acetylene gas was used as the carbon source. All carbon nanostructures produced have morphologies and diameters ranging from 15 to 200 nm and wall thicknesses between 0.5 and 0.8 nm. In comparison to the quantity of MWCNTs produced with other methods described in the literature, the SFCVD technique was superior to floating catalytic CVD (horizontal fixed bed) and microwave CVD but inferior to rotary tube CVD. PMID:19706958

  6. Process synthesis and optimization for the production of carbon nanostructures

    NASA Astrophysics Data System (ADS)

    Iyuke, S. E.; Mamvura, T. A.; Liu, K.; Sibanda, V.; Meyyappan, M.; Varadan, V. K.

    2009-09-01

    A swirled fluidized bed chemical vapour deposition (SFCVD) reactor has been manufactured and optimized to produce carbon nanostructures on a continuous basis using in situ formation of floating catalyst particles by thermal decomposition of organometallic ferrocene. During the process optimization, carbon nanoballs were produced in the absence of a catalyst at temperatures higher than 1000 °C, while carbon nanofibres, single-walled carbon nanotubes, helical carbon nanotubes, multi-walled carbon nanotubes (MWCNTs) and carbon nanofibres (CNFs) were produced in the presence of a catalyst at lower temperatures of between 750 and 900 °C. The optimum conditions for producing carbon nanostructures were a temperature of 850 °C, acetylene flow rate of 100 ml min-1, and acetylene gas was used as the carbon source. All carbon nanostructures produced have morphologies and diameters ranging from 15 to 200 nm and wall thicknesses between 0.5 and 0.8 nm. In comparison to the quantity of MWCNTs produced with other methods described in the literature, the SFCVD technique was superior to floating catalytic CVD (horizontal fixed bed) and microwave CVD but inferior to rotary tube CVD.

  7. Optimal lot sizing in screening processes with returnable defective items

    NASA Astrophysics Data System (ADS)

    Vishkaei, Behzad Maleki; Niaki, S. T. A.; Farhangi, Milad; Rashti, Mehdi Ebrahimnezhad Moghadam

    2014-07-01

    This paper is an extension of Hsu and Hsu (Int J Ind Eng Comput 3(5):939-948, 2012) aiming to determine the optimal order quantity of product batches that contain defective items with percentage nonconforming following a known probability density function. The orders are subject to 100 % screening process at a rate higher than the demand rate. Shortage is backordered, and defective items in each ordering cycle are stored in a warehouse to be returned to the supplier when a new order is received. Although the retailer does not sell defective items at a lower price and only trades perfect items (to avoid loss), a higher holding cost incurs to store defective items. Using the renewal-reward theorem, the optimal order and shortage quantities are determined. Some numerical examples are solved at the end to clarify the applicability of the proposed model and to compare the new policy to an existing one. The results show that the new policy provides better expected profit per time.

  8. Optimizing an immersion ESL curriculum using analytic hierarchy process.

    PubMed

    Tang, Hui-Wen Vivian

    2011-11-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative importance of course criteria for the purpose of tailoring an optimal one-week immersion English as a second language (ESL) curriculum for elementary school students in a suburban county of Taiwan. The hierarchy model and AHP analysis utilized in the present study will be useful for resolving several important multi-criteria decision-making issues in planning and evaluating ESL programs. This study also offers valuable insights and provides a basis for further research in customizing ESL curriculum models for different student populations with distinct learning needs, goals, and socioeconomic backgrounds.

  9. Simulation and optimization technologies for petroleum waste management and remediation process control.

    PubMed

    Qin, X S; Huang, G H; He, L

    2009-01-01

    Leakage and spill of petroleum hydrocarbons from underground storage tanks and pipelines have posed significant threats to groundwater resources across many petroleum-contaminated sites. Remediation of these sites is essential for protecting the soil and groundwater resources and reducing risks to local communities. Although many efforts have been made, effective design and management of various remediation systems are still challenging to practitioners. In recent years, the subsurface simulation model has been combined with techniques of optimization to address important problems of contaminated site management. The combined simulation-optimization system accounts for the complex behavior of the subsurface system and identifies the best management strategy under consideration of the management objectives and constraints. During the past decades, a large number of studies were conducted to simulate contaminant flow and transport in the subsurface and seek cost-effective remediation designs. This paper gives a comprehensive review on recent developments, advancements, challenges, and barriers associated with simulation and optimization techniques in supporting process control of petroleum waste management and site remediation. A number of related methodologies and applications were examined. Perspectives of effective site management were investigated, demonstrating many demanding areas for enhanced research efforts, which include issues of data availability and reliability, concerns in uncertainty, necessity of post-modeling analysis, and usefulness of development of process control techniques.

  10. Statistical process control using optimized neural networks: a case study.

    PubMed

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. PMID:24210290

  11. Optimizing the processing and presentation of PPCR imaging

    NASA Astrophysics Data System (ADS)

    Davies, Andrew G.; Cowen, Arnold R.; Parkin, Geoff J. S.; Bury, Robert F.

    1996-03-01

    Photostimulable phosphor computed radiography (CR) is becoming an increasingly popular image acquisition system. The acceptability of this technique, both diagnostically, ergonomically and economically is highly influenced by the method by which the image data is presented to the user. Traditional CR systems utilize an 11' by 14' film hardcopy format, and can place two images per exposure onto this film, which does not correspond to sizes and presentations provided by conventional techniques. It is also the authors' experience that the image enhancement algorithms provided by traditional CR systems do not provide optimal image presentation. An alternative image enhancement algorithm was developed, along with a number of hardcopy formats, designed to match the requirements of the image reporting process. The new image enhancement algorithm, called dynamic range reduction (DRR), is designed to provide a single presentation per exposure, maintaining the appearance of a conventional radiograph, while optimizing the rendition of diagnostically relevant features within the image. The algorithm was developed on a Sun SPARCstation, but later ported to a Philips' EasyVisionRAD workstation. Print formats were developed on the EasyVision to improve the acceptability of the CR hardcopy. For example, for mammographic examinations, four mammograms (a cranio-caudal and medio-lateral view of each breast) are taken for each patient, with all images placed onto a single sheet of 14' by 17' film. The new composite format provides a more suitable image presentation for reporting, and is more economical to produce. It is the use of enhanced image processing and presentation which has enabled all mammography undertaken within the general infirmary to be performed using the CR/EasyVisionRAD DRR/3M 969 combination, without recourse to conventional film/screen mammography.

  12. Optimization of image processing algorithms on mobile platforms

    NASA Astrophysics Data System (ADS)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  13. Optimization of a biological wastewater treatment process at a petrochemical plant using process simulation

    SciTech Connect

    Jones, R.M.; Dold, P.L.; Baker, A.J.; Briggs, T.

    1996-12-31

    A research study was conducted on the activated sludge process treating the wastewater from a petrochemical manufacturing facility in Ontario, Canada. The objective of the study was to improve the level of understanding of the process and to evaluate the use of model-based simulation tools as an aid in the optimization of the wastewater treatment facility. Models such as the IAWQ Activated Sludge Model No. 1 (ASM1) have previously been developed and applied to assist in designing new systems and to assist in the optimization of existing systems for the treatment of municipal wastewaters, However, due to significant differences between the characteristics of the petrochemical plant wastewater and municipal wastewaters, this study required the development of a mechanistic model specifically to describe the behavior of the activated sludge treatment of the petrochemical wastewater. This paper outlines the development of the mechanistic model and gives examples of how plant performance issues were investigated through process simulation.

  14. The application of multi-objective optimization method for activated sludge process: a review.

    PubMed

    Dai, Hongliang; Chen, Wenliang; Lu, Xiwu

    2016-01-01

    The activated sludge process (ASP) is the most generally applied biological wastewater treatment approach. Depending on the design and specific application, activated sludge wastewater treatment plants (WWTPs) can achieve biological nitrogen (N) and phosphorus (P) removal, besides the removal of organic carbon substances. However, the effluent N and P limits are getting tighter because of increased emphasis on environmental protection, and the needs for energy conservation as well as the operational reliability. Therefore, the balance between treatment performance and cost becomes a critical issue for the operations of WWTPs, which necessitates a multi-objective optimization (MOO). Recent studies in this field have shown promise in utilizing MOO to address the multiple conflicting criteria (i.e. effluent quality, operation cost, operation stability), including studying the ASP models that are primarily responsible for the process, and developing the method of MOO in the wastewater treatment process, which facilitates better optimization of process performance. Based on a better understanding of the application of MOO for ASP, a comprehensive review is conducted to offer a clear vision of the advances, and potential areas for future research are also proposed in the field. PMID:26819377

  15. Design of a tomato packing system by image processing and optimization processing

    NASA Astrophysics Data System (ADS)

    Li, K.; Kumazaki, T.; Saigusa, M.

    2016-02-01

    In recent years, with the development of environmental control systems in plant factories, tomato production has rapidly increased in Japan. However, with the decline in the availability of agricultural labor, there is a need to automate grading, sorting and packing operations. In this research, we designed an automatic packing program with which tomato weight could be estimated by image processing and that they were able to be packed in an optimized configuration. The weight was estimated by using the pixel area properties after an L*a*b* color model conversion, noise rejection, filling holes and boundary preprocessing. The packing optimization program was designed by a 0-1 knapsack algorithm for dynamic combinatorial optimization.

  16. Novel Optimization Methodology for Welding Process/Consumable Integration

    SciTech Connect

    Quintana, Marie A; DebRoy, Tarasankar; Vitek, John; Babu, Suresh

    2006-01-15

    Advanced materials are being developed to improve the energy efficiency of many industries of future including steel, mining, and chemical, as well as, US infrastructures including bridges, pipelines and buildings. Effective deployment of these materials is highly dependent upon the development of arc welding technology. Traditional welding technology development is slow and often involves expensive and time-consuming trial and error experimentation. The reason for this is the lack of useful predictive tools that enable welding technology development to keep pace with the deployment of new materials in various industrial sectors. Literature reviews showed two kinds of modeling activities. Academic and national laboratory efforts focus on developing integrated weld process models by employing the detailed scientific methodologies. However, these models are cumbersome and not easy to use. Therefore, these scientific models have limited application in real-world industrial conditions. On the other hand, industrial users have relied on simple predictive models based on analytical and empirical equations to drive their product development. The scopes of these simple models are limited. In this research, attempts were made to bridge this gap and provide the industry with a computational tool that combines the advantages of both approaches. This research resulted in the development of predictive tools which facilitate the development of optimized welding processes and consumables. The work demonstrated that it is possible to develop hybrid integrated models for relating the weld metal composition and process parameters to the performance of welds. In addition, these tools can be deployed for industrial users through user friendly graphical interface. In principle, the welding industry users can use these modular tools to guide their welding process parameter and consumable composition selection. It is hypothesized that by expanding these tools throughout welding industry

  17. XFEL diffraction: developing processing methods to optimize data quality.

    PubMed

    Sauter, Nicholas K

    2015-03-01

    Serial crystallography, using either femtosecond X-ray pulses from free-electron laser sources or short synchrotron-radiation exposures, has the potential to reveal metalloprotein structural details while minimizing damage processes. However, deriving a self-consistent set of Bragg intensities from numerous still-crystal exposures remains a difficult problem, with optimal protocols likely to be quite different from those well established for rotation photography. Here several data processing issues unique to serial crystallography are examined. It is found that the limiting resolution differs for each shot, an effect that is likely to be due to both the sample heterogeneity and pulse-to-pulse variation in experimental conditions. Shots with lower resolution limits produce lower-quality models for predicting Bragg spot positions during the integration step. Also, still shots by their nature record only partial measurements of the Bragg intensity. An approximate model that corrects to the full-spot equivalent (with the simplifying assumption that the X-rays are monochromatic) brings the distribution of intensities closer to that expected from an ideal crystal, and improves the sharpness of anomalous difference Fourier peaks indicating metal positions.

  18. Biometric Attendance and Big Data Analysis for Optimizing Work Processes.

    PubMed

    Verma, Neetu; Xavier, Teenu; Agrawal, Deepak

    2016-01-01

    Although biometric attendance management is available, large healthcare organizations have difficulty in big data analysis for optimization of work processes. The aim of this project was to assess the implementation of a biometric attendance system and its utility following big data analysis. In this prospective study the implementation of biometric system was evaluated over 3 month period at our institution. Software integration with other existing systems for data analysis was also evaluated. Implementation of the biometric system could be successfully done over a two month period with enrollment of 10,000 employees into the system. However generating reports and taking action this large number of staff was a challenge. For this purpose software was made for capturing the duty roster of each employee and integrating it with the biometric system and adding an SMS gateway. This helped in automating the process of sending SMSs to each employee who had not signed in. Standalone biometric systems have limited functionality in large organizations unless it is meshed with employee duty roster. PMID:27332164

  19. Biometric Attendance and Big Data Analysis for Optimizing Work Processes.

    PubMed

    Verma, Neetu; Xavier, Teenu; Agrawal, Deepak

    2016-01-01

    Although biometric attendance management is available, large healthcare organizations have difficulty in big data analysis for optimization of work processes. The aim of this project was to assess the implementation of a biometric attendance system and its utility following big data analysis. In this prospective study the implementation of biometric system was evaluated over 3 month period at our institution. Software integration with other existing systems for data analysis was also evaluated. Implementation of the biometric system could be successfully done over a two month period with enrollment of 10,000 employees into the system. However generating reports and taking action this large number of staff was a challenge. For this purpose software was made for capturing the duty roster of each employee and integrating it with the biometric system and adding an SMS gateway. This helped in automating the process of sending SMSs to each employee who had not signed in. Standalone biometric systems have limited functionality in large organizations unless it is meshed with employee duty roster.

  20. Further Development and Assessment of a Broadband Liner Optimization Process

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.

    2016-01-01

    The utilization of advanced fan designs (including higher bypass ratios) and shorter engine nacelles has highlighted a need for increased fan noise reduction over a broader frequency range. Thus, improved broadband liner designs must account for these constraints and, where applicable, take advantage of advanced manufacturing techniques that have opened new possibilities for novel configurations. This work focuses on the use of an established broadband acoustic liner optimization process to design a variable-depth, multi-degree of freedom liner for a high speed fan. Specifically, in-duct attenuation predictions with a statistical source model are used to obtain optimum impedance spectra over the conditions of interest. The predicted optimum impedance information is then used with acoustic liner modeling tools to design a liner aimed at producing impedance spectra that most closely match the predicted optimum values. The multi-degree of freedom design is carried through design, fabrication, and testing. In-duct attenuation predictions compare well with measured data and the multi-degree of freedom liner is shown to outperform a more conventional liner over a range of flow conditions. These promising results provide further confidence in the design tool, as well as the enhancements made to the overall design process.

  1. Optimization process of tribenzoine production as a glycerol derived product

    NASA Astrophysics Data System (ADS)

    Widayat, Abdurrakhman, Rifianto, Y.; Abdullah, Hadiyanto, Samsudin, Asep M.; Annisa, A. N.

    2015-12-01

    Tribenzoin is a derived product from glycerol that can produce from glycerol conversion via esterification process. The product can be used in the food industry, cosmetics industry, polymer industry and also can be used to improve the properties of adhesive materials and water resistance in the ink printer.In the other hand, it advantages is environmentally friendly andrenewable because it is not derived from petroleum. This paper discusses the effect of temperature and catalyst concentration for tribenzoin production. For the responses, yield and product composition were observed. Results showed that the highest yield achieved at optimal variable data processed using Central Composite Design (CCD) which is 63.64 temperature (°C), mole ratio of benzoic acidto glycerol is 3.644:1, and catalyst concentration 6.25% (wt% glycerol). Yield products produced 58.71%. FTIR analysis results showed that the samples contained the results of IR spectra wavelength 1761 cm-1 in the fingerprint region and 3165 cm-1 frequency region group. The existence of these two adjustments that fixed in the area is strong evidence that the compound is tribenzoin.

  2. Assessing and Optimizing Microarchitectural Performance of Event Processing Systems

    NASA Astrophysics Data System (ADS)

    Mendes, Marcelo R. N.; Bizarro, Pedro; Marques, Paulo

    Event Processing (EP) systems are being progressively used in business critical applications in domains such as algorithmic trading, supply chain management, production monitoring, or fraud detection. To deal with high throughput and low response time requirements, these EP systems mainly use the CPU-RAM sub-system for data processing. However, as we show here, collected statistics on CPU usage or on CPU-RAM communication reveal that available systems are poorly optimized and grossly waste resources. In this paper we quantify some of these inefficiencies and propose cache-aware algorithms and changes on internal data structures to overcome them. We test the before and after system both at the microarchitecture and application level and show that: i) the changes improve microarchitecture metrics such as clocks-per-instruction, cache misses or TLB misses; ii) and that some of these improvements result in very high application level improvements such as a 44% improvement on stream-to-table joins with 6-fold reduction on memory consumption, and order-of-magnitude increase on throughput for moving aggregation operations.

  3. Poling process optimization of piezo nano composite PZT/polimer

    NASA Astrophysics Data System (ADS)

    Ridlo, M. Rosyid; Lestari, Titik; Mardiyanto, Oemry, Achiar

    2013-09-01

    The objective of poling process is to make the electric dipole directions to be parallel in the inside perovskite crystal of piezo materials. In simply way, poling was carried out by giving the two sides of a piezo material by highly electrical potential. More parallel of electrical dipoles, it is more strength the piezo characteristics. The optimization involved control of temperature, time depth and the electrical voltage. The samples was prepared by solgel method with precursor tetrabutyl titanat Ti(OC4H9)4, zirconium nitrat Zr(NO3)4ṡ5H2O, Pb(CH3COO)2ṡ3H2O and solution ethylene glycol. Molar ratio Pb:Zr:Ti = 1,1:0,52:0,48 with concidering lossed Pb. Result of solgel process is nano powder PZT. The formed nano powder PZT was then mixed with polimer PVDF and pressed 10 MPa at 150 °C with the size 15 mm in diameter. After poling, piezoelectric constant d33 was measured. The highest d33 = 45 pC/N was found at poling parameters V = 5 kV/ mm, T = 120 °C dan time depth = 1 hours.

  4. A key to success: optimizing the planning process

    NASA Astrophysics Data System (ADS)

    Turk, Huseyin; Karakaya, Kamil

    2014-05-01

    operation planning process is analyzed according to a comprehensive approach. The difficulties of planning are identified. Consequently, for optimizing a decisionmaking process of an air operation, a planning process is identified in a virtual command and control structure.

  5. Hydroxyapatite coatings for marble protection: Optimization of calcite covering and acid resistance

    NASA Astrophysics Data System (ADS)

    Graziani, Gabriela; Sassoni, Enrico; Franzoni, Elisa; Scherer, George W.

    2016-04-01

    Hydroxyapatite (HAP) has a much lower dissolution rate and solubility than calcite, especially in an acidic environment, so it has been proposed for the protection of marble against acidic rain corrosion. Promising results were obtained, but further optimization is necessary as the treated layer is often incomplete, cracked and/or porous. In this paper, several parameters were investigated to obtain a coherent, uncracked layer, and to avoid the formation of metastable, soluble phases instead of HAP: the role of the pH of the starting solution; the effect of organic and inorganic additions, and in particular that of ethanol, as it is reported to adsorb on calcite, hence possibly favoring the growth of the HAP layer. Finally, a double application of the treatment was tested. Results were compared to those obtained with ammonium oxalate treatment, widely investigated for marble protection. Results indicate that adding small amounts of ethanol to the formulation remarkably increases the acid resistance of treated samples, and yields better coverage of the surface without crack formation. The effectiveness of the treatment is further enhanced when a second treatment is applied. The efficacy of ethanol-doped DAP mixtures was found to be remarkably higher than that of ammonium oxalate based treatments.

  6. Characterization and optimization of a novel vaccine for protection against Lyme borreliosis.

    PubMed

    Comstedt, Pär; Hanner, Markus; Schüler, Wolfgang; Meinke, Andreas; Schlegl, Robert; Lundberg, Urban

    2015-11-01

    Lyme borreliosis (LB) is the most common vector-borne disease in the northern hemisphere and there is no vaccine available for disease prevention. The majority of LB cases in Europe are caused by four different Borrelia species expressing six different OspA serotypes, whereas in the US only one of these serotypes is present. Immunization with the outer surface protein A (OspA) can prevent infection and the C-terminal part of OspA is sufficient for protection against infection transmitted by Ixodes ticks. Here we show that the order of the stabilized monomeric OspA fragments making up the heterodimers in our LB vaccine does not influence the induced immunogenicity and protection. Using bioinformatics analysis (surface electrostatics), we have designed an improved version of an LB vaccine which has an increased immunogenicity for OspA serotype 3 and an optimized expression and purification profile. The OspA heterodimers were highly purified with low amounts of endotoxin, host cell proteins and host cell DNA. All three proteins were at least 85% triacylated which ensured high immunogenicity. The LB vaccine presented here was designed, produced and characterized to a level which warrants further development as a second generation human LB vaccine.

  7. Tenure: How Due Process Protects Teachers and Students

    ERIC Educational Resources Information Center

    Kahlenberg, Richard D.

    2015-01-01

    Teacher tenure rights, first established more than a century ago, are under unprecedented attack. Tenure--which was enacted to protect students' education and those who provide it--is under assault from coast to coast, in state legislatures, in state courtrooms, and in the media. In June 2014, in the case of "Vergara v. California," a…

  8. Automated dynamic fed-batch process and media optimization for high productivity cell culture process development.

    PubMed

    Lu, Franklin; Toh, Poh Choo; Burnett, Iain; Li, Feng; Hudson, Terry; Amanullah, Ashraf; Li, Jincai

    2013-01-01

    Current industry practices for large-scale mammalian cell cultures typically employ a standard platform fed-batch process with fixed volume bolus feeding. Although widely used, these processes are unable to respond to actual nutrient consumption demands from the culture, which can result in accumulation of by-products and depletion of certain nutrients. This work demonstrates the application of a fully automated cell culture control, monitoring, and data processing system to achieve significant productivity improvement via dynamic feeding and media optimization. Two distinct feeding algorithms were used to dynamically alter feed rates. The first method is based upon on-line capacitance measurements where cultures were fed based on growth and nutrient consumption rates estimated from integrated capacitance. The second method is based upon automated glucose measurements obtained from the Nova Bioprofile FLEX® autosampler where cultures were fed to maintain a target glucose level which in turn maintained other nutrients based on a stoichiometric ratio. All of the calculations were done automatically through in-house integration with a Delta V process control system. Through both media and feed strategy optimization, a titer increase from the original platform titer of 5 to 6.3 g/L was achieved for cell line A, and a substantial titer increase of 4 to over 9 g/L was achieved for cell line B with comparable product quality. Glucose was found to be the best feed indicator, but not all cell lines benefited from dynamic feeding and optimized feed media was critical to process improvement. Our work demonstrated that dynamic feeding has the ability to automatically adjust feed rates according to culture behavior, and that the advantage can be best realized during early and rapid process development stages where different cell lines or large changes in culture conditions might lead to dramatically different nutrient demands.

  9. Process development in the QbD paradigm: Role of process integration in process optimization for production of biotherapeutics.

    PubMed

    Rathore, Anurag S; Pathak, Mili; Godara, Avinash

    2016-03-01

    Biotherapeutics have become the focus of the pharmaceutical industry due to their proven effectiveness in managing complex diseases. Downstream processes of these molecules consist of several orthogonal, high resolution unit operations designed so as to be able to separate variants having very similar physicochemical properties. Typical process development involves optimization of the individual unit operations based on Quality by Design principles in order to define the design space within which the process can deliver product that meets the predefined specifications. However, limited efforts are dedicated to understanding the interactions between the unit operations. This paper aims to showcase the importance of understanding these interactions and thereby arrive at operating conditions that are optimal for the overall process. It is demonstrated that these are not necessarily same as those obtained from optimization of the individual unit operations. Purification of Granulocyte Colony Stimulating Factor (G-CSF), a biotherapeutic expressed in E. coli., has been used as a case study. It is evident that the suggested approach results in not only higher yield (91.5 vs. 86.4) but also improved product quality (% RP-HPLC purity of 98.3 vs. 97.5) and process robustness. We think that this paper is very relevant to the present times when the biotech industry is in the midst of implementing Quality by Design towards process development. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 32:355-362, 2016.

  10. Process development in the QbD paradigm: Role of process integration in process optimization for production of biotherapeutics.

    PubMed

    Rathore, Anurag S; Pathak, Mili; Godara, Avinash

    2016-03-01

    Biotherapeutics have become the focus of the pharmaceutical industry due to their proven effectiveness in managing complex diseases. Downstream processes of these molecules consist of several orthogonal, high resolution unit operations designed so as to be able to separate variants having very similar physicochemical properties. Typical process development involves optimization of the individual unit operations based on Quality by Design principles in order to define the design space within which the process can deliver product that meets the predefined specifications. However, limited efforts are dedicated to understanding the interactions between the unit operations. This paper aims to showcase the importance of understanding these interactions and thereby arrive at operating conditions that are optimal for the overall process. It is demonstrated that these are not necessarily same as those obtained from optimization of the individual unit operations. Purification of Granulocyte Colony Stimulating Factor (G-CSF), a biotherapeutic expressed in E. coli., has been used as a case study. It is evident that the suggested approach results in not only higher yield (91.5 vs. 86.4) but also improved product quality (% RP-HPLC purity of 98.3 vs. 97.5) and process robustness. We think that this paper is very relevant to the present times when the biotech industry is in the midst of implementing Quality by Design towards process development. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 32:355-362, 2016. PMID:26588604

  11. Processing and optimization of functional ceramic coatings and inorganic nanomaterials

    NASA Astrophysics Data System (ADS)

    Nyutu, Edward Kennedy G.

    Processing of functional inorganic materials including zero (0-D) dimensional (e.g. nanoparticles), 1-D (nanorods, nanofibers), and 2-D (films/coating) structures is of fundamental and technological interest. This research will have two major sections. The first part of section one focuses on the deposition of silicon dioxide onto a pre-deposited molybdenum disilicide coating on molybdenum substrates for both high (>1000 °C) and moderate (500-600 °C) temperature oxidation protection. Chemical vapor deposition (CVD/MOCVD) techniques will be utilized to deposit the metal suicide and oxide coatings. The focus of this study will be to establish optimum deposition conditions and evaluate the metal oxide coating as oxidation - thermal barriers for Mo substrates under both isothermal (static) and cyclic oxidation conditions. The second part of this section will involve a systematic evaluation of a boron nitride (BN) interface coating prepared by chemical vapor deposition. Ceramic matrix composites (CMCs) are prospective candidates for high (>1000 °C) temperature applications and fiber- matrix interfaces are the dominant design parameters in ceramic matrix composites (CMCs). An important goal of the study is to determine a set of process parameters, which would define a boron nitride (BN) interface coating by a chemical vapor deposition (CVD) process with respect to coating. In the first part of the second section, we will investigate a new approach to synthesize ultrafine metal oxides that combines microwave heating and an in-situ ultrasonic mixing of two or more liquid precursors with a tubular flow reactor. Different metal oxides such as nickel ferrite and zinc aluminate spinels will be studied. The synthesis of metal oxides were investigated in order to study the effects of the nozzle and microwave (INM process) on the purity, composition, and particle size of the resulting powders. The second part of this research section involves a study of microwave frequency

  12. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  13. Optimization of segmented alignment marks for advanced semiconductor fabrication processes

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Lu, Zhijian G.; Williams, Gary; Zach, Franz X.; Liegl, Bernhard

    2001-08-01

    The continued downscaling of semiconductor fabrication ground rule has imposed increasingly tighter overlay tolerances, which becomes very challenging at the 100 nm lithographic node. Such tight tolerances will require very high performance in alignment. Past experiences indicate that good alignment depends largely on alignment signal quality, which, however, can be strongly affected by chip design and various fabrication processes. Under some extreme circumstances, they can even be reduced to the non- usable limit. Therefore, a systematic understanding of alignment marks and a method to predict alignment performance based on mark design are necessary. Motivated by this, we have performed a detailed study of bright field segmented alignment marks that are used in current state-of- the-art fabrication processes. We find that alignment marks at different lithographic levels can be organized into four basic categories: trench mark, metal mark, damascene mark, and combo mark. The basic principles of these four types of marks turn out to be so similar that they can be characterized within the theoretical framework of a simple model based on optical gratings. An analytic expression has been developed for such model and it has been tested using computer simulation with the rigorous time-domain finite- difference (TD-FD) algorithm TEMPEST. Consistent results have been obtained; indicating that mark signal can be significantly improved through the optimization of mark lateral dimensions, such as segment pitch and segment width. We have also compared simulation studies against experimental data for alignment marks at one typical lithographic level and a good agreement is found.

  14. Supramodal processing optimizes visual perceptual learning and plasticity.

    PubMed

    Zilber, Nicolas; Ciuciu, Philippe; Gramfort, Alexandre; Azizi, Leila; van Wassenhove, Virginie

    2014-06-01

    Multisensory interactions are ubiquitous in cortex and it has been suggested that sensory cortices may be supramodal i.e. capable of functional selectivity irrespective of the sensory modality of inputs (Pascual-Leone and Hamilton, 2001; Renier et al., 2013; Ricciardi and Pietrini, 2011; Voss and Zatorre, 2012). Here, we asked whether learning to discriminate visual coherence could benefit from supramodal processing. To this end, three groups of participants were briefly trained to discriminate which of a red or green intermixed population of random-dot-kinematograms (RDKs) was most coherent in a visual display while being recorded with magnetoencephalography (MEG). During training, participants heard no sound (V), congruent acoustic textures (AV) or auditory noise (AVn); importantly, congruent acoustic textures shared the temporal statistics - i.e. coherence - of visual RDKs. After training, the AV group significantly outperformed participants trained in V and AVn although they were not aware of their progress. In pre- and post-training blocks, all participants were tested without sound and with the same set of RDKs. When contrasting MEG data collected in these experimental blocks, selective differences were observed in the dynamic pattern and the cortical loci responsive to visual RDKs. First and common to all three groups, vlPFC showed selectivity to the learned coherence levels whereas selectivity in visual motion area hMT+ was only seen for the AV group. Second and solely for the AV group, activity in multisensory cortices (mSTS, pSTS) correlated with post-training performances; additionally, the latencies of these effects suggested feedback from vlPFC to hMT+ possibly mediated by temporal cortices in AV and AVn groups. Altogether, we interpret our results in the context of the Reverse Hierarchy Theory of learning (Ahissar and Hochstein, 2004) in which supramodal processing optimizes visual perceptual learning by capitalizing on sensory

  15. Optimization of a novel enzyme treatment process for early-stage processing of sheepskins.

    PubMed

    Lim, Y F; Bronlund, J E; Allsop, T F; Shilton, A N; Edmonds, R L

    2010-01-01

    An enzyme treatment process for early-stage processing of sheepskins has been previously reported by the Leather and Shoe Research Association of New Zealand (LASRA) as an alternative to current industry operations. The newly developed process had marked benefits over conventional processing in terms of a lowered energy usage (73%), processing time (47%) as well as water use (49%), but had been developed as a "proof of principle''. The objective of this work was to develop the process further to a stage ready for adoption by industry. Mass balancing was used to investigate potential modifications for the process based on the understanding developed from a detailed analysis of preliminary design trials. Results showed that a configuration utilising a 2 stage counter-current system for the washing stages and segregation and recycling of enzyme float prior to dilution in the neutralization stage was a significant improvement. Benefits over conventional processing include a reduction of residual TDS by 50% at the washing stages and 70% savings on water use overall. Benefits over the un-optimized LASRA process are reduction of solids in product after enzyme treatment and neutralization stages by 30%, additional water savings of 21%, as well as 10% savings of enzyme usage. PMID:20861557

  16. Development of processing techniques for advanced thermal protection materials

    NASA Technical Reports Server (NTRS)

    Selvaduray, Guna S.

    1995-01-01

    The main purpose of this work has been in the development and characterization of materials for high temperature applications. Thermal Protection Systems (TPS) are constantly being tested, and evaluated for increased thermal shock resistance, high temperature dimensional stability, and tolerance to environmental effects. Materials development was carried out through the use of many different instruments and methods, ranging from extensive elemental analysis to physical attributes testing. The six main focus areas include: (1) protective coatings for carbon/carbon composites; (2) TPS material characterization; (3) improved waterproofing for TPS; (4) modified ceramic insulation for bone implants; (5) improved durability ceramic insulation blankets; and (6) ultra-high temperature ceramics. This report describes the progress made in these research areas during this contract period.

  17. Concept of data processing in multisensor system for perimeter protection

    NASA Astrophysics Data System (ADS)

    Dulski, R.; Kastek, M.; Trzaskawka, P.; Piątkowski, T.; Szustakowski, M.; Życzkowski, M.

    2011-06-01

    The nature of recent terrorist attacks and military conflicts as well as the necessity to protect bases, convoys and patrols gave serious impact to the development of more effective security systems. Widely-used so far concepts of perimeter protection with zone sensors will be replaced in the near future with multi-sensor systems. This kind of systems can utilize day/night cameras, IR uncooled thermal cameras as well as millimeter-wave radars detecting radiation reflected from target. Ranges of detection, recognition and identification for all targets depends on the parameters of the sensors used and the observed scene itself. Apart from the sensors the most important elements that influence the system effectiveness is intelligent data analysis and a proper data fusion algorithm. A multi-sensor protection system allows to achieve significant improvement of detection probability of intruder. The concept of data fusion in multi-sensor system has been introduced. It is based on image fusion algorithm which allows visualizing and tracking intruders under any conditions.

  18. Optimization of Prime-Boost Vaccination Strategies Against Mouse-Adapted Ebolavirus in a Short-Term Protection Study.

    PubMed

    Aviles, Jenna; Bello, Alexander; Wong, Gary; Fausther-Bovendo, Hugues; Qiu, Xiangguo; Kobinger, Gary

    2015-10-01

    In nonhuman primates, complete protection against an Ebola virus (EBOV) challenge has previously been achieved after a single injection with several vaccine platforms. However, long-term protection against EBOV after a single immunization has not been demonstrated to this date. Interestingly, prime-boost regimens have demonstrated longer protection against EBOV challenge, compared with single immunizations. Since prime-boost regimens have the potential to achieve long-term protection, determining optimal vector combinations is crucial. However, testing prime-boost efficiency in long-term protection studies is time consuming and resource demanding. Here, we investigated the optimal prime-boost combination, using DNA, porcine-derived adeno-associated virus serotype 6 (AAV-po6), and human adenovirus serotype 5 (Ad5) vector, in a short-term protection study in the mouse model of EBOV infection. In addition, we also investigated which immune parameters were indicative of a strong boost. Each vaccine platform was titrated in mice to identify which dose (single immunization) induced approximately 20% protection after challenge with a mouse-adapted EBOV. These doses were then used to determine the protection efficacy of various prime-boost combinations, using the same mouse model. In addition, humoral and cellular immune responses against EBOV glycoprotein were analyzed by an enzyme-linked immunosorbent assay, a neutralizing antibody assay, and an interferon γ-specific enzyme-linked immunospot assay. When DNA was used as a prime, Ad5 boost induced the best protection, which correlated with a higher cellular response. In contrast, when AAV-po6 or Ad5 were injected first, better protection was achieved after DNA boost, and this correlated with a higher total glycoprotein-specific immunoglobulin G titer. Prime-boost regimens using independent vaccine platforms may provide a useful strategy to induce long-term immune protection against filoviruses.

  19. Optimization of Prime-Boost Vaccination Strategies Against Mouse-Adapted Ebolavirus in a Short-Term Protection Study.

    PubMed

    Aviles, Jenna; Bello, Alexander; Wong, Gary; Fausther-Bovendo, Hugues; Qiu, Xiangguo; Kobinger, Gary

    2015-10-01

    In nonhuman primates, complete protection against an Ebola virus (EBOV) challenge has previously been achieved after a single injection with several vaccine platforms. However, long-term protection against EBOV after a single immunization has not been demonstrated to this date. Interestingly, prime-boost regimens have demonstrated longer protection against EBOV challenge, compared with single immunizations. Since prime-boost regimens have the potential to achieve long-term protection, determining optimal vector combinations is crucial. However, testing prime-boost efficiency in long-term protection studies is time consuming and resource demanding. Here, we investigated the optimal prime-boost combination, using DNA, porcine-derived adeno-associated virus serotype 6 (AAV-po6), and human adenovirus serotype 5 (Ad5) vector, in a short-term protection study in the mouse model of EBOV infection. In addition, we also investigated which immune parameters were indicative of a strong boost. Each vaccine platform was titrated in mice to identify which dose (single immunization) induced approximately 20% protection after challenge with a mouse-adapted EBOV. These doses were then used to determine the protection efficacy of various prime-boost combinations, using the same mouse model. In addition, humoral and cellular immune responses against EBOV glycoprotein were analyzed by an enzyme-linked immunosorbent assay, a neutralizing antibody assay, and an interferon γ-specific enzyme-linked immunospot assay. When DNA was used as a prime, Ad5 boost induced the best protection, which correlated with a higher cellular response. In contrast, when AAV-po6 or Ad5 were injected first, better protection was achieved after DNA boost, and this correlated with a higher total glycoprotein-specific immunoglobulin G titer. Prime-boost regimens using independent vaccine platforms may provide a useful strategy to induce long-term immune protection against filoviruses. PMID:26038398

  20. Optimized process parameters for fabricating metal particles reinforced 5083 Al composite by friction stir processing

    PubMed Central

    Bauri, Ranjit; Yadav, Devinder; Shyam Kumar, C.N.; Janaki Ram, G.D.

    2015-01-01

    Metal matrix composites (MMCs) exhibit improved strength but suffer from low ductility. Metal particles reinforcement can be an alternative to retain the ductility in MMCs (Bauri and Yadav, 2010; Thakur and Gupta, 2007) [1,2]. However, processing such composites by conventional routes is difficult. The data presented here relates to friction stir processing (FSP) that was used to process metal particles reinforced aluminum matrix composites. The data is the processing parameters, rotation and traverse speeds, which were optimized to incorporate Ni particles. A wide range of parameters covering tool rotation speeds from 1000 rpm to 1800 rpm and a range of traverse speeds from 6 mm/min to 24 mm/min were explored in order to get a defect free stir zone and uniform distribution of particles. The right combination of rotation and traverse speed was found from these experiments. Both as-received coarse particles (70 μm) and ball-milled finer particles (10 μm) were incorporated in the Al matrix using the optimized parameters. PMID:26566541

  1. Equal Protection and Due Process: Contrasting Methods of Review under Fourteenth Amendment Doctrine.

    ERIC Educational Resources Information Center

    Hughes, James A.

    1979-01-01

    Argues that the Court has, at times, confused equal protection and due process methods of review, primarily by employing interest balancing in certain equal protection cases that should have been subjected to due process analysis. Available from Harvard Civil Rights-Civil Liberties Law Review, Harvard Law School, Cambridge, MA 02138; sc $4.00.…

  2. Process for producing radiation-induced self-terminating protective coatings on a substrate

    DOEpatents

    Klebanoff, Leonard E.

    2001-01-01

    A gas and radiation are used to produce a protective coating that is substantially void-free on the molecular scale, self-terminating, and degradation resistant. The process can be used to deposit very thin (.apprxeq.5-20 .ANG.) coatings on critical surfaces needing protection from degradative processes including, corrosion and contamination.

  3. Optimization of the chemical vapor deposition process for fabrication of carbon nanotube/Al composite powders

    SciTech Connect

    He, C.N.; Zhao, N.Q.; Shi, C.S.; Song, S.Z.

    2010-09-15

    In order to optimize the chemical vapor deposition process for fabrication of carbon nanotube/Al composite powders, the effect of different reaction conditions (such as reaction temperature, reaction time, and reaction gas ratio) on the morphological and structural development of the powder and dispersion of CNTs in Al powder was investigated using transmission electron microscope. The results showed that low temperatures (500-550 {sup o}C) give rise to herringbone-type carbon nanofibers and high temperatures (600-630 {sup o}C) lead to multi-walled CNTs. Long reaction times broaden the CNT size distribution and increase the CNT yield. Appropriate nitrogen flow is preferred for CNT growth, but high and low nitrogen flow result in carbon nanospheres and CNTs with coarse surfaces, respectively. Above results show that appropriate parameters are effective in dispersing the nanotubes in the Al powder which simultaneously protects the nanotubes from damage.

  4. Coupled finite element simulation and optimization of single- and multi-stage sheet-forming processes

    NASA Astrophysics Data System (ADS)

    Tamasco, Cynthia M.; Rais-Rohani, Masoud; Buijk, Arjaan

    2013-03-01

    This article presents the development and application of a coupled finite element simulation and optimization framework that can be used for design and analysis of sheet-forming processes of varying complexity. The entire forming process from blank gripping and deep drawing to tool release and springback is modelled. The dies, holders, punch and workpiece are modelled with friction, temperature, holder force and punch speed controlled in the process simulation. Both single- and multi-stage sheet-forming processes are investigated. Process simulation is coupled with a nonlinear gradient-based optimization approach for optimizing single or multiple design objectives with imposed sheet-forming response constraints. A MATLAB program is developed and used for data-flow management between process simulation and optimization codes. Thinning, springback, damage and forming limit diagram are used to define failure in the forming process design optimization. Design sensitivity analysis and optimization results of the example problems are presented and discussed.

  5. Wear Protection of AJ62 Mg Engine Blocks using Plasma Electrolytic Oxidation Process

    NASA Astrophysics Data System (ADS)

    Zhang, Peng

    2011-12-01

    In order to reduce the fuel consumption and pollution, automotive companies are developing magnesium-intensive components. However, due to the low wear resistance of the magnesium (Mg) alloys, Mg cylinder bores are vulnerable to the sliding wear attack. In this thesis, two approaches were used to protect the cylinder bores, made of a new developed Mg engine alloy AJ62 (MgA16Mn0.34Sr2). The first one was to use a Plasma Electrolytic Oxidation (PEO) process to produce oxide coatings on the Mg bores. The wear properties of the PEO coatings were evaluated by sliding wear tests under the boundary lubrication condition at the room and elevated temperatures. It was found that due to the substrate softening and the vaporization loss of the lubricant, the tribological properties of the PEO coatings were deteriorated at the elevated temperature. In order to optimize the PEO process, a statistical method (Response surface method) was used to analyze the effects of the 4 main PEO process parameters with 2 levels for each and their interactions on the tribological properties of the PEO coatings at the room and elevated temperatures, individually. A cylinder liner made of an economical metal-matrix composite (MMC) was another approach to improve the wear resistance of the Mg cylinder bore. In this thesis, an A1383/SiO2 MMC was designed to replace the expensive Alusil alloy used in the BMW Mg/Al composite engine to build the cylinder liner. To further increase the wear resistance of the MMC, PEO process was also used to form an oxide coating on the MMC. The effects of the SiO 2 content and coating thickness on the tribological properties of the MMC were studied. To evaluate the wear properties of the optimal PEO coated Mg coupons and the MMC with the oxide coatings, Alusil and cast iron, currently used on the cylinder bores of the commercial aluminum engines, were used as reference materials. The optimal PEO coated Mg coupons and the oxidized MMC showed their advantages over the

  6. Designing and testing broadly-protective filoviral vaccines optimized for cytotoxic T-lymphocyte epitope coverage.

    PubMed

    Fenimore, Paul W; Muhammad, Majidat A; Fischer, William M; Foley, Brian T; Bakken, Russell R; Thurmond, James R; Yusim, Karina; Yoon, Hyejin; Parker, Michael; Hart, Mary Kate; Dye, John M; Korber, Bette; Kuiken, Carla

    2012-01-01

    We report the rational design and in vivo testing of mosaic proteins for a polyvalent pan-filoviral vaccine using a computational strategy designed for the Human Immunodeficiency Virus type 1 (HIV-1) but also appropriate for Hepatitis C virus (HCV) and potentially other diverse viruses. Mosaics are sets of artificial recombinant proteins that are based on natural proteins. The recombinants are computationally selected using a genetic algorithm to optimize the coverage of potential cytotoxic T lymphocyte (CTL) epitopes. Because evolutionary history differs markedly between HIV-1 and filoviruses, we devised an adapted computational technique that is effective for sparsely sampled taxa; our first significant result is that the mosaic technique is effective in creating high-quality mosaic filovirus proteins. The resulting coverage of potential epitopes across filovirus species is superior to coverage by any natural variants, including current vaccine strains with demonstrated cross-reactivity. The mosaic cocktails are also robust: mosaics substantially outperformed natural strains when computationally tested against poorly sampled species and more variable genes. Furthermore, in a computational comparison of cross-reactive potential a design constructed prior to the Bundibugyo outbreak performed nearly as well against all species as an updated design that included Bundibugyo. These points suggest that the mosaic designs would be more resilient than natural-variant vaccines against future Ebola outbreaks dominated by novel viral variants. We demonstrate in vivo immunogenicity and protection against a heterologous challenge in a mouse model. This design work delineates the likely requirements and limitations on broadly-protective filoviral CTL vaccines.

  7. Plasma Spray and Pack Cementation Process Optimization and Oxidation Behaviour of Novel Multilayered Coatings

    NASA Astrophysics Data System (ADS)

    Gao, Feng

    The hot section components in gas turbines are subjected to a harsh environment with the temperature being increased continuously. The higher temperature has directly resulted in severe oxidation of these components. Monolithic coatings such as MCrAIY and aluminide have been traditionally used to protect the components from oxidation; however, increased operating temperature quickly deteriorates the coatings due to accelerated diffusion of aluminum in the coatings. To improve the oxidation resistance a group of multilayered coatings are developed in this study. The multilayered coatings consist of a Cr-Si co-deposited layer as the diffusion barrier, a plasma sprayed NiCrA1Y coating as the middle layer and an aluminized top layer. The Cr-Si and aluminized layers are fabricated using pack cementation processes and the NiCrA1Y coatings are produced using the Mettech Axial III(TM) System. All of the coating processes are optimized using the methodology of Design of Experiments (DOE) and the results are analyzed using statistical method. The optimal processes are adopted to fabricate the multilayered coatings for oxidation tests. The coatings are exposed in air at 1050°C and 1150°C for 1000 hr. The results indicate that a Cr layer and a silicon-rich barrier layer have formed on the interface between the Cr-Si coating and the NiCrA1Y coating. This barrier layer not only prevents aluminum and chromium from diffusing into the substrate, but also impedes the diffusion of other elements from the substrate into the coating. The results also reveal that, for optimal oxidation resistance at 1050°C, the top layer in a multilayered coating should have at least Al/Ni ratio of one; whereas the multilayered coating with the All Ni ratio of two in the top layer exhibits the best oxidation resistance at 1150°C. The DOE methodology provides an excellent means for process optimization and the selection of oxidation test matrix, and also offers a more thorough understanding of the

  8. A New Approach of Improving Rain Erosion Resistance of Nanocomposite Sol-Gel Coatings by Optimization Process Factors

    NASA Astrophysics Data System (ADS)

    Hojjati Najafabadi, Akbar; Shoja Razavi, Reza; Mozaffarinia, Reza; Rahimi, Hamed

    2014-05-01

    Erosion protection nanocomposite sol-gel coatings based on tetraethylorthosilicate (TEOS) and 3-glycidoxypropyltrimethoxisilane (GPTMS) are prepared and characterized to protect marine structures susceptible to damage caused by liquid impact, e.g., the submarine body. This study focuses on the optimization of compositional and process parameters of transparent hybrid nanocomposite sol-gel coatings resistant to rain erosion by using statistical design of experimental methodology (DoE) based on Taguchi orthogonal design. The impact of compositional and process parameters of the coatings on the erosion protection performance is investigated by five-factor-four-level design methodology. Hybrid coatings were deposited on AA5083 by a dip coating technique. Optimization coatings are analyzed regarding their adhesion (pull-off), flexibility (impact and mandrel bending), hardness (pencil), wear (Taber wear index), and rain erosion resistance (stationary sample erosion test). The surface morphology and roughness were studied by field-emission scanning electron microscopy (FE-SEM) and atomic force microscopy (AFM). The optimization coatings showed excellent flexibility and adhesion to the substrate with smooth nanostructure surface; the RMS surface roughness was 1.85 nm. The evaluation of the result obtained from abrasion shows cohesive and interfacial wear with abrasive and adhesive mechanisms, respectively. Liquid impact results show cohesive failure of the coatings without any sign of delamination.

  9. Process Optimization of Bismaleimide (BMI) Resin Infused Carbon Fiber Composite

    NASA Technical Reports Server (NTRS)

    Ehrlich, Joshua W.; Tate, LaNetra C.; Cox, Sarah B.; Taylor, Brian J.; Wright, M. Clara; Caraccio, Anne J.; Sampson, Jeffery W.

    2013-01-01

    Bismaleimide (BMI) resins are an attractive new addition to world-wide composite applications. This type of thermosetting polyimide provides several unique characteristics such as excellent physical property retention at elevated temperatures and in wet environments, constant electrical properties over a vast array of temperature settings, and nonflammability properties as well. This makes BMI a popular choice in advance composites and electronics applications [I]. Bismaleimide-2 (BMI-2) resin was used to infuse intermediate modulus 7 (IM7) based carbon fiber. Two panel configurations consisting of 4 plies with [+45deg, 90deg]2 and [0deg]4 orientations were fabricated. For tensile testing, a [90deg]4 configuration was tested by rotating the [0deg]4 configirration to lie orthogonal with the load direction of the test fixture. Curing of the BMI-2/IM7 system utilized an optimal infusion process which focused on the integration of the manufacturer-recommended ramp rates,. hold times, and cure temperatures. Completion of the cure cycle for the BMI-2/IM7 composite yielded a product with multiple surface voids determined through visual and metallographic observation. Although the curing cycle was the same for the three panellayups, the surface voids that remained within the material post-cure were different in abundance, shape, and size. For tensile testing, the [0deg]4 layup had a 19.9% and 21.7% greater average tensile strain performance compared to the [90deg]4 and [+45deg, 90deg, 90deg,-45degg] layups, respectively, at failure. For tensile stress performance, the [0deg]4 layup had a 5.8% and 34.0% greater average performance% than the [90deg]4 and [+45deg, 90deg, 90deg,-45deg] layups.

  10. Advanced landfill leachate treatment using iron-carbon microelectrolysis- Fenton process: Process optimization and column experiments.

    PubMed

    Wang, Liqun; Yang, Qi; Wang, Dongbo; Li, Xiaoming; Zeng, Guangming; Li, Zhijun; Deng, Yongchao; Liu, Jun; Yi, Kaixin

    2016-11-15

    A novel hydrogen peroxide-enhanced iron-carbon (Fe-C) microelectrolysis reactor was proposed for the pretreatment of mature landfill leachate. This reactor, combining microelectrolysis with Fenton process, revealed high treatment efficiency. The operating variables, including Fe-C dosage, H2O2 concentration and initial pH, were optimized by the response surface methodology (RSM), regarding the chemical oxygen demand (COD) removal efficiency and biochemical oxygen demand: chemical oxygen demand (BOD5/COD) as the responses. The highest COD removal (74.59%) and BOD5/COD (0.50) was obtained at optimal conditions of Fe-C dosage 55.72g/L, H2O2 concentration 12.32mL/L and initial pH 3.12. Three-dimensional excitation and emission matrix (3D-EEM) fluorescence spectroscopy and molecular weight (MW) distribution demonstrated that high molecular weight fractions such as refractory fulvic-like substances in leachate were effectively destroyed during the combined processes, which should be attributed to the combination oxidative effect of microelectrolysis and Fenton. The fixed-bed column experiments were performed and the breakthrough curves at different flow rates were evaluated to determine the practical applicability of the combined process. All these results show that the hydrogen peroxide-enhanced iron-carbon (Fe-C) microelectrolysis reactor is a promising and efficient technology for the treatment of mature landfill leachate. PMID:27450338

  11. A study of optimizing processes for metallized textile design application

    NASA Astrophysics Data System (ADS)

    Guo, Ronghui

    The purpose of this research is to find an optimum electroless plating process in order to obtain relatively low surface resistance, and improve functional properties and appearance of nickel-plated and copper-plated polyester fabrics. The optimum results indicate that the NiSO4 concentration and temperature of the bath in the plating process are most important factors influencing surface resistance of electroless nickel-plated polyester fabric. However, NiSO4 concentration and pH of the plating bath are most significant factors affecting electroless copper plating. The micro-structures and properties of nickel and copper, and nickel/copper multi-layer plated polyester fabrics have been studied. In the case of electroless nickel plating, the nickel deposit layer becomes more uniform and continuous when prepared at higher NiSO4 concentration and higher bath temperature. As for the electroless copper plating, the surface morphology of the copper deposits indicates that the average diameter of the particles is increased with the rise of NiSO4 concentration and pH. The surface morphology of nickel/copper multi-layer deposits reveals the presence of ultra-fine nodules and the deposits are compact and uniform in size. There is an increase in EMI SE with respect to the rise of Ni 2+ concentration and bath temperature for electroless nickel plating; and EMI SE increases with the rise of Ni2+ concentration and pH of the plating solution for electroless copper plating on polyester fabric. With the same deposit weight, the EMI SE of nickel/copper-plated fabric is greatly higher than that of the nickel-plated fabric, but slightly lower than that of the copper-plated fabric. However, the anti-corrosive property of nickel/copper-plated fabrics is significantly superior to the copper-plated fabrics, but slightly inferior to the nickel-plated fabric. Design application effects have been explored by the controlling plating conditions. The electroless plating parameters play an

  12. Fast engineering optimization: A novel highly effective control parameterization approach for industrial dynamic processes.

    PubMed

    Liu, Ping; Li, Guodong; Liu, Xinggao

    2015-09-01

    Control vector parameterization (CVP) is an important approach of the engineering optimization for the industrial dynamic processes. However, its major defect, the low optimization efficiency caused by calculating the relevant differential equations in the generated nonlinear programming (NLP) problem repeatedly, limits its wide application in the engineering optimization for the industrial dynamic processes. A novel highly effective control parameterization approach, fast-CVP, is first proposed to improve the optimization efficiency for industrial dynamic processes, where the costate gradient formulae is employed and a fast approximate scheme is presented to solve the differential equations in dynamic process simulation. Three well-known engineering optimization benchmark problems of the industrial dynamic processes are demonstrated as illustration. The research results show that the proposed fast approach achieves a fine performance that at least 90% of the computation time can be saved in contrast to the traditional CVP method, which reveals the effectiveness of the proposed fast engineering optimization approach for the industrial dynamic processes.

  13. Integration of Product, Package, Process, and Environment: A Food System Optimization

    NASA Technical Reports Server (NTRS)

    Cooper, Maya R.; Douglas, Grace L.

    2015-01-01

    temperature and pressure were linked to final product quality in freeze-dried corn, indicating processing modifications that could lead to improved product shelf life. Storage temperatures and packaging systems were also assessed for the impact to food quality. Reduced temperature storage had inconclusive impact to the progression of rancidity in butter cookies. Frozen storage was detrimental to fruit and vegetable textural attributes but refrigerated storage helped to sustain color and organoleptic ratings for plant-based foods. With regard to packaging systems, the metallized film overwrap significantly decreased the progression of the rancidity of butter cookies as compared to the highest barrier non-metallized film. The inclusion of oxygen scavengers resulted in noticeable moisture gains in butter cookies over time, independent of packaging film systems. Neither emergent processing technology nor the freeze dry optimization resulted in compelling quality differences from current space food provisions such that a five-year shelf life is likely with these processing changes alone. Using a combination of refrigeration and PATS processing is expected to result in organoleptically-acceptable fruit quality for most fruits through five years. The vitamin degradation will be aided somewhat by the cold temperatures but, given the labile nature of vitamin C, a more stable fortification method, such as encapsulation, should also be investigated to ensure vitamin delivery throughout the product life. Similarly, significant improvement to the packaging film used in the MATS processing, optimization of formulation for dielectric properties, vitamin fortification, and reduced temperature storage should be investigated as a hurdle approach to reach a five year shelf life in wet-pack entrees and soups. Baked goods and other environmentally-sensitive spaceflight foods will require an almost impenetrable barrier to protect the foods from oxygen and moisture ingress but scavengers and

  14. Optimization of thermal protection systems for the space shuttle vehicle. Volume 1: Final report

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A study performed to continue development of computational techniques for the Space Shuttle Thermal Protection System is reported. The resulting computer code was used to perform some additional optimization studies on several TPS configurations. The program was developed in Fortran 4 for the CDC 6400, and it was converted to Fortran 5 to be used for the Univac 1108. The computational methodology is developed in modular fashion to facilitate changes and updating of the techniques and to allow overlaying the computer code to fit into approximately 131,000 octal words of core storage. The program logic involves subroutines which handle input and output of information between computer and user, thermodynamic stress, dynamic, and weight/estimate analyses of a variety of panel configurations. These include metallic, ablative, RSI (with and without an underlying phase change material), and a thermodynamic analysis only of carbon-carbon systems applied to the leading edge and flat cover panels. Two different thermodynamic analyses are used. The first is a two-dimensional, explicit precedure with variable time steps which is used to describe the behavior of metallic and carbon-carbon leading edges. The second is a one-dimensional implicity technique used to predict temperature in the charring ablator and the noncharring RSI. The latter analysis is performed simply by suppressing the chemical reactions and pyrolysis of the TPS material.

  15. Essentiality of mitochondrial oxidative metabolism for photosynthesis: optimization of carbon assimilation and protection against photoinhibition.

    PubMed

    Padmasree, K; Padmavathi, L; Raghavendra, A S

    2002-01-01

    The review emphasizes the essentiality of mitochondrial oxidative metabolism for photosynthetic carbon assimilation. Photosynthetic activity in chloroplasts and oxidative metabolism in mitochondria interact with each other and stimulate their activities. During light, the partially modified TCA cycle supplies oxoglutarate to cytosol and chloroplasts. The marked stimulation of O2 uptake after few minutes of photosynthetic activity, termed as light enhanced dark respiration (LEDR), is now a well-known phenomenon. Both the cytochrome and alternative pathways of mitochondrial electron transport are important in such interactions. The function of chloroplast is optimized by the complementary nature of mitochondrial metabolism in multiple ways: facilitation of export of excess reduced equivalents from chloroplasts, shortening of photosynthetic induction, maintenance of photorespiratory activity, and supply of ATP for sucrose biosynthesis as well as other cytosolic needs. Further, the mitochondrial oxidative electron transport and phosphorylation also protects chloroplasts against photoinhibition. Besides mitochondrial respiration, reducing equivalents (and ATP) are used for other metabolic phenomena, such as sulfur or nitrogen metabolism and photorespiration. These reactions often involve peroxisomes and cytosol. The beneficial interaction between chloroplasts and mitochondria therefore extends invariably to also peroxisomes and cytosol. While the interorganelle exchange of metabolites is the known basis of such interaction, further experiments are warranted to identify other biochemical signals between them. The uses of techniques such as on-line mass spectrometric measurement, novel mutants/transgenics, and variability in metabolism by growth conditions hold a high promise to help the plant biologist to understand this PMID:12027265

  16. Development of Processing Techniques for Advanced Thermal Protection Materials

    NASA Technical Reports Server (NTRS)

    Selvaduray, Guna; Lacson, Jamie; Collazo, Julian

    1997-01-01

    During the period June 1, 1996 through May 31, 1997, the main effort has been in the development of materials for high temperature applications. Thermal Protection Systems (TPS) are constantly being tested and evaluated for thermal shock resistance, high temperature dimensional stability, and tolerance to environmental effects. Materials development was carried out by using many different instruments and methods, ranging from intensive elemental analysis to testing the physical attributes of a material. The material development concentrated on two key areas: (1) development of coatings for carbon/carbon composites, and (2) development of ultra-high temperature ceramics (UHTC). This report describes the progress made in these two areas of research during this contract period.

  17. Development of processing techniques for advanced thermal protection materials

    NASA Technical Reports Server (NTRS)

    Selvaduray, Guna S.

    1994-01-01

    The effort, which was focused on the research and development of advanced materials for use in Thermal Protection Systems (TPS), has involved chemical and physical testing of refractory ceramic tiles, fabrics, threads and fibers. This testing has included determination of the optical properties, thermal shock resistance, high temperature dimensional stability, and tolerance to environmental stresses. Materials have also been tested in the Arc Jet 2 x 9 Turbulent Duct Facility (TDF), the 1 atmosphere Radiant Heat Cycler, and the Mini-Wind Tunnel Facility (MWTF). A significant part of the effort hitherto has gone towards modifying and upgrading the test facilities so that meaningful tests can be carried out. Another important effort during this period has been the creation of a materials database. Computer systems administration and support have also been provided. These are described in greater detail below.

  18. Metamodeling and Optimization of a Blister Copper Two-Stage Production Process

    NASA Astrophysics Data System (ADS)

    Jarosz, Piotr; Kusiak, Jan; Małecki, Stanisław; Morkisz, Paweł; Oprocha, Piotr; Pietrucha, Wojciech; Sztangret, Łukasz

    2016-06-01

    It is often difficult to estimate parameters for a two-stage production process of blister copper (containing 99.4 wt.% of Cu metal) as well as those for most industrial processes with high accuracy, which leads to problems related to process modeling and control. The first objective of this study was to model flash smelting and converting of Cu matte stages using three different techniques: artificial neural networks, support vector machines, and random forests, which utilized noisy technological data. Subsequently, more advanced models were applied to optimize the entire process (which was the second goal of this research). The obtained optimal solution was a Pareto-optimal one because the process consisted of two stages, making the optimization problem a multi-criteria one. A sequential optimization strategy was employed, which aimed for optimal control parameters consecutively for both stages. The obtained optimal output parameters for the first smelting stage were used as input parameters for the second converting stage. Finally, a search for another optimal set of control parameters for the second stage of a Kennecott-Outokumpu process was performed. The optimization process was modeled using a Monte-Carlo method, and both modeling parameters and computed optimal solutions are discussed.

  19. Stretching the limits of forming processes by robust optimization: A demonstrator

    SciTech Connect

    Wiebenga, J. H.; Atzema, E. H.; Boogaard, A. H. van den

    2013-12-16

    Robust design of forming processes using numerical simulations is gaining attention throughout the industry. In this work, it is demonstrated how robust optimization can assist in further stretching the limits of metal forming processes. A deterministic and a robust optimization study are performed, considering a stretch-drawing process of a hemispherical cup product. For the robust optimization study, both the effect of material and process scatter are taken into account. For quantifying the material scatter, samples of 41 coils of a drawing quality forming steel have been collected. The stochastic material behavior is obtained by a hybrid approach, combining mechanical testing and texture analysis, and efficiently implemented in a metamodel based optimization strategy. The deterministic and robust optimization results are subsequently presented and compared, demonstrating an increased process robustness and decreased number of product rejects by application of the robust optimization approach.

  20. A parallel Jacobson-Oksman optimization algorithm. [parallel processing (computers)

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Markos, A. T.

    1975-01-01

    A gradient-dependent optimization technique which exploits the vector-streaming or parallel-computing capabilities of some modern computers is presented. The algorithm, derived by assuming that the function to be minimized is homogeneous, is a modification of the Jacobson-Oksman serial minimization method. In addition to describing the algorithm, conditions insuring the convergence of the iterates of the algorithm and the results of numerical experiments on a group of sample test functions are presented. The results of these experiments indicate that this algorithm will solve optimization problems in less computing time than conventional serial methods on machines having vector-streaming or parallel-computing capabilities.

  1. Multiobjective genetic approach for optimal control of photoinduced processes

    SciTech Connect

    Bonacina, Luigi; Extermann, Jerome; Rondi, Ariana; Wolf, Jean-Pierre; Boutou, Veronique

    2007-08-15

    We have applied a multiobjective genetic algorithm to the optimization of multiphoton-excited fluorescence. Our study shows the advantages that this approach can offer to experiments based on adaptive shaping of femtosecond pulses. The algorithm outperforms single-objective optimizations, being totally independent from the bias of user defined parameters and giving simultaneous access to a large set of feasible solutions. The global inspection of their ensemble represents a powerful support to unravel the connections between pulse spectral field features and excitation dynamics of the sample.

  2. Process optimization for the production of diosgenin with Trichoderma reesei.

    PubMed

    Zhu, Yuling; Ni, Jinren; Huang, Wen

    2010-06-01

    Based on the response surface methodology, an effective microbial system for diosgenin production from enzymatic pretreated Dioscorea zingiberensis tubers with Trichoderma reesei was studied. The fermentation medium was optimized with central composite design (3(5)) depended on Plackett-Burmann design which identified significant impacts of peptone, K(2)HPO(4) and Tween 80 on diosgenin yield. The effects of different fermentation conditions on diosgenin production were also studied. Four parameters, i.e. incubation period, temperature, initial pH and substrate concentration were optimized using 4(5) central composite design. The highest diosgenin yield of 90.57% was achieved with 2.67% (w/v) of peptone, 0.29% (w/v) of K(2)HPO(4), 0.73% (w/v) of Tween 80 and 9.77% (w/v) of substrate, under the condition of pH 5.8, temperature 30 degrees C. The idealized incubation time was 6.5 days. After optimization, the product yield increased by 33.70% as compared to 67.74 +/- 1.54% of diosgenin yield in not optimized condition. Scale-up fermentation was carried out in a 5.0 l bioreactor, maximum diosgenin yield of 90.17 +/- 3.12% was obtained at an aeration of 0.80 vvm and an agitation rate of 300 rpm. The proposed microbial system is clean and effective for diosgenin production and thus more environmentally acceptable than the traditional acid hydrolysis.

  3. Optimal fabrication processes for unidirectional metal-matrix composites: A computational simulation

    NASA Technical Reports Server (NTRS)

    Saravanos, D. A.; Murthy, P. L. N.; Morel, M.

    1990-01-01

    A method is proposed for optimizing the fabrication process of unidirectional metal matrix composites. The temperature and pressure histories are optimized such that the residual microstresses of the composite at the end of the fabrication process are minimized and the material integrity throughout the process is ensured. The response of the composite during the fabrication is simulated based on a nonlinear micromechanics theory. The optimal fabrication problem is formulated and solved with non-linear programming. Application cases regarding the optimization of the fabrication cool-down phases of unidirectional ultra-high modulus graphite/copper and silicon carbide/titanium composites are presented.

  4. Optimization Of PVDF-TrFE Processing Conditions For The Fabrication Of Organic MEMS Resonators.

    PubMed

    Ducrot, Pierre-Henri; Dufour, Isabelle; Ayela, Cédric

    2016-01-01

    This paper reports a systematic optimization of processing conditions of PVDF-TrFE piezoelectric thin films, used as integrated transducers in organic MEMS resonators. Indeed, despite data on electromechanical properties of PVDF found in the literature, optimized processing conditions that lead to these properties remain only partially described. In this work, a rigorous optimization of parameters enabling state-of-the-art piezoelectric properties of PVDF-TrFE thin films has been performed via the evaluation of the actuation performance of MEMS resonators. Conditions such as annealing duration, poling field and poling duration have been optimized and repeatability of the process has been demonstrated.

  5. Optimization Of PVDF-TrFE Processing Conditions For The Fabrication Of Organic MEMS Resonators

    PubMed Central

    Ducrot, Pierre-Henri; Dufour, Isabelle; Ayela, Cédric

    2016-01-01

    This paper reports a systematic optimization of processing conditions of PVDF-TrFE piezoelectric thin films, used as integrated transducers in organic MEMS resonators. Indeed, despite data on electromechanical properties of PVDF found in the literature, optimized processing conditions that lead to these properties remain only partially described. In this work, a rigorous optimization of parameters enabling state-of-the-art piezoelectric properties of PVDF-TrFE thin films has been performed via the evaluation of the actuation performance of MEMS resonators. Conditions such as annealing duration, poling field and poling duration have been optimized and repeatability of the process has been demonstrated. PMID:26792224

  6. Optimization Of PVDF-TrFE Processing Conditions For The Fabrication Of Organic MEMS Resonators

    NASA Astrophysics Data System (ADS)

    Ducrot, Pierre-Henri; Dufour, Isabelle; Ayela, Cédric

    2016-01-01

    This paper reports a systematic optimization of processing conditions of PVDF-TrFE piezoelectric thin films, used as integrated transducers in organic MEMS resonators. Indeed, despite data on electromechanical properties of PVDF found in the literature, optimized processing conditions that lead to these properties remain only partially described. In this work, a rigorous optimization of parameters enabling state-of-the-art piezoelectric properties of PVDF-TrFE thin films has been performed via the evaluation of the actuation performance of MEMS resonators. Conditions such as annealing duration, poling field and poling duration have been optimized and repeatability of the process has been demonstrated.

  7. Emerging drug combinations to optimize renovascular protection and blood pressure goals

    PubMed Central

    Escobar, Carlos; Echarri, Rocio; Barrios, Vivencio

    2012-01-01

    Hypertension and renal disease are closely related. In fact, there is an inverse linear relationship between renal function and prevalence of hypertension. Hypertensive patients with renal dysfunction exhibit a poor clinical profile, which markedly increases their risk for cardiovascular outcomes. This review considers the available evidence on the best therapeutic approach for optimizing renovascular protection in the hypertensive population. To effectively reduce or at least slow the establishment and progression of renal disease in the hypertensive population it is critical to reach blood pressure targets. Many studies have shown that angiotensin-converting enzyme inhibitors and angiotensin receptor blockers prevent or at least delay the development of microalbuminuria in patients with hypertension and type 2 diabetes, reduce the incidence of overt diabetic nephropathy, and are also beneficial in patients with nondiabetic renal disease. Therefore, renin-angiotensin system (RAS) inhibition plays a key role in the prevention of renal outcomes. As the majority of patients with hypertension will need at least two antihypertensive agents to achieve blood pressure goals, the use of RAS inhibitors is a mandatory part of antihypertensive therapy. The question of which antihypertensive agent is the best choice for combining with RAS blockers should be considered. Many studies have shown that diuretics and calcium channel blockers are the best choice. However, more studies are needed to clarify the subgroups of patients who will benefit more from a combination with a diuretic or from a combination with a calcium channel blocker. To date, RAS inhibitors recommended in this context are angiotensin-converting enzyme inhibitors and angiotensin receptor blockers. Aliskiren, the first oral direct renin inhibitor available, has shown promising results. PMID:22536084

  8. Considerations on the Optimal and Efficient Processing of Information-Bearing Signals

    ERIC Educational Resources Information Center

    Harms, Herbert Andrew

    2013-01-01

    Noise is a fundamental hurdle that impedes the processing of information-bearing signals, specifically the extraction of salient information. Processing that is both optimal and efficient is desired; optimality ensures the extracted information has the highest fidelity allowed by the noise, while efficiency ensures limited resource usage. Optimal…

  9. Protected Light-Trapping Silicon by a Simple Structuring Process for Sunlight-Assisted Water Splitting.

    PubMed

    Santinacci, Lionel; Diouf, Maïmouna W; Barr, Maïssa K S; Fabre, Bruno; Joanny, Loïc; Gouttefangeas, Francis; Loget, Gabriel

    2016-09-21

    Macroporous layers are grown onto n-type silicon by successive photoelectrochemical etching in HF-containing solution and chemical etching in KOH. This specific latter treatment gives highly antireflective properties of the Si surface. The duration of the chemical etching is optimized to render the surface as absorbent as possible, and the morphology of the as-grown layer is characterized by scanning electron microscopy. Further functionalization of such structured Si surface is carried out by atomic layer deposition of a thin conformal and homogeneous TiO2 layer that is crystallized by an annealing at 450 °C. This process allows using such surfaces as photoanodes for water oxidation. The 40 nm thick TiO2 film acts indeed as an efficient protective layer against the photocorrosion of the porous Si in KOH, enhances its wettability, and improves the light absorption of the photoelectrode. The macroporous dual-absorber TiO2/Si has a beneficial effect on water oxidation in 1 M KOH and leads to a considerable negative shift of the onset potential of ∼400 mV as well as a 50% increase in photocurrent at 1 V vs SCE. PMID:27575424

  10. Protected Light-Trapping Silicon by a Simple Structuring Process for Sunlight-Assisted Water Splitting.

    PubMed

    Santinacci, Lionel; Diouf, Maïmouna W; Barr, Maïssa K S; Fabre, Bruno; Joanny, Loïc; Gouttefangeas, Francis; Loget, Gabriel

    2016-09-21

    Macroporous layers are grown onto n-type silicon by successive photoelectrochemical etching in HF-containing solution and chemical etching in KOH. This specific latter treatment gives highly antireflective properties of the Si surface. The duration of the chemical etching is optimized to render the surface as absorbent as possible, and the morphology of the as-grown layer is characterized by scanning electron microscopy. Further functionalization of such structured Si surface is carried out by atomic layer deposition of a thin conformal and homogeneous TiO2 layer that is crystallized by an annealing at 450 °C. This process allows using such surfaces as photoanodes for water oxidation. The 40 nm thick TiO2 film acts indeed as an efficient protective layer against the photocorrosion of the porous Si in KOH, enhances its wettability, and improves the light absorption of the photoelectrode. The macroporous dual-absorber TiO2/Si has a beneficial effect on water oxidation in 1 M KOH and leads to a considerable negative shift of the onset potential of ∼400 mV as well as a 50% increase in photocurrent at 1 V vs SCE.

  11. [Search for optimal combinations of the hygienic and protective characteristics of body armors].

    PubMed

    Logatkin, S M

    2008-01-01

    The protective composition of a body armor is generally characterized by two major parameters--the area and level of protection, i.e. resistance to bullets and fragments. These characteristics directly depend on the mass of a body armor and the sizes of the body's screening. The positive protective characteristics simultaneously have a negative impact on the most important hygienic indices of a body armor, such as convenience and easiness-to-use. The optimum combination of protective and performance characteristics of body armors makes a compromise between their mass and the level of protection.

  12. Optimization of enrichment processes of pentachlorophenol (PCP) from water samples.

    PubMed

    Li, Ping; Liu, Jun-xin

    2004-01-01

    The method of enriching PCP(pentachlorophenol) from aquatic environment by solid phase extraction(SPE) was studied. Several factors affecting the recoveries of PCP, including sample pH, eluting solvent, eluting volume and flow rate of water sample, were optimized by orthogonal array design(OAD). The optimized results were sample pH 4; eluting solvent, 100% methanol; eluting solvent volume, 2 ml and flow rate of water sample, 4 ml/min. A comparison is made between SPE and liquid-liquid extraction(LLE) method. The recoveries of PCP were in the range of 87.6%-133.6% and 79%-120.3% for SPE and LLE, respectively. Important advantages of the SPE compared with the LLE include the short extraction time and reduced consumption of organic solvents. SPE can replace LLE for isolating and concentrating PCP from water samples.

  13. Process for the preparation of protected 3-amino-1,2-dihydroxypropane acetal and derivatives thereof

    DOEpatents

    Hollingsworth, Rawle I.; Wang, Guijun

    2000-01-01

    A process for producing protected 3-amino-1,2-dihydroxypropane acetal, particularly in chiral forms, for use as an intermediate in the preparation of various 3-carbon compounds which are chiral. In particular, the present invention relates to the process for preparation of 3-amino-1,2-dihydroxypropane isopropylidene acetal. The protected 3-amino-1,2-dihydroxypropane acetal is a key intermediate to the preparation of chiral 3-carbon compounds which in turn are intermediates to various pharmaceuticals.

  14. A System-Oriented Approach for the Optimal Control of Process Chains under Stochastic Influences

    NASA Astrophysics Data System (ADS)

    Senn, Melanie; Schäfer, Julian; Pollak, Jürgen; Link, Norbert

    2011-09-01

    Process chains in manufacturing consist of multiple connected processes in terms of dynamic systems. The properties of a product passing through such a process chain are influenced by the transformation of each single process. There exist various methods for the control of individual processes, such as classical state controllers from cybernetics or function mapping approaches realized by statistical learning. These controllers ensure that a desired state is obtained at process end despite of variations in the input and disturbances. The interactions between the single processes are thereby neglected, but play an important role in the optimization of the entire process chain. We divide the overall optimization into two phases: (1) the solution of the optimization problem by Dynamic Programming to find the optimal control variable values for each process for any encountered end state of its predecessor and (2) the application of the optimal control variables at runtime for the detected initial process state. The optimization problem is solved by selecting adequate control variables for each process in the chain backwards based on predefined quality requirements for the final product. For the demonstration of the proposed concept, we have chosen a process chain from sheet metal manufacturing with simplified transformation functions.

  15. Nitrogen removal process optimization in New York City WPCPS: a case study of Wards Island WPCP.

    PubMed

    Ramalingam, K; Fillos, J; Musabyimana, M; Deur, A; Beckmann, K

    2009-01-01

    The New York City Department of Environmental Protection has been engaged in a continuous process to develop a nitrogen removal program to reduce the nitrogen mass discharge from its water pollution control plants, (WPCPs), from 49,158 kg/d to 20,105 kg/d by the year 2017 as recommended by the Long Island Sound Study. As part of the process, a comprehensive research effort was undertaken involving bench, pilot and full scale studies to identify the most effective way to upgrade and optimize the existing WPCPs. Aeration tank 13 (AT-13) at the Wards Island WPCP was particularly attractive as a full-scale research facility because its aeration tank with its dedicated final settling tanks and RAS pumps could be isolated from the remaining treatment facilities. The nitrogen removal performance of AT-13, which, at the time, was operated as a "basic step feed BNR Facility", was evaluated and concurrently nitrification kinetic parameters were measured using in-situ bench scale experiments. Additional bench scale experiments provided denitrification rates using different sources of carbon and measurement of the maximum specific growth rate of nitrifying bacteria. The combined findings were then used to upgrade AT-13 to a "full" BNR facility with carbon and alkalinity addition. This paper will focus on the combined bench and full scale results that were the basis for the consequent upgrade. PMID:19901478

  16. Application of a neural network to simulate analysis in an optimization process

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Lamarsh, William J., II

    1992-01-01

    A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.

  17. Optimization of frozen sour cherries vacuum drying process.

    PubMed

    Sumić, Zdravko; Tepić, Aleksandra; Vidović, Senka; Jokić, Stela; Malbaša, Radomir

    2013-01-01

    The objective of this research was to optimize the vacuum-drying of frozen sour cherries in order to preserve health-beneficial phytochemicals, as well as textural characteristics. Investigated range of temperature was 46-74°C and, of pressure, 17-583mbar, in a new design of vacuum-dryer equipment. The total solids, a(w) value, total phenolics, vitamin C, antioxidant activity, anthocyanin content, total colour change and firmness were used as quality indicators of dried sour cherry. Within the experimental range of studied variables, the optimum conditions of 54.03°C and 148.16mbar were established for vacuum drying of sour cherry. Separate validation experiments were conducted, under optimum conditions, to verify predictions and adequacy of the second-order polynomial models. Under these optimal conditions, the predicted amount of total phenolics was 744mg CAE/100 dw, vitamin C 1.44mg/100g per dry weight (g dw), anthocyanin content 125mg/100g dw, IC(50) 3.23mg/ml, total solids 70.72%, a(w) value 0.646, total colour change 52.61 and firmness 3395.4g. The investigated parameters had a significant effect on the quality of the dried sour cherries. PMID:23017392

  18. Optimization of frozen sour cherries vacuum drying process.

    PubMed

    Sumić, Zdravko; Tepić, Aleksandra; Vidović, Senka; Jokić, Stela; Malbaša, Radomir

    2013-01-01

    The objective of this research was to optimize the vacuum-drying of frozen sour cherries in order to preserve health-beneficial phytochemicals, as well as textural characteristics. Investigated range of temperature was 46-74°C and, of pressure, 17-583mbar, in a new design of vacuum-dryer equipment. The total solids, a(w) value, total phenolics, vitamin C, antioxidant activity, anthocyanin content, total colour change and firmness were used as quality indicators of dried sour cherry. Within the experimental range of studied variables, the optimum conditions of 54.03°C and 148.16mbar were established for vacuum drying of sour cherry. Separate validation experiments were conducted, under optimum conditions, to verify predictions and adequacy of the second-order polynomial models. Under these optimal conditions, the predicted amount of total phenolics was 744mg CAE/100 dw, vitamin C 1.44mg/100g per dry weight (g dw), anthocyanin content 125mg/100g dw, IC(50) 3.23mg/ml, total solids 70.72%, a(w) value 0.646, total colour change 52.61 and firmness 3395.4g. The investigated parameters had a significant effect on the quality of the dried sour cherries.

  19. Optimization of the sonication process for meloxicam nanocrystals preparation

    PubMed Central

    IURIAN, SONIA; TOMUŢA, IOAN; RUS, LUCIA; ACHIM, MARCELA; LEUCUTA, SORIN E.

    2015-01-01

    Background and aims Meloxicam, a widely recommended AINS, presents poor water solubility, which limits its bioavailability and effect onset. The objective of this study is the investigation of the most important factors that influence the efficiency of sonication in the preparation of meloxicam nanocrystals. Methods The effects of crucial technological sonication parameters (amplitude, time and applied cycle) on the crystal sizes and dissolution were investigated using a central composite experimental design with three factors and three levels. Different mathematical models were applied for the evaluation of the influence of each factor on the measured responses. Results The amplitude and the time were found as the most important variables. Their increase determined significant size reduction and homogeneity due to cavitation phenomenon, while the applied cycle was less important. The crystal size greatly influenced dissolution; a strong correlation was noted between small crystals and fast dissolution after freeze-drying the nanosuspensions. The optimal formulation was obtained by sonication at 100% amplitude, for 45 minutes and cycle 1, conditions which led to 600 nm crystals with 0.521 polydispersion index. The morphological analysis revealed small, round-shaped crystals with narrow size distribution. Conclusions The results provided the optimal sonication conditions needed to obtain meloxicam nanosuspensions with high drug dissolution capacity. PMID:26609271

  20. Cooperative optimization of reconfigurable machine tool configurations and production process plan

    NASA Astrophysics Data System (ADS)

    Xie, Nan; Li, Aiping; Xue, Wei

    2012-09-01

    The production process plan design and configurations of reconfigurable machine tool (RMT) interact with each other. Reasonable process plans with suitable configurations of RMT help to improve product quality and reduce production cost. Therefore, a cooperative strategy is needed to concurrently solve the above issue. In this paper, the cooperative optimization model for RMT configurations and production process plan is presented. Its objectives take into account both impacts of process and configuration. Moreover, a novel genetic algorithm is also developed to provide optimal or near-optimal solutions: firstly, its chromosome is redesigned which is composed of three parts, operations, process plan and configurations of RMTs, respectively; secondly, its new selection, crossover and mutation operators are also developed to deal with the process constraints from operation processes (OP) graph, otherwise these operators could generate illegal solutions violating the limits; eventually the optimal configurations for RMT under optimal process plan design can be obtained. At last, a manufacturing line case is applied which is composed of three RMTs. It is shown from the case that the optimal process plan and configurations of RMT are concurrently obtained, and the production cost decreases 6.28% and nonmonetary performance increases 22%. The proposed method can figure out both RMT configurations and production process, improve production capacity, functions and equipment utilization for RMT.

  1. Image pre-processing for optimizing automated photogrammetry performances

    NASA Astrophysics Data System (ADS)

    Guidi, G.; Gonizzi, S.; Micoli, L. L.

    2014-05-01

    The purpose of this paper is to analyze how optical pre-processing with polarizing filters and digital pre-processing with HDR imaging, may improve the automated 3D modeling pipeline based on SFM and Image Matching, with special emphasis on optically non-cooperative surfaces of shiny or dark materials. Because of the automatic detection of homologous points, the presence of highlights due to shiny materials, or nearly uniform dark patches produced by low reflectance materials, may produce erroneous matching involving wrong 3D point estimations, and consequently holes and topological errors on the mesh originated by the associated dense 3D cloud. This is due to the limited dynamic range of the 8 bit digital images that are matched each other for generating 3D data. The same 256 levels can be more usefully employed if the actual dynamic range is compressed, avoiding luminance clipping on the darker and lighter image areas. Such approach is here considered both using optical filtering and HDR processing with tone mapping, with experimental evaluation on different Cultural Heritage objects characterized by non-cooperative optical behavior. Three test images of each object have been captured from different positions, changing the shooting conditions (filter/no-filter) and the image processing (no processing/HDR processing), in order to have the same 3 camera orientations with different optical and digital pre-processing, and applying the same automated process to each photo set.

  2. Optimal Control of Markov Processes with Age-Dependent Transition Rates

    SciTech Connect

    Ghosh, Mrinal K. Saha, Subhamay

    2012-10-15

    We study optimal control of Markov processes with age-dependent transition rates. The control policy is chosen continuously over time based on the state of the process and its age. We study infinite horizon discounted cost and infinite horizon average cost problems. Our approach is via the construction of an equivalent semi-Markov decision process. We characterise the value function and optimal controls for both discounted and average cost cases.

  3. Point-process principal components analysis via geometric optimization.

    PubMed

    Solo, Victor; Pasha, Syed Ahmed

    2013-01-01

    There has been a fast-growing demand for analysis tools for multivariate point-process data driven by work in neural coding and, more recently, high-frequency finance. Here we develop a true or exact (as opposed to one based on time binning) principal components analysis for preliminary processing of multivariate point processes. We provide a maximum likelihood estimator, an algorithm for maximization involving steepest ascent on two Stiefel manifolds, and novel constrained asymptotic analysis. The method is illustrated with a simulation and compared with a binning approach. PMID:23020106

  4. Point-process principal components analysis via geometric optimization.

    PubMed

    Solo, Victor; Pasha, Syed Ahmed

    2013-01-01

    There has been a fast-growing demand for analysis tools for multivariate point-process data driven by work in neural coding and, more recently, high-frequency finance. Here we develop a true or exact (as opposed to one based on time binning) principal components analysis for preliminary processing of multivariate point processes. We provide a maximum likelihood estimator, an algorithm for maximization involving steepest ascent on two Stiefel manifolds, and novel constrained asymptotic analysis. The method is illustrated with a simulation and compared with a binning approach.

  5. Control and optimization system and method for chemical looping processes

    DOEpatents

    Lou, Xinsheng; Joshi, Abhinaya; Lei, Hao

    2014-06-24

    A control system for optimizing a chemical loop system includes one or more sensors for measuring one or more parameters in a chemical loop. The sensors are disposed on or in a conduit positioned in the chemical loop. The sensors generate one or more data signals representative of an amount of solids in the conduit. The control system includes a data acquisition system in communication with the sensors and a controller in communication with the data acquisition system. The data acquisition system receives the data signals and the controller generates the control signals. The controller is in communication with one or more valves positioned in the chemical loop. The valves are configured to regulate a flow of the solids through the chemical loop.

  6. Control and optimization system and method for chemical looping processes

    DOEpatents

    Lou, Xinsheng; Joshi, Abhinaya; Lei, Hao

    2015-02-17

    A control system for optimizing a chemical loop system includes one or more sensors for measuring one or more parameters in a chemical loop. The sensors are disposed on or in a conduit positioned in the chemical loop. The sensors generate one or more data signals representative of an amount of solids in the conduit. The control system includes a data acquisition system in communication with the sensors and a controller in communication with the data acquisition system. The data acquisition system receives the data signals and the controller generates the control signals. The controller is in communication with one or more valves positioned in the chemical loop. The valves are configured to regulate a flow of the solids through the chemical loop.

  7. Optimization of process parameters for the manufacturing of rocket casings: A study using processing maps

    NASA Astrophysics Data System (ADS)

    Avadhani, G. S.

    2003-12-01

    Maraging steels possess ultrahigh strength combined with ductility and toughness and could be easily fabricated and heat-treated. Bulk metalworking of maraging steels is an important step in the component manufacture. To optimize the hot-working parameters (temperature and strain rate) for the ring rolling process of maraging steel used for the manufacture of rocket casings, a systematic study was conducted to characterize the hot working behavior by developing processing maps for γ-iron and an indigenous 250 grade maraging steel. The hot deformation behavior of binary alloys of iron with Ni, Co, and Mo, which are major constituents of maraging steel, is also studied. Results from the investigation suggest that all the materials tested exhibit a domain of dynamic recrystallization (DRX). From the instability maps, it was revealed that strain rates above 10 s-1 are not suitable for hot working of these materials. An important result from the stress-strain behavior is that while Co strengthens γ-iron, Ni and Mo cause flow softening. Temperatures around 1125 °C and strain rate range between 0.001 and 0.1 s-1 are suitable for the hot working of maraging steel in the DRX domain. Also, higher strain rates may be used in the meta-dynamic recrystallization domain above 1075 °C for high strain rate applications such as ring rolling. The microstructural mechanisms identified from the processing maps along with grain size analyses and hot ductility measurements could be used to design hot-working schedules for maraging steel.

  8. Integrated process optimization: lessons from retrovirus and virus-like particle production.

    PubMed

    Cruz, P E; Maranga, L; Carrondo, M J T

    2002-11-13

    The optimization of production and purification processes is usually approached by engineers from a strictly biotechnological point of view. The present paper envisages the definition and application of an optimization model that takes into account the impact of both biological and technological issues upon the optimization protocols and strategies. For this purpose, the optimization of three analogous but different systems comprising animal cell growth and bioparticle production is presented. These systems were: human immunodeficiency 1 (HIV-1) and porcine parvovirus (PPV) virus-like particles (VLPs) produced in insect cells and retrovirus produced in mammalian cells. For the systematization of the optimization process four levels of optimization were defined-product, technology, design and integration. In this paper, the limits of each of the optimization levels defined are discussed by applying the concept to the systems described. This analysis leads to decisions regarding the production of VLPs and retrovirus as well as on the points relevant for further process development. Finally, the definition of the objective function or performance index, the possible strategies and tools for bioprocess optimization are described. Although developed from the three described processes, this approach can, based on the recent literature evidence reviewed here, be applied more universally for the process development of complex biopharmaceuticals.

  9. Pulsed pumping process optimization using a potential flow model

    NASA Astrophysics Data System (ADS)

    Tenney, C. M.; Lastoskie, C. M.

    2007-08-01

    A computational model is applied to the optimization of pulsed pumping systems for efficient in situ remediation of groundwater contaminants. In the pulsed pumping mode of operation, periodic rather than continuous pumping is used. During the pump-off or trapping phase, natural gradient flow transports contaminated groundwater into a treatment zone surrounding a line of injection and extraction wells that transect the contaminant plume. Prior to breakthrough of the contaminated water from the treatment zone, the wells are activated and the pump-on or treatment phase ensues, wherein extracted water is augmented to stimulate pollutant degradation and recirculated for a sufficient period of time to achieve mandated levels of contaminant removal. An important design consideration in pulsed pumping groundwater remediation systems is the pumping schedule adopted to best minimize operational costs for the well grid while still satisfying treatment requirements. Using an analytic two-dimensional potential flow model, optimal pumping frequencies and pumping event durations have been investigated for a set of model aquifer-well systems with different well spacings and well-line lengths, and varying aquifer physical properties. The results for homogeneous systems with greater than five wells and moderate to high pumping rates are reduced to a single, dimensionless correlation. Results for heterogeneous systems are presented graphically in terms of dimensionless parameters to serve as an efficient tool for initial design and selection of the pumping regimen best suited for pulsed pumping operation for a particular well configuration and extraction rate. In the absence of significant retardation or degradation during the pump-off phase, average pumping rates for pulsed operation were found to be greater than the continuous pumping rate required to prevent contaminant breakthrough.

  10. Pulsed pumping process optimization using a potential flow model.

    PubMed

    Tenney, C M; Lastoskie, C M

    2007-08-15

    A computational model is applied to the optimization of pulsed pumping systems for efficient in situ remediation of groundwater contaminants. In the pulsed pumping mode of operation, periodic rather than continuous pumping is used. During the pump-off or trapping phase, natural gradient flow transports contaminated groundwater into a treatment zone surrounding a line of injection and extraction wells that transect the contaminant plume. Prior to breakthrough of the contaminated water from the treatment zone, the wells are activated and the pump-on or treatment phase ensues, wherein extracted water is augmented to stimulate pollutant degradation and recirculated for a sufficient period of time to achieve mandated levels of contaminant removal. An important design consideration in pulsed pumping groundwater remediation systems is the pumping schedule adopted to best minimize operational costs for the well grid while still satisfying treatment requirements. Using an analytic two-dimensional potential flow model, optimal pumping frequencies and pumping event durations have been investigated for a set of model aquifer-well systems with different well spacings and well-line lengths, and varying aquifer physical properties. The results for homogeneous systems with greater than five wells and moderate to high pumping rates are reduced to a single, dimensionless correlation. Results for heterogeneous systems are presented graphically in terms of dimensionless parameters to serve as an efficient tool for initial design and selection of the pumping regimen best suited for pulsed pumping operation for a particular well configuration and extraction rate. In the absence of significant retardation or degradation during the pump-off phase, average pumping rates for pulsed operation were found to be greater than the continuous pumping rate required to prevent contaminant breakthrough.

  11. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    NASA Astrophysics Data System (ADS)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  12. Optimizing the availability of a buffered industrial process

    DOEpatents

    Martz, Jr., Harry F.; Hamada, Michael S.; Koehler, Arthur J.; Berg, Eric C.

    2004-08-24

    A computer-implemented process determines optimum configuration parameters for a buffered industrial process. A population size is initialized by randomly selecting a first set of design and operation values associated with subsystems and buffers of the buffered industrial process to form a set of operating parameters for each member of the population. An availability discrete event simulation (ADES) is performed on each member of the population to determine the product-based availability of each member. A new population is formed having members with a second set of design and operation values related to the first set of design and operation values through a genetic algorithm and the product-based availability determined by the ADES. Subsequent population members are then determined by iterating the genetic algorithm with product-based availability determined by ADES to form improved design and operation values from which the configuration parameters are selected for the buffered industrial process.

  13. Optimization of the BCP processing of elliptical nb srf cavities

    SciTech Connect

    Boffo, C.; Cooper, C.; Rowe, A.; Galasso, G.; /Udine U.

    2006-06-01

    At present, the electropolishing (EP) process is considered the key technology unleashing the capability to produce Niobium SRF cavities performing at or above 35 MV/m. Nevertheless buffered chemical polishing (BCP) remains a cheap, simple and effective processing technique for single grain high gradient and polycrystalline lower gradient cavities. BCP will be adopted to chemically process the third harmonic 3.9 GHz cavities being fabricated at Fermilab [1]. The dimensions and the shape of these cavities yield a strong nonuniformity in the material removal between iris and equator of the cells. This paper describes the thermal-fluid finite element model adopted to simulate the process, the experimental flow visualization tests performed to verify the simulation and a novel device fabricated to solve the problem.

  14. Optimal Conditions for the Control Problem Associated to a Biomedical Process

    NASA Astrophysics Data System (ADS)

    Bundǎu, O.; Juratoni, A.; Chevereşan, A.

    2010-09-01

    This paper considers a mathematical model of infectious disease of SIS type. We will analyze the problem of minimizing the cost of diseases trough medical treatment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon. The necessary conditions for optimality are given. Using the optimality conditions we prove the existence, uniqueness and stability of the steady state for a differential equations system.

  15. Optimization of an improved single-column chromatographic process for the separation of enantiomers.

    PubMed

    Kazi, Monzure-Khoda; Medi, Bijan; Amanullah, Mohammad

    2012-03-30

    This work addresses optimization of an improved single-column chromatographic (ISCC) process for the separation of guaifenesin enantiomers. Conventional feed injection and fraction collection systems have been replaced with customized components facilitating simultaneous separation and online monitoring with the ultimate objective of application of an optimizing controller. Injection volume, cycle time, desorbent flow rate, feed concentration, and three cut intervals are considered as decision variables. A multi-objective optimization technique based on genetic algorithm (GA) is adopted to achieve maximum productivity and minimum desorbent requirement in the region constrained by product specifications and hardware limitations. The optimization results along with the contribution of decision variables are discussed using Pareto fronts that identify non-dominated solutions. Optimization results of a similar simulated moving bed process have also been included to facilitate comparison with a continuous chromatographic process. PMID:22364669

  16. Optimization of an improved single-column chromatographic process for the separation of enantiomers.

    PubMed

    Kazi, Monzure-Khoda; Medi, Bijan; Amanullah, Mohammad

    2012-03-30

    This work addresses optimization of an improved single-column chromatographic (ISCC) process for the separation of guaifenesin enantiomers. Conventional feed injection and fraction collection systems have been replaced with customized components facilitating simultaneous separation and online monitoring with the ultimate objective of application of an optimizing controller. Injection volume, cycle time, desorbent flow rate, feed concentration, and three cut intervals are considered as decision variables. A multi-objective optimization technique based on genetic algorithm (GA) is adopted to achieve maximum productivity and minimum desorbent requirement in the region constrained by product specifications and hardware limitations. The optimization results along with the contribution of decision variables are discussed using Pareto fronts that identify non-dominated solutions. Optimization results of a similar simulated moving bed process have also been included to facilitate comparison with a continuous chromatographic process.

  17. Optimization of multi-objective integrated process planning and scheduling problem using a priority based optimization algorithm

    NASA Astrophysics Data System (ADS)

    Ausaf, Muhammad Farhan; Gao, Liang; Li, Xinyu

    2015-12-01

    For increasing the overall performance of modern manufacturing systems, effective integration of process planning and scheduling functions has been an important area of consideration among researchers. Owing to the complexity of handling process planning and scheduling simultaneously, most of the research work has been limited to solving the integrated process planning and scheduling (IPPS) problem for a single objective function. As there are many conflicting objectives when dealing with process planning and scheduling, real world problems cannot be fully captured considering only a single objective for optimization. Therefore considering multi-objective IPPS (MOIPPS) problem is inevitable. Unfortunately, only a handful of research papers are available on solving MOIPPS problem. In this paper, an optimization algorithm for solving MOIPPS problem is presented. The proposed algorithm uses a set of dispatching rules coupled with priority assignment to optimize the IPPS problem for various objectives like makespan, total machine load, total tardiness, etc. A fixed sized external archive coupled with a crowding distance mechanism is used to store and maintain the non-dominated solutions. To compare the results with other algorithms, a C-matric based method has been used. Instances from four recent papers have been solved to demonstrate the effectiveness of the proposed algorithm. The experimental results show that the proposed method is an efficient approach for solving the MOIPPS problem.

  18. "Just another hoop to jump through?" using environmental laws and processes to protect indigenous rights.

    PubMed

    Middleton, Beth Rose

    2013-11-01

    Protection of culturally important indigenous landscapes has become an increasingly important component of environmental management processes, for both companies and individuals striving to comply with environmental regulations, and for indigenous groups seeking stronger laws to support site protection and cultural/human rights. Given that indigenous stewardship of culturally important sites, species, and practices continues to be threatened or prohibited on lands out of indigenous ownership, this paper examines whether or not indigenous people can meaningfully apply mainstream environmental management laws and processes to achieve protection of traditional sites and associated stewardship activities. While environmental laws can provide a "back door" to protect traditional sites and practices, they are not made for this purpose, and, as such, require specific amendments to become more useful for indigenous practitioners. Acknowledging thoughtful critiques of the cultural incommensurability of environmental law with indigenous environmental stewardship of sacred sites, I interrogate the ability of four specific environmental laws and processes-the Uniform Conservation Easement Act; the National Environmental Policy Act and the California Environmental Quality Act; the Pacific Stewardship Council land divestiture process; and Senate Bill 18 (CA-2004)-to protect culturally important landscapes and practices. I offer suggestions for improving these laws and processes to make them more applicable to indigenous stewardship of traditional landscapes. PMID:23232791

  19. "Just another hoop to jump through?" using environmental laws and processes to protect indigenous rights.

    PubMed

    Middleton, Beth Rose

    2013-11-01

    Protection of culturally important indigenous landscapes has become an increasingly important component of environmental management processes, for both companies and individuals striving to comply with environmental regulations, and for indigenous groups seeking stronger laws to support site protection and cultural/human rights. Given that indigenous stewardship of culturally important sites, species, and practices continues to be threatened or prohibited on lands out of indigenous ownership, this paper examines whether or not indigenous people can meaningfully apply mainstream environmental management laws and processes to achieve protection of traditional sites and associated stewardship activities. While environmental laws can provide a "back door" to protect traditional sites and practices, they are not made for this purpose, and, as such, require specific amendments to become more useful for indigenous practitioners. Acknowledging thoughtful critiques of the cultural incommensurability of environmental law with indigenous environmental stewardship of sacred sites, I interrogate the ability of four specific environmental laws and processes-the Uniform Conservation Easement Act; the National Environmental Policy Act and the California Environmental Quality Act; the Pacific Stewardship Council land divestiture process; and Senate Bill 18 (CA-2004)-to protect culturally important landscapes and practices. I offer suggestions for improving these laws and processes to make them more applicable to indigenous stewardship of traditional landscapes.

  20. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining

    PubMed Central

    Salehi, Mojtaba

    2010-01-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously. PMID:21845020

  1. FinFET Doping; Material Science, Metrology, and Process Modeling Studies for Optimized Device Performance

    SciTech Connect

    Duffy, R.; Shayesteh, M.

    2011-01-07

    In this review paper the challenges that face doping optimization in 3-dimensional (3D) thin-body silicon devices will be discussed, within the context of material science studies, metrology methodologies, process modeling insight, ultimately leading to optimized device performance. The focus will be on ion implantation at the method to introduce the dopants to the target material.

  2. Optimizing an Immersion ESL Curriculum Using Analytic Hierarchy Process

    ERIC Educational Resources Information Center

    Tang, Hui-Wen Vivian

    2011-01-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative…

  3. Optimization of the lithographic performance for lift-off processing

    NASA Astrophysics Data System (ADS)

    Yin, Wenyan; Fillmore, Ward; Dempsey, Kevin J.

    1999-06-01

    Shipley MICROPOSIT LOL lift-off technology exploits a develop rate difference in a resist, LOL1000 bi-layer system to generate retrograde profiles. This is an enabling technology for 'additive' processing. Deposition follows lithography and the resist is then 'lifted off' to generate a patterned layer.

  4. Optimal evaluation of infectious medical waste disposal companies using the fuzzy analytic hierarchy process

    SciTech Connect

    Ho, Chao Chung

    2011-07-15

    Ever since Taiwan's National Health Insurance implemented the diagnosis-related groups payment system in January 2010, hospital income has declined. Therefore, to meet their medical waste disposal needs, hospitals seek suppliers that provide high-quality services at a low cost. The enactment of the Waste Disposal Act in 1974 had facilitated some improvement in the management of waste disposal. However, since the implementation of the National Health Insurance program, the amount of medical waste from disposable medical products has been increasing. Further, of all the hazardous waste types, the amount of infectious medical waste has increased at the fastest rate. This is because of the increase in the number of items considered as infectious waste by the Environmental Protection Administration. The present study used two important findings from previous studies to determine the critical evaluation criteria for selecting infectious medical waste disposal firms. It employed the fuzzy analytic hierarchy process to set the objective weights of the evaluation criteria and select the optimal infectious medical waste disposal firm through calculation and sorting. The aim was to propose a method of evaluation with which medical and health care institutions could objectively and systematically choose appropriate infectious medical waste disposal firms.

  5. Optimal evaluation of infectious medical waste disposal companies using the fuzzy analytic hierarchy process.

    PubMed

    Ho, Chao Chung

    2011-07-01

    Ever since Taiwan's National Health Insurance implemented the diagnosis-related groups payment system in January 2010, hospital income has declined. Therefore, to meet their medical waste disposal needs, hospitals seek suppliers that provide high-quality services at a low cost. The enactment of the Waste Disposal Act in 1974 had facilitated some improvement in the management of waste disposal. However, since the implementation of the National Health Insurance program, the amount of medical waste from disposable medical products has been increasing. Further, of all the hazardous waste types, the amount of infectious medical waste has increased at the fastest rate. This is because of the increase in the number of items considered as infectious waste by the Environmental Protection Administration. The present study used two important findings from previous studies to determine the critical evaluation criteria for selecting infectious medical waste disposal firms. It employed the fuzzy analytic hierarchy process to set the objective weights of the evaluation criteria and select the optimal infectious medical waste disposal firm through calculation and sorting. The aim was to propose a method of evaluation with which medical and health care institutions could objectively and systematically choose appropriate infectious medical waste disposal firms.

  6. Process Optimization of Bismaleimide (BMI) Resin Infused Carbon Fiber Composite

    NASA Technical Reports Server (NTRS)

    Ehrlich, Joshua W.; Tate, LaNetra C.; Cox, Sarah B.; Taylor, Brian J.; Wright, M. Clara; Faughnan, Patrick D.; Batterson, Lawrence M.; Caraccio, Anne J.; Sampson, Jeffery W.

    2013-01-01

    Engineers today are presented with the opportunity to design and build the next generation of space vehicles out of the lightest, strongest, and most durable materials available. Composites offer excellent structural characteristics and outstanding reliability in many forms that will be utilized in future aerospace applications including the Commercial Crew and Cargo Program and the Orion space capsule. NASA's Composites for Exploration (CoEx) project researches the various methods of manufacturing composite materials of different fiber characteristics while using proven infusion methods of different resin compositions. Development and testing on these different material combinations will provide engineers the opportunity to produce optimal material compounds for multidisciplinary applications. Through the CoEx project, engineers pursue the opportunity to research and develop repair patch procedures for damaged spacecraft. Working in conjunction with Raptor Resins Inc., NASA engineers are utilizing high flow liquid infusion molding practices to manufacture high-temperature composite parts comprised of intermediate modulus 7 (IM7) carbon fiber material. IM7 is a continuous, high-tensile strength composite with outstanding structural qualities such as high shear strength, tensile strength and modulus as well as excellent corrosion, creep, and fatigue resistance. IM7 carbon fiber, combined with existing thermoset and thermoplastic resin systems, can provide improvements in material strength reinforcement and deformation-resistant properties for high-temperature applications. Void analysis of the different layups of the IM7 material discovered the largest total void composition within the [ +45 , 90 , 90 , -45 ] composite panel. Tensile and compressional testing proved the highest mechanical strength was found in the [0 4] layup. This paper further investigates the infusion procedure of a low-cost/high-performance BMI resin into an IM7 carbon fiber material and the

  7. Charge transfer processes: the role of optimized molecular orbitals.

    PubMed

    Meyer, Benjamin; Domingo, Alex; Krah, Tim; Robert, Vincent

    2014-08-01

    The influence of the molecular orbitals on charge transfer (CT) reactions is analyzed through wave function-based calculations. Characteristic CT processes in the organic radical 2,5-di-tert-butyl-6-oxophenalenoxyl linked with tetrathiafulvalene and the inorganic crystalline material LaMnO3 show that changes in the inner shells must be explicitly taken into account. Such electronic reorganization can lead to a reduction of the CT vertical transition energy up to 66%. A state-specific approach accessible through an adapted CASSCF (complete active space self-consistent field) methodology is capable of reaching good agreement with the experimental spectroscopy of CT processes. A partitioning of the relaxation energy in terms of valence- and inner-shells is offered and sheds light on their relative importance. This work paves the way to the intimate description of redox reactions using quantum chemistry methods.

  8. Strength optimization of alpha-SiC by improved processing

    NASA Technical Reports Server (NTRS)

    Dutta, Sunil

    1986-01-01

    Silicon carbide is of great interest for structural use in aircraft and automobile engines. This ceramic combines high thermal conductivity and low coefficient of thermal expansion, and consequently has good thermal shock resistance. However, like other ceramics, silicon carbide shows strength variability due to processing flaws such as large voids, shrinkage cracks, and inclusions. Agglomerates in the starting powder seem to be the predominant cause of such defects. Improved processing techniques such as slurry pressing and hot isostatic pressing (HIP) were employed to minimize these defects and to improve strength and reliability in the fabricated material. For this purpose 2-inch diameter disks were fabricated by various consolidation techniques. These include: dry pressing and sintering, slurry pressing and sintering, and slurry pressing and HIPing. The results are discussed.

  9. Deconvoluting the Friction Stir Weld Process for Optimizing Welds

    NASA Technical Reports Server (NTRS)

    Schneider, Judy; Nunes, Arthur C.

    2008-01-01

    In the friction stir welding process, the rotating surfaces of the pin and shoulder contact the weld metal and force a rotational flow within the weld metal. Heat, generated by the metal deformation as well as frictional slippage with the contact surface, softens the metal and makes it easier to deform. As in any thermo-mechanical processing of metal, the flow conditions are critical to the quality of the weld. For example, extrusion of metal from under the shoulder of an excessively hot weld may relax local pressure and result in wormhole defects. The trace of the weld joint in the wake of the weld may vary geometrically depending upon the flow streamlines around the tool with some geometry more vulnerable to loss of strength from joint contamination than others. The material flow path around the tool cannot be seen in real time during the weld. By using analytical "tools" based upon the principles of mathematics and physics, a weld model can be created to compute features that can be observed. By comparing the computed observations with actual data, the weld model can be validated or adjusted to get better agreement. Inputs to the model to predict weld structures and properties include: hot working properties ofthe metal, pin tool geometry, travel rate, rotation and plunge force. Since metals record their prior hot working history, the hot working conditions imparted during FSW can be quantified by interpreting the final microstructure. Variations in texture and grain size result from variations in the strain accommodated at a given strain rate and temperature. Microstructural data from a variety of FSWs has been correlated with prior marker studies to contribute to our understanding of the FSW process. Once this stage is reached, the weld modeling process can save significant development costs by reducing costly trial-and-error approaches to obtaining quality welds.

  10. Optimization and application of Retinex algorithm in aerial image processing

    NASA Astrophysics Data System (ADS)

    Sun, Bo; He, Jun; Li, Hongyu

    2008-04-01

    In this paper, we provide a segmentation based Retinex for improving the visual quality of aerial images obtained under complex weather conditions. With the method, an aerial image will be segmented into different regions, and then an adaptive Gaussian based on the segmentations will be used to process it. The method addresses the problems existing in previously developed Retinex algorithms, such as halo artifacts and graying-out artifacts. The experimental result also shows evidence of its better effect.

  11. Multiscale metrologies for process optimization of carbon nanotube polymer composites

    DOE PAGES

    Natarajan, Bharath; Orloff, Nathan D.; Ashkar, Rana; Doshi, Sagar; Twedt, Kevin; Krishnamurthy, Ajay; Davis, Chelsea; Forster, Aaron M.; Thostenson, Erik; Obrzut, Jan; et al

    2016-07-18

    Carbon nanotube (CNT) polymer nanocomposites are attractive multifunctional materials with a growing range of commercial applications. With the increasing demand for these materials, it is imperative to develop and validate methods for on-line quality control and process monitoring during production. In this work, a novel combination of characterization techniques is utilized, that facilitates the non-invasive assessment of CNT dispersion in epoxy produced by the scalable process of calendering. First, the structural parameters of these nanocomposites are evaluated across multiple length scales (10-10 m to 10-3 m) using scanning gallium-ion microscopy, transmission electron microscopy and small-angle neutron scattering. Then, a non-contactmore » resonant microwave cavity perturbation (RCP) technique is employed to accurately measure the AC electrical conductivity of the nanocomposites. Quantitative correlations between the conductivity and structural parameters find the RCP measurements to be sensitive to CNT mass fraction, spatial organization and, therefore, the processing parameters. These results, and the non-contact nature and speed of RCP measurements identify this technique as being ideally suited for quality control of CNT nanocomposites in a nanomanufacturing environment. In conclusion, when validated by the multiscale characterization suite, RCP may be broadly applicable in the production of hybrid functional materials, such as graphene, gold nanorod, and carbon black nanocomposites.« less

  12. A Graph-Based Ant Colony Optimization Approach for Process Planning

    PubMed Central

    Wang, JinFeng; Fan, XiaoLiang; Wan, Shuting

    2014-01-01

    The complex process planning problem is modeled as a combinatorial optimization problem with constraints in this paper. An ant colony optimization (ACO) approach has been developed to deal with process planning problem by simultaneously considering activities such as sequencing operations, selecting manufacturing resources, and determining setup plans to achieve the optimal process plan. A weighted directed graph is conducted to describe the operations, precedence constraints between operations, and the possible visited path between operation nodes. A representation of process plan is described based on the weighted directed graph. Ant colony goes through the necessary nodes on the graph to achieve the optimal solution with the objective of minimizing total production costs (TPC). Two cases have been carried out to study the influence of various parameters of ACO on the system performance. Extensive comparative experiments have been conducted to demonstrate the feasibility and efficiency of the proposed approach. PMID:24995355

  13. Life cycle analysis within pharmaceutical process optimization and intensification: case study of active pharmaceutical ingredient production.

    PubMed

    Ott, Denise; Kralisch, Dana; Denčić, Ivana; Hessel, Volker; Laribi, Yosra; Perrichon, Philippe D; Berguerand, Charline; Kiwi-Minsker, Lioubov; Loeb, Patrick

    2014-12-01

    As the demand for new drugs is rising, the pharmaceutical industry faces the quest of shortening development time, and thus, reducing the time to market. Environmental aspects typically still play a minor role within the early phase of process development. Nevertheless, it is highly promising to rethink, redesign, and optimize process strategies as early as possible in active pharmaceutical ingredient (API) process development, rather than later at the stage of already established processes. The study presented herein deals with a holistic life-cycle-based process optimization and intensification of a pharmaceutical production process targeting a low-volume, high-value API. Striving for process intensification by transfer from batch to continuous processing, as well as an alternative catalytic system, different process options are evaluated with regard to their environmental impact to identify bottlenecks and improvement potentials for further process development activities.

  14. The role of optimism in the process of schema-focused cognitive therapy of personality problems.

    PubMed

    Hoffart, Asle; Sexton, Harold

    2002-06-01

    The aim of this study was to examine the determinants and effects of optimism in the process of schema-focused cognitive therapy of personality problems. The sample consisted of 35 patients with panic disorder and/or agoraphobia and DSM-IV Cluster C personality traits who participated in an 11-week residential program with one symptom-focused and one personality-focused phase. This study examines the role played by optimism during the individual sessions of the second phase, using a time series approach. Decreased patient's belief in his/her primary Early Maladaptive Schema and increased patient-experienced empathy from the therapist in a session predicted increased patient-rated optimism before the subsequent session. Increased patient-rated optimism in turn predicted decreased schema belief and distress and increased insight, empathy, and therapist-rated optimism. The slope of optimism across sessions was related to change in most of the overall outcome measures. There appears to be a positive feedback in the process of schema-focused cognitive therapy between decreased schema belief and increased optimism. In addition, optimism appears to mediate the effects of schema belief and therapist empathy on overall improvement, and to serve as an antecedent to decreased distress and to increased empathy, insight, and therapist's optimism. PMID:12051481

  15. The role of optimism in the process of schema-focused cognitive therapy of personality problems.

    PubMed

    Hoffart, Asle; Sexton, Harold

    2002-06-01

    The aim of this study was to examine the determinants and effects of optimism in the process of schema-focused cognitive therapy of personality problems. The sample consisted of 35 patients with panic disorder and/or agoraphobia and DSM-IV Cluster C personality traits who participated in an 11-week residential program with one symptom-focused and one personality-focused phase. This study examines the role played by optimism during the individual sessions of the second phase, using a time series approach. Decreased patient's belief in his/her primary Early Maladaptive Schema and increased patient-experienced empathy from the therapist in a session predicted increased patient-rated optimism before the subsequent session. Increased patient-rated optimism in turn predicted decreased schema belief and distress and increased insight, empathy, and therapist-rated optimism. The slope of optimism across sessions was related to change in most of the overall outcome measures. There appears to be a positive feedback in the process of schema-focused cognitive therapy between decreased schema belief and increased optimism. In addition, optimism appears to mediate the effects of schema belief and therapist empathy on overall improvement, and to serve as an antecedent to decreased distress and to increased empathy, insight, and therapist's optimism.

  16. ``Just Another Hoop to Jump Through?'' Using Environmental Laws and Processes to Protect Indigenous Rights

    NASA Astrophysics Data System (ADS)

    Middleton, Beth Rose

    2013-11-01

    Protection of culturally important indigenous landscapes has become an increasingly important component of environmental management processes, for both companies and individuals striving to comply with environmental regulations, and for indigenous groups seeking stronger laws to support site protection and cultural/human rights. Given that indigenous stewardship of culturally important sites, species, and practices continues to be threatened or prohibited on lands out of indigenous ownership, this paper examines whether or not indigenous people can meaningfully apply mainstream environmental management laws and processes to achieve protection of traditional sites and associated stewardship activities. While environmental laws can provide a “back door” to protect traditional sites and practices, they are not made for this purpose, and, as such, require specific amendments to become more useful for indigenous practitioners. Acknowledging thoughtful critiques of the cultural incommensurability of environmental law with indigenous environmental stewardship of sacred sites, I interrogate the ability of four specific environmental laws and processes—the Uniform Conservation Easement Act; the National Environmental Policy Act and the California Environmental Quality Act; the Pacific Stewardship Council land divestiture process; and Senate Bill 18 (CA-2004)—to protect culturally important landscapes and practices. I offer suggestions for improving these laws and processes to make them more applicable to indigenous stewardship of traditional landscapes.

  17. Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy

    NASA Astrophysics Data System (ADS)

    Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.

    2011-08-01

    The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.

  18. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  19. Experiments with repeating weighted boosting search for optimization in signal processing applications.

    PubMed

    Chen, Sheng; Wang, Xunxian; Harris, Chris J

    2005-08-01

    Many signal processing applications pose optimization problems with multimodal and nonsmooth cost functions. Gradient methods are ineffective in these situations, and optimization methods that require no gradient and can achieve a global optimal solution are highly desired to tackle these difficult problems. The paper proposes a guided global search optimization technique, referred to as the repeated weighted boosting search. The proposed optimization algorithm is extremely simple and easy to implement, involving a minimum programming effort. Heuristic explanation is given for the global search capability of this technique. Comparison is made with the two better known and widely used guided global search techniques, known as the genetic algorithm and adaptive simulated annealing, in terms of the requirements for algorithmic parameter tuning. The effectiveness of the proposed algorithm as a global optimizer are investigated through several application examples.

  20. Modeling hydrogen diffusion for solar cell passivation and process optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Yi

    A diffusion model for hydrogen (H) in crystalline silicon was established which takes into account the charged state conversion, junction field, mobile traps, and complex formation and dissociation at dopant and trap sites. Carrier exchange among the various charged species is a "fast" process compared to the diffusion process. A numerical method was developed to solve the densities of various charged species from the Poisson's equation that involves shallow-level dopants and one "negative U" impurity, e.g., H. Time domain implicit method was adopted in finite difference scheme to solve the fully coupled equations. Limiting versions of the model were applied to the problems that are of interest to photovoltaics. Simplified trap-limited model was used to describe the low temperature diffusion profiles, assuming process-induced traps, a constant bulk trap level, and trapping/detrapping mechanisms. The results of the simulation agreed with those obtained from experiments. The best fit yielded a low surface free H concentration, Cs, (˜10 14 cm-3) from high temperature extrapolated diffusivity value. In the case of ion beam hydrogenation, mobile traps needed to be considered. PAS analysis showed the existence of vacancy-type defects in implanted Si substrates. Simulation of hydrogen diffusion in p-n junction was first attempted in this work. The order of magnitude of Cs (˜10 14 cm-3) was confirmed. Simulation results showed that the preferred charged state of H is H- (H +) in n- (p-) side of the junction. The accumulation of H- (H+) species on n+ (p+) side of the n+-p (p+-n) junction was observed, which could retard the diffusion in junction. The diffusion of hydrogen through heavily doped region in a junction is trap-limited. Several popular hydrogenation techniques were evaluated by means of modeling and experimental observations. In particular, PECVD followed by RTP hydrogenation was found to be two-step process: PECVD deposition serves as a predeposition step of H

  1. Optimization of high cell density fermentation process for recombinant nitrilase production in E. coli.

    PubMed

    Sohoni, Sujata Vijay; Nelapati, Dhanaraj; Sathe, Sneha; Javadekar-Subhedar, Vaishali; Gaikaiwari, Raghavendra P; Wangikar, Pramod P

    2015-01-01

    Nitrilases constitute an important class of biocatalysts for chiral synthesis. This work was undertaken with the aim to optimize nitrilase production in a host that is well-studied for protein production. Process parameters were optimized for high cell density fermentation, in batch and fed-batch modes, of Escherichia coli BL21 (DE3) expressing Pseudomonas fluorescens nitrilase with a T7 promoter based expression system. Effects of different substrates, temperature and isopropyl β-D-1-thiogalactopyranoside (IPTG) induction on nitrilase production were studied. Super optimal broth containing glycerol but without an inducer gave best results in batch mode with 32 °C as the optimal temperature. Use of IPTG led to insoluble protein and lower enzyme activity. Optimized fed-batch strategy resulted in significant improvement in specific activity as well as volumetric productivity of the enzyme. On a volumetric basis, the activity improved 40-fold compared to the unoptimized batch process. PMID:25739996

  2. Mechanism of cross-sectoral coordination between nature protection and forestry in the Natura 2000 formulation process in Slovakia.

    PubMed

    Sarvašová, Zuzana; Sálka, Jaroslav; Dobšinská, Zuzana

    2013-09-01

    Nature protection as a policy sector is not isolated and is directly or indirectly influenced by many other sectors (e.g. forestry, water management, rural development, energy, etc.). These policy sectors are neither completely segmented nor unaffected by the decisions taken in other policy sectors. Policy formulation in nature protection is therefore also influenced by different sectors. For that reason it is inevitable to stress the need for inter-sectoral coordination to assure their policy coherence. The aim of this article is to describe the mechanism and modes of cross-sectoral coordination and to analyze the relevant actors and their interaction, using the case of the Natura 2000 formulation process in Slovakia. The European Union (EU) set up an ecological network of special protected areas, known as Natura 2000 to ensure biodiversity by conserving natural habitats and wild fauna and flora in the territory of the Member States. An optimized nature protection must therefore carefully consider existing limits and crossdisciplinary relationships at the EU, national and regional levels. The relations between forestry and biodiversity protection are analyzed using the advocacy coalition framework (ACF). The ACF is used for analyzing how two coalitions, in this case ecological and forest owners' coalitions, advocate or pursue their beliefs from the nature protection and forestry policy field. The whole process is illustrated at the regional scale on the case study of Natura 2000 sites formulation in the Slovak Republic. For better reliability and validity of research, a combination of various empiric research methods was used, supported by existing theories. So called triangulation of sociological research or triangulation of methods consists of mutual results testing of individual methodological steps through identifying corresponding political-science theories, assessing their formal points using primary and secondary document analysis and assessing their

  3. Mechanism of cross-sectoral coordination between nature protection and forestry in the Natura 2000 formulation process in Slovakia.

    PubMed

    Sarvašová, Zuzana; Sálka, Jaroslav; Dobšinská, Zuzana

    2013-09-01

    Nature protection as a policy sector is not isolated and is directly or indirectly influenced by many other sectors (e.g. forestry, water management, rural development, energy, etc.). These policy sectors are neither completely segmented nor unaffected by the decisions taken in other policy sectors. Policy formulation in nature protection is therefore also influenced by different sectors. For that reason it is inevitable to stress the need for inter-sectoral coordination to assure their policy coherence. The aim of this article is to describe the mechanism and modes of cross-sectoral coordination and to analyze the relevant actors and their interaction, using the case of the Natura 2000 formulation process in Slovakia. The European Union (EU) set up an ecological network of special protected areas, known as Natura 2000 to ensure biodiversity by conserving natural habitats and wild fauna and flora in the territory of the Member States. An optimized nature protection must therefore carefully consider existing limits and crossdisciplinary relationships at the EU, national and regional levels. The relations between forestry and biodiversity protection are analyzed using the advocacy coalition framework (ACF). The ACF is used for analyzing how two coalitions, in this case ecological and forest owners' coalitions, advocate or pursue their beliefs from the nature protection and forestry policy field. The whole process is illustrated at the regional scale on the case study of Natura 2000 sites formulation in the Slovak Republic. For better reliability and validity of research, a combination of various empiric research methods was used, supported by existing theories. So called triangulation of sociological research or triangulation of methods consists of mutual results testing of individual methodological steps through identifying corresponding political-science theories, assessing their formal points using primary and secondary document analysis and assessing their

  4. Process optimization of ultrasonic spray coating of polymer films.

    PubMed

    Bose, Sanjukta; Keller, Stephan S; Alstrøm, Tommy S; Boisen, Anja; Almdal, Kristoffer

    2013-06-11

    In this work we have performed a detailed study of the influence of various parameters on spray coating of polymer films. Our aim is to produce polymer films of uniform thickness (500 nm to 1 μm) and low roughness compared to the film thickness. The coatings are characterized with respect to thickness, roughness (profilometer), and morphology (optical microscopy). Polyvinylpyrrolidone (PVP) is used to do a full factorial design of experiments with selected process parameters such as temperature, distance between spray nozzle and substrate, and speed of the spray nozzle. A mathematical model is developed for statistical analysis which identifies the distance between nozzle and substrate as the most significant parameter. Depending on the drying of the sprayed droplets on the substrate, we define two broad regimes, "dry" and "wet". The optimum condition of spraying lies in a narrow window between these two regimes, where we obtain a film of desired quality. Both with increasing nozzle-substrate distance and temperature, the deposition moves from a wet state to a dry regime. Similar results are also achieved for solvents with low boiling points. Finally, we study film formation during spray coating with poly (D,L-lactide) (PDLLA). The results confirm the processing knowledge obtained with PVP and indicate that the observed trends are identical for spraying of other polymer films.

  5. Parameter Optimization of Nitriding Process Using Chemical Kinetics

    NASA Astrophysics Data System (ADS)

    Özdemir, İ. Bedii; Akar, Firat; Lippmann, Nils

    2016-09-01

    Using the dynamics of chemical kinetics, an investigation to search for an optimum condition for a gas nitriding process is performed over the solution space spanned by the initial temperature and gas composition of the furnace. For a two-component furnace atmosphere, the results are presented in temporal variations of gas concentrations and the nitrogen coverage on the surface. It seems that the exploitation of the nitriding kinetics can provide important feedback for setting the model-based control algorithms. The present work shows that when the nitrogen gas concentration is not allowed to exceed 6 pct, the Nad coverage can attain maximum values as high as 0.97. The time evolution of the Nad coverage also reveals that, as long as the temperature is above the value where nitrogen poisoning of the surface due to the low-temperature adsorption of excess nitrogen occurs, the initial ammonia content in the furnace atmosphere is much more important in the nitriding process than is the initial temperature.

  6. Process optimization of ultrasonic spray coating of polymer films.

    PubMed

    Bose, Sanjukta; Keller, Stephan S; Alstrøm, Tommy S; Boisen, Anja; Almdal, Kristoffer

    2013-06-11

    In this work we have performed a detailed study of the influence of various parameters on spray coating of polymer films. Our aim is to produce polymer films of uniform thickness (500 nm to 1 μm) and low roughness compared to the film thickness. The coatings are characterized with respect to thickness, roughness (profilometer), and morphology (optical microscopy). Polyvinylpyrrolidone (PVP) is used to do a full factorial design of experiments with selected process parameters such as temperature, distance between spray nozzle and substrate, and speed of the spray nozzle. A mathematical model is developed for statistical analysis which identifies the distance between nozzle and substrate as the most significant parameter. Depending on the drying of the sprayed droplets on the substrate, we define two broad regimes, "dry" and "wet". The optimum condition of spraying lies in a narrow window between these two regimes, where we obtain a film of desired quality. Both with increasing nozzle-substrate distance and temperature, the deposition moves from a wet state to a dry regime. Similar results are also achieved for solvents with low boiling points. Finally, we study film formation during spray coating with poly (D,L-lactide) (PDLLA). The results confirm the processing knowledge obtained with PVP and indicate that the observed trends are identical for spraying of other polymer films. PMID:23631433

  7. Thermal modeling of grinding for process optimization and durability improvements

    NASA Astrophysics Data System (ADS)

    Hanna, Ihab M.

    Both thermal and mechanical aspects of the grinding process are investigated in detail in an effort to predict grinding induced residual stresses. An existing thermal model is used as a foundation for computing heat partitions and temperatures in surface grinding. By numerically processing data from IR temperature measurements of the grinding zone; characterizations are made of the grinding zone heat flux. It is concluded that the typical heat flux profile in the grinding zone is triangular in shape, supporting this often used assumption found in the literature. Further analyses of the computed heat flux profiles has revealed that actual grinding zone contact lengths exceed geometric contact lengths by an average of 57% for the cases considered. By integrating the resulting heat flux profiles; workpiece energy partitions are computed for several cases of dry conventional grinding of hardened steel. The average workpiece energy partition for the cases considered was 37%. In an effort to more accurately predict grinding zone temperatures and heat fluxes, refinements are made to the existing thermal model. These include consideration of contact length extensions due to local elastic deformations, variations of the assumed contact area ratio as a function of grinding process parameters, consideration of coolant latent heat of vaporization and its effect on heat transfer beyond the coolant boiling point, and incorporation of coolant-workpiece convective heat flux effects outside the grinding zone. The result of the model refinements accounting for contact length extensions and process-dependant contact area ratios is excellent agreement with IR temperature measurements over a wide range of grinding conditions. By accounting for latent heat of vaporization effects, grinding zone temperature profiles are shown to be capable of reproducing measured profiles found in the literature for cases on the verge of thermal surge conditions. Computed peak grinding zone temperatures

  8. Comparison of batch and continuous multi-column protein A capture processes by optimal design.

    PubMed

    Baur, Daniel; Angarita, Monica; Müller-Späth, Thomas; Steinebach, Fabian; Morbidelli, Massimo

    2016-07-01

    Multi-column capture processes show several advantages compared to batch capture. It is however not evident how many columns one should use exactly. To investigate this issue, twin-column CaptureSMB, 3- and 4-column periodic counter-current chromatography (PCC) and single column batch capture are numerically optimized and compared in terms of process performance for capturing a monoclonal antibody using protein A chromatography. Optimization is carried out with respect to productivity and capacity utilization (amount of product loaded per cycle compared to the maximum amount possible), while keeping yield and purity constant. For a wide range of process parameters, all three multi-column processes show similar maximum capacity utilization and performed significantly better than batch. When maximizing productivity, the CaptureSMB process shows optimal performance, except at high feed titers, where batch chromatography can reach higher productivity values than the multi-column processes due to the complete decoupling of the loading and elution steps, albeit at a large cost in terms of capacity utilization. In terms of trade-off, i.e. how much the capacity utilization decreases with increasing productivity, CaptureSMB is optimal for low and high feed titers, whereas the 3-column process is optimal in an intermediate region. Using these findings, the most suitable process can be chosen for different production scenarios.

  9. Comparison of batch and continuous multi-column protein A capture processes by optimal design.

    PubMed

    Baur, Daniel; Angarita, Monica; Müller-Späth, Thomas; Steinebach, Fabian; Morbidelli, Massimo

    2016-07-01

    Multi-column capture processes show several advantages compared to batch capture. It is however not evident how many columns one should use exactly. To investigate this issue, twin-column CaptureSMB, 3- and 4-column periodic counter-current chromatography (PCC) and single column batch capture are numerically optimized and compared in terms of process performance for capturing a monoclonal antibody using protein A chromatography. Optimization is carried out with respect to productivity and capacity utilization (amount of product loaded per cycle compared to the maximum amount possible), while keeping yield and purity constant. For a wide range of process parameters, all three multi-column processes show similar maximum capacity utilization and performed significantly better than batch. When maximizing productivity, the CaptureSMB process shows optimal performance, except at high feed titers, where batch chromatography can reach higher productivity values than the multi-column processes due to the complete decoupling of the loading and elution steps, albeit at a large cost in terms of capacity utilization. In terms of trade-off, i.e. how much the capacity utilization decreases with increasing productivity, CaptureSMB is optimal for low and high feed titers, whereas the 3-column process is optimal in an intermediate region. Using these findings, the most suitable process can be chosen for different production scenarios. PMID:26992151

  10. Optimization of the production process using virtual model of a workspace

    NASA Astrophysics Data System (ADS)

    Monica, Z.

    2015-11-01

    Optimization of the production process is an element of the design cycle consisting of: problem definition, modelling, simulation, optimization and implementation. Without the use of simulation techniques, the only thing which could be achieved is larger or smaller improvement of the process, not the optimization (i.e., the best result it is possible to get for the conditions under which the process works). Optimization is generally management actions that are ultimately bring savings in time, resources, and raw materials and improve the performance of a specific process. It does not matter whether it is a service or manufacturing process. Optimizing the savings generated by improving and increasing the efficiency of the processes. Optimization consists primarily of organizational activities that require very little investment, or rely solely on the changing organization of work. Modern companies operating in a market economy shows a significant increase in interest in modern methods of production management and services. This trend is due to the high competitiveness among companies that want to achieve success are forced to continually modify the ways to manage and flexible response to changing demand. Modern methods of production management, not only imply a stable position of the company in the sector, but also influence the improvement of health and safety within the company and contribute to the implementation of more efficient rules for standardization work in the company. This is why in the paper is presented the application of such developed environment like Siemens NX to create the virtual model of a production system and to simulate as well as optimize its work. The analyzed system is the robotized workcell consisting of: machine tools, industrial robots, conveyors, auxiliary equipment and buffers. In the program could be defined the control program realizing the main task in the virtual workcell. It is possible, using this tool, to optimize both the

  11. Children's cognitive triage: optimal retrieval or effortful processing?

    PubMed

    Brainerd, C J; Reyna, V F; Howe, M L

    1990-06-01

    Cognitive triage is a surprising nonmonotonic relationship that exists between the order in which children read words out of long-term memory and the memory strengths of those same words. Two forgetting experiments with 7- and 12-year-old children are reported in which fuzzy-trace theory's explanation of this effect was pitted against an effortful processing explanation. The two explanations make different predictions about the relative rates of forgetting for words that are recalled at the primacy and recency positions of output queues. The data consistently favored fuzzy-trace theory's predictions. We discuss the implications of our results for two assumptions that are commonly made in theories of memory development--namely, that recall accuracy is a monotonic-increasing function of memory strength and that recall order is a monotonic-decreasing function of memory strength.

  12. The Integration of LNT and Hormesis for Cancer Risk Assessment Optimizes Public Health Protection.

    PubMed

    Calabrese, Edward J; Shamoun, Dima Yazji; Hanekamp, Jaap C

    2016-03-01

    This paper proposes a new cancer risk assessment strategy and methodology that optimizes population-based responses by yielding the lowest disease/tumor incidence across the entire dose continuum. The authors argue that the optimization can be achieved by integrating two seemingly conflicting models; i.e., the linear no-threshold (LNT) and hormetic dose-response models. The integration would yield the optimized response at a risk of 10 with the LNT model. The integrative functionality of the LNT and hormetic dose response models provides an improved estimation of tumor incidence through model uncertainty analysis and major reductions in cancer incidence via hormetic model estimates. This novel approach to cancer risk assessment offers significant improvements over current risk assessment approaches by revealing a regulatory sweet spot that maximizes public health benefits while incorporating practical approaches for model validation. PMID:26808876

  13. Integration of Virtual Reality with Computational Fluid Dynamics for Process Optimization

    NASA Astrophysics Data System (ADS)

    Wu, B.; Chen, G. H.; Fu, D.; Moreland, John; Zhou, Chenn Q.

    2010-03-01

    Computational Fluid Dynamics (CFD) has become a powerful simulation technology used in many industrial applications for process design and optimization to save energy, improve environment, and reduce costs. In order to better understand CFD results and more easily communicate with non-CFD experts, advanced virtual reality (VR) visualization is desired for CFD post-processing. Efforts have recently been made at Purdue University Calumet to integrate VR with CFD to visualize complex data in three dimensions in an interactive, virtual environment. The virtual engineering environment greatly enhances the value of CFD simulations and allows engineers to gain much needed process insights for the design and optimization of industrial processes.

  14. Codon-optimized filovirus DNA vaccines delivered by intramuscular electroporation protect cynomolgus macaques from lethal Ebola and Marburg virus challenges.

    PubMed

    Grant-Klein, Rebecca J; Altamura, Louis A; Badger, Catherine V; Bounds, Callie E; Van Deusen, Nicole M; Kwilas, Steven A; Vu, Hong A; Warfield, Kelly L; Hooper, Jay W; Hannaman, Drew; Dupuy, Lesley C; Schmaljohn, Connie S

    2015-01-01

    Cynomolgus macaques were vaccinated by intramuscular electroporation with DNA plasmids expressing codon-optimized glycoprotein (GP) genes of Ebola virus (EBOV) or Marburg virus (MARV) or a combination of codon-optimized GP DNA vaccines for EBOV, MARV, Sudan virus and Ravn virus. When measured by ELISA, the individual vaccines elicited slightly higher IgG responses to EBOV or MARV than did the combination vaccines. No significant differences in immune responses of macaques given the individual or combination vaccines were measured by pseudovirion neutralization or IFN-γ ELISpot assays. Both the MARV and mixed vaccines were able to protect macaques from lethal MARV challenge (5/6 vs. 6/6). In contrast, a greater proportion of macaques vaccinated with the EBOV vaccine survived lethal EBOV challenge in comparison to those that received the mixed vaccine (5/6 vs. 1/6). EBOV challenge survivors had significantly higher pre-challenge neutralizing antibody titers than those that succumbed. PMID:25996997

  15. Combined micromechanical and fabrication process optimization for metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Morel, M.; Saravanos, D. A.; Chamis, C. C.

    1991-01-01

    A method is presented to minimize the residual matrix stresses in metal matrix composites. Fabrication parameters such as temperature and consolidation pressure are optimized concurrently with the characteristics (i.e., modulus, coefficient of thermal expansion, strength, and interphase thickness) of a fiber-matrix interphase. By including the interphase properties in the fabrication process, lower residual stresses are achievable. Results for an ultra-high modulus graphite (P100)/copper composite show a reduction of 21 percent for the maximum matrix microstress when optimizing the fabrication process alone. Concurrent optimization of the fabrication process and interphase properties show a 41 percent decrease in the maximum microstress. Therefore, this optimization method demonstrates the capability of reducing residual microstresses by altering the temperature and consolidation pressure histories and tailoring the interphase properties for an improved composite material. In addition, the results indicate that the consolidation pressures are the most important fabrication parameters, and the coefficient of thermal expansion is the most critical interphase property.

  16. Concurrent micromechanical tailoring and fabrication process optimization for metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Morel, M.; Saravanos, D. A.; Chamis, Christos C.

    1990-01-01

    A method is presented to minimize the residual matrix stresses in metal matrix composites. Fabrication parameters such as temperature and consolidation pressure are optimized concurrently with the characteristics (i.e., modulus, coefficient of thermal expansion, strength, and interphase thickness) of a fiber-matrix interphase. By including the interphase properties in the fabrication process, lower residual stresses are achievable. Results for an ultra-high modulus graphite (P100)/copper composite show a reduction of 21 percent for the maximum matrix microstress when optimizing the fabrication process alone. Concurrent optimization of the fabrication process and interphase properties show a 41 percent decrease in the maximum microstress. Therefore, this optimization method demonstrates the capability of reducing residual microstresses by altering the temperature and consolidation pressure histories and tailoring the interphase properties for an improved composite material. In addition, the results indicate that the consolidation pressures are the most important fabrication parameters, and the coefficient of thermal expansion is the most critical interphase property.

  17. Optimization of biodiesel production process using recycled vegetable oil

    NASA Astrophysics Data System (ADS)

    Lugo, Yarely

    Petro diesel toxic emissions and its limited resources have created an interest for the development of new energy resources, such as biodiesel. Biodiesel is traditionally produced by a transesterification reaction between vegetable oil and an alcohol in the presence of a catalyst. However, this process is slow and expensive due to the high cost of raw materials. Low costs feedstock oils such as recycled and animal fats are available but they cannot be transesterified with alkaline catalysts due to high content of free fatty acids, which can lead to undesirable reactions such as saponification. In this study, we reduce free fatty acids content by using an acid pre-treatment. We compare sulfuric acid, hydrochloric acid and ptoluenesulfonic acid (PTSA) to pre-treat recycled vegetable oil. PTSA removes water after 60 minutes of treatment at room temperature or within 15 minutes at 50°C. The pretreatment was followed by a transesterification reaction using alkaline catalyst. To minimize costs and accelerate reaction, the pretreatment and transesterification reaction of recycle vegetable oil was conducted at atmospheric pressure in a microwave oven. Biodiesel was characterized using a GC-MS method.

  18. Method and apparatus for optimized processing of sparse matrices

    DOEpatents

    Taylor, Valerie E.

    1993-01-01

    A computer architecture for processing a sparse matrix is disclosed. The apparatus stores a value-row vector corresponding to nonzero values of a sparse matrix. Each of the nonzero values is located at a defined row and column position in the matrix. The value-row vector includes a first vector including nonzero values and delimiting characters indicating a transition from one column to another. The value-row vector also includes a second vector which defines row position values in the matrix corresponding to the nonzero values in the first vector and column position values in the matrix corresponding to the column position of the nonzero values in the first vector. The architecture also includes a circuit for detecting a special character within the value-row vector. Matrix-vector multiplication is executed on the value-row vector. This multiplication is performed by multiplying an index value of the first vector value by a column value from a second matrix to form a matrix-vector product which is added to a previous matrix-vector product.

  19. 76 FR 58807 - An Assessment of Decision-Making Processes: Evaluation of Where Land Protection Planning Can...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-22

    ... Incorporate Climate Change Information-- Release of Final Report AGENCY: Environmental Protection Agency (EPA... Decision-Making Processes: Evaluation of Where Land Protection Planning can Incorporate Climate Change... planning can incorporate climate change impacts information into programs. The assessment revealed...

  20. Optimization of the processing technology of Fructus Arctii by response surface methodology.

    PubMed

    Liu, Qi-Di; Qin, Kun-Ming; Shen, Bao-Jia; Cai, Hao; Cai, Bao-Chang

    2015-03-01

    The present study was designed to optimize the processing of Fructus Arctii by response surface methodology (RSM). Based on single factor studies, a three-variable, three-level Box-Behnken design (BBD) was used to monitor the effects of independent variables, including processing temperature and time, on the dependent variables. Response surfaces and contour plots of the contents of total lignans, chlorogenic acid, arctiin, and arctigenin were obtained through ultraviolet and visible (UV-Vis) monitoring and high performance liquid chromatography (HPLC). Fructus Arctii should be processed under heating in a pot at 311 °C, medicine at 119 °C for 123s with flipping frequently. The experimental values under the optimized processing technology were consistent with the predicted values. In conclusion, RSM is an effective method to optimize the processing of traditional Chinese medicine (TCM).

  1. Optimization of the processing technology of Fructus Arctii by response surface methodology.

    PubMed

    Liu, Qi-Di; Qin, Kun-Ming; Shen, Bao-Jia; Cai, Hao; Cai, Bao-Chang

    2015-03-01

    The present study was designed to optimize the processing of Fructus Arctii by response surface methodology (RSM). Based on single factor studies, a three-variable, three-level Box-Behnken design (BBD) was used to monitor the effects of independent variables, including processing temperature and time, on the dependent variables. Response surfaces and contour plots of the contents of total lignans, chlorogenic acid, arctiin, and arctigenin were obtained through ultraviolet and visible (UV-Vis) monitoring and high performance liquid chromatography (HPLC). Fructus Arctii should be processed under heating in a pot at 311 °C, medicine at 119 °C for 123s with flipping frequently. The experimental values under the optimized processing technology were consistent with the predicted values. In conclusion, RSM is an effective method to optimize the processing of traditional Chinese medicine (TCM). PMID:25835367

  2. A Survey of Stochastic Simulation and Optimization Methods in Signal Processing

    NASA Astrophysics Data System (ADS)

    Pereyra, Marcelo; Schniter, Philip; Chouzenoux, Emilie; Pesquet, Jean-Christophe; Tourneret, Jean-Yves; Hero, Alfred O.; McLaughlin, Steve

    2016-03-01

    Modern signal processing (SP) methods rely very heavily on probability and statistics to solve challenging SP problems. SP methods are now expected to deal with ever more complex models, requiring ever more sophisticated computational inference techniques. This has driven the development of statistical SP methods based on stochastic simulation and optimization. Stochastic simulation and optimization algorithms are computationally intensive tools for performing statistical inference in models that are analytically intractable and beyond the scope of deterministic inference methods. They have been recently successfully applied to many difficult problems involving complex statistical models and sophisticated (often Bayesian) statistical inference techniques. This survey paper offers an introduction to stochastic simulation and optimization methods in signal and image processing. The paper addresses a variety of high-dimensional Markov chain Monte Carlo (MCMC) methods as well as deterministic surrogate methods, such as variational Bayes, the Bethe approach, belief and expectation propagation and approximate message passing algorithms. It also discusses a range of optimization methods that have been adopted to solve stochastic problems, as well as stochastic methods for deterministic optimization. Subsequently, areas of overlap between simulation and optimization, in particular optimization-within-MCMC and MCMC-driven optimization are discussed.

  3. New Grandparents' Mental Health: The Protective Role of Optimism, Self-Mastery, and Social Support

    ERIC Educational Resources Information Center

    Ben Shlomo, Shirley; Taubman - Ben-Ari, Orit

    2012-01-01

    The current study examines the contribution of optimism, self-mastery, perceived social support, and background variables (age, physical health, economic status) to mental health following the transition to grandparenthood. The sample consisted of 257 first-time Israeli grandparents (grandmothers and grandfathers, maternal and paternal) who were…

  4. The Traumatic Impact of the September 11, 2001, Terrorist Attacks and the Potential Protection of Optimism

    ERIC Educational Resources Information Center

    Ai, Amy L.; Evans-Campbell, Teresa; Santangelo, Linda K.; Cascio, Toni

    2006-01-01

    This study examined the impact of the September 11 terrorist attacks on graduate and undergraduate students and the role of optimism in posttraumatic distress. A sample of 457 students who attended courses at three schools of social work (Nevada, Pennsylvania, and Washington) participated in the study. A quarter of them had a known person as an…

  5. Effects of chemical protective equipment on team process performance in small unit rescue operations.

    PubMed

    Grugle, Nancy L; Kleiner, Brian M

    2007-09-01

    In the event of a nuclear, biological, or chemical terrorist attack against civilians, both military and civilian emergency response teams must be able to respond and operate efficiently while wearing protective equipment. Chemical protective equipment protects the user by providing a barrier between the individual and hazardous environment. Unfortunately, the same equipment that is designed to support the user can potentially cause heat stress, reduced task efficiency, and reduced range-of-motion. Targeted Acceptable Responses to Generated Events of Tasks (TARGETS), an event-based team performance measurement methodology was used to investigate the effects of Mission Oriented Protective Posture (MOPP) on the behavioral processes underlying team performance during simulated rescue tasks. In addition, this study determined which team processes were related to team performance outcomes. Results of six primary analyses indicated that team process performance was not degraded by MOPP 4 on any rescue task and that the team processes critical for successful task performance are task-dependent. This article discusses the implications of these results with respect to the study design and the limitations of using an event-based team performance measurement methodology.

  6. Vulnerability and Protection Talk: Systemic Therapy Process with People with Intellectual Disability

    ERIC Educational Resources Information Center

    Pote, Helen; Mazon, Teresa; Clegg, Jennifer; King, Susan

    2011-01-01

    Background: Vulnerability and protection are key concepts within the literature relating to systemic therapy for people with an intellectual disability (ID). This paper explores the processes by which these concepts were discussed in systemic therapy sessions. Method: Four videotapes of systemic therapy sessions were evaluated using a qualitative…

  7. Optimization Control of the Color-Coating Production Process for Model Uncertainty.

    PubMed

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results. PMID:27247563

  8. Optimization Control of the Color-Coating Production Process for Model Uncertainty.

    PubMed

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results.

  9. Optimization Control of the Color-Coating Production Process for Model Uncertainty

    PubMed Central

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results. PMID:27247563

  10. Two-step optimization of pressure and recovery of reverse osmosis desalination process.

    PubMed

    Liang, Shuang; Liu, Cui; Song, Lianfa

    2009-05-01

    Driving pressure and recovery are two primary design variables of a reverse osmosis process that largely determine the total cost of seawater and brackish water desalination. A two-step optimization procedure was developed in this paper to determine the values of driving pressure and recovery that minimize the total cost of RO desalination. It was demonstrated that the optimal net driving pressure is solely determined by the electricity price and the membrane price index, which is a lumped parameter to collectively reflect membrane price, resistance, and service time. On the other hand, the optimal recovery is determined by the electricity price, initial osmotic pressure, and costs for pretreatment of raw water and handling of retentate. Concise equations were derived for the optimal net driving pressure and recovery. The dependences of the optimal net driving pressure and recovery on the electricity price, membrane price, and costs for raw water pretreatment and retentate handling were discussed.

  11. Optimization of Training Sets for Neural-Net Processing of Characteristic Patterns from Vibrating Solids

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J.

    2001-01-01

    Artificial neural networks have been used for a number of years to process holography-generated characteristic patterns of vibrating structures. This technology depends critically on the selection and the conditioning of the training sets. A scaling operation called folding is discussed for conditioning training sets optimally for training feed-forward neural networks to process characteristic fringe patterns. Folding allows feed-forward nets to be trained easily to detect damage-induced vibration-displacement-distribution changes as small as 10 nm. A specific application to aerospace of neural-net processing of characteristic patterns is presented to motivate the conditioning and optimization effort.

  12. MINLP models for the synthesis of optimal peptide tags and downstream protein processing.

    PubMed

    Simeonidis, Evangelos; Pinto, Jose M; Lienqueo, M Elena; Tsoka, Sophia; Papageorgiou, Lazaros G

    2005-01-01

    The development of systematic methods for the synthesis of downstream protein processing operations has seen growing interest in recent years, as purification is often the most complex and costly stage in biochemical production plants. The objective of the work presented here is to develop mathematical models based on mixed integer optimization techniques, which integrate the selection of optimal peptide purification tags into an established framework for the synthesis of protein purification processes. Peptide tags are comparatively short sequences of amino acids fused onto the protein product, capable of reducing the required purification steps. The methodology is illustrated through its application on two example protein mixtures involving up to 13 contaminants and a set of 11 candidate chromatographic steps. The results are indicative of the benefits resulting by the appropriate use of peptide tags in purification processes and provide a guideline for both optimal tag design and downstream process synthesis. PMID:15932268

  13. A Conductivity Relationship for Steady-state Unsaturated Flow Processes under Optimal Flow Conditions

    SciTech Connect

    Liu, H. H.

    2010-09-15

    Optimality principles have been used for investigating physical processes in different areas. This work attempts to apply an optimal principle (that water flow resistance is minimized on global scale) to steady-state unsaturated flow processes. Based on the calculus of variations, we show that under optimal conditions, hydraulic conductivity for steady-state unsaturated flow is proportional to a power function of the magnitude of water flux. This relationship is consistent with an intuitive expectation that for an optimal water flow system, locations where relatively large water fluxes occur should correspond to relatively small resistance (or large conductance). Similar results were also obtained for hydraulic structures in river basins and tree leaves, as reported in other studies. Consistence of this theoretical result with observed fingering-flow behavior in unsaturated soils and an existing model is also demonstrated.

  14. Optimization of the Temperature-Time Curve for the Curing Process of Thermoset Matrix Composites

    NASA Astrophysics Data System (ADS)

    Aleksendrić, Dragan; Carlone, Pierpaolo; Ćirović, Velimir

    2016-05-01

    An intelligent optimization model aiming at off-line or pre-series optimization of the thermal curing cycle of polymer matrix composites is proposed and discussed. The computational procedure is based on the coupling of a finite element thermochemical process model, dynamic artificial neural networks and genetic algorithms. Objective of the optimization routine is the maximization of the composite degree of cure by the definition of the autoclave temperature. Obtained outcomes evidenced the capability of the method as well as its efficiency with respect to hard computing or experimental procedures.

  15. Optimization of the Temperature-Time Curve for the Curing Process of Thermoset Matrix Composites

    NASA Astrophysics Data System (ADS)

    Aleksendrić, Dragan; Carlone, Pierpaolo; Ćirović, Velimir

    2016-10-01

    An intelligent optimization model aiming at off-line or pre-series optimization of the thermal curing cycle of polymer matrix composites is proposed and discussed. The computational procedure is based on the coupling of a finite element thermochemical process model, dynamic artificial neural networks and genetic algorithms. Objective of the optimization routine is the maximization of the composite degree of cure by the definition of the autoclave temperature. Obtained outcomes evidenced the capability of the method as well as its efficiency with respect to hard computing or experimental procedures.

  16. Protecting Public Health in Nuclear Emergencies-the Need to Broaden the Process.

    PubMed

    Carr, Z; Weiss, W; Roebbel, N; Abrahams, J

    2016-09-01

    It is necessary for the radiation protection system to broaden beyond radioactive dose, the view on impact of nuclear accidents, taking in consideration the psychological, social and economic determinants impacting the vulnerability of the exposed population, as well as the impacts of emergency countermeasures. It is strongly recommended to pursue strategies, approaches and services that will address these aspects within the general health protection system and will be applied before, during and after an emergency. The paper raises awareness and proposes a three-step development process for an integrated framework based on the social determinants of health approach. PMID:27542815

  17. Process optimization for osmo-dehydrated carambola (Averrhoa carambola L) slices and its storage studies.

    PubMed

    Roopa, N; Chauhan, O P; Raju, P S; Das Gupta, D K; Singh, R K R; Bawa, A S

    2014-10-01

    An osmotic-dehydration process protocol for Carambola (Averrhoacarambola L.,), an exotic star shaped tropical fruit, was developed. The process was optimized using Response Surface Methodology (RSM) following Central Composite Rotatable Design (CCRD). The experimental variables selected for the optimization were soak solution concentration (°Brix), soaking temperature (°C) and soaking time (min) with 6 experiments at central point. The effect of process variables was studied on solid gain and water loss during osmotic dehydration process. The data obtained were analyzed employing multiple regression technique to generate suitable mathematical models. Quadratic models were found to fit well (R(2), 95.58 - 98.64 %) in describing the effect of variables on the responses studied. The optimized levels of the process variables were achieved at 70°Brix, 48 °C and 144 min for soak solution concentration, soaking temperature and soaking time, respectively. The predicted and experimental results at optimized levels of variables showed high correlation. The osmo-dehydrated product prepared at optimized conditions showed a shelf-life of 10, 8 and 6 months at 5 °C, ambient (30 ± 2 °C) and 37 °C, respectively. PMID:25328186

  18. Process optimization for osmo-dehydrated carambola (Averrhoa carambola L) slices and its storage studies.

    PubMed

    Roopa, N; Chauhan, O P; Raju, P S; Das Gupta, D K; Singh, R K R; Bawa, A S

    2014-10-01

    An osmotic-dehydration process protocol for Carambola (Averrhoacarambola L.,), an exotic star shaped tropical fruit, was developed. The process was optimized using Response Surface Methodology (RSM) following Central Composite Rotatable Design (CCRD). The experimental variables selected for the optimization were soak solution concentration (°Brix), soaking temperature (°C) and soaking time (min) with 6 experiments at central point. The effect of process variables was studied on solid gain and water loss during osmotic dehydration process. The data obtained were analyzed employing multiple regression technique to generate suitable mathematical models. Quadratic models were found to fit well (R(2), 95.58 - 98.64 %) in describing the effect of variables on the responses studied. The optimized levels of the process variables were achieved at 70°Brix, 48 °C and 144 min for soak solution concentration, soaking temperature and soaking time, respectively. The predicted and experimental results at optimized levels of variables showed high correlation. The osmo-dehydrated product prepared at optimized conditions showed a shelf-life of 10, 8 and 6 months at 5 °C, ambient (30 ± 2 °C) and 37 °C, respectively.

  19. The protective effect of Agaricus blazei Murrill, submerged culture using the optimized medium composition, on alcohol-induced liver injury.

    PubMed

    Wang, Hang; Li, Gang; Zhang, Wenyu; Han, Chunchao; Xu, Xin; Li, Yong-Ping

    2014-01-01

    Agaricus blazei Murrill (ABM), an edible mushroom native to Brazil, is widely used for nonprescript and medicinal purposes. Alcohol liver disease (ALD) is considered as a leading cause for a liver injury in modern dietary life, which can be developed by a prolonged or large intake of alcohol. In this study, the medium composition of ABM was optimized using response surface methodology for maximum mycelial biomass and extracellular polysaccharide (EPS) production. The model predicts to gain a maximal mycelial biomass and extracellular polysaccharide at 1.047 g/100 mL, and 0.367 g/100 mL, respectively, when the potato is 29.88 g/100 mL, the glucose is 1.01 g/100 mL, and the bran is 1.02 g/100 mL. The verified experiments showed that the model was significantly consistent with the model prediction and that the trends of mycelial biomass and extracellular polysaccharide were predicted by artificial neural network. After that, the optimized medium was used for the submerged culture of ABM. Then, alcohol-induced liver injury in mice model was used to examine the protective effect of ABM cultured using the optimized medium on the liver. And the hepatic histopathological observations showed that ABM had a relatively significant role in mice model, which had alcoholic liver damage.

  20. Optimization of carbon capture systems using surrogate models of simulated processes.

    SciTech Connect

    Cozad, A.; Chang, Y.; Sahinidis, N.; Miller, D.

    2011-01-01

    With increasing demand placed on power generation plants to reduce carbon dioxide (CO2) emissions, processes to separate and capture CO2 for eventual sequestration are highly sought after. Carbon capture processes impart a parasitic load on the power plants; it is estimated that this would increase the cost of electricity from existing pulverized coal plants anywhere from 71-85 percent [1]. The National Energy and Technology Lab (NETL) is working to lower this to below a 30 percent increase. To reach this goal, work is being done not only to accurately simulate these processes, but also to leverage those accurate and detailed simulations to design optimal carbon capture processes. The major challenges include the lack of accurate algebraic models of the processes, computationally costly simulations, and insufficiently robust simulations. The first challenge bars the use of provable derivative-based optimization algorithms. The latter two can either lead to difficult or impossible direct derivative-free optimization. To overcome these difficulties, we take a more indirect method to solving this problem by, first, generating an accurate set of algebraic surrogate models from the simulation then using derivative-based solvers to optimize the surrogate models. We developed a method that uses derivative-based and derivative-free optimization alongside machine learning and statistical techniques to generate the set of low-complexity surrogate models using data sampled from detailed simulations. The models are validated and improved through the use of derivative-free solvers to adaptively sample new simulation points. The resulting surrogate models can then be used in a superstructure-based process synthesis and solved using derivative-based methods to optimize carbon capture processes.

  1. A Study on the Optimization Performance of Fireworks and Cuckoo Search Algorithms in Laser Machining Processes

    NASA Astrophysics Data System (ADS)

    Goswami, D.; Chakraborty, S.

    2014-11-01

    Laser machining is a promising non-contact process for effective machining of difficult-to-process advanced engineering materials. Increasing interest in the use of lasers for various machining operations can be attributed to its several unique advantages, like high productivity, non-contact processing, elimination of finishing operations, adaptability to automation, reduced processing cost, improved product quality, greater material utilization, minimum heat-affected zone and green manufacturing. To achieve the best desired machining performance and high quality characteristics of the machined components, it is extremely important to determine the optimal values of the laser machining process parameters. In this paper, fireworks algorithm and cuckoo search (CS) algorithm are applied for single as well as multi-response optimization of two laser machining processes. It is observed that although almost similar solutions are obtained for both these algorithms, CS algorithm outperforms fireworks algorithm with respect to average computation time, convergence rate and performance consistency.

  2. Design and optimization of the SOI field effect diode (FED) for ESD protection

    NASA Astrophysics Data System (ADS)

    Yang, Yang; Salman, Akram A.; Ioannou, Dimitris E.; Beebe, Stephen G.

    2008-10-01

    A thorough investigation is carried out by numerical simulations of the field effect diode (FED) with the aim to explore its potential for ESD protection applications in silicon on insulator (SOI) technologies. It is shown that the carrier lifetime value has an important impact on the device operation. By careful sizing and doping, FED devices with reasonable breakdown voltage values can be achieved but at rather high gate voltage values. Better results are achieved by modifying the doping profile to resemble a PNPN structure with two gates.

  3. Standard for fire protection of DOE electronic computer/data processing systems

    SciTech Connect

    Not Available

    1984-01-01

    The standard applies to all essential electronic computer data processing equipment as well as the storage facilities, associated utilities, and air conditioning systems. The types of construction necessary for a computer building and computer rooms are described, including location and perimeter separation and certain operating requirements. Fire protection systems that may be employed for computer equipment protection are described. Also discussed are the necessary utilities required in a computer area and the emergency controls required for shutdown of these utilities. The types of storage and records and the protection required for each type are discussed. Specific requirements are listed unique to mobile equipment. Emergency operations are described, including programs for firefighting and for the restoration of damaged records. (LEW)

  4. Standard for fire protection of DOE electronic computer/data processing systems

    NASA Astrophysics Data System (ADS)

    1984-01-01

    The standard which applies to all essential electronic computer data processing equipment and the storage facilities, associated utilities, and air conditioning systems is outlined. The types of construction necessary for a computer building and computer rooms are described, which includes location and perimeter separation anc certain operating requirements. Fire protection systems that may be employed for computer equipment protection are described. The necessary utilities required in a computer area and the emergency controls required for shutdown of these utilities and the types of storage and records and the protection required for each type are discussed. Specific requirements are listed unique to mobile equipment. Emergency operations programs for firefighting and for the restoration of damaged records are described.

  5. Application of the Environmental Protection Agency`s data quality objective process to environmental monitoring quality control

    SciTech Connect

    Garcia, L.M.

    1995-11-01

    The United States Environmental Protection Agency`s (EPA) Data Quality Objectives (DQO) process was applied to two environmental monitoring networks for the purpose of optimizing field quality control sampling to give the highest quality monitoring data with minimal impact on resources. The DQO process, developed primarily to aid in cleanup and restoration activities, is a systematic approach to designing sampling, and analysis programs with improved efficiency, cost savings, and measureable and traceable data quality. The two monitoring- networks studied had not been subjected to the systematic review and analysis of the DQO process defined by the EPA. The two monitoring networks studied had relied upon field duplicates or replicates as the main source of field quality control data. Sometimes, both duplicate and routine sample were analyzed by the same analytical laboratory; at other times they were analyzed by different laboratories. This study identified some potential inconsistencies between analytical data and reporting limits from two different laboratories. Application of the EPA DQO process resulted in recommendations for changes in the field quality control sampling program, allowed new insight into the monitoring data, and raised several issues that should be the subject of further investigation.

  6. Improved Sugar Production by Optimizing Planetary Mill Pretreatment and Enzyme Hydrolysis Process

    PubMed Central

    Kwon, Jeong Heo; Lee, Siseon; Lee, Jae-Won; Hong, Youn-Woo; Chang, Jeong Ho; Sung, Daekyung; Kim, Sung Hyun; Sang, Byoung-In; Mitchell, Robert J.; Lee, Jin Hyung

    2015-01-01

    This paper describes an optimization of planetary mill pretreatment and saccharification processes for improving biosugar production. Pitch pine (Pinus rigida) wood sawdust waste was used as biomass feedstock and the process parameters optimized in this study were the buffering media, the milling time, the enzyme quantity, and the incubation time. Glucose yields were improved when acetate buffer was used rather than citrate buffer. Initially, with each process variable tests, the optimal values were 100 minutes of milling, an enzyme concentration of 16 FPU/g-biomass, and a 12-hour enzymatic hydrolysis. Typically, interactions between these experimental conditions and their effects on glucose production were next investigated using RSM. Glucose yields from the Pinus rigida waste exceeded 80% with several of the conditions tested, demonstrating that milling can be used to obtain high levels of glucose bioconversion from woody biomass for biorefinery purposes. PMID:26539475

  7. Improved Sugar Production by Optimizing Planetary Mill Pretreatment and Enzyme Hydrolysis Process.

    PubMed

    Kwon, Jeong Heo; Lee, Siseon; Lee, Jae-Won; Hong, Youn-Woo; Chang, Jeong Ho; Sung, Daekyung; Kim, Sung Hyun; Sang, Byoung-In; Mitchell, Robert J; Lee, Jin Hyung

    2015-01-01

    This paper describes an optimization of planetary mill pretreatment and saccharification processes for improving biosugar production. Pitch pine (Pinus rigida) wood sawdust waste was used as biomass feedstock and the process parameters optimized in this study were the buffering media, the milling time, the enzyme quantity, and the incubation time. Glucose yields were improved when acetate buffer was used rather than citrate buffer. Initially, with each process variable tests, the optimal values were 100 minutes of milling, an enzyme concentration of 16 FPU/g-biomass, and a 12-hour enzymatic hydrolysis. Typically, interactions between these experimental conditions and their effects on glucose production were next investigated using RSM. Glucose yields from the Pinus rigida waste exceeded 80% with several of the conditions tested, demonstrating that milling can be used to obtain high levels of glucose bioconversion from woody biomass for biorefinery purposes. PMID:26539475

  8. Optimization of process parameters for production of volatile fatty acid, biohydrogen and methane from anaerobic digestion.

    PubMed

    Khan, M A; Ngo, H H; Guo, W S; Liu, Y; Nghiem, L D; Hai, F I; Deng, L J; Wang, J; Wu, Y

    2016-11-01

    The anaerobic digestion process has been primarily utilized for methane containing biogas production over the past few years. However, the digestion process could also be optimized for producing volatile fatty acids (VFAs) and biohydrogen. This is the first review article that combines the optimization approaches for all three possible products from the anaerobic digestion. In this review study, the types and configurations of the bioreactor are discussed for each type of product. This is followed by a review on optimization of common process parameters (e.g. temperature, pH, retention time and organic loading rate) separately for the production of VFA, biohydrogen and methane. This review also includes additional parameters, treatment methods or special additives that wield a significant and positive effect on production rate and these products' yield.

  9. Optimization of process parameters for production of volatile fatty acid, biohydrogen and methane from anaerobic digestion.

    PubMed

    Khan, M A; Ngo, H H; Guo, W S; Liu, Y; Nghiem, L D; Hai, F I; Deng, L J; Wang, J; Wu, Y

    2016-11-01

    The anaerobic digestion process has been primarily utilized for methane containing biogas production over the past few years. However, the digestion process could also be optimized for producing volatile fatty acids (VFAs) and biohydrogen. This is the first review article that combines the optimization approaches for all three possible products from the anaerobic digestion. In this review study, the types and configurations of the bioreactor are discussed for each type of product. This is followed by a review on optimization of common process parameters (e.g. temperature, pH, retention time and organic loading rate) separately for the production of VFA, biohydrogen and methane. This review also includes additional parameters, treatment methods or special additives that wield a significant and positive effect on production rate and these products' yield. PMID:27570139

  10. Multiresponse Optimization of Process Parameters in Turning of GFRP Using TOPSIS Method

    PubMed Central

    Parida, Arun Kumar; Routara, Bharat Chandra

    2014-01-01

    Taguchi's design of experiment is utilized to optimize the process parameters in turning operation with dry environment. Three parameters, cutting speed (v), feed (f), and depth of cut (d), with three different levels are taken for the responses like material removal rate (MRR) and surface roughness (Ra). The machining is conducted with Taguchi L9 orthogonal array, and based on the S/N analysis, the optimal process parameters for surface roughness and MRR are calculated separately. Considering the larger-the-better approach, optimal process parameters for material removal rate are cutting speed at level 3, feed at level 2, and depth of cut at level 3, that is, v3-f2-d3. Similarly for surface roughness, considering smaller-the-better approach, the optimal process parameters are cutting speed at level 1, feed at level 1, and depth of cut at level 3, that is, v1-f1-d3. Results of the main effects plot indicate that depth of cut is the most influencing parameter for MRR but cutting speed is the most influencing parameter for surface roughness and feed is found to be the least influencing parameter for both the responses. The confirmation test is conducted for both MRR and surface roughness separately. Finally, an attempt has been made to optimize the multiresponses using technique for order preference by similarity to ideal solution (TOPSIS) with Taguchi approach. PMID:27437503

  11. Critical Infrastructure Protection II, The International Federation for Information Processing, Volume 290.

    NASA Astrophysics Data System (ADS)

    Papa, Mauricio; Shenoi, Sujeet

    The information infrastructure -- comprising computers, embedded devices, networks and software systems -- is vital to day-to-day operations in every sector: information and telecommunications, banking and finance, energy, chemicals and hazardous materials, agriculture, food, water, public health, emergency services, transportation, postal and shipping, government and defense. Global business and industry, governments, indeed society itself, cannot function effectively if major components of the critical information infrastructure are degraded, disabled or destroyed. Critical Infrastructure Protection II describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. Areas of coverage include: - Themes and Issues - Infrastructure Security - Control Systems Security - Security Strategies - Infrastructure Interdependencies - Infrastructure Modeling and Simulation This book is the second volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of twenty edited papers from the Second Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection held at George Mason University, Arlington, Virginia, USA in the spring of 2008.

  12. Applying ILT mask synthesis for co-optimizing design rules and DSA process characteristics

    NASA Astrophysics Data System (ADS)

    Dam, Thuc; Stanton, William

    2014-03-01

    During early stage development of a DSA process, there are many unknown interactions between design, DSA process, RET, and mask synthesis. The computational resolution of these unknowns can guide development towards a common process space whereby manufacturing success can be evaluated. This paper will demonstrate the use of existing Inverse Lithography Technology (ILT) to co-optimize the multitude of parameters. ILT mask synthesis will be applied to a varied hole design space in combination with a range of DSA model parameters under different illumination and RET conditions. The design will range from 40 nm pitch doublet to random DSA designs with larger pitches, while various effective DSA characteristics of shrink bias and corner smoothing will be assumed for the DSA model during optimization. The co-optimization of these design parameters and process characteristics under different SMO solutions and RET conditions (dark/bright field tones and binary/PSM mask types) will also help to provide a complete process mapping of possible manufacturing options. The lithographic performances for masks within the optimized parameter space will be generated to show a common process space with the highest possibility for success.

  13. Optimizing Friction Stir Welding via Statistical Design of Tool Geometry and Process Parameters

    NASA Astrophysics Data System (ADS)

    Blignault, C.; Hattingh, D. G.; James, M. N.

    2012-06-01

    This article considers optimization procedures for friction stir welding (FSW) in 5083-H321 aluminum alloy, via control of weld process parameters and tool design modifications. It demonstrates the potential utility of the "force footprint" (FF) diagram in providing a real-time graphical user interface (GUI) for process optimization of FSW. Multiple force, torque, and temperature responses were recorded during FS welding using 24 different tool pin geometries, and these data were statistically analyzed to determine the relative influence of a number of combinations of important process and tool geometry parameters on tensile strength. Desirability profile charts are presented, which show the influence of seven key combinations of weld process variables on tensile strength. The model developed in this study allows the weld tensile strength to be predicted for other combinations of tool geometry and process parameters to fall within an average error of 13%. General guidelines for tool profile selection and the likelihood of influencing weld tensile strength are also provided.

  14. SEMICONDUCTOR DEVICES Process optimization of a deep trench isolation structure for high voltage SOI devices

    NASA Astrophysics Data System (ADS)

    Kuiying, Zhu; Qinsong, Qian; Jing, Zhu; Weifeng, Sun

    2010-12-01

    The process reasons for weak point formation of the deep trench on SOI wafers have been analyzed in detail. An optimized trench process is also proposed. It is found that there are two main reasons: one is over-etching laterally of the silicon on the surface of the buried oxide caused by a fringe effect; and the other is the slow growth rate of the isolation oxide in the concave silicon corner of the trench bottom. In order to improve the isolation performance of the deep trench, two feasible ways for optimizing the trench process are proposed. The improved process thickens the isolation oxide and rounds sharp silicon corners at their weak points, increasing the applied voltage by 15-20 V at the same leakage current. The proposed new trench isolation process has been verified in the foundry's 0.5-μm HV SOI technology.

  15. Optimization of the high-shear wet granulation wetting process using fuzzy logic modeling.

    PubMed

    Belohlav, Zdenek; Brenkova, Lucie; Kalcikova, Jana; Hanika, Jiri; Durdil, Petr; Tomasek, Vaclav; Palatova, Marta

    2007-01-01

    A fuzzy model has been developed for the optimization of high-shear wet granulation wetting on a plant scale depending on the characteristics of pharmaceutical active substance particles. The model optimized on the basis of experimental data involves a set of rules obtained from expert knowledge and full-scale process data. The skewness coefficient of particle size distribution and the tapped density of the granulated mixture were chosen as the model input variables. The output of the fuzzy ruled system is the optimal quantity of wetting liquid. In comparison to manufacturing practice, a very strong sensitivity of the optimal quantity of the added wetting liquid to the size and shape of the active substance particles has been identified by fuzzy modeling. PMID:17763139

  16. A case study of optimization in the decision process: Siting groundwater monitoring wells

    SciTech Connect

    Cardwell, H.; Huff, D.; Douthitt, J.; Sale, M.

    1993-12-01

    Optimization is one of the tools available to assist decision makers in balancing multiple objectives and concerns. In a case study of the siting decision for groundwater monitoring wells, we look at the influence of the optimization models on the decisions made by the responsible groundwater specialist. This paper presents a multi-objective integer programming model for determining the location of monitoring wells associated with a groundwater pump-and-treat remediation. After presenting the initial optimization results, we analyze the actual decision and revise the model to incorporate elements of the problem that were later identified as important in the decision-making process. The results of a revised model are compared to the actual siting plans, the recommendations from the initial optimization runs, and the initial monitoring network proposed by the decision maker.

  17. Optimization of enzymatic process for vanillin extraction using response surface methodology.

    PubMed

    Gu, Fenglin; Xu, Fei; Tan, Lehe; Wu, Huasong; Chu, Zhong; Wang, Qinghuang

    2012-01-01

    Vanillin was extracted from vanilla beans using pretreatment with cellulase to produce enzymatic hydrolysis, and response surface methodology (RSM) was applied to optimize the processing parameters of this extraction. The effects of heating time, enzyme quantity and temperature on enzymatic extraction of vanillin were evaluated. Extraction yield (mg/g) was used as the response value. The results revealed that the increase in heating time and the increase in enzyme quantity (within certain ranges) were associated with an enhancement of extraction yield, and that the optimal conditions for vanillin extraction were: Heating time 6 h, temperature 60 °C and enzyme quantity 33.5 mL. Calculated from the final polynomial functions, the optimal response of vanillin extraction yield was 7.62 mg/g. The predicted results for optimal reaction conditions were in good agreement with experimental values.

  18. Multi-response optimization of CO 2 laser-welding process of austenitic stainless steel

    NASA Astrophysics Data System (ADS)

    Benyounis, K. Y.; Olabi, A. G.; Hashmi, M. S. J.

    2008-02-01

    Recently, laser welding of austenitic stainless steel has received great attention in industry. This is due to its widespread application in petroleum refinement stations, power plants, the pharmaceutical industry and also in households. Therefore, mechanical properties should be controlled to obtain good welded joints. The welding process should be optimized by the proper mathematical models. In this research, the tensile strength and impact strength along with the joint-operating cost of laser-welded butt joints made of AISI304 was investigated. Design-expert software was used to establish the design matrix and to analyze the experimental data. The relationships between the laser-welding parameters (laser power, welding speed and focal point position) and the three responses (tensile strength, impact strength and joint-operating cost) were established. Also, the optimization capabilities in design-expert software were used to optimize the welding process. The developed mathematical models were tested for adequacy using analysis of variance and other adequacy measures. In this investigation, the optimal welding conditions were identified in order to increase the productivity and minimize the total operating cost. Overlay graphs were plotted by superimposing the contours for the various response surfaces. The process parameters effect was determined and the optimal welding combinations were tabulated.

  19. IEC 61511 and the capital project process--a protective management system approach.

    PubMed

    Summers, Angela E

    2006-03-17

    This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.

  20. Data-based robust multiobjective optimization of interconnected processes: energy efficiency case study in papermaking.

    PubMed

    Afshar, Puya; Brown, Martin; Maciejowski, Jan; Wang, Hong

    2011-12-01

    Reducing energy consumption is a major challenge for "energy-intensive" industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of "optimized" operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method.

  1. Optimization of photo-Fenton process of RO concentrated coking wastewater using response surface methodology.

    PubMed

    Huiqing, Zhang; Chunsong, Ye; Xian, Zhang; Fan, Yang; Jun, Yang; Wei, Zhou

    2012-01-01

    The objective of this study was aimed at investigating the removal of chemical oxygen demand (COD) from reverse osmosis (RO) concentrated coking wastewater by the photo-Fenton process. The optimum extraction conditions for the photo-Fenton process by Box-Behnken design (BBD) and response surface methodology (RSM) to establish a predictive polynomial quadratic model were discussed based on a single factor test. Optimized parameters validated by the analysis of variances (ANOVA) were found to be H(2)O(2) concentration of 345.2 mg/L, pH value of 4.1 and reaction time of 103.5 minutes under ultraviolet irradiation. The experimental results of the COD removal under the optimized conditions presented better agreement with the predicted values with deviation error of 3.2%. The results confirmed that RSM based on BBD was a suitable method to optimize the operating conditions of RO concentrated coking wastewater.

  2. Rational design and optimization of downstream processes of virus particles for biopharmaceutical applications: current advances.

    PubMed

    Vicente, Tiago; Mota, José P B; Peixoto, Cristina; Alves, Paula M; Carrondo, Manuel J T

    2011-01-01

    The advent of advanced therapies in the pharmaceutical industry has moved the spotlight into virus-like particles and viral vectors produced in cell culture holding great promise in a myriad of clinical targets, including cancer prophylaxis and treatment. Even though a couple of cases have reached the clinic, these products have yet to overcome a number of biological and technological challenges before broad utilization. Concerning the manufacturing processes, there is significant research focusing on the optimization of current cell culture systems and, more recently, on developing scalable downstream processes to generate material for pre-clinical and clinical trials. We review the current options for downstream processing of these complex biopharmaceuticals and underline current advances on knowledge-based toolboxes proposed for rational optimization of their processing. Rational tools developed to increase the yet scarce knowledge on the purification processes of complex biologicals are discussed as alternative to empirical, "black-boxed" based strategies classically used for process development. Innovative methodologies based on surface plasmon resonance, dynamic light scattering, scale-down high-throughput screening and mathematical modeling for supporting ion-exchange chromatography show great potential for a more efficient and cost-effective process design, optimization and equipment prototyping.

  3. Finite Element Based Optimization of Material Parameters for Enhanced Ballistic Protection

    NASA Astrophysics Data System (ADS)

    Ramezani, Arash; Huber, Daniel; Rothe, Hendrik

    2013-06-01

    The threat imposed by terrorist attacks is a major hazard for military installations, vehicles and other items. The large amounts of firearms and projectiles that are available, pose serious threats to military forces and even civilian facilities. An important task for international research and development is to avert danger to life and limb. This work will evaluate the effect of modern armor with numerical simulations. It will also provide a brief overview of ballistic tests in order to offer some basic knowledge of the subject, serving as a basis for the comparison of simulation results. The objective of this work is to develop and improve the modern armor used in the security sector. Numerical simulations should replace the expensive ballistic tests and find vulnerabilities of items and structures. By progressively changing the material parameters, the armor is to be optimized. Using a sensitivity analysis, information regarding decisive variables is yielded and vulnerabilities are easily found and eliminated afterwards. To facilitate the simulation, advanced numerical techniques have been employed in the analyses.

  4. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  5. Numerical simulation study on active and passive hydroforming process optimization of box shaped part

    NASA Astrophysics Data System (ADS)

    Zeng, Y. P.; Dong, J. L.; He, T. D.; Wang, B.

    2016-08-01

    Low qualified rate and inferior quality frequently occurring in the general deep drawing process of a certain box-shaped part, now use hydroforming to optimize forming process, in order to study the effect of hydroforming for improving the quality and formability, purposed five process schemes: general deep drawing, active hydroforming, passive hydroforming, general deep drawing combined with active hydroforming, passive combined with active hydroforming. Each process was simulated by finite element simulation and results were analysed. The results indicate the passive combined with active hydroforming is the best scheme which can obtain smallest thickness thinning and satisfactory formability, then optimized hydroforming pressure, blank holder force subsequently by adjust the simulation parameters. Research result proves that active/passive hydroforming is a new method for complex parts forming.

  6. Computational techniques for design optimization of thermal protection systems for the space shuttle vehicle. Volume 1: Final report

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Computational techniques were developed and assimilated for the design optimization. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in Fortran IV for the CDC 6400 but was subsequently converted to the Fortran V language to be used on the Univac 1108. The program allows for improvement and update of the performance prediction techniques. The program logic involves subroutines which handle the following basic functions: (1) a driver which calls for input, output, and communication between program and user and between the subroutines themselves; (2) thermodynamic analysis; (3) thermal stress analysis; (4) acoustic fatigue analysis; and (5) weights/cost analysis. In addition, a system total cost is predicted based on system weight and historical cost data of similar systems. Two basic types of input are provided, both of which are based on trajectory data. These are vehicle attitude (altitude, velocity, and angles of attack and sideslip), for external heat and pressure loads calculation, and heating rates and pressure loads as a function of time.

  7. Codon-optimized filovirus DNA vaccines delivered by intramuscular electroporation protect cynomolgus macaques from lethal Ebola and Marburg virus challenges

    PubMed Central

    Grant-Klein, Rebecca J; Altamura, Louis A; Badger, Catherine V; Bounds, Callie E; Van Deusen, Nicole M; Kwilas, Steven A; Vu, Hong A; Warfield, Kelly L; Hooper, Jay W; Hannaman, Drew; Dupuy, Lesley C; Schmaljohn, Connie S

    2015-01-01

    Cynomolgus macaques were vaccinated by intramuscular electroporation with DNA plasmids expressing codon-optimized glycoprotein (GP) genes of Ebola virus (EBOV) or Marburg virus (MARV) or a combination of codon-optimized GP DNA vaccines for EBOV, MARV, Sudan virus and Ravn virus. When measured by ELISA, the individual vaccines elicited slightly higher IgG responses to EBOV or MARV than did the combination vaccines. No significant differences in immune responses of macaques given the individual or combination vaccines were measured by pseudovirion neutralization or IFN-γ ELISpot assays. Both the MARV and mixed vaccines were able to protect macaques from lethal MARV challenge (5/6 vs. 6/6). In contrast, a greater proportion of macaques vaccinated with the EBOV vaccine survived lethal EBOV challenge in comparison to those that received the mixed vaccine (5/6 vs. 1/6). EBOV challenge survivors had significantly higher pre-challenge neutralizing antibody titers than those that succumbed. PMID:25996997

  8. Modeling and optimization of transmission and processing of data in an information computer network

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Boriev, Z.; Nyrkov, A.; Sokolov, S.

    2016-04-01

    The paper presents a comparative analysis of the routing algorithms that allows optimizing the process of transmission and processing of data in information computer networks. A special attention is paid to multipath methods of data transmission coupled with the number of operations necessary for their performance. In addition the authors have raised the question of a linear programming method for the purpose of the solution of the above-mentioned problem.

  9. A strategy to optimize the thermoelectric performance in a spark plasma sintering process

    PubMed Central

    Chiu, Wan-Ting; Chen, Cheng-Lung; Chen, Yang-Yuan

    2016-01-01

    Spark plasma sintering (SPS) is currently widely applied to existing alloys as a means of further enhancing the alloys’ figure of merit. However, the determination of the optimal sintering condition is challenging in the SPS process. This report demonstrates a systematic way to independently optimize the Seebeck coefficient S and the ratio of electrical to thermal conductivity (σ/κ) and thus achieve the maximum figure of merit zT = S2(σ/κ)T. Sb2−xInxTe3 (x = 0–0.2) were chosen as examples to validate the method. Although high sintering temperature and pressure are helpful in enhancing the compactness and electrical conductivity of pressed samples, the resultant deteriorated Seebeck coefficient and increasing thermal conductivity eventually offset the benefit. We found that the optimal sintering temperature coincides with temperatures at which the maximum Seebeck coefficient begins to degrade, whereas the optimal sintering pressure coincided with the pressure at which the σ/κ ratio reaches a maximum. Based on this principle, the optimized sintering conditions were determined, and the zT of Sb1.9In0.1Te3 is raised to 0.92 at 600 K, showing an approximately 84% enhancement. This work develops a facile strategy for selecting the optimal SPS sintering condition to further enhance the zT of bulk specimens. PMID:26975209

  10. Combining analysis with optimization at Langley Research Center. An evolutionary process

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1982-01-01

    The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.

  11. Non-conventional approaches to food processing in CELSS, 1. Algal proteins: Characterization and process optimization

    NASA Technical Reports Server (NTRS)

    Nakhost, Z.; Karel, M.; Krukonis, V. J.

    1987-01-01

    Protein isolate obtained from green algae cultivated under controlled conditions was characterized. Molecular weight determination of fractionated algal proteins using SDS-polyacrylamide gel electrophoresis revealed a wide spectrum of molecular weights ranging from 15,000 to 220,000. Isoelectric points of dissociated proteins were in the range of 3.95 to 6.20. Amino acid composition of protein isolate compared favorably with FAO standards. High content of essential amino acids leucine, valine, phenylalanine and lysine make algal protein isolate a high quality component of closed ecological life support system diets. To optimize the removal of algal lipids and pigments supercritical carbon dioxide extraction (with and without ethanol as a co-solvent) was used. Addition of ethanol to supercritical carbon dioxide resulted in more efficient removal of algal lipids and produced protein isolate with a good yield and protein recovery. The protein isolate extracted by the above mixture had an improved water solubility.

  12. Non-conventional approaches to food processing in CELSS. I - Algal proteins: Characterization and process optimization

    NASA Technical Reports Server (NTRS)

    Nakhost, Z.; Karel, M.; Krukonis, V. J.

    1987-01-01

    Protein isolate obtained from green algae (Scenedesmus obliquus) cultivated under controlled conditions was characterized. Molecular weight determination of fractionated algal proteins using SDS-polyacrylamide gel electrophoresis revealed a wide spectrum of molecular weights ranging from 15,000 to 220,000. Isoelectric points of dissociated proteins were in the range of 3.95 to 6.20. Amino acid composition of protein isolate compared favorably with FAO standards. High content of essential amino acids leucine, valine, phenylalanine and lysine makes algal protein isolate a high quality component of CELSS diets. To optimize the removal of algal lipids and pigments supercritical carbon dioxide extraction (with and without ethanol as a co-solvent) was used. Addition of ethanol to supercritical CO2 resulted in more efficient removal of algal lipids and produced protein isolate with a good yield and protein recovery. The protein isolate extracted by the above mixture had an improved water solubility.

  13. Optimization of pretreatments and process parameters for sorghum popping in microwave oven using response surface methodology.

    PubMed

    Mishra, Gayatri; Joshi, Dinesh C; Mohapatra, Debabandya

    2015-12-01

    Sorghum is a popular healthy snack food. Popped sorghum was prepared in a domestic microwave oven. A 3 factor 3 level Box and Behneken design was used to optimize the pretreatment conditions. Grains were preconditioned to 12-20 % moisture content by the addition of 0-2 % salt solutions. Oil was applied (0-10 % w/w) to the preconditioned grains. Optimization of the pretreatments was based on popping yield, volume expansion ratio, and sensory score. The optimized condition was found at 16.62 % (wb), 0.55 % salt and 10 % oil with popping yield of 82.228 %, volume expansion ratio of 14.564 and overall acceptability of 8.495. Further, the microwave process parameters were optimized using a 2 factor 3 level design having microwave power density ranging from 9 to 18 W/g and residence time ranging from 100 to 180 s. For the production of superior quality pop sorghum, the optimized microwave process parameters were microwave power density of 18 Wg(-1) and residence time of 140 s. PMID:26604356

  14. Adaptive optimal control of highly dissipative nonlinear spatially distributed processes with neuro-dynamic programming.

    PubMed

    Luo, Biao; Wu, Huai-Ning; Li, Han-Xiong

    2015-04-01

    Highly dissipative nonlinear partial differential equations (PDEs) are widely employed to describe the system dynamics of industrial spatially distributed processes (SDPs). In this paper, we consider the optimal control problem of the general highly dissipative SDPs, and propose an adaptive optimal control approach based on neuro-dynamic programming (NDP). Initially, Karhunen-Loève decomposition is employed to compute empirical eigenfunctions (EEFs) of the SDP based on the method of snapshots. These EEFs together with singular perturbation technique are then used to obtain a finite-dimensional slow subsystem of ordinary differential equations that accurately describes the dominant dynamics of the PDE system. Subsequently, the optimal control problem is reformulated on the basis of the slow subsystem, which is further converted to solve a Hamilton-Jacobi-Bellman (HJB) equation. HJB equation is a nonlinear PDE that has proven to be impossible to solve analytically. Thus, an adaptive optimal control method is developed via NDP that solves the HJB equation online using neural network (NN) for approximating the value function; and an online NN weight tuning law is proposed without requiring an initial stabilizing control policy. Moreover, by involving the NN estimation error, we prove that the original closed-loop PDE system with the adaptive optimal control policy is semiglobally uniformly ultimately bounded. Finally, the developed method is tested on a nonlinear diffusion-convection-reaction process and applied to a temperature cooling fin of high-speed aerospace vehicle, and the achieved results show its effectiveness.

  15. Adaptive optimal control of highly dissipative nonlinear spatially distributed processes with neuro-dynamic programming.

    PubMed

    Luo, Biao; Wu, Huai-Ning; Li, Han-Xiong

    2015-04-01

    Highly dissipative nonlinear partial differential equations (PDEs) are widely employed to describe the system dynamics of industrial spatially distributed processes (SDPs). In this paper, we consider the optimal control problem of the general highly dissipative SDPs, and propose an adaptive optimal control approach based on neuro-dynamic programming (NDP). Initially, Karhunen-Loève decomposition is employed to compute empirical eigenfunctions (EEFs) of the SDP based on the method of snapshots. These EEFs together with singular perturbation technique are then used to obtain a finite-dimensional slow subsystem of ordinary differential equations that accurately describes the dominant dynamics of the PDE system. Subsequently, the optimal control problem is reformulated on the basis of the slow subsystem, which is further converted to solve a Hamilton-Jacobi-Bellman (HJB) equation. HJB equation is a nonlinear PDE that has proven to be impossible to solve analytically. Thus, an adaptive optimal control method is developed via NDP that solves the HJB equation online using neural network (NN) for approximating the value function; and an online NN weight tuning law is proposed without requiring an initial stabilizing control policy. Moreover, by involving the NN estimation error, we prove that the original closed-loop PDE system with the adaptive optimal control policy is semiglobally uniformly ultimately bounded. Finally, the developed method is tested on a nonlinear diffusion-convection-reaction process and applied to a temperature cooling fin of high-speed aerospace vehicle, and the achieved results show its effectiveness. PMID:25794375

  16. Ablation Thermal Protection Systems: Suitability of ablation systems to thermal protection depends on complex physical and chemical processes.

    PubMed

    Ungar, E W

    1967-11-10

    The performance of ablation thermal protection systems is intimately related to the mass transfer, heat transfer, and chemical reactions which occur within the gas boundary layer. Production of a liquid layer and phase change or chemical reaction heat sinks greatly improve materials performance. Materials are available which achieve many goals for thermal protection. However, advanced materials which are now being developed provide hope of further reductions in the weight of heat-shielding structures. PMID:17732614

  17. Optimization, Production, and Characterization of a CpG-Oligonucleotide-Ficoll Conjugate Nanoparticle Adjuvant for Enhanced Immunogenicity of Anthrax Protective Antigen

    PubMed Central

    2016-01-01

    We have synthesized and characterized a novel phosphorothioate CpG oligodeoxynucleotide (CpG ODN)-Ficoll conjugated nanoparticulate adjuvant, termed DV230-Ficoll. This adjuvant was constructed from an amine-functionalized-Ficoll, a heterobifunctional linker (succinimidyl-[(N-maleimidopropionamido)-hexaethylene glycol] ester) and the CpG-ODN DV230. Herein, we describe the evaluation of the purity and reactivity of linkers of different lengths for CpG-ODN-Ficoll conjugation, optimization of linker coupling, and conjugation of thiol-functionalized CpG to maleimide-functionalized Ficoll and process scale-up. Physicochemical characterization of independently produced lots of DV230-Ficoll reveal a bioconjugate with a particle size of approximately 50 nm and covalent attachment of more than 100 molecules of CpG per Ficoll. Solutions of purified DV230-Ficoll were stable for at least 12 months at frozen and refrigerated temperatures and stability was further enhanced in lyophilized form. Compared to nonconjugated monomeric DV230, the DV230-Ficoll conjugate demonstrated improved in vitro potency for induction of IFN-α from human peripheral blood mononuclear cells and induced higher titer neutralizing antibody responses against coadministered anthrax recombinant protective antigen in mice. The processes described here establish a reproducible and robust process for the synthesis of a novel, size-controlled, and stable CpG-ODN nanoparticle adjuvant suitable for manufacture and use in vaccines. PMID:27074387

  18. Optimization, Production, and Characterization of a CpG-Oligonucleotide-Ficoll Conjugate Nanoparticle Adjuvant for Enhanced Immunogenicity of Anthrax Protective Antigen.

    PubMed

    Milley, Bob; Kiwan, Radwan; Ott, Gary S; Calacsan, Carlo; Kachura, Melissa; Campbell, John D; Kanzler, Holger; Coffman, Robert L

    2016-05-18

    We have synthesized and characterized a novel phosphorothioate CpG oligodeoxynucleotide (CpG ODN)-Ficoll conjugated nanoparticulate adjuvant, termed DV230-Ficoll. This adjuvant was constructed from an amine-functionalized-Ficoll, a heterobifunctional linker (succinimidyl-[(N-maleimidopropionamido)-hexaethylene glycol] ester) and the CpG-ODN DV230. Herein, we describe the evaluation of the purity and reactivity of linkers of different lengths for CpG-ODN-Ficoll conjugation, optimization of linker coupling, and conjugation of thiol-functionalized CpG to maleimide-functionalized Ficoll and process scale-up. Physicochemical characterization of independently produced lots of DV230-Ficoll reveal a bioconjugate with a particle size of approximately 50 nm and covalent attachment of more than 100 molecules of CpG per Ficoll. Solutions of purified DV230-Ficoll were stable for at least 12 months at frozen and refrigerated temperatures and stability was further enhanced in lyophilized form. Compared to nonconjugated monomeric DV230, the DV230-Ficoll conjugate demonstrated improved in vitro potency for induction of IFN-α from human peripheral blood mononuclear cells and induced higher titer neutralizing antibody responses against coadministered anthrax recombinant protective antigen in mice. The processes described here establish a reproducible and robust process for the synthesis of a novel, size-controlled, and stable CpG-ODN nanoparticle adjuvant suitable for manufacture and use in vaccines. PMID:27074387

  19. A multi-objective optimization framework to model 3D river and landscape evolution processes

    NASA Astrophysics Data System (ADS)

    Bizzi, Simone; Castelletti, Andrea; Cominola, Andrea; Mason, Emanuele; Paik, Kyungrock

    2013-04-01

    Water and sediment interactions shape hillslopes, regulate soil erosion and sedimentation, and organize river networks. Landscape evolution and river organization occur at various spatial and temporal scale and the understanding and modelling of them is highly complex. The idea of a least action principle governing river networks evolution has been proposed many times as a simpler approach among the ones existing in the literature. These theories assume that river networks, as observed in nature, self-organize and act on soil transportation in order to satisfy a particular "optimality" criterion. Accordingly, river and landscape weathering can be simulated by solving an optimization problem, where the choice of the criterion to be optimized becomes the initial assumption. The comparison between natural river networks and optimized ones verifies the correctness of this initial assumption. Yet, various criteria have been proposed in literature and there is no consensus on which is better able to explain river network features observed in nature like network branching and river bed profile: each one is able to reproduce some river features through simplified modelling of the natural processes, but it fails to characterize the whole complexity (3D and its dynamic) of the natural processes. Some of the criteria formulated in the literature partly conflict: the reason is that their formulation rely on mathematical and theoretical simplifications of the natural system that are suitable for specific spatial and temporal scale but fails to represent the whole processes characterizing landscape evolution. In an attempt to address some of these scientific questions, we tested the suitability of using a multi-objective optimization framework to describe river and landscape evolution in a 3D spatial domain. A synthetic landscape is used to this purpose. Multiple, alternative river network evolutions, corresponding to as many tradeoffs between the different and partly

  20. Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process

    PubMed Central

    Yin, Chuancun; Yuen, Kam Chuen; Shen, Ying

    2015-01-01

    We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy. PMID:26351655

  1. Development of Protective Coatings for Co-Sequestration Processes and Pipelines

    SciTech Connect

    Bierwagen, Gordon; Huang, Yaping

    2011-11-30

    The program, entitled Development of Protective Coatings for Co-Sequestration Processes and Pipelines, examined the sensitivity of existing coating systems to supercritical carbon dioxide (SCCO2) exposure and developed new coating system to protect pipelines from their corrosion under SCCO2 exposure. A literature review was also conducted regarding pipeline corrosion sensors to monitor pipes used in handling co-sequestration fluids. Research was to ensure safety and reliability for a pipeline involving transport of SCCO2 from the power plant to the sequestration site to mitigate the greenhouse gas effect. Results showed that one commercial coating and one designed formulation can both be supplied as potential candidates for internal pipeline coating to transport SCCO2.

  2. Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil

    USGS Publications Warehouse

    Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W

    2016-01-01

    Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.

  3. Meltlets® of Soy Isoflavones: Process Optimization and the Effect of Extrusion Spheronization Process Parameters on Antioxidant Activity

    PubMed Central

    Deshmukh, Ketkee; Amin, Purnima

    2013-01-01

    In the current research work an attempt was made to develop “Melt in mouth pellets” (Meltlets®) containing 40% herbal extract of soy isoflavones that served to provide antioxidants activity in menopausal women. The process of extrusion–spheronization was optimized for extruder speed, extruder screen size, spheronization speed, and time. While doing so the herbal extract incorporated in the pellet matrix was subjected to various processing conditions such as the effect of the presence of other excipients, mixing or kneading to prepare wet mass, heat generated during the process of extrusion, spheronization, and drying. Thus, the work further investigates the effect of these processing parameters on the antioxidant activity of the soy isoflavone herbal extract incorporated in the formula. Thereby, the antioxidant activity of the soya bean herbal extract, Meltlets® and of the placebo pellets was evaluated using DPPH free radical scavenging assay and total reduction capacity. PMID:24302800

  4. Optimization of processing temperature in the nitridation process for the synthesis of iron nitride nanoparticles

    SciTech Connect

    Rohith Vinod, K.; Sakar, M.; Balakumar, S.; Saravanan, P.

    2015-06-24

    We have demonstrated an effective strategy on the nitridation process to synthesize ε-Fe{sub 3}N nanoparticles (NPs) from the zero valent iron NPs as a starting material. The transformation of iron into iron nitride phase was systematically studied by performing the nitridation process at different processing temperatures. The phase, crystal structure was analyzed by XRD. Morphology and size of the ZVINPs and ε-Fe{sub 3}N NPs were analyzed by field emission scanning electron microscope. Further, their room temperature magnetic properties were studied by using vibrating sample magnetometer and it revealed that the magnetic property of ε-Fe{sub 3}N is associated with ratio of Fe-N in the iron nitride system.

  5. Optimal nonlinear information processing capacity in delay-based reservoir computers

    PubMed Central

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-01-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature. PMID:26358528

  6. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  7. Optimal nonlinear information processing capacity in delay-based reservoir computers

    NASA Astrophysics Data System (ADS)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  8. Optimal nonlinear information processing capacity in delay-based reservoir computers.

    PubMed

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-01-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  9. [Optimization of extraction process for tannins from Geranium orientali-tibeticum by supercritical CO2 method].

    PubMed

    Xie, Song; Tong, Zhi-Ping; Tan, Rui; Liu, Xiao-Zhen

    2014-08-01

    In order to optimize extraction process conditions of tannins from Geranium orientali-tibeticum by supercritical CO2, the content of tannins was determined by phosphomolybdium tungsten acid-casein reaction, with extraction pressure, extraction temper- ature and extraction time as factors, the content of tannins from extract of G. orientali-tibeticum as index, technology conditions were optimized by orthogonal test. Optimum technology conditions were as follows: extraction pressure was 25 MPa, extraction temperature was 50 °C, extracted 1.5 h. The content of tannins in extract was 12.91 mg x g(-1), extract rate was 3.67%. The method established could be used for assay the contents of tannin in G. orientali-tibeticum. The circulated extraction was an effective extraction process that was stable and feasible, and that provides a way of the extraction process conditions of tannin from G. orientali-tibeticum.

  10. Simultaneous optimization of multiple performance characteristics in coagulation-flocculation process for Indian paper industry wastewater.

    PubMed

    Saraswathi, R; Saseetharan, M K

    2012-01-01

    The goal of this study was to optimize the coagulation-flocculation process in wastewater generated from the paper and pulp industry using a grey relational analysis (GRA)-based Taguchi method. Process parameters included types and doses of natural coagulants and coagulant aid, and pH. To track the efficiency of the treatment process, the following responses were chosen for optimization: chemical oxygen demand (COD), total dissolved solids (TDS) and turbidity of wastewater, alone or in combination or all together. Analysis of variance showed that the type and dose of the coagulant aid were the most significant parameters, followed by pH and the dose of the coagulant; the type of coagulant used was found to be insignificant in the coagulation-flocculation process. Optimization of process parameters to achieve lower turbidity and greater removal of COD and TDS was verified in a separate confirmatory experiment, which showed improvements in COD and TDS removal and a decrease in turbidity of 8.2, 6.35 and 26.17%, respectively, with the application of the Taguchi method and GRA.

  11. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    PubMed

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  12. Optimization of the process chain for mirrors made of silicon carbide

    NASA Astrophysics Data System (ADS)

    Waechter, Daniel; Kroedel, Matthias; Huenten, Martin; Klocke, Fritz

    2012-09-01

    Different grades of silicon carbide (SiC) became an established material for structures as well as optical mirrors in space-borne applications. But the manufacturing still causes high efforts and restrains an extension of application in further fields. The research project MirrorFab aims for a qualification of an optimized process chain for manufacturing mirrors made of Cesic®. Cesic® consists of a matrix of SiC reinforced with chopped carbon fibers. There is a space qualified Cesic® manufacturing process and an established network for the supply chain. The project addresses the required gain in efficiency and flexibility in the manufacturing capabilities. The consortium covers the major parts of the process chain. It aims for increasing the performance of each manufacturing technology. Additionally, the consideration of the complete process chain enables a holistic optimization approach. This paper deals particularly with the process optimization of the grinding step after infiltration. The benefit of the use of an ultra precision grinding machine for mirrors in the range of 200 mm is evaluated. This paper presents the results of a systematical study on the influence of the grit size, the type of bond as well the major machining parameters on the surface roughness and the grinding forces, when machining the material Cesic®. A major finding is, that the use of ultra fine grinding wheels does not result in a superior surface quality compared to the use of a D46 grinding wheel with resinoid bond.

  13. Optimized Rapeseed Oils Rich in Endogenous Micronutrients Protect High Fat Diet Fed Rats from Hepatic Lipid Accumulation and Oxidative Stress

    PubMed Central

    Xu, Jiqu; Liu, Xiaoli; Gao, Hui; Chen, Chang; Deng, Qianchun; Huang, Qingde; Ma, Zhonghua; Huang, Fenghong

    2015-01-01

    Micronutrients in rapeseed exert a potential benefit to hepatoprotection, but most of them are lost during the conventional refining processing. Thus some processing technologies have been optimized to improve micronutrient retention in oil. The aim of this study is to assess whether optimized rapeseed oils (OROs) have positive effects on hepatic lipid accumulation and oxidative stress induced by a high-fat diet. Methods: Rats received experiment diets containing 20% fat and refined rapeseed oil or OROs obtained with various processing technologies as lipid source. After 10 weeks of treatment, liver was assayed for lipid accumulation and oxidative stress. Results: All OROs reduced hepatic triglyceride contents. Microwave pretreatment-cold pressing oil (MPCPO) which had the highest micronutrients contents also reduced hepatic cholesterol level. MPCPO significantly decreased hepatic sterol regulatory element-binding transcription factor 1 (SREBP1) but increased peroxisome proliferator activated receptor α (PPARα) expressions, and as a result, MPCPO significantly suppressed acetyl CoA carboxylase and induced carnitine palmitoyl transferase-1 and acyl CoA oxidase expression. Hepatic catalase (CAT) and glutathione peroxidase (GPx) activities as well as reduced glutathione (GSH) contents remarkably increased and lipid peroxidation levels decreased in parallel with the increase of micronutrients. Conclusion: OROs had the ability to reduce excessive hepatic fat accumulation and oxidative stress, which indicated that OROs might contribute to ameliorating nonalcoholic fatty liver induced by high-fat diet. PMID:26473919

  14. The optimization of technological condition in the fermentation process of glutamate by pattern recognition method.

    PubMed

    Xu, C; Chen, C; Wang, H; Sun, J

    1994-01-01

    The technological condition in the fermentation process of fermentation glutamate (such as pH value, temperature, ventilation rate, etc.) were optimized by computerized pattern recognition method. The visible optimum region may be found based on the mapping from the multi-dimensional pattern space into a plane. It is then transformed along the reciprocal direction into the original data space using Monte Carlo simulation, so the orientation of optimization and the best combination of all parameters can be determined. A new mathematical model is being proposed based on the experimental evidence in production. The transfer ratio of glucose to glutamic acid, the production capacity and the glutamic acid concentration increase 2.9%, 1.45% and 2.65% respectively by operating this optimization method. The method has been widely extended to factories and has granted in decreasing the expense of raw materials and that of the production cost.

  15. Numerical solution of the problem of optimizing the process of oil displacement by steam

    NASA Astrophysics Data System (ADS)

    Temirbekov, N. M.; Baigereyev, D. R.

    2016-06-01

    The paper is devoted to the problem of optimizing the process of steam stimulation on the oil reservoir by controlling the steam pressure on the injection well to achieve preassigned temperature distribution along the reservoir at a given time of development. The relevance of the study of this problem is related to the need to improve methods of heavy oil development, the proportion of which exceeds the reserves of light oils, and it tends to grow. As a mathematical model of oil displacement by steam, three-phase non-isothermal flow equations is considered. The problem of optimal control is formulated, an algorithm for the numerical solution is proposed. As a reference regime, temperature distribution corresponding to the constant pressure of injected steam is accepted. The solution of the optimization problem shows that choosing the steam pressure on the injection well, one can improve the efficiency of steam-stimulation and reduce the pressure of the injected steam.

  16. ICPP Fluorinel Dissolution Process (FDP) Plant Protection System (PPS) baseline criteria evaluation

    SciTech Connect

    Allen, G.W.; Clayton, R.J.; Fielding, K.D.; Mozes, M.L.

    1993-06-01

    This report documents a baseline criteria evaluation of the FAST Plant Protection System (PPS) at the Idaho Chemical Processing Plant (ICPP). Westinghouse Idaho Nuclear Company (WINCO), Computer Process Application (CPA) personnel originally prepared this report as requested by the FAST Fluorinel Dissolution Process (FDP) Operational Readiness Review (ORR) committee. It was required by the ORR committee for the 1992 restart of FDP operations. However on April 29, 1992, the Department of Energy (DOE) directed WINCO to discontinue reprocessing of spent nuclear fuel at the ICPP. This eliminated the mission of the FDP. The report includes an evaluation of the PPS against criteria requested by the ORR committee and against criteria contained in the WINCO PPS Requirements Manual. This second criteria evaluation is summarized in Appendix A.

  17. Optimization and analysis of mixed refrigerant composition for the PRICO natural gas liquefaction process

    NASA Astrophysics Data System (ADS)

    Xu, Xiongwen; Liu, Jinping; Cao, Le

    2014-01-01

    In this paper, the energy optimization of the PRICO natural gas liquefaction (LNG) process was performed with the genetic algorithm (GA) and the process simulation software Aspen Plus. Then the characteristics of the heat transfer composite curves of the cold box were obtained and analyzed. Based on it, the heat exchange process in the cold box was divided into three regions. At last, in order to find the relationship between the energy consumption and the composition of the mixed refrigerant, the effects of the refrigerant flow composition on the temperature difference and the pinch point location were deeply investigated, which would be useful to guide the refrigerant charging.

  18. Optimization of the laser remelting process for HVOF-sprayed Stellite 6 wear resistant coatings

    NASA Astrophysics Data System (ADS)

    Ciubotariu, Costel-Relu; Frunzăverde, Doina; Mărginean, Gabriela; Șerban, Viorel-Aurel; Bîrdeanu, Aurel-Valentin

    2016-03-01

    Cobalt base alloys are used in all industrial areas due to their excellent wear resistance. Several studies have shown that Stellite 6 coatings are suitable not only for protection against sliding wear, but also in case of exposure to impact loading. In this respect, a possible application is the protection of hydropower plant components affected by cavitation. The main problem in connection with Stellite 6 is the deposition procedure of the protective layers, both welding and thermal spraying techniques requesting special measures in order to prevent the brittleness of the coating. In this study, Stellite 6 layers were HVOF thermally sprayed on a martensitic 13-4 stainless steel substrate, as usually used for hydraulic machinery components. In order to improve the microstructure of the HVOF-sprayed coatings and their adhesion to the substrate, laser remelting was applied, using a TRUMPF Laser type HL 124P LCU and different working parameters. The microstructure of the coatings, obtained for various remelting conditions, was evaluated by light microscopy, showing the optimal value of the pulse power, which provided a homogenous Stellite 6 layer with good adhesion to the substrate.

  19. Process optimization by response surface design and characterization study on geniposide pharmacosomes.

    PubMed

    Yue, Peng-Fei; Zheng, Qin; Wu, Bin; Yang, Ming; Wang, Mu-Sheng; Zhang, Hai-Yan; Hu, Peng-Yi; Wu, Zhen-Feng

    2012-01-01

    The objective of this study was to prepare and characterize geniposide-pharmcosomes (GP-PMS) and optimize the process and formulation variables using response surface methodology. Tetrahydrofuran was used as a reaction medium, GP and phospholipids were resolved into the medium, and GP-PMS was formed after the organic solvent was evaporated off under vacuum condition. The process and formulation variables were optimized by central composite design (CCD) of response surface methodology (RSM). The phospholipid-to-drug ratio (X(1)), reaction temperature (X(2)) and the drug concentration (X(3)) were selected as independent variables and the yield (%) of GP 'present as a complex' in the PMS was used as the dependent variable. The physico-chemical properties of the complex obtained by optimal parameters were investigated by means of Fourier transform infrared spectrophotometry (FT-IR), differential scanning calorimetry, n-octanol/water partition coefficient (P) and particle size analysis. Multiple linear regression analysis for optimization by CCD revealed that the higher the yield of GP 'present as a complex' in the GP-PMS was obtained wherein the optimal settings of X(1), X(2) and X(3) are 3, 50°C and 5.5 mg/mL, respectively. The DSC and IR studies of GP-PMS by the optimal settings demonstrated that GP and phospholipids in the GP-PMS were combined by non-covalent bond, not forming a new compound. GP-PMS could significantly increased the lipophilicify of GP, and P of GP-PMS in n-octanol and water was about 20 multiples more than that of GP material. Pharmacosomes could be an alternative approach to improve the absorption and permeation of biologically active constituents.

  20. An intelligent factory-wide optimal operation system for continuous production process

    NASA Astrophysics Data System (ADS)

    Ding, Jinliang; Chai, Tianyou; Wang, Hongfeng; Wang, Junwei; Zheng, Xiuping

    2016-03-01

    In this study, a novel intelligent factory-wide operation system for a continuous production process is designed to optimise the entire production process, which consists of multiple units; furthermore, this system is developed using process operational data to avoid the complexity of mathematical modelling of the continuous production process. The data-driven approach aims to specify the structure of the optimal operation system; in particular, the operational data of the process are used to formulate each part of the system. In this context, the domain knowledge of process engineers is utilised, and a closed-loop dynamic optimisation strategy, which combines feedback, performance prediction, feed-forward, and dynamic tuning schemes into a framework, is employed. The effectiveness of the proposed system has been verified using industrial experimental results.

  1. Applying Business Process Re-engineering Patterns to optimize WS-BPEL Workflows

    NASA Astrophysics Data System (ADS)

    Buys, Jonas; de Florio, Vincenzo; Blondia, Chris

    With the advent of XML-based SOA, WS-BPEL shortly turned out to become a widely accepted standard for modeling business processes. Though SOA is said to embrace the principle of business agility, BPEL process definitions are still manually crafted into their final executable version. While SOA has proven to be a giant leap forward in building flexible IT systems, this static BPEL workflow model is somewhat paradoxical to the need for real business agility and should be enhanced to better sustain continual process evolution. In this paper, we point out the potential of adding business intelligence with respect to business process re-engineering patterns to the system to allow for automatic business process optimization. Furthermore, we point out that BPR macro-rules could be implemented leveraging micro-techniques from computer science. We present some practical examples that illustrate the benefit of such adaptive process models and our preliminary findings.

  2. Process metallurgy simulation for metal drawing process optimization by using two-scale finite element method

    SciTech Connect

    Nakamachi, Eiji; Yoshida, Takashi; Yamaguchi, Toshihiko; Morita, Yusuke; Kuramae, Hiroyuki; Morimoto, Hideo

    2014-10-06

    We developed two-scale FE analysis procedure based on the crystallographic homogenization method by considering the hierarchical structure of poly-crystal aluminium alloy metal. It can be characterized as the combination of two-scale structure, such as the microscopic polycrystal structure and the macroscopic elastic plastic continuum. Micro polycrystal structure can be modeled as a three dimensional representative volume element (RVE). RVE is featured as by 3×3×3 eight-nodes solid finite elements, which has 216 crystal orientations. This FE analysis code can predict the deformation, strain and stress evolutions in the wire drawing processes in the macro- scales, and further the crystal texture and hardening evolutions in the micro-scale. In this study, we analyzed the texture evolution in the wire drawing processes by our two-scale FE analysis code under conditions of various drawing angles of dice. We evaluates the texture evolution in the surface and center regions of the wire cross section, and to clarify the effects of processing conditions on the texture evolution.

  3. A Twin Protection Effect? Explaining Twin Survival Advantages with a Two-Process Mortality Model

    PubMed Central

    2016-01-01

    Twin studies that focus on the correlation in age-at-death between twin pairs have yielded important insights into the heritability and role of genetic factors in determining lifespan, but less attention is paid to the biological and social role of zygosity itself in determining survival across the entire life course. Using data from the Danish Twin Registry and the Human Mortality Database, we show that monozygotic twins have greater cumulative survival proportions at nearly every age compared to dizygotic twins and the Danish general population. We examine this survival advantage by fitting these data with a two-process mortality model that partitions survivorship patterns into extrinsic and intrinsic mortality processes roughly corresponding to acute, environmental and chronic, biological origins. We find intrinsic processes confer a survival advantage at older ages for males, while at younger ages, all monozygotic twins show a health protection effect against extrinsic death akin to a marriage protection effect. While existing research suggests an increasingly important role for genetic factors at very advanced ages, we conclude that the social closeness of monozygotic twins is a plausible driver of the survival advantage at ages <65. PMID:27192433

  4. A Twin Protection Effect? Explaining Twin Survival Advantages with a Two-Process Mortality Model.

    PubMed

    Sharrow, David J; Anderson, James J

    2016-01-01

    Twin studies that focus on the correlation in age-at-death between twin pairs have yielded important insights into the heritability and role of genetic factors in determining lifespan, but less attention is paid to the biological and social role of zygosity itself in determining survival across the entire life course. Using data from the Danish Twin Registry and the Human Mortality Database, we show that monozygotic twins have greater cumulative survival proportions at nearly every age compared to dizygotic twins and the Danish general population. We examine this survival advantage by fitting these data with a two-process mortality model that partitions survivorship patterns into extrinsic and intrinsic mortality processes roughly corresponding to acute, environmental and chronic, biological origins. We find intrinsic processes confer a survival advantage at older ages for males, while at younger ages, all monozygotic twins show a health protection effect against extrinsic death akin to a marriage protection effect. While existing research suggests an increasingly important role for genetic factors at very advanced ages, we conclude that the social closeness of monozygotic twins is a plausible driver of the survival advantage at ages <65.

  5. Statistical optimization of process parameters on biohydrogen production from glucose by Clostridium sp. Fanp2.

    PubMed

    Pan, C M; Fan, Y T; Xing, Y; Hou, H W; Zhang, M L

    2008-05-01

    Statistically based experimental designs were applied to optimizing process parameters for hydrogen production from glucose by Clostridium sp. Fanp2 which was isolated from effluent sludge of anaerobic hydrogen-producing bioreactor. The important factors influencing hydrogen production, which identified by initial screening method of Plackett-Burman, were glucose, phosphate buffer and vitamin solution. The path of steepest ascent was undertaken to approach the optimal region of the three significant factors. Box-Behnken design and response surface analysis were adopted to further investigate the mutual interaction between the variables and identify optimal values that bring maximum hydrogen production. Experimental results showed that glucose, vitamin solution and phosphate buffer concentration all had an individual significant influence on the specific hydrogen production potential (Ps). Simultaneously, glucose and vitamin solution, glucose and phosphate buffer were interdependent. The optimal conditions for the maximal Ps were: glucose 23.75 g/l, phosphate buffer 0.159 M and vitamin solution 13.3 ml/l. Using this statistical optimization method, the hydrogen production from glucose was increased from 2248.5 to 4165.9 ml H2/l.

  6. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology.

    PubMed

    Arulmathi, P; Elangovan, G; Begum, A Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm(2), electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively.

  7. Toxicity assessment of tannery effluent treated by an optimized photo-Fenton process.

    PubMed

    Borba, Fernando Henrique; Módenes, Aparecido Nivaldo; Espinoza-Quiñones, Fernando Rodolfo; Manenti, Diego Ricieri; Bergamasco, Rosangela; Mora, Nora Diaz

    2013-01-01

    In this work, an optimized photo-Fenton process was applied to remove pollutants from tannery industrial effluent (TIE) with its final toxicity level being assessed by a lettuce-seed-based bioassay test. A full 33 factorial design was applied for the optimization of long-term photo-Fenton experiments. The oPtimum conditions of the photo-Fenton process were attained at concentration values of 0.3 g Fe(2+) L(-1) and 20 g H2O2 L(-1) and pH3, for 120 min UV irradiation time. Reactor operating parameter (ROP) effects on the removal of chemical oxygen demand, colour, turbidity, total suspended solids and total volatile solids were evaluated, suggesting that a broad range of ROP values are also suitable to give results very near to those of the photo-Fenton experiments under optimal conditions. Based on the low calculated median lethal dose (LD50) values from a lettuce-seed-based bioassay test, we suggest that recalcitrant substances are present in treated TIE samples. A possible cause of the high toxicity level could partly be attributed to the nitrate concentration, which was not completely abated by the photo-Fenton process. Apart from this, the photo-Fenton process can be used as a part of an industrial effluent treatment system in order to abate high organic pollutant loads. PMID:23837315

  8. An Approach to Optimize Size Parameters of Forging by Combining Hot-Processing Map and FEM

    NASA Astrophysics Data System (ADS)

    Hu, H. E.; Wang, X. Y.; Deng, L.

    2014-11-01

    The size parameters of 6061 aluminum alloy rib-web forging were optimized by using hot-processing map and finite element method (FEM) based on high-temperature compression data. The results show that the stress level of the alloy can be represented by a Zener-Holloman parameter in a hyperbolic sine-type equation with the hot deformation activation energy of 343.7 kJ/mol. Dynamic recovery and dynamic recrystallization concurrently preceded during high-temperature deformation of the alloy. Optimal hot-processing parameters for the alloy corresponding to the peak value of 0.42 are 753 K and 0.001 s-1. The instability domain occurs at deformation temperature lower than 653 K. FEM is an available method to validate hot-processing map in actual manufacture by analyzing the effect of corner radius, rib width, and web thickness on workability of rib-web forging of the alloy. Size parameters of die forgings can be optimized conveniently by combining hot-processing map and FEM.

  9. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology

    PubMed Central

    Arulmathi, P.; Elangovan, G.; Begum, A. Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm2, electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively. PMID:26491716

  10. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology.

    PubMed

    Arulmathi, P; Elangovan, G; Begum, A Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm(2), electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively. PMID:26491716

  11. About Distributed Simulation-based Optimization of Forming Processes using a Grid Architecture

    NASA Astrophysics Data System (ADS)

    Grauer, Manfred; Barth, Thomas

    2004-06-01

    Permanently increasing complexity of products and their manufacturing processes combined with a shorter "time-to-market" leads to more and more use of simulation and optimization software systems for product design. Finding a "good" design of a product implies the solution of computationally expensive optimization problems based on the results of simulation. Due to the computational load caused by the solution of these problems, the requirements on the Information&Telecommunication (IT) infrastructure of an enterprise or research facility are shifting from stand-alone resources towards the integration of software and hardware resources in a distributed environment for high-performance computing. Resources can either comprise software systems, hardware systems, or communication networks. An appropriate IT-infrastructure must provide the means to integrate all these resources and enable their use even across a network to cope with requirements from geographically distributed scenarios, e.g. in computational engineering and/or collaborative engineering. Integrating expert's knowledge into the optimization process is inevitable in order to reduce the complexity caused by the number of design variables and the high dimensionality of the design space. Hence, utilization of knowledge-based systems must be supported by providing data management facilities as a basis for knowledge extraction from product data. In this paper, the focus is put on a distributed problem solving environment (PSE) capable of providing access to a variety of necessary resources and services. A distributed approach integrating simulation and optimization on a network of workstations and cluster systems is presented. For geometry generation the CAD-system CATIA is used which is coupled with the FEM-simulation system INDEED for simulation of sheet-metal forming processes and the problem solving environment OpTiX for distributed optimization.

  12. An integrated approach of topology optimized design and selective laser melting process for titanium implants materials.

    PubMed

    Xiao, Dongming; Yang, Yongqiang; Su, Xubin; Wang, Di; Sun, Jianfeng

    2013-01-01

    The load-bearing bone implants materials should have sufficient stiffness and large porosity, which are interacted since larger porosity causes lower mechanical properties. This paper is to seek the maximum stiffness architecture with the constraint of specific volume fraction by topology optimization approach, that is, maximum porosity can be achieved with predefine stiffness properties. The effective elastic modulus of conventional cubic and topology optimized scaffolds were calculated using finite element analysis (FEA) method; also, some specimens with different porosities of 41.1%, 50.3%, 60.2% and 70.7% respectively were fabricated by Selective Laser Melting (SLM) process and were tested by compression test. Results showed that the computational effective elastic modulus of optimized scaffolds was approximately 13% higher than cubic scaffolds, the experimental stiffness values were reduced by 76% than the computational ones. The combination of topology optimization approach and SLM process would be available for development of titanium implants materials in consideration of both porosity and mechanical stiffness.

  13. Gas separation using membranes. 1: Optimization of the separation process using new cost parameters

    SciTech Connect

    Hinchliffe, A.B.; Porter, K.E.

    1997-03-01

    This is the first in a series of papers presenting new concepts for the development of membranes for gas separation. In this paper two new cost parameters, which are useful for costing and optimization of membrane gas separation systems, are described. The new parameters, cost permeability and effective selectivity, can be used to show the direction to be taken in membrane research and development. The new parameters are shown to predict accurately the cost of membrane separation plant by correlating bids from membrane plant suppliers using the new parameters with cross-flow design equations. The parameters are used to optimize the membrane gas separation of hydrogen and carbon monoxide for two commercially available membrane systems. The membrane separation is compared with the currently used method, cryogenic flash distillation. Economic evaluation methods are developed to compare different separation methods so that the process as a whole can be optimized. The evaluation shows that, for membrane gas separation, it is important to find the optimum degree of separation; when membrane separation is evaluated at the separation specification for the established cryogenic method, membranes are not competitive; however, when the process is optimized for membrane separation, the cost of separation reduces to less than 60% of the cryogenic separation.

  14. Modeling and optimization of red currants vacuum drying process by response surface methodology (RSM).

    PubMed

    Šumić, Zdravko; Vakula, Anita; Tepić, Aleksandra; Čakarević, Jelena; Vitas, Jasmina; Pavlić, Branimir

    2016-07-15

    Fresh red currants were dried by vacuum drying process under different drying conditions. Box-Behnken experimental design with response surface methodology was used for optimization of drying process in terms of physical (moisture content, water activity, total color change, firmness and rehydratation power) and chemical (total phenols, total flavonoids, monomeric anthocyanins and ascorbic acid content and antioxidant activity) properties of dried samples. Temperature (48-78 °C), pressure (30-330 mbar) and drying time (8-16 h) were investigated as independent variables. Experimental results were fitted to a second-order polynomial model where regression analysis and analysis of variance were used to determine model fitness and optimal drying conditions. The optimal conditions of simultaneously optimized responses were temperature of 70.2 °C, pressure of 39 mbar and drying time of 8 h. It could be concluded that vacuum drying provides samples with good physico-chemical properties, similar to lyophilized sample and better than conventionally dried sample.

  15. Nacelle/Diverter Integration into the Design Optimization Process Using Pseudo, Warped, and Real Nacelles

    NASA Technical Reports Server (NTRS)

    Cliff, Susan E.; Reuther, James J.; Saunders, David A.; Rimlinger, Mark J.

    1999-01-01

    The computational results of the optimized complete configurations, including nacelles and diverters, are presented in terms of drag count improvement compared with the TCA baseline configuration at Mach 2.4, C(sub L)=0.1. The three candidate designs are designated by the organization from which they were derived. ARC represents the Ames Research Center 1-03 design, BCAG represents the Boeing Commercial Aircraft Group's design from Seattle, and BLB represents the design from Boeing Long Beach. All CFD methods are in unanimous agreement that the Ames 1-03 configuration has the largest performance improvement, followed closely by the BCAG configuration, with a much smaller improvement attained by Boeing Long Beach. The Ames design was obtained using the single-block wing/body code SYN87-SB with its "pseudo" nacelle option-an elaborate technique for incorporating nacelle/diverter effects into the design optimization process. This technique uses AIRPLANE surface pressure coefficient data with and without the nacelles/diverters. Further details of this method are described. It is reasonable to expect that further improvements could be achieved by including the "real" nacelles directly into the optimization process by use of the newly-developed multiblock optimization code, SYN107-MB, which can handle full configurations.

  16. Modeling and optimization of red currants vacuum drying process by response surface methodology (RSM).

    PubMed

    Šumić, Zdravko; Vakula, Anita; Tepić, Aleksandra; Čakarević, Jelena; Vitas, Jasmina; Pavlić, Branimir

    2016-07-15

    Fresh red currants were dried by vacuum drying process under different drying conditions. Box-Behnken experimental design with response surface methodology was used for optimization of drying process in terms of physical (moisture content, water activity, total color change, firmness and rehydratation power) and chemical (total phenols, total flavonoids, monomeric anthocyanins and ascorbic acid content and antioxidant activity) properties of dried samples. Temperature (48-78 °C), pressure (30-330 mbar) and drying time (8-16 h) were investigated as independent variables. Experimental results were fitted to a second-order polynomial model where regression analysis and analysis of variance were used to determine model fitness and optimal drying conditions. The optimal conditions of simultaneously optimized responses were temperature of 70.2 °C, pressure of 39 mbar and drying time of 8 h. It could be concluded that vacuum drying provides samples with good physico-chemical properties, similar to lyophilized sample and better than conventionally dried sample. PMID:26948639

  17. Thermodynamic optimization of a Penrose process: An engineers' approach to black hole thermodynamics

    NASA Astrophysics Data System (ADS)

    Bravetti, A.; Gruber, C.; Lopez-Monsalvo, C. S.

    2016-03-01

    In this work we present a new view on the thermodynamics of black holes introducing effects of irreversibility by employing thermodynamic optimization and finite-time thermodynamics. These questions are of importance both in physics and in engineering, combining standard thermodynamics with optimal control theory in order to find optimal protocols and bounds for realistic processes without assuming anything about the microphysics involved. We work out the details of the thermodynamic optimization of a Penrose process, i.e. the problem of finding the maximum work that can be extracted from a Kerr black hole in finite time. This problem has already been addressed in the case of an isolated black hole. Here we consider the case of a black hole immersed in a reservoir and show that the presence of the reservoir can dramatically improve the work output. We discuss the relevance of our results for real astrophysical phenomena, for the comparison with laboratory black holes analogues and for other theoretical aspects of black hole thermodynamics.

  18. Development of optimization model for sputtering process parameter based on gravitational search algorithm

    NASA Astrophysics Data System (ADS)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In the RF magnetron sputtering process, the desirable layer properties are largely influenced by the process parameters and conditions. If the quality of the thin film has not reached up to its intended level, the experiments have to be repeated until the desirable quality has been met. This research is proposing Gravitational Search Algorithm (GSA) as the optimization model to reduce the time and cost to be spent in the thin film fabrication. The optimization model's engine has been developed using Java. The model is developed based on GSA concept, which is inspired by the Newtonian laws of gravity and motion. In this research, the model is expected to optimize four deposition parameters which are RF power, deposition time, oxygen flow rate and substrate temperature. The results have turned out to be promising and it could be concluded that the performance of the model is satisfying in this parameter optimization problem. Future work could compare GSA with other nature based algorithms and test them with various set of data.

  19. An integrated approach of topology optimized design and selective laser melting process for titanium implants materials.

    PubMed

    Xiao, Dongming; Yang, Yongqiang; Su, Xubin; Wang, Di; Sun, Jianfeng

    2013-01-01

    The load-bearing bone implants materials should have sufficient stiffness and large porosity, which are interacted since larger porosity causes lower mechanical properties. This paper is to seek the maximum stiffness architecture with the constraint of specific volume fraction by topology optimization approach, that is, maximum porosity can be achieved with predefine stiffness properties. The effective elastic modulus of conventional cubic and topology optimized scaffolds were calculated using finite element analysis (FEA) method; also, some specimens with different porosities of 41.1%, 50.3%, 60.2% and 70.7% respectively were fabricated by Selective Laser Melting (SLM) process and were tested by compression test. Results showed that the computational effective elastic modulus of optimized scaffolds was approximately 13% higher than cubic scaffolds, the experimental stiffness values were reduced by 76% than the computational ones. The combination of topology optimization approach and SLM process would be available for development of titanium implants materials in consideration of both porosity and mechanical stiffness. PMID:23988713

  20. Ethanol production from banana peels using statistically optimized simultaneous saccharification and fermentation process.

    PubMed

    Oberoi, Harinder Singh; Vadlani, Praveen V; Saida, Lavudi; Bansal, Sunil; Hughes, Joshua D

    2011-07-01

    Dried and ground banana peel biomass (BP) after hydrothermal sterilization pretreatment was used for ethanol production using simultaneous saccharification and fermentation (SSF). Central composite design (CCD) was used to optimize concentrations of cellulase and pectinase, temperature and time for ethanol production from BP using SSF. Analysis of variance showed a high coefficient of determination (R(2)) value of 0.92 for ethanol production. On the basis of model graphs and numerical optimization, the validation was done in a laboratory batch fermenter with cellulase, pectinase, temperature and time of nine cellulase filter paper unit/gram cellulose (FPU/g-cellulose), 72 international units/gram pectin (IU/g-pectin), 37 °C and 15 h, respectively. The experiment using optimized parameters in batch fermenter not only resulted in higher ethanol concentration than the one predicted by the model equation, but also saved fermentation time. This study demonstrated that both hydrothermal pretreatment and SSF could be successfully carried out in a single vessel, and use of optimized process parameters helped achieve significant ethanol productivity, indicating commercial potential for the process. To the best of our knowledge, ethanol concentration and ethanol productivity of 28.2 g/l and 2.3 g/l/h, respectively from banana peels have not been reported to date.

  1. Ethanol production from banana peels using statistically optimized simultaneous saccharification and fermentation process.

    PubMed

    Oberoi, Harinder Singh; Vadlani, Praveen V; Saida, Lavudi; Bansal, Sunil; Hughes, Joshua D

    2011-07-01

    Dried and ground banana peel biomass (BP) after hydrothermal sterilization pretreatment was used for ethanol production using simultaneous saccharification and fermentation (SSF). Central composite design (CCD) was used to optimize concentrations of cellulase and pectinase, temperature and time for ethanol production from BP using SSF. Analysis of variance showed a high coefficient of determination (R(2)) value of 0.92 for ethanol production. On the basis of model graphs and numerical optimization, the validation was done in a laboratory batch fermenter with cellulase, pectinase, temperature and time of nine cellulase filter paper unit/gram cellulose (FPU/g-cellulose), 72 international units/gram pectin (IU/g-pectin), 37 °C and 15 h, respectively. The experiment using optimized parameters in batch fermenter not only resulted in higher ethanol concentration than the one predicted by the model equation, but also saved fermentation time. This study demonstrated that both hydrothermal pretreatment and SSF could be successfully carried out in a single vessel, and use of optimized process parameters helped achieve significant ethanol productivity, indicating commercial potential for the process. To the best of our knowledge, ethanol concentration and ethanol productivity of 28.2 g/l and 2.3 g/l/h, respectively from banana peels have not been reported to date. PMID:21376555

  2. Coatings for protection of equipment for biochemical processing of geothermal residues: Progress report FY`97

    SciTech Connect

    Allan, M.L.

    1997-11-01

    Thermal sprayed ethylene methacrylic acid (EMAA) and ethylene tetrafluoroethylene (ETFE), spray-and-bake ETFE and polyvinylidene fluoride (PVDF) and brushable ceramic-epoxy coatings were evaluated for corrosion protection in a biochemical process to treat geothermal residues. Coupon, Atlas cell, peel strength, cathodic disbondment and abrasion tests were performed in aggressive environments including geothermal sludge, hypersaline brine and sulfur-oxidizing bacteria (Thiobacillus ferrooxidans) to determine suitability for protecting storage tanks and reaction vessels. It was found that all of the coatings were resistant to chemical attack and biodegradation at the test temperature of 55 C. The EMAA coatings protected 316L stainless steel from corrosion in coupon tests. However, corrosion of mild steel substrates thermal sprayed with EMAA and ETFE occurred in Atlas cell tests that simulated a lined reactor operating environment and this resulted in decreased adhesive strength. Peel tests to measure residual adhesion revealed that failure mode was dependent on exposure conditions. Abrasion tests showed that the ceramic-epoxy had good resistance to the abrasive effects of sludge. Thermal sprayed EMAA coatings also displayed abrasion resistance. Cathodic disbondment tests in brine at room temperature indicated that EMAA coatings are resistant to disbondment at applied potentials of {minus}780 to {minus}1,070 mV SCE for the test conditions and duration. Slight disbondment of one specimen occurred at a potential of {minus}1,500 mV SCE. The EMAA may be suited to use in conjunction with cathodic protection although further long-term, higher temperature testing would be needed.

  3. COATINGS FOR PROTECTION OF EQUIPMENT FOR BIOCHEMICAL PROCESSING OF GEOTHERMAL RESIDUES: PROGRESS REPORT FY 97

    SciTech Connect

    ALLAN,M.L.

    1997-11-01

    Thermal sprayed ethylene methacrylic acid (EMAA) and ethylene tetrafluoroethylene (ETFE), spray-and-bake ETFE and polyvinylidene fluoride (PVDF) and brushable ceramic-epoxy coatings were evaluated for corrosion protection in a biochemical process to treat geothermal residues. The findings are also relevant to other moderate temperature brine environments where corrosion is a problem. Coupon, Atlas cell, peel strength, cathodic disbondment and abrasion tests were performed in aggressive environments including geothermal sludge, hypersaline brine and sulfur-oxidizing bacteria (Thiobadus ferrooxidans) to determine suitability for protecting storage tanks and reaction vessels. It was found that all of the coatings were resistant to chemical attack and biodegradation at the test temperature of 55 C. The EMAA coatings protected 316L stainless steel from corrosion in coupon tests. However, corrosion of mild steel substrates thermal sprayed with EMAA and ETFE occurred in Atlas cell tests that simulated a lined reactor operating environment and this resulted in decreased adhesive strength. Peel tests to measure residual adhesion revealed that failure mode was dependent on exposure conditions. Long-term tests on the durability of ceramic-epoxy coatings in brine and bacteria are ongoing. Initial indications are that this coating has suitable characteristics. Abrasion tests showed that the ceramic-epoxy had good resistance to the abrasive effects of sludge. Thermal sprayed EMAA coatings also displayed abrasion resistance. Cathodic disbondment tests in brine at room temperature indicated that EMAA coatings are resistant to disbondment at applied potentials of {minus}780 to {minus}1,070 mV SCE for the test conditions and duration. Slight disbondment of one specimen occurred at a potential of {minus}1,500 mV SCE. The EMAA may be suited to use in conjunction with cathodic protection although further long-term, higher temperature testing would be needed.

  4. Optimization of Brazilian TNT industry wastewater treatment using combined zero-valent iron and fenton processes.

    PubMed

    Barreto-Rodrigues, Marcio; Silva, Flávio T; Paiva, Teresa C B

    2009-09-15

    This work explores the optimization of combined zero-valent iron and fenton processes for the treatment of TNT industry wastewater, a residue with recognized polluting potential due to its high concentration of 2,4,6-trinitrotoluene and extremely acidic pH due of the nature of the product purification process. The results of the optimization study indicate that the most efficient condition for reducing the concentration of TNT also generates sufficient amounts of iron(II)for the subsequent oxidative treatment through the Fenton reaction. In general, it was observed that the treatment was highly efficient in terms of meeting the main associated environmental parameters, since it reduced acute toxicity, removed 100% of TNT, 100% of the organic nitrogen and 95.4% of the COD.

  5. Capturing Knowledge In Order To Optimize The Cutting Process For Polyethylene Pipes Using Knowledge Models

    NASA Astrophysics Data System (ADS)

    Rotaru, Ionela Magdalena

    2015-09-01

    Knowledge management is a powerful instrument. Areas where knowledge - based modelling can be applied are different from business, industry, government to education area. Companies engage in efforts to restructure the database held based on knowledge management principles as they recognize in it a guarantee of models characterized by the fact that they consist only from relevant and sustainable knowledge that can bring value to the companies. The proposed paper presents a theoretical model of what it means optimizing polyethylene pipes, thus bringing to attention two important engineering fields, the one of the metal cutting process and gas industry, who meet in order to optimize the butt fusion welding process - the polyethylene cutting part - of the polyethylene pipes. All approach is shaped on the principles of knowledge management. The study was made in collaboration with companies operating in the field.

  6. Optimization of the Purification and Processing of Carbon Nanotubes for Strong, Conductive and Lightweight Wires

    NASA Astrophysics Data System (ADS)

    Moses, Brian T.

    Single walled carbon nanotubes are produced using standard synthesis and purification techniques. Bulk materials produced using filtration drying are characterized mechanically and electrically for engineering properties. Modifications to the purification process are explored with consideration given for the effects on electrical conductivity and mechanical strength. Raman spectroscopy, thermal oxidation profiling, and high-temperature vacuum annealing are used to gain further insight on the connection between defects and nanotube oxidation during the purification process. It is observed that the mechanical properties are strongly temperature dependent, while electrical conductivity varies with humidity rather than temperature. The use of a thermal vacuum anneal can improve separation of oxidative processes between nanotubes and carbon in the time domain, allowing further optimization of the thermal processing and improved physical properties of nanotube bulk materials post-processing.

  7. Reduced order model based on principal component analysis for process simulation and optimization

    SciTech Connect

    Lang, Y.; Malacina, A.; Biegler, L.; Munteanu, S.; Madsen, J.; Zitney, S.

    2009-01-01

    It is well-known that distributed parameter computational fluid dynamics (CFD) models provide more accurate results than conventional, lumped-parameter unit operation models used in process simulation. Consequently, the use of CFD models in process/equipment co-simulation offers the potential to optimize overall plant performance with respect to complex thermal and fluid flow phenomena. Because solving CFD models is time-consuming compared to the overall process simulation, we consider the development of fast reduced order models (ROMs) based on CFD results to closely approximate the high-fidelity equipment models in the co-simulation. By considering process equipment items with complicated geometries and detailed thermodynamic property models, this study proposes a strategy to develop ROMs based on principal component analysis (PCA). Taking advantage of commercial process simulation and CFD software (for example, Aspen Plus and FLUENT), we are able to develop systematic CFD-based ROMs for equipment models in an efficient manner. In particular, we show that the validity of the ROM is more robust within well-sampled input domain and the CPU time is significantly reduced. Typically, it takes at most several CPU seconds to evaluate the ROM compared to several CPU hours or more to solve the CFD model. Two case studies, involving two power plant equipment examples, are described and demonstrate the benefits of using our proposed ROM methodology for process simulation and optimization.

  8. Optimization of electrocoagulation process to treat grey wastewater in batch mode using response surface methodology

    PubMed Central

    2014-01-01

    Background Discharge of grey wastewater into the ecological system causes the negative impact effect on receiving water bodies. Methods In this present study, electrocoagulation process (EC) was investigated to treat grey wastewater under different operating conditions such as initial pH (4–8), current density (10–30 mA/cm2), electrode distance (4–6 cm) and electrolysis time (5–25 min) by using stainless steel (SS) anode in batch mode. Four factors with five levels Box-Behnken response surface design (BBD) was employed to optimize and investigate the effect of process variables on the responses such as total solids (TS), chemical oxygen demand (COD) and fecal coliform (FC) removal. Results The process variables showed significant effect on the electrocoagulation treatment process. The results were analyzed by Pareto analysis of variance (ANOVA) and second order polynomial models were developed in order to study the electrocoagulation process statistically. The optimal operating conditions were found to be: initial pH of 7, current density of 20 mA/cm2, electrode distance of 5 cm and electrolysis time of 20 min. Conclusion These results indicated that EC process can be scale up in large scale level to treat grey wastewater with high removal efficiency of TS, COD and FC. PMID:24410752

  9. Optimization of the Solution and Processing Parameters for Strontium Titanate Thin Films for Electronic Devices

    NASA Astrophysics Data System (ADS)

    Weiss, Claire Victoria

    Metallo-organic solution deposition (MOSD) and spin-coating were used to deposit strontium titanate (SrTiO3 or STO) thin films on Si and metalized Si substrates. In addition, a thermodynamic model was constructed based on the Landau polynomial for the free energy. Using this model, the thin film strain due to the difference in thermal expansion coefficients (TECs) of the film and substrate was calculated, as well as its effect on the permittivity and tunability. It was found that a large tensile thermal strain develops in the STO/Si material system, and this strain significantly lowers the dielectric response as compared to bulk STO. A multi-dimensional parameter optimization process was used to systematically vary the solution, deposition, and processing parameters of the STO thin films. These parameters include the precursor solution heating, solution molarity/concentration, solution aging, spin-coating recipe, pyrolysis procedure/temperature, annealing temperature, and annealing oxygen environment. X-ray diffraction (XRD), scanning electron microscopy (SEM), atomic force microscopy (AFM), X-ray photoelectron spectroscopy (XPS), spectroscopic ellipsometry (SE), and dielectric/insulating measurements were used to characterize the STO thin film devices. By optimizing various deposition parameters, such as the solution molarity and the pyrolysis temperature, the tensile stress induced from the difference in TECs of the film and substrate, which was predicted by the thermodynamic theory, can be reduced or completely eliminated. This stress relaxation is achieved through the tailoring of compressive "growth stresses" by optimizing the precursor solution molarity as well as the post-deposition heat treatment processing. By utilizing the multi-dimensional parameter optimization process, high-quality, electronic-grade thin film STO can be deposited via the affordable, simple, and industry-standard MOSD technique.

  10. 10 CFR 140.13a - Amount of financial protection required for plutonium processing and fuel fabrication plants.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Amount of financial protection required for plutonium... of financial protection required for plutonium processing and fuel fabrication plants. (a) Each holder of a license issued pursuant to part 70 of this chapter to possess and use plutonium at...

  11. 10 CFR 140.13a - Amount of financial protection required for plutonium processing and fuel fabrication plants.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Amount of financial protection required for plutonium... of financial protection required for plutonium processing and fuel fabrication plants. (a) Each holder of a license issued pursuant to part 70 of this chapter to possess and use plutonium at...

  12. 10 CFR 140.13a - Amount of financial protection required for plutonium processing and fuel fabrication plants.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Amount of financial protection required for plutonium... of financial protection required for plutonium processing and fuel fabrication plants. (a) Each holder of a license issued pursuant to part 70 of this chapter to possess and use plutonium at...

  13. 10 CFR 140.13a - Amount of financial protection required for plutonium processing and fuel fabrication plants.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Amount of financial protection required for plutonium... of financial protection required for plutonium processing and fuel fabrication plants. (a) Each holder of a license issued pursuant to part 70 of this chapter to possess and use plutonium at...

  14. 15 CFR 301.4 - Processing of applications by the Department of the Treasury (Customs and Border Protection).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Department of the Treasury (Customs and Border Protection). 301.4 Section 301.4 Commerce and Foreign Trade... INSTITUTIONS § 301.4 Processing of applications by the Department of the Treasury (Customs and Border... Customs and Border Protection. If the application appears to be complete, the Commissioner shall...

  15. 15 CFR 301.4 - Processing of applications by the Department of the Treasury (Customs and Border Protection).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Department of the Treasury (Customs and Border Protection). 301.4 Section 301.4 Commerce and Foreign Trade... INSTITUTIONS § 301.4 Processing of applications by the Department of the Treasury (Customs and Border... Customs and Border Protection. If the application appears to be complete, the Commissioner shall...

  16. 15 CFR 301.4 - Processing of applications by the Department of the Treasury (Customs and Border Protection).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Department of the Treasury (Customs and Border Protection). 301.4 Section 301.4 Commerce and Foreign Trade... INSTITUTIONS § 301.4 Processing of applications by the Department of the Treasury (Customs and Border... Customs and Border Protection. If the application appears to be complete, the Commissioner shall...

  17. 15 CFR 301.4 - Processing of applications by the Department of the Treasury (Customs and Border Protection).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Department of the Treasury (Customs and Border Protection). 301.4 Section 301.4 Commerce and Foreign Trade... INSTITUTIONS § 301.4 Processing of applications by the Department of the Treasury (Customs and Border... Customs and Border Protection. If the application appears to be complete, the Commissioner shall...

  18. 15 CFR 301.4 - Processing of applications by the Department of the Treasury (Customs and Border Protection).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Department of the Treasury (Customs and Border Protection). 301.4 Section 301.4 Commerce and Foreign Trade... INSTITUTIONS § 301.4 Processing of applications by the Department of the Treasury (Customs and Border... Customs and Border Protection. If the application appears to be complete, the Commissioner shall...

  19. 10 CFR 140.13a - Amount of financial protection required for plutonium processing and fuel fabrication plants.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Amount of financial protection required for plutonium... of financial protection required for plutonium processing and fuel fabrication plants. (a) Each holder of a license issued pursuant to part 70 of this chapter to possess and use plutonium at...

  20. Critical Protection Item Classification for a waste processing facility at Savannah River Site. Revision 1

    SciTech Connect

    Ades, M.J.; Garrett, R.J.

    1993-12-31

    As a part of its compliance with the Department of Energy requirements for safety of nuclear facilities at the Savannah River Site (SRS), Westinghouse Savannah River Company (WSRC) assigns functional classifications to structures, systems and components (SSCs). As a result, changes in design, operations, maintenance, testing, and inspections of SSCs are performed and backfit requirements are established. This paper describes the Critical Protection Item (CPI) Classification for waste processing facility (WPF) at SRS. The descriptions of the WPF and the processes considered are provided elsewhere. The proposed CPI classification methodology includes the evaluation of the onsite radiological consequences, and the onsite and offsite non-radiological consequences from postulated accidents at the WPF, and comparison of these consequences with allowable frequency-dependent limits. When allowable limits are exceeded, CPIs are identified for accident mitigation.

  1. Biodegradability and toxicity assessment of a real textile wastewater effluent treated by an optimized electrocoagulation process.

    PubMed

    Manenti, Diego R; Módenes, Aparecido N; Soares, Petrick A; Boaventura, Rui A R; Palácio, Soraya M; Borba, Fernando H; Espinoza-Quiñones, Fernando R; Bergamasco, Rosângela; Vilar, Vítor J P

    2015-01-01

    In this work, the application of an iron electrode-based electrocoagulation (EC) process on the treatment of a real textile wastewater (RTW) was investigated. In order to perform an efficient integration of the EC process with a biological oxidation one, an enhancement in the biodegradability and low toxicity of final compounds was sought. Optimal values of EC reactor operation parameters (pH, current density and electrolysis time) were achieved by applying a full factorial 3(3) experimental design. Biodegradability and toxicity assays were performed on treated RTW samples obtained at the optimal values of: pH of the solution (7.0), current density (142.9 A m(-2)) and different electrolysis times. As response variables for the biodegradability and toxicity assessment, the Zahn-Wellens test (Dt), the ratio values of dissolved organic carbon (DOC) relative to low-molecular-weight carboxylates anions (LMCA) and lethal concentration 50 (LC50) were used. According to the Dt, the DOC/LMCA ratio and LC50, an electrolysis time of 15 min along with the optimal values of pH and current density were suggested as suitable for a next stage of treatment based on a biological oxidation process.

  2. Optimization of the ASPN Process to Bright Nitriding of Woodworking Tools Using the Taguchi Approach

    NASA Astrophysics Data System (ADS)

    Walkowicz, J.; Staśkiewicz, J.; Szafirowicz, K.; Jakrzewski, D.; Grzesiak, G.; Stępniak, M.

    2013-02-01

    The subject of the research is optimization of the parameters of the Active Screen Plasma Nitriding (ASPN) process of high speed steel planing knives used in woodworking. The Taguchi approach was applied for development of the plan of experiments and elaboration of obtained experimental results. The optimized ASPN parameters were: process duration, composition and pressure of the gaseous atmosphere, the substrate BIAS voltage and the substrate temperature. The results of the optimization procedure were verified by the tools' behavior in the sharpening operation performed in normal industrial conditions. The ASPN technology proved to be extremely suitable for nitriding the woodworking planing tools, which because of their specific geometry, in particular extremely sharp wedge angles, could not be successfully nitrided using conventional direct current plasma nitriding method. The carried out research proved that the values of fracture toughness coefficient K Ic are in correlation with maximum spalling depths of the cutting edge measured after sharpening, and therefore may be used as a measure of the nitrided planing knives quality. Based on this criterion the optimum parameters of the ASPN process for nitriding high speed planing knives were determined.

  3. Study of research and development processes through Fuzzy Super FRM model and optimization solutions.

    PubMed

    Sârbu, Flavius Aurelian; Moga, Monika; Calefariu, Gavrilă; Boșcoianu, Mircea

    2015-01-01

    The aim of this study is to measure resources for R&D (research and development) at the regional level in Romania and also obtain primary data that will be important in making the right decisions to increase competitiveness and development based on an economic knowledge. As our motivation, we would like to emphasize that by the use of Super Fuzzy FRM model we want to determine the state of R&D processes at regional level using a mean different from the statistical survey, while by the two optimization methods we mean to provide optimization solutions for the R&D actions of the enterprises. Therefore to fulfill the above mentioned aim in this application-oriented paper we decided to use a questionnaire and for the interpretation of the results the Super Fuzzy FRM model, representing the main novelty of our paper, as this theory provides a formalism based on matrix calculus, which allows processing of large volumes of information and also delivers results difficult or impossible to see, through statistical processing. Furthermore another novelty of the paper represents the optimization solutions submitted in this work, given for the situation when the sales price is variable, and the quantity sold is constant in time and for the reverse situation. PMID:25821846

  4. The mental health of children affected by armed conflict: protective processes and pathways to resilience.

    PubMed

    Betancourt, Theresa Stichick; Khan, Kashif Tanveer

    2008-06-01

    This paper examines the concept of resilience in the context of children affected by armed conflict. Resilience has been frequently viewed as a unique quality of certain 'invulnerable' children. In contrast, this paper argues that a number of protective processes contribute to resilient mental health outcomes in children when considered through the lens of the child's social ecology. While available research has made important contributions to understanding risk factors for negative mental health consequences of war-related violence and loss, the focus on trauma alone has resulted in inadequate attention to factors associated with resilient mental health outcomes. This paper presents key studies in the literature that address the interplay between risk and protective processes in the mental health of war-affected children from an ecological, developmental perspective. It suggests that further research on war-affected children should pay particular attention to coping and meaning making at the individual level; the role of attachment relationships, caregiver health, resources and connection in the family, and social support available in peer and extended social networks. Cultural and community influences such as attitudes towards mental health and healing as well as the meaning given to the experience of war itself are also important aspects of the larger social ecology.

  5. Investigating risky, distracting, and protective peer passenger effects in a dual process framework.

    PubMed

    Ross, Veerle; Jongen, Ellen M M; Brijs, Kris; Brijs, Tom; Wets, Geert

    2016-08-01

    Prior studies indicated higher collision rates among young novice drivers with peer passengers. This driving simulator study provided a test for a dual process theory of risky driving by examining social rewards (peer passengers) and cognitive control (inhibitory control). The analyses included age (17-18 yrs, n=30; 21-24 yrs, n=20). Risky, distracting, and protective effects were classified by underlying driver error mechanisms. In the first drive, participants drove alone. In the second, participants drove with a peer passenger. Red-light running (violation) was more prevalent in the presence of peer passengers, which provided initial support for a dual process theory of risk driving. In a subgroup with low inhibitory control, speeding (violation) was more prevalent in the presence of peer passengers. Reduced lane-keeping variability reflected distracting effects. Nevertheless, possible protective effects for amber-light running and hazard handling (cognition and decision-making) were found in the drive with peer passengers. Avenues for further research and possible implications for targets of future driver training programs are discussed. PMID:27218409

  6. A novel powder coating process for attaining taste masking and moisture protective films applied to tablets.

    PubMed

    Cerea, Matteo; Zheng, Weijia; Young, Christopher R; McGinity, James W

    2004-07-26

    A novel powder coating process was developed for the application of taste masking and moisture protective films on tablets while avoiding the use of solvents or water. The coalescence of particles to form a polymeric film was investigated through studies of dry powder layering of micronized acrylic polymer (E PO) to produce free films. Theophylline containing tablets were coated with the same acrylic polymer in a laboratory scale spheronizer using a powder coating technique. The dry powder layer delayed the onset of drug release in pH 6.8 medium, depending on the coating level, while no delay was observed in pH 1.0 medium. The presence of hydrophilic polymers in the acrylic coating layer decreased the lag time for drug release in pH 6.8 medium, while only the presence of HPMC in the film slowed the drug release rate in acidic medium. The dry coating process was demonstrated to be a reliable alternative to solvent or aqueous film coating technologies for applying taste masking and moisture protective film coats onto compressed tablets. A controlled drug release profile was achieved in pH 6.8 media.

  7. The mental health of children affected by armed conflict: protective processes and pathways to resilience.

    PubMed

    Betancourt, Theresa Stichick; Khan, Kashif Tanveer

    2008-06-01

    This paper examines the concept of resilience in the context of children affected by armed conflict. Resilience has been frequently viewed as a unique quality of certain 'invulnerable' children. In contrast, this paper argues that a number of protective processes contribute to resilient mental health outcomes in children when considered through the lens of the child's social ecology. While available research has made important contributions to understanding risk factors for negative mental health consequences of war-related violence and loss, the focus on trauma alone has resulted in inadequate attention to factors associated with resilient mental health outcomes. This paper presents key studies in the literature that address the interplay between risk and protective processes in the mental health of war-affected children from an ecological, developmental perspective. It suggests that further research on war-affected children should pay particular attention to coping and meaning making at the individual level; the role of attachment relationships, caregiver health, resources and connection in the family, and social support available in peer and extended social networks. Cultural and community influences such as attitudes towards mental health and healing as well as the meaning given to the experience of war itself are also important aspects of the larger social ecology. PMID:18569183

  8. Planetary protection, legal ambiguity and the decision making process for Mars sample return.

    PubMed

    Race, M S

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions.

  9. Planetary protection, legal ambiguity and the decision making process for Mars sample return

    NASA Technical Reports Server (NTRS)

    Race, M. S.

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions.

  10. Planetary protection, legal ambiguity and the decision making process for Mars sample return.

    PubMed

    Race, M S

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions. PMID:11538983

  11. Optimization of Training Sets For Neural-Net Processing of Characteristic Patterns From Vibrating Solids

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J. (Inventor)

    2006-01-01

    An artificial neural network is disclosed that processes holography generated characteristic pattern of vibrating structures along with finite-element models. The present invention provides for a folding operation for conditioning training sets for optimally training forward-neural networks to process characteristic fringe pattern. The folding pattern increases the sensitivity of the feed-forward network for detecting changes in the characteristic pattern The folding routine manipulates input pixels so as to be scaled according to the location in an intensity range rather than the position in the characteristic pattern.

  12. Optimal Control of the Valve Based on Traveling Wave Method in the Water Hammer Process

    NASA Astrophysics Data System (ADS)

    Cao, H. Z.; Wang, F.; Feng, J. L.; Tan, H. P.

    2011-09-01

    Valve regulation is an effective method for process control during the water hammer. The principle of d'Alembert traveling wave theory was used in this paper to construct the exact analytical solution of the water hammer, and the optimal speed law of the valve that can reduce the water hammer pressure in the maximum extent was obtained. Combining this law with the valve characteristic curve, the principle corresponding to the valve opening changing with time was obtained, which can be used to guide the process of valve closing and to reduce the water hammer pressure in the maximum extent.

  13. The source term and waste optimization of molten salt reactors with processing

    SciTech Connect

    Gat, U.; Dodds, H.L.

    1993-07-01

    The source term of a molten salt reactor (MSR) with fuel processing is reduced by the ratio of processing time to refueling time as compared to solid fuel reactors. The reduction, which can be one to two orders of magnitude, is due to removal of the long-lived fission products. The waste from MSRs can be optimized with respect to its chemical composition, concentration, mixture, shape, and size. The actinides and long-lived isotopes can be separated out and returned to the reactor for transmutation. These features make MSRs more acceptable and simpler in operation and handling.

  14. Eco-techno-economic synthesis of process routes for the production of zinc using combinatorial optimization

    NASA Astrophysics Data System (ADS)

    Sudhölter, S.; Krüger, J.; Reuter, M. A.

    1996-12-01

    The demands placed on the environmental and social acceptability of metallurgical processing technology are rising steadily. Of particular importance are the production techniques, products, and disposal of residues. These aspects are affected by the varying compositions of the primary and secondary raw materials processed in the plants and the rapidly changing market situations in the metallurgical industry. Metallurgical engineers have to select “optimal” processes from a vast number of existing technologies for the primary production of zinc and for the processing of zinc containing residues. To enable the engineer to compare these techniques and to choose the right combination of unit operations, a process design methodology is presented here, which has been adapted from methodologies developed in chemical engineering and minerals processing. In a previous article by the authors, a structural parameter approach was introduced, that implements a synthesis model, which includes all unit operations currently implemented in zinc metallurgy. At the basis of this model is a data base containing the details of the unit operations included in the model. In this article, this methodology is expanded to incorporate an unlimited quantity of different components by introducing the simulated annealing optimization technique to generate optimal flow sheets for the production of zinc under varying constraints which include operation costs, metal prices, environmental costs, and split factors for Zn, Pb, Ag, and Fe. Case studies demonstrate the functionality of this metallurgical tool for the hydrometallurgical recovery of zinc including numerous unit operations for the processing of by-products and residues. It will also be demonstrated how this model can be extended to a “waste management” tool that generates processing routes not only for the residues from the zinc industry but also for zinc containing residues from other processes, e.g., EAF dusts.

  15. Study of optimal sequences and energy requirements of integrated processing systems

    SciTech Connect

    Al-Enezi, G.A.

    1986-01-01

    The increased demand for high quality unleaded gasoline produced from a refinery has caused an increased in developing processing alternatives for producing high-octane gasoline components. The production of methyl tertiary butyl ether is currently considered one of the most practical alternatives. The production of methyl tertiary butyl ether is based mainly on the availability of light hydrocarbons as a feed, such as isobutane from a refinery. The availability of isobutane is increased by isomerization of normal butanes. Even though distillation processes are widely used to separate mixtures of light hydrocarbons, they are highly energy intensive. A steady-state design of several configurations of distillation columns were studied for separating light hydrocarbon mixtures. A number of energy conservation alternatives were evaluated for the distillation process integrated with an isomerization unit. A modified form of the Complex Method of Box was used for optimizing the design and operating conditions of these energy conservation alternatives. The use of vapor recompression with distillation columns was evaluated as one of the alternatives. Despite the more complex processing scheme required, this alternative used only about 30% of the external energy required in a conventional distillation process for the same separation. The operating conditions of the multi-effect distillation columns were optimized as another alternative. Reduction in energy consumption for this case was about 40% compared to conventional distillation columns.

  16. To protect or abandon: a participatory process on landslide risk mitigation

    NASA Astrophysics Data System (ADS)

    Scolobig, A.; Bayer, J.; Cascini, L.; Ferlisi, S.

    2012-04-01

    With escalating costs of landslide risk mitigation and relief, a challenge for local authorities is to develop landslide risk mitigation measures that are viewed as efficient, feasible and fair by the many stakeholders involved. Innovative measures and the participation of stakeholders in the decision making process are essential elements in developing effective strategies to deal with the ever-changing spatial and temporal patterns of landslide risk. A stakeholder-led policy process, however, can face many social and economic challenges. One of the most difficult is deciding between costly protection measures or relocating homes. Particularly in areas with high population density, protection works are often not built because of economic/environmental constraints or private interests of the local residents. At the same time it not always possible to relocate households even if the costs are deemed less than protecting them. These issues turned out to be crucial in a recent participatory process for selecting risk mitigation measures in the town of Nocera Inferiore, Southern Italy, which experienced a landslide in 2005 causing three fatalities. The paper reports on this process which was structured in a series of meetings with a group of selected residents and several parallel activities open to the public. The preparatory work included semi-structured interviews carried out with key local stakeholders and a public survey eliciting residents' views on landslide risk mitigation. After describing the background of the landslide risk management problem in Nocera Inferiore, the paper focuses on three packages of risk mitigation measures (each of them not exceeding a total cost of 7 million Euro, namely the available funds) and the key trade-offs that emerged during the meetings with the residents. The participants reached a unanimous consensus on fundamental priorities, i.e. the improvement of the warning system, the implementation of an integrated system of monitoring

  17. Springback prediction and optimization of variable stretch force trajectory in three-dimensional stretch bending process

    NASA Astrophysics Data System (ADS)

    Teng, Fei; Zhang, Wanxi; Liang, Jicai; Gao, Song

    2015-11-01

    Most of the existing studies use constant force to reduce springback while researching stretch force. However, variable stretch force can reduce springback more efficiently. The current research on springback prediction in stretch bending forming mainly focuses on artificial neural networks combined with the finite element simulation. There is a lack of springback prediction by support vector regression (SVR). In this paper, SVR is applied to predict springback in the three-dimensional stretch bending forming process, and variable stretch force trajectory is optimized. Six parameters of variable stretch force trajectory are chosen as the input parameters of the SVR model. Sixty experiments generated by design of experiments (DOE) are carried out to train and test the SVR model. The experimental results confirm that the accuracy of the SVR model is higher than that of artificial neural networks. Based on this model, an optimization algorithm of variable stretch force trajectory using particle swarm optimization (PSO) is proposed. The springback amount is used as the objective function. Changes of local thickness are applied as the criterion of forming constraints. The objection and constraints are formulated by response surface models. The precision of response surface models is examined. Six different stretch force trajectories are employed to certify springback reduction in the optimum stretch force trajectory, which can efficiently reduce springback. This research proposes a new method of springback prediction using SVR and optimizes variable stretch force trajectory to reduce springback.

  18. Statistical media and process optimization for biotransformation of rice bran to vanillin using Pediococcus acidilactici.

    PubMed

    Kaur, Baljinder; Chakraborty, Debkumar

    2013-11-01

    An isolate of P. acidilactici capable of producing vanillin from rice bran was isolated from a milk product. Response Surface Methodology was employed for statistical media and process optimization for production of biovanillin. Statistical medium optimization was done in two steps involving Placket Burman Design and Central Composite Response Designs. The RSM optimized vanillin production medium consisted of 15% (w/v) rice bran, 0.5% (w/v) peptone, 0.1% (w/v) ammonium nitrate, 0.005% (w/v) ferulic acid, 0.005% (w/v) magnesium sulphate, and 0.1% (v/v) tween-80, pH 5.6, at a temperature of 37 degrees C under shaking conditions at 180 rpm. 1.269 g/L vanillin was obtained within 24 h of incubation in optimized culture medium. This is the first report indicating such a high vanillin yield obtained during biotransformation of ferulic acid to vanillin using a Pediococcal isolate.

  19. Solution chemistry optimization of sol-gel processed PZT thin films

    SciTech Connect

    Lockwood, S.J. Schwartz, R.W.; Tuttle, B.A.; Thomas, E.V.

    1992-12-31

    We have optimized the ferroelectric properties and microstructural characteristics of sol-gel PZT thin films used in a CMOS-integrated, 256 bit ferroelectric non-volatile memory. The sol-gel process utilized in our work involved the reaction of Zr n-butoxide, Ti isopropoxide, and Pb (IV) acetate in a methanol/acetic acid solvent system. A 10-factor screening experiment identified solution concentration, acetic acid addition, and water volume as the solution chemistry factors having the most significant effects on the remanent polarization, coercive field, ferroelectric loop quality, and microstructural quality. The optimal values for these factors were determined by running a 3-factor uniform shell design, modeling the responses, and testing the models at the predicted optimal conditions. The optimized solution chemistry generated 3-layer, 300--400 nm thick films on RuO{sub 2} coated silicon substrates with coercive fields of less than 25 kv/cm (a 40--50% improvement over the original solution chemistry), a remanent polarization of 25--30 {mu}C/cm, and a reduction in the pyrochlore phase content below observable levels.

  20. Antioxidant protection of proteins and lipids in processed pork loin chops through feed supplementation with avocado.

    PubMed

    Hernández-López, Silvia H; Rodríguez-Carpena, Javier G; Lemus-Flores, Clemente; Galindo-García, Jorge; Estévez, Mario

    2016-06-01

    This study was conducted to analyze the impact of dietary avocado on the oxidative stability of lipids and proteins during pork processing. Loins from control (fed basic diet) and treated pigs (fed on avocado-supplemented diet) were roasted (102 °C/20 min) and subsequently packed in trays wrapped with oxygen-permeable films and chilled at 4 °C for 12 days. At each processing stage (raw, cooked and cooked & chilled), pork samples from both groups were analyzed for the concentration of TBARS, the loss of tryptophan and free thiols, and the formation of protein carbonyls, disulphide bonds and Schiff bases. Processing led to a depletion of tryptophan and sulfur-containing amino acids and an increase of lipid and protein oxidation products. Dietary avocado was not able to protect against the oxidation of tryptophan and thiols but cooked & chilled loins from treated pigs had significantly lower concentration of lipid and protein carbonyls than control counterparts. Likewise, dietary avocado alleviated the formation of Schiff bases during cooking. These results illustrate the benefits of dietary avocado on the oxidative stability of processed pork loins. PMID:27478235

  1. Antioxidant protection of proteins and lipids in processed pork loin chops through feed supplementation with avocado.

    PubMed

    Hernández-López, Silvia H; Rodríguez-Carpena, Javier G; Lemus-Flores, Clemente; Galindo-García, Jorge; Estévez, Mario

    2016-06-01

    This study was conducted to analyze the impact of dietary avocado on the oxidative stability of lipids and proteins during pork processing. Loins from control (fed basic diet) and treated pigs (fed on avocado-supplemented diet) were roasted (102 °C/20 min) and subsequently packed in trays wrapped with oxygen-permeable films and chilled at 4 °C for 12 days. At each processing stage (raw, cooked and cooked & chilled), pork samples from both groups were analyzed for the concentration of TBARS, the loss of tryptophan and free thiols, and the formation of protein carbonyls, disulphide bonds and Schiff bases. Processing led to a depletion of tryptophan and sulfur-containing amino acids and an increase of lipid and protein oxidation products. Dietary avocado was not able to protect against the oxidation of tryptophan and thiols but cooked & chilled loins from treated pigs had significantly lower concentration of lipid and protein carbonyls than control counterparts. Likewise, dietary avocado alleviated the formation of Schiff bases during cooking. These results illustrate the benefits of dietary avocado on the oxidative stability of processed pork loins.

  2. Optimizing Cloud Based Image Storage, Dissemination and Processing Through Use of Mrf and Lerc

    NASA Astrophysics Data System (ADS)

    Becker, Peter; Plesea, Lucian; Maurer, Thomas

    2016-06-01

    The volume and numbers of geospatial images being collected continue to increase exponentially with the ever increasing number of airborne and satellite imaging platforms, and the increasing rate of data collection. As a result, the cost of fast storage required to provide access to the imagery is a major cost factor in enterprise image management solutions to handle, process and disseminate the imagery and information extracted from the imagery. Cloud based object storage offers to provide significantly lower cost and elastic storage for this imagery, but also adds some disadvantages in terms of greater latency for data access and lack of traditional file access. Although traditional file formats geoTIF, JPEG2000 and NITF can be downloaded from such object storage, their structure and available compression are not optimum and access performance is curtailed. This paper provides details on a solution by utilizing a new open image formats for storage and access to geospatial imagery optimized for cloud storage and processing. MRF (Meta Raster Format) is optimized for large collections of scenes such as those acquired from optical sensors. The format enables optimized data access from cloud storage, along with the use of new compression options which cannot easily be added to existing formats. The paper also provides an overview of LERC a new image compression that can be used with MRF that provides very good lossless and controlled lossy compression.

  3. Process optimization for extraction of carotenoids from medicinal caterpillar fungus, Cordyceps militaris (Ascomycetes).

    PubMed

    Yang, Tao; Sun, Junde; Lian, Tiantian; Wang, Wenzhao; Dong, Cai-Hong

    2014-01-01

    Natural carotenoids have attracted great attention for their important beneficial effects on human health and food coloring function. Cordyceps militaris, a well-known edible and medicinal fungus, is a potential source of natural carotenoids. The present study aimed to optimize the process parameters for carotenoid extraction from this mushroom. The effects of different methods of breaking the fungal cell wall and organic solvents were studied by the one-factor-at-a-time method. Subsequently, the process parameters including the duration of the extraction time, the number of extractions, and the solvent to solid ratio were optimized by using the Box-Behnken design. The optimal extraction conditions included using an acid-heating method to break the cell wall and later extracting three times, each for a 1 h duration, with a 4:1 mixture of acetone: petroleum ether and a solvent: solid ratio of 24:1. The carotenoid content varied from 2122.50 to 3847.50 µg/g dry weights in different commercially obtained fruit bodies of C. militaris. The results demonstrated that the C. militaris contained more carotenoid content in its fruit bodies than other known mushrooms. Stability monitoring by HPLC demonstrated that the carotenoids could be stored at 4°C for 40 d. It is suggested that the carotenoid content should be considered as the quality standard of commercial products of this valued mushroom. These findings will facilitate the exploration of carotenoids from C. militaris. PMID:24941034

  4. Modeling and optimizing electrodischarge machine process (EDM) with an approach based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    Zabbah, Iman

    2012-01-01

    Electro Discharge Machine (EDM) is the commonest untraditional method of production for forming metals and the Non-Oxide ceramics. The increase of smoothness, the increase of the remove of filings, and also the decrease of proportional erosion tool has an important role in this machining. That is directly related to the choosing of input parameters.The complicated and non-linear nature of EDM has made the process impossible with usual and classic method. So far, some methods have been used based on intelligence to optimize this process. At the top of them we can mention artificial neural network that has modelled the process as a black box. The problem of this kind of machining is seen when a workpiece is composited of the collection of carbon-based materials such as silicon carbide. In this article, besides using the new method of mono-pulse technical of EDM, we design a fuzzy neural network and model it. Then the genetic algorithm is used to find the optimal inputs of machine. In our research, workpiece is a Non-Oxide metal called silicon carbide. That makes the control process more difficult. At last, the results are compared with the previous methods.

  5. Modeling and optimizing electrodischarge machine process (EDM) with an approach based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    Zabbah, Iman

    2011-12-01

    Electro Discharge Machine (EDM) is the commonest untraditional method of production for forming metals and the Non-Oxide ceramics. The increase of smoothness, the increase of the remove of filings, and also the decrease of proportional erosion tool has an important role in this machining. That is directly related to the choosing of input parameters.The complicated and non-linear nature of EDM has made the process impossible with usual and classic method. So far, some methods have been used based on intelligence to optimize this process. At the top of them we can mention artificial neural network that has modelled the process as a black box. The problem of this kind of machining is seen when a workpiece is composited of the collection of carbon-based materials such as silicon carbide. In this article, besides using the new method of mono-pulse technical of EDM, we design a fuzzy neural network and model it. Then the genetic algorithm is used to find the optimal inputs of machine. In our research, workpiece is a Non-Oxide metal called silicon carbide. That makes the control process more difficult. At last, the results are compared with the previous methods.

  6. Optimization of tetanus toxoid ammonium sulfate precipitation process using response surface methodology.

    PubMed

    Brgles, Marija; Prebeg, Pero; Kurtović, Tihana; Ranić, Jelena; Marchetti-Deschmann, Martina; Allmaier, Günter; Halassy, Beata

    2016-10-01

    Tetanus toxoid (TTd) is a highly immunogenic, detoxified form of tetanus toxin, a causative agent of tetanus disease, produced by Clostridium tetani. Since tetanus disease cannot be eradicated but is easily prevented by vaccination, the need for the tetanus vaccine is permanent. The aim of this work was to investigate the possibility of optimizing TTd purification, i.e., ammonium sulfate precipitation process. The influence of the percentage of ammonium sulfate, starting amount of TTd, buffer type, pH, temperature, and starting purity of TTd on the purification process were investigated using optimal design for response surface models. Responses measured for evaluation of the ammonium sulfate precipitation process were TTd amount (Lf/mL) and total protein content. These two parameters were used to calculate purity (Lf/mgPN) and the yield of the process. Results indicate that citrate buffer, lower temperature, and lower starting amount of TTd result in higher purities of precipitates. Gel electrophoresis combined with matrix-assisted laser desorption ionization-mass spectrometric analysis of precipitates revealed that there are no inter-protein cross-links and that all contaminating proteins have pIs similar to TTd, so this is most probably the reason for the limited success of purification by precipitation.

  7. IOTA: integration optimization, triage and analysis tool for the processing of XFEL diffraction images1

    PubMed Central

    Lyubimov, Artem Y.; Uervirojnangkoorn, Monarin; Zeldin, Oliver B.; Brewster, Aaron S.; Murray, Thomas D.; Sauter, Nicholas K.; Berger, James M.; Weis, William I.; Brunger, Axel T.

    2016-01-01

    Serial femtosecond crystallography (SFX) uses an X-ray free-electron laser to extract diffraction data from crystals not amenable to conventional X-ray light sources owing to their small size or radiation sensitivity. However, a limitation of SFX is the high variability of the diffraction images that are obtained. As a result, it is often difficult to determine optimal indexing and integration parameters for the individual diffraction images. Presented here is a software package, called IOTA, which uses a grid-search technique to determine optimal spot-finding parameters that can in turn affect the success of indexing and the quality of integration on an image-by-image basis. Integration results can be filtered using a priori information about the Bravais lattice and unit-cell dimensions and analyzed for unit-cell isomorphism, facilitating an improvement in subsequent data-processing steps. PMID:27275148

  8. Biocatalytic conversion of poultry processing leftovers: Optimization of hydrolytic conditions and peptide hydrolysate characterization.

    PubMed

    Nikolaev, I V; Sforza, S; Lambertini, F; Ismailova, D Yu; Khotchenkov, V P; Volik, V G; Dossena, A; Popov, V O; Koroleva, O V

    2016-04-15

    Peptide hydrolysate (PH) was produced by deep controllable bioconversion of poultry processing leftovers (broiler necks), by means of a multienzyme composition, containing four commercially available enzyme preparations (Alcalase, Neutrase, Flavourzyme, Protamex). The design of multienzyme composition (MEC) was applied to yield a hydrolysate with adjusted properties, including minimized antigenicity and bitterness. The protein recovery was optimized using Box-Behnken response surface design. The individual and interactive effects of hydrolysis conditions (time, hydromodule and MEC dosage) were studied. The experimental data were analyzed by ANOVA method and a well-predictive, second order polynomial model was developed using multiple regression analysis. Optimal hydrolysis conditions were found to be: hydrolysis time 3 h, hydromodule 2.25 l/kg and dosage of MEC 0.25%. The corresponding predicted value for protein recovery was 75.34%, 2 times higher compared to traditional long-term heating hydrolysis. The PH obtained is a low allergenic product with high antioxidant capacity.

  9. Determining optimal selling price and lot size with process reliability and partial backlogging considerations

    NASA Astrophysics Data System (ADS)

    Hsieh, Tsu-Pang; Cheng, Mei-Chuan; Dye, Chung-Yuan; Ouyang, Liang-Yuh

    2011-01-01

    In this article, we extend the classical economic production quantity (EPQ) model by proposing imperfect production processes and quality-dependent unit production cost. The demand rate is described by any convex decreasing function of the selling price. In addition, we allow for shortages and a time-proportional backlogging rate. For any given selling price, we first prove that the optimal production schedule not only exists but also is unique. Next, we show that the total profit per unit time is a concave function of price when the production schedule is given. We then provide a simple algorithm to find the optimal selling price and production schedule for the proposed model. Finally, we use a couple of numerical examples to illustrate the algorithm and conclude this article with suggestions for possible future research.

  10. Optimization of process parameters in CNC turning of aluminium alloy using hybrid RSM cum TLBO approach

    NASA Astrophysics Data System (ADS)

    Rudrapati, R.; Sahoo, P.; Bandyopadhyay, A.

    2016-09-01

    The main aim of the present work is to analyse the significance of turning parameters on surface roughness in computer numerically controlled (CNC) turning operation while machining of aluminium alloy material. Spindle speed, feed rate and depth of cut have been considered as machining parameters. Experimental runs have been conducted as per Box-Behnken design method. After experimentation, surface roughness is measured by using stylus profile meter. Factor effects have been studied through analysis of variance. Mathematical modelling has been done by response surface methodology, to made relationships between the input parameters and output response. Finally, process optimization has been made by teaching learning based optimization (TLBO) algorithm. Predicted turning condition has been validated through confirmatory experiment.

  11. Performance of the ALTA 4700 with variable print strategy and optimized resist process

    NASA Astrophysics Data System (ADS)

    Allen, Paul C.; Hamaker, H. Christopher; Morgante, Cris; Berwick, Andrew; White, Michael

    2005-11-01

    The ALTA 4700 incorporates new optical subsystems to improve pattern quality performance and has added the capability to do variable multipass printing. The optical system changes are the addition of a 0.9-NA reduction lens and a new AOD subsystem to reduce beam placement and intensity errors. Variable multipass printing allows two-, four- or eight-pass printing, thereby enabling the user to optimize the pattern quality/throughput tradeoff. Local CDU 3σ performance for one pattern is reduced from 8.2 to 5.1 to 3.4 nm as the number of passes is increased from two to four to eight. Reduction of CDU performance is more pattern dependent going from four to eight passes than going from two to four passes. Pattern write times scale roughly linearly with the number of passes. Local pattern loading effects can limit global CDU performance. These effects can be reduced by optimizing resist selection and develop processes.

  12. Wavelet transform based on the optimal wavelet pairs for tunable diode laser absorption spectroscopy signal processing.

    PubMed

    Li, Jingsong; Yu, Benli; Fischer, Horst

    2015-04-01

    This paper presents a novel methodology-based discrete wavelet transform (DWT) and the choice of the optimal wavelet pairs to adaptively process tunable diode laser absorption spectroscopy (TDLAS) spectra for quantitative analysis, such as molecular spectroscopy and trace gas detection. The proposed methodology aims to construct an optimal calibration model for a TDLAS spectrum, regardless of its background structural characteristics, thus facilitating the application of TDLAS as a powerful tool for analytical chemistry. The performance of the proposed method is verified using analysis of both synthetic and observed signals, characterized with different noise levels and baseline drift. In terms of fitting precision and signal-to-noise ratio, both have been improved significantly using the proposed method.

  13. Review of exchange processes on Ganymede in view of its planetary protection categorization.

    PubMed

    Grasset, O; Bunce, E J; Coustenis, A; Dougherty, M K; Erd, C; Hussmann, H; Jaumann, R; Prieto-Ballesteros, O

    2013-10-01

    In this paper, we provide a detailed review of Ganymede's characteristics that are germane to any consideration of its planetary protection requirements. Ganymede is the largest moon in our solar system and is the subject of one of the main science objectives of the JUICE mission to the jovian system. We explore the probability of the occurrence of potentially habitable zones within Ganymede at present, including those both within the deep liquid ocean and those in shallow liquid reservoirs. We consider the possible exchange processes between the surface and any putative habitats to set some constraints on the planetary protection approach for this moon. As a conclusion, the "remote" versus "significant" chance of contamination will be discussed, according to our current understanding of this giant icy moon. Based on the different estimates we investigate here, it appears extremely unlikely that material would be exchanged downward through the upper icy layer of Ganymede and, thus, bring material into the ocean over timescales consistent with the survival of microorganisms. PMID:24143869

  14. Protection of high temperature superconducting thin-films in a semiconductor processing environment

    SciTech Connect

    Xu, Yizi; Fiske, R.; Sanders, S.C.; Ekin, J.W.

    1996-12-31

    Annealing studies have been carried out for high temperature superconductor YBaCuO{sub 7{minus}{delta}} in a reducing ambient, in order to identify insulator layer(s) that will effectively protect the superconducting film in the hostile environment. While a layer of magnesium oxide (MgO) sputter deposited directly on YBaCuO{sub 7{minus}{delta}} film provides some degree of protection, the authors found that a composite structure of YBCO/SrTiO{sub 3}/MgO, where the SrTiO{sub 3} was grown by laser ablation immediately following YBCO deposition (in-situ process), was much more effective. They also address the need for a buffer layer between YBCO and aluminum (Al) during annealing. Al is most commenly used for semiconductor metalization, but is known to react readily with YBCO at elevated temperatures. The authors found that the most effective buffer layers are platinum (Pt) and gold/platinum (Au/Pt).

  15. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    PubMed Central

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  16. Optimization of residual stresses in MMC's through the variation of interfacial layer architectures and processing parameters

    NASA Technical Reports Server (NTRS)

    Pindera, Marek-Jerzy; Salzar, Robert S.

    1996-01-01

    The objective of this work was the development of efficient, user-friendly computer codes for optimizing fabrication-induced residual stresses in metal matrix composites through the use of homogeneous and heterogeneous interfacial layer architectures and processing parameter variation. To satisfy this objective, three major computer codes have been developed and delivered to the NASA-Lewis Research Center, namely MCCM, OPTCOMP, and OPTCOMP2. MCCM is a general research-oriented code for investigating the effects of microstructural details, such as layered morphology of SCS-6 SiC fibers and multiple homogeneous interfacial layers, on the inelastic response of unidirectional metal matrix composites under axisymmetric thermomechanical loading. OPTCOMP and OPTCOMP2 combine the major analysis module resident in MCCM with a commercially-available optimization algorithm and are driven by user-friendly interfaces which facilitate input data construction and program execution. OPTCOMP enables the user to identify those dimensions, geometric arrangements and thermoelastoplastic properties of homogeneous interfacial layers that minimize thermal residual stresses for the specified set of constraints. OPTCOMP2 provides additional flexibility in the residual stress optimization through variation of the processing parameters (time, temperature, external pressure and axial load) as well as the microstructure of the interfacial region which is treated as a heterogeneous two-phase composite. Overviews of the capabilities of these codes are provided together with a summary of results that addresses the effects of various microstructural details of the fiber, interfacial layers and matrix region on the optimization of fabrication-induced residual stresses in metal matrix composites.

  17. Optimization of processing parameters for the preparation of phytosterol microemulsions by the solvent displacement method.

    PubMed

    Leong, Wai Fun; Che Man, Yaakob B; Lai, Oi Ming; Long, Kamariah; Misran, Misni; Tan, Chin Ping

    2009-09-23

    The purpose of this study was to optimize the parameters involved in the production of water-soluble phytosterol microemulsions for use in the food industry. In this study, response surface methodology (RSM) was employed to model and optimize four of the processing parameters, namely, the number of cycles of high-pressure homogenization (1-9 cycles), the pressure used for high-pressure homogenization (100-500 bar), the evaporation temperature (30-70 degrees C), and the concentration ratio of microemulsions (1-5). All responses-particle size (PS), polydispersity index (PDI), and percent ethanol residual (%ER)-were well fit by a reduced cubic model obtained by multiple regression after manual elimination. The coefficient of determination (R(2)) and absolute average deviation (AAD) value for PS, PDI, and %ER were 0.9628 and 0.5398%, 0.9953 and 0.7077%, and 0.9989 and 1.0457%, respectively. The optimized processing parameters were 4.88 (approximately 5) homogenization cycles, homogenization pressure of 400 bar, evaporation temperature of 44.5 degrees C, and concentration ratio of microemulsions of 2.34 cycles (approximately 2 cycles) of high-pressure homogenization. The corresponding responses for the optimized preparation condition were a minimal particle size of 328 nm, minimal polydispersity index of 0.159, and <0.1% of ethanol residual. The chi-square test verified the model, whereby the experimental values of PS, PDI, and %ER agreed with the predicted values at a 0.05 level of significance. PMID:19694442

  18. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  19. Critical Protection Item classification for a waste processing facility at Savannah River Site

    SciTech Connect

    Ades, M.J.; Garrett, R.J.

    1993-10-01

    This paper describes the methodology for Critical Protection Item (CPI) classification and its application to the Structures, Systems and Components (SSC) of a waste processing facility at the Savannah River Site (SRS). The WSRC methodology for CPI classification includes the evaluation of the radiological and non-radiological consequences resulting from postulated accidents at the waste processing facility and comparison of these consequences with allowable limits. The types of accidents considered include explosions and fire in the facility and postulated accidents due to natural phenomena, including earthquakes, tornadoes, and high velocity straight winds. The radiological analysis results indicate that CPIs are not required at the waste processing facility to mitigate the consequences of radiological release. The non-radiological analysis, however, shows that the Waste Storage Tank (WST) and the dike spill containment structures around the formic acid tanks in the cold chemical feed area and waste treatment area of the facility should be identified as CPIs. Accident mitigation options are provided and discussed.

  20. Optimization of laser-assisted glass frit bonding process by response surface methodology

    NASA Astrophysics Data System (ADS)

    Wang, Wen; Xiao, Yanyi; Wu, Xingyang; Zhang, Jianhua

    2016-03-01

    In this work, a systematic study on laser-assisted glass frit bonding process was carried out by response surface methodology (RSM). Laser power, sealing speed and spot diameter were considered as key bonding parameters. Combined with a central rotatable experimental design, RSM was employed to establish mathematical model to predict the relationship between the shear force after bonding and the bonding process parameters. The model was validated experimentally. Based on the model, the interaction effects of the process parameters on the shear force were analyzed and the optimum bonding parameters were achieved. The results indicate that the model can be used to illustrate the relationship between the shear force and the bonding parameters. The predicted results obtained under the optimized parameters by the models are consistent with the experimental results.

  1. Malachite green decolorization by the filamentous fungus Myrothecium roridum--Mechanistic study and process optimization.

    PubMed

    Jasińska, Anna; Paraszkiewicz, Katarzyna; Sip, Anna; Długoński, Jerzy

    2015-10-01

    The filamentous fungus Myrothecium roridum isolated from a dye-contaminated area was investigated in terms of its use for the treatment of Malachite green (MG). The mechanisms involved in this process were established. Peroxidases and cytochrome P-450 do not mediate MG elimination. The laccase of M. roridum IM 6482 was found to be responsible for the decolorization of 8-11% of MG. Thermostable low-molecular-weight factors (LMWF) resistant to sodium azide were found to be largely involved in dye decomposition. In addition, MG decolorization by M. roridum IM 6482 occurred in a non-toxic manner. Data from antimicrobial tests showed that MG toxicity decreased after decolorization. To optimize the MG decolorization process, the effects of operational parameters (such as the medium pH and composition, process temperature and culture agitation) were examined. The results demonstrate that M. roridum IM 6482 may be used effectively as an alternative to traditional decolorization agents. PMID:26185924

  2. Malachite green decolorization by the filamentous fungus Myrothecium roridum--Mechanistic study and process optimization.

    PubMed

    Jasińska, Anna; Paraszkiewicz, Katarzyna; Sip, Anna; Długoński, Jerzy

    2015-10-01

    The filamentous fungus Myrothecium roridum isolated from a dye-contaminated area was investigated in terms of its use for the treatment of Malachite green (MG). The mechanisms involved in this process were established. Peroxidases and cytochrome P-450 do not mediate MG elimination. The laccase of M. roridum IM 6482 was found to be responsible for the decolorization of 8-11% of MG. Thermostable low-molecular-weight factors (LMWF) resistant to sodium azide were found to be largely involved in dye decomposition. In addition, MG decolorization by M. roridum IM 6482 occurred in a non-toxic manner. Data from antimicrobial tests showed that MG toxicity decreased after decolorization. To optimize the MG decolorization process, the effects of operational parameters (such as the medium pH and composition, process temperature and culture agitation) were examined. The results demonstrate that M. roridum IM 6482 may be used effectively as an alternative to traditional decolorization agents.

  3. Optimization of a RF-generated CF4/O2 gas plasma sterilization process.

    PubMed

    Lassen, Klaus S; Nordby, Bolette; Grün, Reinar

    2003-05-15

    A sterilization process with the use of RF-generated (13.56 MHz) CF(4)/O(2) gas plasma was optimized in regards to power, flow rate, exposure time, and RF-system type. The dependency of the sporicidal effect on the spore inoculum positioning in the chamber of the RF systems was also investigated. Dried Bacillus stearothermophilus ATCC 7953 endospores were used as test organisms. The treatments were evaluated on the basis of survival curves and corresponding D values. The only parameter found to affect the sterilization process was the power of the RF system. Higher power resulted in higher kill. Finally, when the samples were placed more than 3-8 cm away from a centrally placed electrode in System 2, the sporicidal effect was reduced. The results are discussed and compared to results from the present literature. The RF excitation source is evaluated to be more appropriate for sterilization processes than the MW source. PMID:12687716

  4. Auto-SEIA: simultaneous optimization of image processing and machine learning algorithms

    NASA Astrophysics Data System (ADS)

    Negro Maggio, Valentina; Iocchi, Luca

    2015-02-01

    Object classification from images is an important task for machine vision and it is a crucial ingredient for many computer vision applications, ranging from security and surveillance to marketing. Image based object classification techniques properly integrate image processing and machine learning (i.e., classification) procedures. In this paper we present a system for automatic simultaneous optimization of algorithms and parameters for object classification from images. More specifically, the proposed system is able to process a dataset of labelled images and to return a best configuration of image processing and classification algorithms and of their parameters with respect to the accuracy of classification. Experiments with real public datasets are used to demonstrate the effectiveness of the developed system.

  5. Optimized formulation and processing protocol for a supplementary bean-based composite flour.

    PubMed

    Ndagire, Catherine T; Muyonga, John H; Manju, Reddy; Nakimbugwe, Dorothy

    2015-11-01

    Protein-energy malnutrition is the most serious nutritional body depletion disorder among infants and young children in developing countries, attributable to inadequate energy and nutrient intake, partly due to high dietary bulk of weaning and infant foods. The gruels fed to children are typically of low nutrient and energy density due to the low flour incorporation rate required for drinking viscosity. The aim of this study was to develop a nutritious product, based on common dry beans and other grains, suitable for supplementary feeding. The optimal processing conditions for desired nutritional and sensory attributes were determined using Response Surface Methodology. For bean processing, soaking for 6, 15, or 24 h, germination for 24 or 48 h, and cooking under pressure for either 10 or 20 min were the independent variables. The processed bean flour's total polyphenol, phytic acid and protein content, the sensory acceptability of the bean-based composite porridge and its protein and starch digestibility were dependent variables. Based on product acceptability, antinutrients and protein content, as well as on protein and starch digestibility, the optimum processing conditions for the bean flour for infant and young child feeding were 24 h of soaking, 48 h of malting, and 19 min of steaming under pressure. These conditions resulted in a product with the highest desirability. The model equations developed can be used for predicting the quality of the bean flour and the bean-based composite porridge. Bean optimally processed and incorporated with grain amaranth and rice flours of a ratio of 40: 30: 30, respectively, resulted into flour with high energy, mineral, and nutrient density of the final porridge. The composite is well adaptable to preparation at rural community level. The use of these locally available grains and feasible processes could make a great contribution to nutrition security in sub-Saharan Africa and other developing countries. PMID:26788294

  6. Experiments for practical education in process parameter optimization for selective laser sintering to increase workpiece quality

    NASA Astrophysics Data System (ADS)

    Reutterer, Bernd; Traxler, Lukas; Bayer, Natascha; Drauschke, Andreas

    2016-04-01

    Selective Laser Sintering (SLS) is considered as one of the most important additive manufacturing processes due to component stability and its broad range of usable materials. However the influence of the different process parameters on mechanical workpiece properties is still poorly studied, leading to the fact that further optimization is necessary to increase workpiece quality. In order to investigate the impact of various process parameters, laboratory experiments are implemented to improve the understanding of the SLS limitations and advantages on an educational level. Experiments are based on two different workstations, used to teach students the fundamentals of SLS. First of all a 50 W CO2 laser workstation is used to investigate the interaction of the laser beam with the used material in accordance with varied process parameters to analyze a single-layered test piece. Second of all the FORMIGA P110 laser sintering system from EOS is used to print different 3D test pieces in dependence on various process parameters. Finally quality attributes are tested including warpage, dimension accuracy or tensile strength. For dimension measurements and evaluation of the surface structure a telecentric lens in combination with a camera is used. A tensile test machine allows testing of the tensile strength and the interpreting of stress-strain curves. The developed laboratory experiments are suitable to teach students the influence of processing parameters. In this context they will be able to optimize the input parameters depending on the component which has to be manufactured and to increase the overall quality of the final workpiece.

  7. Optimized breeding strategies for multiple trait integration: II. Process efficiency in event pyramiding and trait fixation.

    PubMed

    Peng, Ting; Sun, Xiaochun; Mumm, Rita H

    2014-01-01

    Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity

  8. Optimal processing for gel electrophoresis images: Applying Monte Carlo Tree Search in GelApp.

    PubMed

    Nguyen, Phi-Vu; Ghezal, Ali; Hsueh, Ya-Chih; Boudier, Thomas; Gan, Samuel Ken-En; Lee, Hwee Kuan

    2016-08-01

    In biomedical research, gel band size estimation in electrophoresis analysis is a routine process. To facilitate and automate this process, numerous software have been released, notably the GelApp mobile app. However, the band detection accuracy is limited due to a band detection algorithm that cannot adapt to the variations in input images. To address this, we used the Monte Carlo Tree Search with Upper Confidence Bound (MCTS-UCB) method to efficiently search for optimal image processing pipelines for the band detection task, thereby improving the segmentation algorithm. Incorporating this into GelApp, we report a significant enhancement of gel band detection accuracy by 55.9 ± 2.0% for protein polyacrylamide gels, and 35.9 ± 2.5% for DNA SYBR green agarose gels. This implementation is a proof-of-concept in demonstrating MCTS-UCB as a strategy to optimize general image segmentation. The improved version of GelApp-GelApp 2.0-is freely available on both Google Play Store (for Android platform), and Apple App Store (for iOS platform).

  9. Towards Optimal Filtering on ARM for ATLAS Tile Calorimeter Front-End Processing

    NASA Astrophysics Data System (ADS)

    Cox, Mitchell A.

    2015-10-01

    The Large Hadron Collider at CERN generates enormous amounts of raw data which presents a serious computing challenge. After planned upgrades in 2022, the data output from the ATLAS Tile Calorimeter will increase by 200 times to over 40 Tb/s. Advanced and characteristically expensive Digital Signal Processors (DSPs) and Field Programmable Gate Arrays (FPGAs) are currently used to process this quantity of data. It is proposed that a cost- effective, high data throughput Processing Unit (PU) can be developed by using several ARM System on Chips in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. ARM is a cost effective and energy efficient alternative CPU architecture to the long established x86 architecture. This PU could be used for a variety of high-level algorithms on the high data throughput raw data. An Optimal Filtering algorithm has been implemented in C++ and several ARM platforms have been tested. Optimal Filtering is currently used in the ATLAS Tile Calorimeter front-end for basic energy reconstruction and is currently implemented on DSPs.

  10. Optimal processing for gel electrophoresis images: Applying Monte Carlo Tree Search in GelApp.

    PubMed

    Nguyen, Phi-Vu; Ghezal, Ali; Hsueh, Ya-Chih; Boudier, Thomas; Gan, Samuel Ken-En; Lee, Hwee Kuan

    2016-08-01

    In biomedical research, gel band size estimation in electrophoresis analysis is a routine process. To facilitate and automate this process, numerous software have been released, notably the GelApp mobile app. However, the band detection accuracy is limited due to a band detection algorithm that cannot adapt to the variations in input images. To address this, we used the Monte Carlo Tree Search with Upper Confidence Bound (MCTS-UCB) method to efficiently search for optimal image processing pipelines for the band detection task, thereby improving the segmentation algorithm. Incorporating this into GelApp, we report a significant enhancement of gel band detection accuracy by 55.9 ± 2.0% for protein polyacrylamide gels, and 35.9 ± 2.5% for DNA SYBR green agarose gels. This implementation is a proof-of-concept in demonstrating MCTS-UCB as a strategy to optimize general image segmentation. The improved version of GelApp-GelApp 2.0-is freely available on both Google Play Store (for Android platform), and Apple App Store (for iOS platform). PMID:27251892

  11. Process optimization of continuous gluconic acid fermentation by isolated yeast-like strains of Aureobasidium pullulans.

    PubMed

    Anastassiadis, Savas; Aivasidis, Alexander; Wandrey, Christian; Rehm, Hans-Jürgen

    2005-08-20

    This study was focused on the optimization of a new fermentation process for continuous gluconic acid production by the isolated yeast-like strain Aureobasidium pullulans DSM 7085 (isolate 70). Operational fermentation parameters were optimized in chemostat cultures, using a defined glucose medium. Different optima were found for growth and gluconic acid production for each set of operation parameters. Highest productivity was recorded at pH values between 6.5 and 7.0 and temperatures between 29 and 31 degrees C. A gluconic acid concentration higher than 230 g/L was continuously produced at residence times of 12 h. A steady state extracellular gluconic acid concentration of 234 g/L was measured at pH 6.5. 122% air saturation yielded the highest volumetric productivity and product concentration. The biomass-specific productivity increased steadily upon raising air saturation. An intracellular gluconic acid concentration of about 159 g/L (0.83 mol) was determined at 31 degrees C. This is to be compared with an extracellular concentration of 223 g/L (1.16 mol), which indicates the possible existence of an active transport system for gluconic acid secretion, or the presence of extracellular glucose oxidizing enzymes. The new process provides significant advantages over the traditional discontinuous fungi operations. The process control becomes easier, thus offering stable product quality and quantity.

  12. Magnetically assisted chemical separation (MACS) process: Preparation and optimization of particles for removal of transuranic elements

    SciTech Connect

    Nunez, L.; Kaminski, M.; Bradley, C.; Buchholz, B.A.; Aase, S.B.; Tuazon, H.E.; Vandegrift, G.F.; Landsberger, S.

    1995-05-01

    The Magnetically Assisted Chemical Separation (MACS) process combines the selectivity afforded by solvent extractants with magnetic separation by using specially coated magnetic particles to provide a more efficient chemical separation of transuranic (TRU) elements, other radionuclides, and heavy metals from waste streams. Development of the MACS process uses chemical and physical techniques to elucidate the properties of particle coatings and the extent of radiolytic and chemical damage to the particles, and to optimize the stages of loading, extraction, and particle regeneration. This report describes the development of a separation process for TRU elements from various high-level waste streams. Polymer-coated ferromagnetic particles with an adsorbed layer of octyl(phenyl)-N,N-diisobutylcarbamoylmethylphosphine oxide (CMPO) diluted with tributyl phosphate (TBP) were evaluated for use in the separation and recovery of americium and plutonium from nuclear waste solutions. Due to their chemical nature, these extractants selectively complex americium and plutonium contaminants onto the particles, which can then be recovered from the solution by using a magnet. The partition coefficients were larger than those expected based on liquid[liquid extractions, and the extraction proceeded with rapid kinetics. Extractants were stripped from the particles with alcohols and 400-fold volume reductions were achieved. Particles were more sensitive to acid hydrolysis than to radiolysis. Overall, the optimization of a suitable NMCS particle for TRU separation was achieved under simulant conditions, and a MACS unit is currently being designed for an in-lab demonstration.

  13. DEVELOPING AN OPTIMIZED PROCESS STRATEGY FOR ACID CLEANING OF THE SAVANNAH RIVERSITE HLW TANKS

    SciTech Connect

    Ketusky, E

    2006-12-04

    At the Savannah River Site (SRS), there remains approximately 35 million gallons of High Level Waste (HLW) that was mostly created from Purex and SRS H-Area Modified (HM) nuclear fuel cycles. The waste is contained in approximately forty-nine tanks fabricated from commercially available carbon steel. In order to minimize general corrosion, the waste is maintained as very-alkaline solution. The very-alkaline chemistry has caused hydrated metal oxides to precipitate and form a sludge heel. Over the years, the sludge waste has aged, with some forming a hardened crust. To aid in the removal of the sludge heels from select tanks for closure the use of oxalic acid to dissolve the sludge is being investigated. Developing an optimized process strategy based on laboratory analyses would be prohibitively costly. This research, therefore, demonstrates that a chemical equilibrium based software program can be used to develop an optimized process strategy for oxalic acid cleaning of the HLW tanks based on estimating resultant chemistries, minimizing resultant oxalates sent to the evaporator, and minimizing resultant solids sent to the Defense Waste Processing Facility (DWPF).

  14. A Technical Review on Biomass Processing: Densification, Preprocessing, Modeling and Optimization

    SciTech Connect

    Jaya Shankar Tumuluru; Christopher T. Wright

    2010-06-01

    It is now a well-acclaimed fact that burning fossil fuels and deforestation are major contributors to climate change. Biomass from plants can serve as an alternative renewable and carbon-neutral raw material for the production of bioenergy. Low densities of 40–60 kg/m3 for lignocellulosic and 200–400 kg/m3 for woody biomass limits their application for energy purposes. Prior to use in energy applications these materials need to be densified. The densified biomass can have bulk densities over 10 times the raw material helping to significantly reduce technical limitations associated with storage, loading and transportation. Pelleting, briquetting, or extrusion processing are commonly used methods for densification. The aim of the present research is to develop a comprehensive review of biomass processing that includes densification, preprocessing, modeling and optimization. The specific objective include carrying out a technical review on (a) mechanisms of particle bonding during densification; (b) methods of densification including extrusion, briquetting, pelleting, and agglomeration; (c) effects of process and feedstock variables and biomass biochemical composition on the densification (d) effects of preprocessing such as grinding, preheating, steam explosion, and torrefaction on biomass quality and binding characteristics; (e) models for understanding the compression characteristics; and (f) procedures for response surface modeling and optimization.

  15. Analysis of grinding of superalloys and ceramics for off-line process optimization

    NASA Astrophysics Data System (ADS)

    Sathyanarayanan, G.

    The present study has compared the performances of resinoid, vitrified, and electroplated CBN wheels in creep feed grinding of M42 and D2 tool steels. Responses such as a specific energy, normal and tangential forces, and surface roughness were used as measures of performance. It was found that creep feed grinding with resinoid, vitrified, and electroplated CBN wheels has its own advantages, but no single wheel could provide good finish, lower specific energy, and high material removal rates simultaneously. To optimize the CBN grinding with different bonded wheels, a Multiple Criteria Decision Making (MCDM) methodology was used. Creep feed grinding of superalloys, Ti-6Al-4V and Inconel 718, has been modeled by utilizing neural networks to optimize the grinding process. A parallel effort was directed at creep feed grinding of alumina ceramics with diamond wheels to investigate the influence of process variables on responses based on experimental results and statistical analysis. The conflicting influence of variables was observed. This led to the formulation of ceramic grinding process as a multi-objective nonlinear mixed integer problem.

  16. Optimizing photo-Fenton like process for the removal of diesel fuel from the aqueous phase

    PubMed Central

    2014-01-01

    Background In recent years, pollution of soil and groundwater caused by fuel leakage from old underground storage tanks, oil extraction process, refineries, fuel distribution terminals, improper disposal and also spills during transferring has been reported. Diesel fuel has created many problems for water resources. The main objectives of this research were focused on assessing the feasibility of using photo-Fenton like method using nano zero-valent iron (nZVI/UV/H2O2) in removing total petroleum hydrocarbons (TPH) and determining the optimal conditions using Taguchi method. Results The influence of different parameters including the initial concentration of TPH (0.1-1 mg/L), H2O2 concentration (5-20 mmole/L), nZVI concentration (10-100 mg/L), pH (3-9), and reaction time (15-120 min) on TPH reduction rate in diesel fuel were investigated. The variance analysis suggests that the optimal conditions for TPH reduction rate from diesel fuel in the aqueous phase are as follows: the initial TPH concentration equals to 0.7 mg/L, nZVI concentration 20 mg/L, H2O2 concentration equals to 5 mmol/L, pH 3, and the reaction time of 60 min and degree of significance for the study parameters are 7.643, 9.33, 13.318, 15.185 and 6.588%, respectively. The predicted removal rate in the optimal conditions was 95.8% and confirmed by data obtained in this study which was between 95-100%. Conclusion In conclusion, photo-Fenton like process using nZVI process may enhance the rate of diesel degradation in polluted water and could be used as a pretreatment step for the biological removal of TPH from diesel fuel in the aqueous phase. PMID:24955242

  17. Simulation based flow distribution network optimization for vacuum assisted resin transfer moulding process

    NASA Astrophysics Data System (ADS)

    Hsiao, Kuang-Ting; Devillard, Mathieu; Advani, Suresh G.

    2004-05-01

    In the vacuum assisted resin transfer moulding (VARTM) process, using a flow distribution network such as flow channels and high permeability fabrics can accelerate the resin infiltration of the fibre reinforcement during the manufacture of composite parts. The flow distribution network significantly influences the fill time and fill pattern and is essential for the process design. The current practice has been to cover the top surface of the fibre preform with the distribution media with the hope that the resin will flood the top surface immediately and penetrate through the thickness. However, this approach has some drawbacks. One is when the resin finds its way to the vent before it has penetrated the preform entirely, which results in a defective part or resin wastage. Also, if the composite structure contains ribs or inserts, this approach invariably results in dry spots. Instead of this intuitive approach, we propose a science-based approach to design the layout of the distribution network. Our approach uses flow simulation of the resin into the network and the preform and a genetic algorithm to optimize the flow distribution network. An experimental case study of a co-cured rib structure is conducted to demonstrate the design procedure and validate the optimized flow distribution network design. Good agreement between the flow simulations and the experimental results was observed. It was found that the proposed design algorithm effectively optimized the flow distribution network of the part considered in our case study and hence should prove to be a useful tool to extend the VARTM process to manufacture of complex structures with effective use of the distribution network layup.

  18. Inverse problems and optimal experiment design in unsteady heat transfer processes identification

    NASA Technical Reports Server (NTRS)

    Artyukhin, Eugene A.

    1991-01-01

    Experimental-computational methods for estimating characteristics of unsteady heat transfer processes are analyzed. The methods are based on the principles of distributed parameter system identification. The theoretical basis of such methods is the numerical solution of nonlinear ill-posed inverse heat transfer problems and optimal experiment design problems. Numerical techniques for solving problems are briefly reviewed. The results of the practical application of identification methods are demonstrated when estimating effective thermophysical characteristics of composite materials and thermal contact resistance in two-layer systems.

  19. Optimal cure cycle design for autoclave processing of thick composites laminates: A feasibility study

    NASA Technical Reports Server (NTRS)

    Hou, Jean W.

    1985-01-01

    The thermal analysis and the calculation of thermal sensitivity of a cure cycle in autoclave processing of thick composite laminates were studied. A finite element program for the thermal analysis and design derivatives calculation for temperature distribution and the degree of cure was developed and verified. It was found that the direct differentiation was the best approach for the thermal design sensitivity analysis. In addition, the approach of the direct differentiation provided time histories of design derivatives which are of great value to the cure cycle designers. The approach of direct differentiation is to be used for further study, i.e., the optimal cycle design.

  20. Testing of New Materials and Computer Aided Optimization of Process Parameters and Clamping Device During Predevelopment of Laser Welding Processes

    NASA Astrophysics Data System (ADS)

    Weidinger, Peter; Günther, Kay; Fitzel, Martin; Logvinov, Ruslan; Ilin, Alexander; Ploshikhin, Vasily; Hugger, Florian; Mann, Vincent; Roth, Stephan; Schmidt, Michael

    The necessity for weight reduction in motor vehicles in order to save fuel consumption pushes automotive suppliers to use materials of higher strength. Due to their excellent crash behavior high strength steels are increasingly applied in various structures. In this paper some predevelopment steps for a material change from a micro alloyed to dual phase and complex phase steels of a T-joint assembly are displayed. Initially the general weldability of the materials regarding pore formation, hardening in the heat affected zone and hot cracking susceptibility is discussed. After this basic investigation, the computer aided design optimization of a clamping device is shown, in which influences of the clamping jaw, the welding position and the clamping forces upon weld quality are presented. Finally experimental results of the welding process are displayed, which validate the numerical simulation.

  1. Optimizing fermentation process miscanthus-to-ethanol biorefinery scale under uncertain conditions

    NASA Astrophysics Data System (ADS)

    Bomberg, Matthew; Sanchez, Daniel L.; Lipman, Timothy E.

    2014-05-01

    Ethanol produced from cellulosic feedstocks has garnered significant interest for greenhouse gas abatement and energy security promotion. One outstanding question in the development of a mature cellulosic ethanol industry is the optimal scale of biorefining activities. This question is important for companies and entrepreneurs seeking to construct and operate cellulosic ethanol biorefineries as it determines the size of investment needed and the amount of feedstock for which they must contract. The question also has important implications for the nature and location of lifecycle environmental impacts from cellulosic ethanol. We use an optimization framework similar to previous studies, but add richer details by treating many of these critical parameters as random variables and incorporating a stochastic sub-model for land conversion. We then use Monte Carlo simulation to obtain a probability distribution for the optimal scale of a biorefinery using a fermentation process and miscanthus feedstock. We find a bimodal distribution with a high peak at around 10-30 MMgal yr-1 (representing circumstances where a relatively low percentage of farmers elect to participate in miscanthus cultivation) and a lower and flatter peak between 150 and 250 MMgal yr-1 (representing more typically assumed land-conversion conditions). This distribution leads to useful insights; in particular, the asymmetry of the distribution—with significantly more mass on the low side—indicates that developers of cellulosic ethanol biorefineries may wish to exercise caution in scale-up.

  2. Self-adaptive optimal control of dry dual clutch transmission (DCT) during starting process

    NASA Astrophysics Data System (ADS)

    Zhao, Zhiguo; He, Lu; Zheng, Zhengxing; Yang, Yunyun; Wu, Chaochun

    2016-02-01

    An optimal control based on the minimum principle is proposed to solve the problems with the starting process of the self-developed five-speed dry dual clutch transmission (DCT). For the slipping phase, the minimum principle and improved engine constant speed control are adopted to obtain the optimal clutch and engine torques and their rotating speeds, with the minimum jerk intensity and friction work as optimization indices. For the stable running phase, the engine torque is converted to the driver's level of demand. The Matlab/Simulink software platform was used to simulate the DCT vehicle in the starting stage. The simulation and related analysis were conducted for different engine speeds and intentions of the driver. The results showed that the proposed clutch starting control strategy not only reduces the level of jerk and the frictional energy loss but also follows the different starting intentions of the driver. The optimum clutch engagement principle was transformed into the clutch position principle, and a test was carried out on the test bench to validate the effectiveness of the optimum clutch position curve.

  3. Optimizing prescription of chinese herbal medicine for unstable angina based on partially observable markov decision process.

    PubMed

    Feng, Yan; Qiu, Yu; Zhou, Xuezhong; Wang, Yixin; Xu, Hao; Liu, Baoyan

    2013-01-01

    Objective. Initial optimized prescription of Chinese herb medicine for unstable angina (UA). Methods. Based on partially observable Markov decision process model (POMDP), we choose hospitalized patients of 3 syndrome elements, such as qi deficiency, blood stasis, and turbid phlegm for the data mining, analysis, and objective evaluation of the diagnosis and treatment of UA at a deep level in order to optimize the prescription of Chinese herb medicine for UA. Results. The recommended treatment options of UA for qi deficiency, blood stasis, and phlegm syndrome patients were as follows: Milkvetch Root + Tangshen + Indian Bread + Largehead Atractylodes Rhizome (ADR = 0.96630); Danshen Root + Chinese Angelica + Safflower + Red Peony Root + Szechwan Lovage Rhizome Orange Fruit (ADR = 0.76); Snakegourd Fruit + Longstamen Onion Bulb + Pinellia Tuber + Dried Tangerine peel + Largehead Atractylodes Rhizome + Platycodon Root (ADR = 0.658568). Conclusion. This study initially optimized prescriptions for UA based on POMDP, which can be used as a reference for further development of UA prescription in Chinese herb medicine. PMID:24078826

  4. Post-processing of Monte Carlo simulations for rapid BNCT source optimization studies

    SciTech Connect

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; Vujic, J.

    2000-10-01

    A great advantage of some neutron sources, such as accelerator-produced sources, is that they can be tuned to produce different spectra. Unfortunately, optimization studies are often time-consuming and difficult, as they require a lengthy Monte Carlo simulation for each source. When multiple characteristics, such as energy, angle, and spatial distribution of a neutron beam are allowed to vary, an overwhelming number of simulations may be required. Many optimization studies, therefore, suffer from a small number of datapoints, restrictive treatment conditions, or poor statistics. By scoring pertinent information from every particle tally in a Monte Carlo simulation, then applying appropriate source variable weight factors in a post-processing algorithm, a single simulation can be used to model any number of multiple sources. Through this method, the response to a new source can be modeled in minutes or seconds, rather than hours or days, allowing for the analysis of truly variable source conditions of much greater resolution than is normally possible when a new simulation must be run for each datapoint in a study. This method has been benchmarked and used to recreate optimization studies in a small fraction of the time spent in the original studies.

  5. [Process Optimization of PEGylating Fused Protein of LL-37 and Interferon-α2a].

    PubMed

    Zhang, Mingjie

    2015-12-01

    PEGylating is an effective way for prolonging the half-time period and decreasing the immunogenicity of protein drugs. With experiments of single factor, it was proved that the optimal processes for PEGylating the fused protein of LL-37 and interferon (IFN)-α2a were: PEG molecular weight was 5,000, fused protein concentration was 0.6 mg/mL, the mole ratio of protein to mPEG₅₀₀₀-SS was 1:10, the reaction temperature was 4 °C, and the pH was 9.0, respectively. With orthogonal experiments, we proved that the influential order of 3 main factors is: the fused protein concentration > the mole ratio of protein and mPEG₅₀₀₀-SS > pH and the optimal conditions were the fused protein concentration as 0.6 mg/mL, the mole ratio of protein and mPEG₅₀₀₀-SS as 1:10, pH as 8.8. Under these optimal conditions, the average rate of PEGylated protein with 3 times parallel experiments was 86.98%. After PEGylated, the interferon activity and antimicrobial activity of fused protein could be remained higher than 58% and 97%, respectively. PMID:27079098

  6. Reduced-order model for dynamic optimization of pressure swing adsorption processes

    SciTech Connect

    Agarwal, A.; Biegler, L.; Zitney, S.

    2007-01-01

    Over the past decades, pressure swing adsorption (PSA) processes have been widely used as energy-efficient gas and liquid separation techniques, especially for high purity hydrogen purification from refinery gases. The separation processes are based on solid-gas equilibrium and operate under periodic transient conditions. Models for PSA processes are therefore multiple instances of partial differential equations (PDEs) in time and space with periodic boundary conditions that link the processing steps together. The solution of this coupled stiff PDE system is governed by steep concentrations and temperature fronts moving with time. As a result, the optimization of such systems for either design or operation represents a significant computational challenge to current differential algebraic equation (DAE) optimization techniques and nonlinear programming algorithms. Model reduction is one approach to generate cost-efficient low-order models which can be used as surrogate models in the optimization problems. The study develops a reduced-order model (ROM) based on proper orthogonal decomposition (POD), which is a low-dimensional approximation to a dynamic PDE-based model. Initially, a representative ensemble of solutions of the dynamic PDE system is constructed by solving a higher-order discretization of the model using the method of lines, a two-stage approach that discretizes the PDEs in space and then integrates the resulting DAEs over time. Next, the ROM method applies the Karhunen-Loeve expansion to derive a small set of empirical eigenfunctions (POD modes) which are used as basis functions within a Galerkin's projection framework to derive a low-order DAE system that accurately describes the dominant dynamics of the PDE system. The proposed method leads to a DAE system of significantly lower order, thus replacing the one obtained from spatial discretization before and making optimization problem computationally-efficient. The method has been applied to the dynamic

  7. Molecular identification of potential denitrifying bacteria and use of D-optimal mixture experimental design for the optimization of denitrification process.

    PubMed

    Ben Taheur, Fadia; Fdhila, Kais; Elabed, Hamouda; Bouguerra, Amel; Kouidhi, Bochra; Bakhrouf, Amina; Chaieb, Kamel

    2016-04-01

    Three bacterial strains (TE1, TD3 and FB2) were isolated from date palm (degla), pistachio and barley. The presence of nitrate reductase (narG) and nitrite reductase (nirS and nirK) genes in the selected strains was detected by PCR technique. Molecular identification based on 16S rDNA sequencing method was applied to identify positive strains. In addition, the D-optimal mixture experimental design was used to optimize the optimal formulation of probiotic bacteria for denitrification process. Strains harboring denitrification genes were identified as: TE1, Agrococcus sp LN828197; TD3, Cronobacter sakazakii LN828198 and FB2, Pedicoccus pentosaceus LN828199. PCR results revealed that all strains carried the nirS gene. However only C. sakazakii LN828198 and Agrococcus sp LN828197 harbored the nirK and the narG genes respectively. Moreover, the studied bacteria were able to form biofilm on abiotic surfaces with different degree. Process optimization showed that the most significant reduction of nitrate was 100% with 14.98% of COD consumption and 5.57 mg/l nitrite accumulation. Meanwhile, the response values were optimized and showed that the most optimal combination was 78.79% of C. sakazakii LN828198 (curve value), 21.21% of P. pentosaceus LN828199 (curve value) and absence (0%) of Agrococcus sp LN828197 (curve value). PMID:26893037

  8. Molecular identification of potential denitrifying bacteria and use of D-optimal mixture experimental design for the optimization of denitrification process.

    PubMed

    Ben Taheur, Fadia; Fdhila, Kais; Elabed, Hamouda; Bouguerra, Amel; Kouidhi, Bochra; Bakhrouf, Amina; Chaieb, Kamel

    2016-04-01

    Three bacterial strains (TE1, TD3 and FB2) were isolated from date palm (degla), pistachio and barley. The presence of nitrate reductase (narG) and nitrite reductase (nirS and nirK) genes in the selected strains was detected by PCR technique. Molecular identification based on 16S rDNA sequencing method was applied to identify positive strains. In addition, the D-optimal mixture experimental design was used to optimize the optimal formulation of probiotic bacteria for denitrification process. Strains harboring denitrification genes were identified as: TE1, Agrococcus sp LN828197; TD3, Cronobacter sakazakii LN828198 and FB2, Pedicoccus pentosaceus LN828199. PCR results revealed that all strains carried the nirS gene. However only C. sakazakii LN828198 and Agrococcus sp LN828197 harbored the nirK and the narG genes respectively. Moreover, the studied bacteria were able to form biofilm on abiotic surfaces with different degree. Process optimization showed that the most significant reduction of nitrate was 100% with 14.98% of COD consumption and 5.57 mg/l nitrite accumulation. Meanwhile, the response values were optimized and showed that the most optimal combination was 78.79% of C. sakazakii LN828198 (curve value), 21.21% of P. pentosaceus LN828199 (curve value) and absence (0%) of Agrococcus sp LN828197 (curve value).

  9. Searching for optimal setting conditions in technological processes using parametric estimation models and neural network mapping approach: a tutorial.

    PubMed

    Fjodorova, Natalja; Novič, Marjana

    2015-09-01

    Engineering optimization is an actual goal in manufacturing and service industries. In the tutorial we represented the concept of traditional parametric estimation models (Factorial Design (FD) and Central Composite Design (CCD)) for searching optimal setting parameters of technological processes. Then the 2D mapping method based on Auto Associative Neural Networks (ANN) (particularly, the Feed Forward Bottle Neck Neural Network (FFBN NN)) was described in comparison with traditional methods. The FFBN NN mapping technique enables visualization of all optimal solutions in considered processes due to the projection of input as well as output parameters in the same coordinates of 2D map. This phenomenon supports the more efficient way of improving the performance of existing systems. Comparison of two methods was performed on the bases of optimization of solder paste printing processes as well as optimization of properties of cheese. Application of both methods enables the double check. This increases the reliability of selected optima or specification limits. PMID:26388367

  10. Searching for optimal setting conditions in technological processes using parametric estimation models and neural network mapping approach: a tutorial.

    PubMed

    Fjodorova, Natalja; Novič, Marjana

    2015-09-01

    Engineering optimization is an actual goal in manufacturing and service industries. In the tutorial we represented the concept of traditional parametric estimation models (Factorial Design (FD) and Central Composite Design (CCD)) for searching optimal setting parameters of technological processes. Then the 2D mapping method based on Auto Associative Neural Networks (ANN) (particularly, the Feed Forward Bottle Neck Neural Network (FFBN NN)) was described in comparison with traditional methods. The FFBN NN mapping technique enables visualization of all optimal solutions in considered processes due to the projection of input as well as output parameters in the same coordinates of 2D map. This phenomenon supports the more efficient way of improving the performance of existing systems. Comparison of two methods was performed on the bases of optimization of solder paste printing processes as well as optimization of properties of cheese. Application of both methods enables the double check. This increases the reliability of selected optima or specification limits.

  11. Analysis and Optimization of the Production Process of Cooked Sausage Meat Matrices

    NASA Astrophysics Data System (ADS)

    Diez, L.; Rauh, C.; Delgado, A.

    2010-09-01

    In the production of cooked sausages a critical step for product quality is the cutting process, where the comminuting and mixing of meat, fat, ice and spices are carried out. These processes take usually place in bowl cutters, which main control parameters are the working time, knife geometry (shape and sharpness) and rotational velocities of the knives and the bowl. The choice of the geometry and sharpness of the knives influences not only the meat matrix properties (mechanical, rheological, etc.) and, as a consequence, the sensory value of the sausages (size of connective tissue particles, water binding, etc.), but also the energetic demand for the production. However, the cutting process proves to be understood only fragmentarily due to the complex colloid chemical and mechanical behavior of the product. This is documented on the one hand by numerous knife types on the market, extremely empirical approach during determination of geometry and process parameters in practice as well as, on the other hand, by contradictory statements and explanation approaches of observed phenomena present in literature. The present contribution applies numerical simulations to analyze thermo fluid mechanical phenomena, e.g. shear stresses, during the cutting process of the non-Newtonian meat matrix. Combining these results with selected experimental investigations from literature, e.g. sensory properties, knife geometry, velocity of the knife and bowl, improvements of the cutting and mixing process are proposed using cognitive algorithms (Artificial neural networks) aiming at an optimization regarding energy and time demand and product quality.

  12. A Big Data-driven Model for the Optimization of Healthcare Processes.

    PubMed

    Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George

    2015-01-01

    Healthcare organizations increasingly navigate a highly volatile, complex environment in which technological advancements and new healthcare delivery business models are the only constants. In their effort to out-perform in this environment, healthcare organizations need to be agile enough in order to become responsive to these increasingly changing conditions. To act with agility, healthcare organizations need to discover new ways to optimize their operations. To this end, they focus on healthcare processes that guide healthcare delivery and on the technologies that support them. Business process management (BPM) and Service-Oriented Architecture (SOA) can provide a flexible, dynamic, cloud-ready infrastructure where business process analytics can be utilized to extract useful insights from mountains of raw data, and make them work in ways beyond the abilities of human brains, or IT systems from just a year ago. This paper presents a framework which provides healthcare professionals gain better insight within and across your business processes. In particular, it performs real-time analysis on process-related data in order reveal areas of potential process improvement.

  13. An application of anti-optimization in the process of validating aerodynamic codes

    NASA Astrophysics Data System (ADS)

    Cruz, Juan R.

    An investigation was conducted to assess the usefulness of anti-optimization in the process of validating of aerodynamic codes. Anti-optimization is defined here as the intentional search for regions where the computational and experimental results disagree. Maximizing such disagreements can be a useful tool in uncovering errors and/or weaknesses in both analyses and experiments. The codes chosen for this investigation were an airfoil code and a lifting line code used together as an analysis to predict three-dimensional wing aerodynamic coefficients. The parameter of interest was the maximum lift coefficient of the three-dimensional wing, CL max. The test domain encompassed Mach numbers from 0.3 to 0.8, and Reynolds numbers from 25,000 to 250,000. A simple rectangular wing was designed for the experiment. A wind tunnel model of this wing was built and tested in the NASA Langley Transonic Dynamics Tunnel. Selection of the test conditions (i.e., Mach and Reynolds numbers) were made by applying the techniques of response surface methodology and considerations involving the predicted experimental uncertainty. The test was planned and executed in two phases. In the first phase runs were conducted at the pre-planned test conditions. Based on these results additional runs were conducted in areas where significant differences in CL max were observed between the computational results and the experiment---in essence applying the concept of anti-optimization. These additional runs were used to verify the differences in CL max and assess the extent of the region where these differences occurred. The results of the experiment showed that the analysis was capable of predicting CL max to within 0.05 over most of the test domain. The application of anti-optimization succeeded in identifying a region where the computational and experimental values of C L max differed by more than 0.05, demonstrating the usefulness of anti-optimization in process of validating aerodynamic codes

  14. Economic on-line optimization for liquids extraction and treating in gas processing plants

    SciTech Connect

    Berkowitz, P.N.; Gamez, J.P.

    1995-11-01

    Significant changes in the gas processing industry are driving processors to become more dependent on their ability to adapt plant operations to respond to changing third party contracts, wide variability of inlet conditions, plus the volatile market pricing of NGL`s and residue gas in order to remain competitive and profitable. The need for flexible operations at each facility requires an on-line, real time supervisory controller/optimizer that manipulates the process to achieve its economic optimum. Economic optimum does not equal process optimum and the traditional approach of only meeting the control objectives is insufficient for today`s gas plant operations. Because gas plants have no storage of products or residue gas, lost opportunity is immediate with no means to regain lost profitability. The nonlinear characteristics of the process makes typical available control technologies unacceptable for these applications. A new solution that incorporates economics, process dynamics and the economic arrangements of the gas processor has been developed with the cooperation of industry, the Gas Research Institute (GRI) and Continental Controls, Inc. (CCI). The objective function of this software solution is the maximization of profit for the plant and its co-owners. Thus, the optimum set of controlled variables is also dynamic, dependent on product price margins, variability of the inlet gas, and the cost of utilities. This simultaneous control and optimization solution has produced benefits that have paid for the project in as little as two and one half months and typically four to seven months depending on the gas throughput and its richness.

  15. Optimization of silver-assisted nano-pillar etching process in silicon

    NASA Astrophysics Data System (ADS)

    Azhari, Ayu Wazira; Sopian, Kamaruzzaman; Desa, Mohd Khairunaz Mat; Zaidi, Saleem H.

    2015-12-01

    In this study, a respond surface methodology (RSM) model is developed using three-level Box-Behnken experimental design (BBD) technique. This model is developed to investigate the influence of metal-assisted chemical etching (MACE) process variables on the nanopillars profiles created in single crystalline silicon (Si) substrate. Design-Expert® software (version 7.1) is employed in formulating the RSM model based on five critical process variables: (A) concentration of silver (Ag), (B) concentration of hydrofluoric acid (HF), (C) concentration of hydrogen peroxide (H2O2), (D) deposition time, and (E) etching time. This model is supported by data from 46 experimental configurations. Etched profiles as a function of lateral etching rate, vertical etching rate, height, size and separation between the Si trenches and etching uniformity are characterized using field emission scanning electron microscope (FE-SEM). A quadratic regression model is developed to correlate critical process variables and is validated using the analysis of variance (ANOVA) methodology. The model exhibits near-linear dependence of lateral and vertical etching rates on both the H2O2 concentration and etching time. The predicted model is in good agreement with the experimental data where R2 is equal to 0.80 and 0.67 for the etching rate and lateral etching respectively. The optimized result shows minimum lateral etching with the average pore size of about 69 nm while the maximum etching rate is estimated at around 360 nm/min. The model demonstrates that the etching process uniformity is not influenced by either the etchant concentration or the etching time. This lack of uniformity could be attributed to the surface condition of the wafer. Optimization of the process parameters show adequate accuracy of the model with acceptable percentage errors of 6%, 59%, 1.8%, 38% and 61% for determination of the height, separation, size, the pore size and the etching rate respectively.

  16. SorLA Complement-type Repeat Domains Protect the Amyloid Precursor Protein against Processing*

    PubMed Central

    Mehmedbasic, Arnela; Christensen, Sofie K.; Nilsson, Jonas; Rüetschi, Ulla; Gustafsen, Camilla; Poulsen, Annemarie Svane Aavild; Rasmussen, Rikke W.; Fjorback, Anja N.; Larson, Göran; Andersen, Olav M.

    2015-01-01

    SorLA is a neuronal sorting receptor that is genetically associated with Alzheimer disease. SorLA interacts directly with the amyloid precursor protein (APP) and affects the processing of the precursor, leading to a decreased generation of the amyloid-β peptide. The SorLA complement-type repeat (CR) domains associate in vitro with APP, but the precise molecular determinants of SorLA·APP complex formation and the mechanisms responsible for the effect of binding on APP processing have not yet been elucidated. Here, we have generated protein expression constructs for SorLA devoid of the 11 CR-domains and for two SorLA mutants harboring substitutions of the fingerprint residues in the central CR-domains. We generated SH-SY5Y cell lines that stably express these SorLA variants to study the binding and processing of APP using co-immunoprecipitation and Western blotting/ELISAs, respectively. We found that the SorLA CR-cluster is essential for interaction with APP and that deletion of the CR-cluster abolishes the protection against APP processing. Mutation of identified fingerprint residues in the SorLA CR-domains leads to changes in the O-linked glycosylation of APP when expressed in SH-SY5Y cells. Our results provide novel information on the mechanisms behind the influence of SorLA activity on APP metabolism by controlling post-translational glycosylation in the Golgi, suggesting new strategies against amyloidogenesis in Alzheimer disease. PMID:25525276

  17. ATOMIC-LEVEL IMAGING OF CO2 DISPOSAL AS A CARBONATE MINERAL: OPTIMIZING REACTION PROCESS DESIGN

    SciTech Connect

    M.J. McKelvy; R. Sharma; A.V.G. Chizmeshya; H. Bearat; R.W. Carpenter; K. Streib

    1999-09-01

    Fossil fuels, especially coal, can support the energy demands of the world for centuries to come, if the environmental problems associated with CO{sub 2} emissions can be overcome. Permanent and safe methods for CO{sub 2} capture and disposal/storage need to be developed. Mineralization of stationary-source CO{sub 2} emissions as carbonates can provide such safe capture and long-term sequestration. Mg(OH){sub 2} carbonation is a leading process candidate, which generates the stable naturally occurring mineral magnesite (MgCO{sub 3}) and water. Key to process cost and viability are the carbonation reaction rate and its degree of completion. This process, which involves simultaneous dehydroxylation and carbonation is very promising, but far from optimized. In order to optimize the dehydroxylation/carbonation process, an atomic-level understanding of the mechanisms involved is needed. Since Mg(OH){sub 2} dehydroxylation is intimately associated with the carbonation process, its mechanisms are also of direct interest in understanding and optimizing the process. In the first project year, our investigations have focused on developing an atomic-level understanding of the dehydroxylation/carbonation reaction mechanisms that govern the overall carbonation reaction process in well crystallized material. In years two and three, we will also explore the roles of crystalline defects and impurities. Environmental-cell, dynamic high-resolution transmission electron microscopy has been used to directly observe the dehydroxylation process at the atomic-level for the first time. These observations were combined with advanced computational modeling studies to better elucidate the atomic-level process. These studies were combined with direct carbonation studies to better elucidate dehydroxylation/carbonation reaction mechanisms. Dehydroxylation follows a lamellar nucleation and growth process involving oxide layer formation. These layers form lamellar oxyhydroxide regions, which can

  18. Impact of cultivar selection and process optimization on ethanol yield from different varieties of sugarcane

    PubMed Central

    2014-01-01

    Background The development of ‘energycane’ varieties of sugarcane is underway, targeting the use of both sugar juice and bagasse for ethanol production. The current study evaluated a selection of such ‘energycane’ cultivars for the combined ethanol yields from juice and bagasse, by optimization of dilute acid pretreatment optimization of bagasse for sugar yields. Method A central composite design under response surface methodology was used to investigate the effects of dilute acid pretreatment parameters followed by enzymatic hydrolysis on the combined sugar yield of bagasse samples. The pressed slurry generated from optimum pretreatment conditions (maximum combined sugar yield) was used as the substrate during batch and fed-batch simultaneous saccharification and fermentation (SSF) processes at different solid loadings and enzyme dosages, aiming to reach an ethanol concentration of at least 40 g/L. Results Significant variations were observed in sugar yields (xylose, glucose and combined sugar yield) from pretreatment-hydrolysis of bagasse from different cultivars of sugarcane. Up to 33% difference in combined sugar yield between best performing varieties and industrial bagasse was observed at optimal pretreatment-hydrolysis conditions. Significant improvement in overall ethanol yield after SSF of the pretreated bagasse was also observed from the best performing varieties (84.5 to 85.6%) compared to industrial bagasse (74.5%). The ethanol concentration showed inverse correlation with lignin content and the ratio of xylose to arabinose, but it showed positive correlation with glucose yield from pretreatment-hydrolysis. The overall assessment of the cultivars showed greater improvement in the final ethanol concentration (26.9 to 33.9%) and combined ethanol yields per hectare (83 to 94%) for the best performing varieties with respect to industrial sugarcane. Conclusions These results suggest that the selection of sugarcane variety to optimize ethanol

  19. Implementation and optimization of ultrasound signal processing algorithms on mobile GPU

    NASA Astrophysics Data System (ADS)

    Kong, Woo Kyu; Lee, Wooyoul; Kim, Kyu Cheol; Yoo, Yangmo; Song, Tai-Kyong

    2014-03-01

    A general-purpose graphics processing unit (GPGPU) has been used for improving computing power in medical ultrasound imaging systems. Recently, a mobile GPU becomes powerful to deal with 3D games and videos at high frame rates on Full HD or HD resolution displays. This paper proposes the method to implement ultrasound signal processing on a mobile GPU available in the high-end smartphone (Galaxy S4, Samsung Electronics, Seoul, Korea) with programmable shaders on the OpenGL ES 2.0 platform. To maximize the performance of the mobile GPU, the optimization of shader design and load sharing between vertex and fragment shader was performed. The beamformed data were captured from a tissue mimicking phantom (Model 539 Multipurpose Phantom, ATS Laboratories, Inc., Bridgeport, CT, USA) by using a commercial ultrasound imaging system equipped with a research package (Ultrasonix Touch, Ultrasonix, Richmond, BC, Canada). The real-time performance is evaluated by frame rates while varying the range of signal processing blocks. The implementation method of ultrasound signal processing on OpenGL ES 2.0 was verified by analyzing PSNR with MATLAB gold standard that has the same signal path. CNR was also analyzed to verify the method. From the evaluations, the proposed mobile GPU-based processing method has no significant difference with the processing using MATLAB (i.e., PSNR<52.51 dB). The comparable results of CNR were obtained from both processing methods (i.e., 11.31). From the mobile GPU implementation, the frame rates of 57.6 Hz were achieved. The total execution time was 17.4 ms that was faster than the acquisition time (i.e., 34.4 ms). These results indicate that the mobile GPU-based processing method can support real-time ultrasound B-mode processing on the smartphone.

  20. Development of a Groundwater Transport Simulation Tool for Remedial Process Optimization

    SciTech Connect

    Ivarson, Kristine A.; Hanson, James P.; Tonkin, M.; Miller, Charles W.; Baker, S.

    2015-01-14

    The groundwater remedy for hexavalent chromium at the Hanford Site includes operation of five large pump-and-treat systems along the Columbia River. The systems at the 100-HR-3 and 100-KR-4 groundwater operable units treat a total of about 9,840 liters per minute (2,600 gallons per minute) of groundwater to remove hexavalent chromium, and cover an area of nearly 26 square kilometers (10 square miles). The pump-and-treat systems result in large scale manipulation of groundwater flow direction, velocities, and most importantly, the contaminant plumes. Tracking of the plumes and predicting needed system modifications is part of the remedial process optimization, and is a continual process with the goal of reducing costs and shortening the timeframe to achieve the cleanup goals. While most of the initial system evaluations are conducted by assessing performance (e.g., reduction in contaminant concentration in groundwater and changes in inferred plume size), changes to the well field are often recommended. To determine the placement for new wells, well realignments, and modifications to pumping rates, it is important to be able to predict resultant plume changes. In smaller systems, it may be effective to make small scale changes periodically and adjust modifications based on groundwater monitoring results. Due to the expansive nature of the remediation systems at Hanford, however, additional tools were needed to predict the plume reactions to system changes. A computer simulation tool was developed to support pumping rate recommendations for optimization of large pump-and-treat groundwater remedy systems. This tool, called the Pumping Optimization Model, or POM, is based on a 1-layer derivation of a multi-layer contaminant transport model using MODFLOW and MT3D.

  1. Automation of reverse engineering process in aircraft modeling and related optimization problems

    NASA Technical Reports Server (NTRS)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  2. HiC-Pro: an optimized and flexible pipeline for Hi-C data processing.

    PubMed

    Servant, Nicolas; Varoquaux, Nelle; Lajoie, Bryan R; Viara, Eric; Chen, Chong-Jian; Vert, Jean-Philippe; Heard, Edith; Dekker, Job; Barillot, Emmanuel

    2015-12-01

    HiC-Pro is an optimized and flexible pipeline for processing Hi-C data from raw reads to normalized contact maps. HiC-Pro maps reads, detects valid ligation products, performs quality controls and generates intra- and inter-chromosomal contact maps. It includes a fast implementation of the iterative correction method and is based on a memory-efficient data format for Hi-C contact maps. In addition, HiC-Pro can use phased genotype data to build allele-specific contact maps. We applied HiC-Pro to different Hi-C datasets, demonstrating its ability to easily process large data in a reasonable time. Source code and documentation are available at http://github.com/nservant/HiC-Pro .

  3. Optimization of cow dung spiked pre-consumer processing vegetable waste for vermicomposting using Eisenia fetida.

    PubMed

    Garg, V K; Gupta, Renuka

    2011-01-01

    This paper reports the optimization of cow dung (CD) spiked pre-consumer processing vegetable waste (PPVW) for vermicomposting using Eisenia fetida in a laboratory scale study. Vermicomposting process decreased carbon and organic matter concentration and increased N, P and K content in the vermicompost. The C:N ratio was decreased by 45-69% in different vermireactors indicating stabilization of the waste. The heavy metal content was within permissible limits of their application in agricultural soils. It has been concluded from the results that addition of PPVW up to 40% with CD can produce a good quality vermicompost. Whereas, growth and fecundity of E. fetida was best when reared in 20% PPVW+80% CD feed mixture. However, higher percentages of PPVW in different vermireactors significantly affected the growth and fecundity of worms. PMID:20951432

  4. Pre-Hardware Optimization of Spacecraft Image Processing Algorithms and Hardware Implementation

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Petrick, David J.; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Day, John H. (Technical Monitor)

    2002-01-01

    Spacecraft telemetry rates and telemetry product complexity have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image data processing and color picture generation application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The proposed solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms, and reconfigurable computing hardware (RC) technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processors (DSP). It has been shown that this approach can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft.

  5. Exploring High-Dimensional Data Space: Identifying Optimal Process Conditions in Photovoltaics: Preprint

    SciTech Connect

    Suh, C.; Glynn, S.; Scharf, J.; Contreras, M. A.; Noufi, R.; Jones, W. B.; Biagioni, D.

    2011-07-01

    We demonstrate how advanced exploratory data analysis coupled to data-mining techniques can be used to scrutinize the high-dimensional data space of photovoltaics in the context of thin films of Al-doped ZnO (AZO), which are essential materials as a transparent conducting oxide (TCO) layer in CuInxGa1-xSe2 (CIGS) solar cells. AZO data space, wherein each sample is synthesized from a different process history and assessed with various characterizations, is transformed, reorganized, and visualized in order to extract optimal process conditions. The data-analysis methods used include parallel coordinates, diffusion maps, and hierarchical agglomerative clustering algorithms combined with diffusion map embedding.

  6. Process Optimization of Seed Precipitation Tank with Multiple Impellers Using Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Zhao, Hong-Liang; Lv, Chao; Liu, Yan; Zhang, Ting-An

    2015-07-01

    The complex fluid flow in a large-scale tank stirred with multiple Ekato Intermig impellers used in the seed precipitation process was numerically analyzed by the computational fluid dynamics method. The flow field, liquid-solid mixing, and power consumption were simulated by adopting the Eulerian granular multiphase model and standard k- ɛ turbulence model. A steady multiple reference frame approach was used to represent impeller rotation. The simulated results showed that the five-stage multiple Intermig impeller coupled with sloped baffles could generate circulation loops in axial, which is good for solid uniform mixing. The fluid is overmixed under the current industrial condition. Compared with the current process conditions, a three-stage impeller with L/ D of 1.25 not only could meet the industrial requirements, but also more than 20% power could be saved. The results have important implications for reliable design and optimal performance for industry.

  7. Towards optimization of the silanization process of hydroxyapatite for its use in bone cement formulations.

    PubMed

    Cisneros-Pineda, Olga G; Herrera Kao, Wilberth; Loría-Bastarrachea, María I; Veranes-Pantoja, Yaymarilis; Cauich-Rodríguez, Juan V; Cervantes-Uc, José M

    2014-07-01

    The aim of this work was to provide some fundamental information for optimization of silanization of hydroxyapatite intended for bone cement formulations. The effect of 3-(trimethoxysilyl) propyl methacrylate (MPS) concentration and solvent system (acetone/water or methanol/water mixtures) during HA silanization was monitored by X-ray diffraction (XRD), FTIR spectroscopy and EDX analysis. The effect of silanized HA on the mechanical properties of acrylic bone cements is also reported. It was found that the silanization process rendered hydroxyapatite with lower crystallinity compared to untreated HA. Through EDX, it was observed that the silicon concentration in the HA particles was higher for acetone-water than that obtained for methanol-water system, although the mechanical performance of cements prepared with these particles exhibited the opposite behavior. Taking all these results together, it is concluded that methanol-water system containing MPS at 3wt.% provides the better results during silanization process of HA.

  8. Optimization of cold rolling process parameters in order to increasing rolling speed limited by chatter vibrations

    PubMed Central

    Heidari, Ali; Forouzan, Mohammad R.

    2012-01-01

    Chatter has been recognized as major restriction for the increase in productivity of cold rolling processes, limiting the rolling speed for thin steel strips. It is shown that chatter has close relation with rolling conditions. So the main aim of this paper is to attain the optimum set points of rolling to achieve maximum rolling speed, preventing chatter to occur. Two combination methods were used for optimization. First method is done in four steps: providing a simulation program for chatter analysis, preparing data from simulation program based on central composite design of experiment, developing a statistical model to relate system tendency to chatter and rolling parameters by response surface methodology, and finally optimizing the process by genetic algorithm. Second method has analogous stages. But central composite design of experiment is replaced by Taguchi method and response surface methodology is replaced by neural network method. Also a study on the influence of the rolling parameters on system stability has been carried out. By using these combination methods, new set points were determined and significant improvement achieved in rolling speed. PMID:25685398

  9. Optimization of process parameters for drilled hole quality characteristics during cortical bone drilling using Taguchi method.

    PubMed

    Singh, Gurmeet; Jain, Vivek; Gupta, Dheeraj; Ghai, Aman

    2016-09-01

    Orthopaedic surgery involves drilling of bones to get them fixed at their original position. The drilling process used in orthopaedic surgery is most likely to the mechanical drilling process and there is all likelihood that it may harm the already damaged bone, the surrounding bone tissue and nerves, and the peril is not limited at that. It is very much feared that the recovery of that part may be impeded so that it may not be able to sustain life long. To achieve sustainable orthopaedic surgery, a surgeon must try to control the drilling damage at the time of bone drilling. The area around the holes decides the life of bone joint and so, the contiguous area of drilled hole must be intact and retain its properties even after drilling. This study mainly focuses on optimization of drilling parameters like rotational speed, feed rate and the type of tool at three levels each used by Taguchi optimization for surface roughness and material removal rate. The confirmation experiments were also carried out and results found with the confidence interval. Scanning electrode microscopy (SEM) images assisted in getting the micro level information of bone damage.

  10. A method to optimize the processing algorithm of a computed radiography system for chest radiography.

    PubMed

    Moore, C S; Liney, G P; Beavis, A W; Saunderson, J R

    2007-09-01

    A test methodology using an anthropomorphic-equivalent chest phantom is described for the optimization of the Agfa computed radiography "MUSICA" processing algorithm for chest radiography. The contrast-to-noise ratio (CNR) in the lung, heart and diaphragm regions of the phantom, and the "system modulation transfer function" (sMTF) in the lung region, were measured using test tools embedded in the phantom. Using these parameters the MUSICA processing algorithm was optimized with respect to low-contrast detectability and spatial resolution. Two optimum "MUSICA parameter sets" were derived respectively for maximizing the CNR and sMTF in each region of the phantom. Further work is required to find the relative importance of low-contrast detectability and spatial resolution in chest images, from which the definitive optimum MUSICA parameter set can then be derived. Prior to this further work, a compromised optimum MUSICA parameter set was applied to a range of clinical images. A group of experienced image evaluators scored these images alongside images produced from the same radiographs using the MUSICA parameter set in clinical use at the time. The compromised optimum MUSICA parameter set was shown to produce measurably better images.

  11. Data processing of vertical scanning white-light interferometry based on particle swarm optimization

    NASA Astrophysics Data System (ADS)

    Hu, Jie; Cui, Changcai; Huang, Hui; Ye, Ruifang

    2013-01-01

    In order to precisely locate the position of zero optical path difference (ZOPD) between the measuring light beam and the reference light beam in Vertical Scanning White-light Interferometer and then realize accurate surface measurement, the Particle Swarm Optimization (PSO) was used to process the interferometry data captured by a CCD camera. The envelope line of series of intensities of every pixel was supposed to be approximated by a Gaussian curve first. Then its parameters were optimized to find the best Gaussian curve as well as the position of ZOPD by the PSO with an objective function which minimized the residual sum of squares between the measured data and theoretical fitting curve. Finally, the measured surface can be reconstructed according to a series of best positions of ZOPD obtained by the proposed method. The simulation data and sampled data of two standard samples with different kinds of reticles from repetitive test show that the PSO is suitable for precisely locating the ZOPD with low requirements of step sampling and a small amount of pictures. Therefore, without reducing the precision, the PSO can be used in data processing of the white-light interferometry system with relatively low requirements for stepping hardware.

  12. Optimization of cold rolling process parameters in order to increasing rolling speed limited by chatter vibrations.

    PubMed

    Heidari, Ali; Forouzan, Mohammad R

    2013-01-01

    Chatter has been recognized as major restriction for the increase in productivity of cold rolling processes, limiting the rolling speed for thin steel strips. It is shown that chatter has close relation with rolling conditions. So the main aim of this paper is to attain the optimum set points of rolling to achieve maximum rolling speed, preventing chatter to occur. Two combination methods were used for optimization. First method is done in four steps: providing a simulation program for chatter analysis, preparing data from simulation program based on central composite design of experiment, developing a statistical model to relate system tendency to chatter and rolling parameters by response surface methodology, and finally optimizing the process by genetic algorithm. Second method has analogous stages. But central composite design of experiment is replaced by Taguchi method and response surface methodology is replaced by neural network method. Also a study on the influence of the rolling parameters on system stability has been carried out. By using these combination methods, new set points were determined and significant improvement achieved in rolling speed.

  13. Optimization and automation of an end-to-end high throughput microscale transient protein production process.

    PubMed

    Bos, Aaron B; Luan, Peng; Duque, Joseph N; Reilly, Dorothea; Harms, Peter D; Wong, Athena W

    2015-09-01

    High throughput protein production from transient transfection of mammalian cells is used in multiple facets of research and development studies. Commonly used formats for these high number expressions are 12-, 24- and 96-well plates at various volumes. However there are no published examples of a 96-deep well plate microscale (1,000 μL) suspension process for mammalian transient expression. For this reason, we aimed to determine the optimal operating conditions for a high producing, microscale HEK293 transient system. We evaluated the hydrodynamic flow and measured the oxygen transfer rate (OTR) and transient protein expression for 96-deep well plates of different well geometries filled at 600-1,000 μL working volumes and agitated at various speeds and orbital diameters. Ultimately, a round well-round bottom (RR) 96-deep well plate with a working volume of 1,000 µL agitated at 1,000 RPM and a 3 mm orbital diameter yielded the highest and most consistent total transient protein production. As plate cultures are subject to evaporation, water loss from different plate seals was measured to identify an optimal plate sealing method. Finally, to enable higher capacity protein production, both expression and purification processes were automated. Functionality of this end-to-end automation workflow was demonstrated with the generation of high levels of human IgG1 antibodies (≥360 µg/mL) with reproducible productivity, product quality and ≥78% purification recovery.

  14. A method to optimize the processing algorithm of a computed radiography system for chest radiography.

    PubMed

    Moore, C S; Liney, G P; Beavis, A W; Saunderson, J R

    2007-09-01

    A test methodology using an anthropomorphic-equivalent chest phantom is described for the optimization of the Agfa computed radiography "MUSICA" processing algorithm for chest radiography. The contrast-to-noise ratio (CNR) in the lung, heart and diaphragm regions of the phantom, and the "system modulation transfer function" (sMTF) in the lung region, were measured using test tools embedded in the phantom. Using these parameters the MUSICA processing algorithm was optimized with respect to low-contrast detectability and spatial resolution. Two optimum "MUSICA parameter sets" were derived respectively for maximizing the CNR and sMTF in each region of the phantom. Further work is required to find the relative importance of low-contrast detectability and spatial resolution in chest images, from which the definitive optimum MUSICA parameter set can then be derived. Prior to this further work, a compromised optimum MUSICA parameter set was applied to a range of clinical images. A group of experienced image evaluators scored these images alongside images produced from the same radiographs using the MUSICA parameter set in clinical use at the time. The compromised optimum MUSICA parameter set was shown to produce measurably better images. PMID:17709364

  15. Stochastic analysis and simulation of hydrometeorological processes for optimizing hybrid renewable energy systems

    NASA Astrophysics Data System (ADS)

    Tsekouras, Georgios; Ioannou, Christos; Efstratiadis, Andreas; Koutsoyiannis, Demetris

    2013-04-01

    The drawbacks of conventional energy sources including their negative environmental impacts emphasize the need to integrate renewable energy sources into energy balance. However, the renewable sources strongly depend on time varying and uncertain hydrometeorological processes, including wind speed, sunshine duration and solar radiation. To study the design and management of hybrid energy systems we investigate the stochastic properties of these natural processes, including possible long-term persistence. We use wind speed and sunshine duration time series retrieved from a European database of daily records and we estimate representative values of the Hurst coefficient for both variables. We conduct simultaneous generation of synthetic time series of wind speed and sunshine duration, on yearly, monthly and daily scale. To this we use the Castalia software system which performs multivariate stochastic simulation. Using these time series as input, we perform stochastic simulation of an autonomous hypothetical hybrid renewable energy system and optimize its performance using genetic algorithms. For the system design we optimize the sizing of the system in order to satisfy the energy demand with high reliability also minimizing the cost. While the simulation scale is the daily, a simple method allows utilizing the subdaily distribution of the produced wind power. Various scenarios are assumed in order to examine the influence of input parameters, such as the Hurst coefficient, and design parameters such as the photovoltaic panel angle.

  16. Fault Detection of Roller-Bearings Using Signal Processing and Optimization Algorithms

    PubMed Central

    Kwak, Dae-Ho; Lee, Dong-Han; Ahn, Jong-Hyo; Koh, Bong-Hwan

    2014-01-01

    This study presents a fault detection of roller bearings through signal processing and optimization techniques. After the occurrence of scratch-type defects on the inner race of bearings, variations of kurtosis values are investigated in terms of two different data processing techniques: minimum entropy deconvolution (MED), and the Teager-Kaiser Energy Operator (TKEO). MED and the TKEO are employed to qualitatively enhance the discrimination of defect-induced repeating peaks on bearing vibration data with measurement noise. Given the perspective of the execution sequence of MED and the TKEO, the study found that the kurtosis sensitivity towards a defect on bearings could be highly improved. Also, the vibration signal from both healthy and damaged bearings is decomposed into multiple intrinsic mode functions (IMFs), through empirical mode decomposition (EMD). The weight vectors of IMFs become design variables for a genetic algorithm (GA). The weights of each IMF can be optimized through the genetic algorithm, to enhance the sensitivity of kurtosis on damaged bearing signals. Experimental results show that the EMD-GA approach successfully improved the resolution of detectability between a roller bearing with defect, and an intact system. PMID:24368701

  17. Optimization of process parameters for drilled hole quality characteristics during cortical bone drilling using Taguchi method.

    PubMed

    Singh, Gurmeet; Jain, Vivek; Gupta, Dheeraj; Ghai, Aman

    2016-09-01

    Orthopaedic surgery involves drilling of bones to get them fixed at their original position. The drilling process used in orthopaedic surgery is most likely to the mechanical drilling process and there is all likelihood that it may harm the already damaged bone, the surrounding bone tissue and nerves, and the peril is not limited at that. It is very much feared that the recovery of that part may be impeded so that it may not be able to sustain life long. To achieve sustainable orthopaedic surgery, a surgeon must try to control the drilling damage at the time of bone drilling. The area around the holes decides the life of bone joint and so, the contiguous area of drilled hole must be intact and retain its properties even after drilling. This study mainly focuses on optimization of drilling parameters like rotational speed, feed rate and the type of tool at three levels each used by Taguchi optimization for surface roughness and material removal rate. The confirmation experiments were also carried out and results found with the confidence interval. Scanning electrode microscopy (SEM) images assisted in getting the micro level information of bone damage. PMID:27254280

  18. [THE OPTIMIZATION OF THE ACTIVITY OF ORGANS OF FEDERAL SERVICE FOR SUPERVISION OF CONSUMER RIGHTS PROTECTION AND HUMAN WELFARE IN THE SVERDLOVSK REGION].

    PubMed

    Kuz'min, S V; Gurvich, V B; Romanov, S V; Dikonskaia, O V; Iarushin, S V; Malykh, O L

    2015-01-01

    In the Sverdlovsk region there have developed and implemented methodological approaches to the optimization oj the activity of the Directorate and the Centre directed to the improvement of the sanitary and epidemiological surveillance and in the sphere of the protection of the rights of consumers in the framework of the development of an comprehensive regional system of risk management for the population's health in the Sverdlovsk region.

  19. Optimization of oxidation processes to improve crystalline silicon solar cell emitters

    SciTech Connect

    Shen, L.; Liang, Z. C. Liu, C. F.; Long, T. J.; Wang, D. L.

    2014-02-15

    Control of the oxidation process is one key issue in producing high-quality emitters for crystalline silicon solar cells. In this paper, the oxidation parameters of pre-oxidation time, oxygen concentration during pre-oxidation and pre-deposition and drive-in time were optimized by using orthogonal experiments. By analyzing experimental measurements of short-circuit current, open circuit voltage, series resistance and solar cell efficiency in solar cells with different sheet resistances which were produced by using different diffusion processes, we inferred that an emitter with a sheet resistance of approximately 70 Ω/□ performed best under the existing standard solar cell process. Further investigations were conducted on emitters with sheet resistances of approximately 70 Ω/□ that were obtained from different preparation processes. The results indicate that emitters with surface phosphorus concentrations between 4.96 × 10{sup 20} cm{sup −3} and 7.78 × 10{sup 20} cm{sup −3} and with junction depths between 0.46 μm and 0.55 μm possessed the best quality. With no extra processing, the final preparation of the crystalline silicon solar cell efficiency can reach 18.41%, which is an increase of 0.4%{sub abs} compared to conventional emitters with 50 Ω/□ sheet resistance.

  20. Numerical model describing optimization of fibres winding process on open and closed frame

    NASA Astrophysics Data System (ADS)

    Petrů, M.; Mlýnek, J.; Martinec, T.

    2016-08-01

    This article discusses a numerical model describing optimization of fibres winding process on open and closed frame. The quality production of said type of composite frame depends primarily on the correct winding of fibers on a polyurethane core. It is especially needed to ensure the correct angles of the fibers winding on the polyurethane core and the homogeneity of individual winding layers. The article describes mathematical model for use an industrial robot in filament winding and how to calculate the trajectory of the robot. When winding fibers on the polyurethane core which is fastened to the robot-end-effector so that during the winding process goes through a fibre-processing head on the basis of the suitably determined robot-end-effector trajectory. We use the described numerical model and matrix calculus to enumerate the trajectory of the robot-end-effector to determine the desired passage of the frame through the fibre-processing head. The calculation of the trajectory was programmed in the Delphi development environment. Relations of the numerical model are important for use a real solving of the passage of a polyurethane core through fibre-processing head.